[petsc-users] How to use PETSc4py/SLEPc4py to solve an eigenvalue problem in parallel
Jose E. Roman
jroman at dsic.upv.es
Wed Oct 29 09:23:22 CDT 2014
El 29/10/2014, a las 14:56, Mengqi Zhang escribió:
> Hi, Jose
>
> Thanks for your reply.
> I will specify the problem more exactly. I want to solve a big generalized eigenvalue problem. The matrices are sparse. My question lies on how to allocate the matrices among the processors. Do I have to do it by myself? Or there is a routine to do so? I notice that one should do something like
> from petsc4py import PETSc
>
>
> A = PETSc.Mat().create()
>
>
> A.setType('aij')
>
>
> A.setSizes([M,N])
>
>
> A.setPreallocationNNZ([diag_nz, offdiag_nz]) # optional
> A.setUp()
> I have several question regarding these lines.
> (1) I presume M and N are the dimension of the matrix. Then how do the processors divide the matrix? I guess setPreallocationNNZ does the allocation of the matrix among the processors. What does nz mean here? Why here appears diag and offdiag?
> (2) I actually saw somewhere else that people use A.setPreallocationNNZ(5), with a single parameter 5. What does 5 mean here?
> (3) I want to make sure that the matrix so generated is sparse (since it uses aij). It is sparse right? I feel it tricky since if the matrix is stored as sparse, will the allocation/parallelization destroy the efficiency of sparse matrix?
>
>
> After the matrix is set up, I would like to use SLEPc4py to solve the generalized eigenvalue problem. The example code I got online is like
> E = SLEPc.EPS(); E.create()
> E.setOperators(A)
> E.setProblemType(SLEPc.EPS.ProblemType.GNHEP)
> E.setFromOptions()
> E.solve()
> I'm afraid this script is not designed for the parallel computation since in the options there is no indication of parallelization. Do you know how to set it up?
>
> Thank you very much for your time. I appreciate it very much.
> Best,
> Mengqi
>
Almost all examples under slepc4py/demo are parallel. You can take for instance ex1.py as a basis for your own scripts. In these examples, every process fills its part of the matrix (locally owned rows), as obtained from getOwnershipRange(). See PETSc's documentation here:
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetOwnershipRange.html
The meaning of the preallocation arguments is also explained in PETSc's manpages:
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatMPIAIJSetPreallocation.html
petsc4py/slepc4py are just python wrappers to PETSc/SLEPc, so you must understand how PETSc and SLEPc work, and dig into their manuals and manpages.
Jose
More information about the petsc-users
mailing list