[petsc-users] SLEPc eigensolver that uses minimal memory and finds ALL eigenvalues of a real symmetric sparse matrix in reasonable time

Jose E. Roman jroman at dsic.upv.es
Mon Aug 8 04:25:54 CDT 2011


El 08/08/2011, a las 09:14, Shitij Bhargava escribió:

> Thank you Jed. That was indeed the problem. I installed a separate MPI for PETSc/SLEPc, but was running my program with a default, already installed one.
> 
> Now, I have a different question. What I want to do is this:
> 
> 1. Only 1 process, say root, calculates the matrix in SeqAIJ format
> 2. Then root creates the EPS context, eps and initializes,sets parameters, problem type,etc. properly
> 3. After this the root process broadcasts this eps object to other processes
> 4. I use EPSSolve to solve for eigenvalues (all process together in cooperation resulting in memory distribution)
> 5. I get the results from root
> 
> is this possible ? I am not able to broadcast the EPS object, because it is not an MPI_DataType. Is there any PETSc/SLEPc function for this ? I am avoiding using MPIAIJ because that will mean making many changes in the existing code, including the numerous write(*,*) statements (i would have to convert them to PetscPrint in FORTRAN or something like that). 
> So I want a single process to handle matrix generation and assembly, but want to solve the eigenproblem in parallel by different processes. Running the subroutine EPSSolve in parallel and hence distribute memory is the only reason why I want to use MPI.
> 
> Thanks a lot !!
> 
> Shitij

No, you cannot use a solver in parallel with a matrix in SeqAIJ format. The matrix must be MPIAIJ.

If you want to generate the matrix only in process 0, you can do MPI_Comm_rank(PETSC_COMM_WORLD,&rank) and then enclose the matrix generation (setvalues only) in an if clause:

if (!rank) {
    ierr = MatSetValues(A,...);CHKERRQ(ierr);
}
ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr);
ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr);

However, it is better to do the matrix generation in parallel also.

Jose



More information about the petsc-users mailing list