[petsc-users] SLEPc eigensolver that uses minimal memory and finds ALL eigenvalues of a real symmetric sparse matrix in reasonable time

Jose E. Roman jroman at dsic.upv.es
Thu Aug 4 06:27:46 CDT 2011


El 04/08/2011, a las 08:48, Shitij Bhargava escribió:

> Thank you very much for your reply.
> 
> I am a summer trainee, and I talked to my guide about this, and now what I want might be a little different.
> 
> I really do want all the eigenvalues. There cannot be any compromise on that. But please note that time is not much of a concern here, the much bigger concern is of memory.  Even if I can distribute the memory somehow for the computation of all eigenvalues, that is fine. (doesnt matter if it takes a huge amount of time, he is willing to wait). Can any slepc Eigensolver distribute memory this way while solving for all eigenvalues ?
> 
> I am sorry if what I am asking is obvious, I dont know much about distributed memory, etc. also. I am stupid at this moment, but I am learning.
> 
> Also, when I ran the slepc program on a small cluster, and then ran the top command, I saw that while solving for eigenvalues, the program was already using upto 500% CPU. Does this mean that the eigensolver was "distributing memory" automatically among different nodes ? I understand that it must be distributing computational effort, but how do I know if it was distributing memory also ? It simply showed me the RES memory usage to be 1.3 gb, and the %MEM to be about 17%. Also, I did not use any MPIXXX data structures. I simply used SeqAIJ. Will using MPIAIJ "distribute memory" while solving for all eigenvalues  ? (I understand that it will distribute memory to store the matrix, but the memory bottleneck is eigenvalue solver, so its more important for memory to be distributed then ) My guide said that one node will not have enough memory, but all nodes combined will, and hence the requirement of memory to be distributed.
> 
> I read about ScaLAPACK, and I have already asked on their forums to confirm if it is suitable for me. I am waiting for an answer.
> 
> Thank you once again.
> 
> Shitij

Try something like this:

$ mpirun -np 8 ./program -eps_nev 4800 -eps_mpd 400 -eps_largest_real
$ mpirun -np 8 ./program -eps_nev 4800 -eps_mpd 400 -eps_smallest_real

But this may not work for the smallest ones.

Jose



More information about the petsc-users mailing list