[petsc-users] SLEPc eigensolver that uses minimal memory and finds ALL eigenvalues of a real symmetric sparse matrix in reasonable time

Shitij Bhargava shitij.cse at gmail.com
Wed Aug 3 01:30:42 CDT 2011


Hi all !!



The reason I have started using SLEPc/PETSc is that I want to reduce the
memory requirements of a program that I already have in Intel MKL. The
program's purpose is to calculate ALL eigenvalues and eigenvectors of a real
symmetric sparse matrix. Intel MKL doesnt have a sparse solver for this
(cant solve for eigenvalues of a matrix represented in sparse format), so I
am using SLEPc, as it can do this.
(NOTE: by "sparsity" I mean ZEROES*100/(ZEROES+NONZEROES). Also the
tolerance I am using is 10^-6)

Till now I have been using the LAPACK method (EPSLAPACK) for the slepc
eigensolver, as I found out that it was the fastest (I configured PETSc to
use Intel MKL BLAS/LAPACK routines). But I just found out that it is using a
lot of memory irrespective of the sparsity of  the matrix. For example, on a
matrix of order 9600x9600 no matter how sparse/dense it is (I tested from
0.02% to 75% sparsity), it used  1.3gb of peak memory every time, whereas
the part of the program in which the matrix is assembled in AIJ format (in
case of 75% sparsity) takes only 550mb. The MKL program takes 3.1gb.
Although memory requirement is less, the difference is constant irrespective
of sparsity.So,* it seems that the LAPACK method will use memory dependent
on the order of the matrix, irrespective of the sparsity.* So I guess its
not very suitable for my purpose ? I also tried changing the NCV and MPD
(Maximum projected Dimension) parameters hoping they would somehow reduce
memory requirements even if at the cost of more time, but whatever I pass
NCV and MPD (in EPSSetDimensions) as, they are not considered at all. They
remain the same as the default values (as I verified with EPSGetDimensions).


So I tried other solvers like the default, krylovschur, lanczos, but I think
they are built to find only a few eigenvalues and not all, because they
almost never converged to find ALL eigenvalues, even if I gave them a lot of
time (big max number of iterations parameter). But their memory requirements
seem to be pretty less. For example, I ran the program with EPSLANCZOS on
the same 9600x9600 matrix (75% sparsity) with maximum number of iterations
as 500,000. It ran it for 6.5 hrs (then I killed it), and at the end it was
using 530mb only (although I dont know how near it was to completion).

I am somewhat illiterate about the mathematics behind all this, and how the
lapack,krylovschur, etc. work and I have no idea if what I am experiencing
was expected/obvious, although I half-expected some things by reading the
slepc documentation, where a table was given comparing different
eigensolvers.

Also, I have observed that as the sparsity of the matrix increases (more
zeroes), more eigenvalues also become zeroes. Is there a way to exploit this
using probably krylovschur/lanczos so that there is no need to converge to
all eigenvalues (meaning that the time they will take will be reasonable),
but to only the number of eigenvalues that are non-zero,so that I can assume
the non-converged eigenvalues to be simply zeroes (but is there a way to
know beforehand by sparsity approximately how many non-zeroe eigenvalues
will be there, and the corresponding value of maximum iterations that are
required ?)

Is there a way to do what I want to do through SLEPc ??? * I want ALL
eigenvalues and want that the memory requirements of the eigensolver should
be less/in accordance with the sparsity of the matrix*, otherwise the very
purpose of using a sparse matrix representation is somewhat defeated. I am
doing all this in FORTRAN by the way.


Thanks a lot in advance !!!


Shitij
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110803/9ddbfd5e/attachment-0001.htm>


More information about the petsc-users mailing list