[petsc-users] slepc eating all my ram

Simon Burton simon at arrowtheory.com
Fri Jul 15 07:29:13 CDT 2016


Hi,

I'm running a slepc eigenvalue solver on a single machine with 198GB of ram,
and solution space dimension 2^32. With double precision this means
each vector is 32GB. I'm using shell matrices to implement the matrix
vector product. I figured the easiest way to get eigenvalues is using
the slepc power method, but it is still eating all the ram.

Running in gdb I see that slepc is allocating a bunch of vectors in
the spectral transform object (in STSetUp), and by this time it has consumed
most of the 198GB of ram. I don't see why a spectral transform
shift of zero needs to alloc a whole bunch of memory.

I'm wondering if there are some other options to slepc that can
reduce the memory footprint? A barebones implementation of the
power method only needs to keep two vectors, perhaps I should
just try doing this using petsc primitives. It's also possible that
I could spread the computation over two or more machines but
that's a whole other learning curve. 

The code I am running is essentially the laplacian grid
example from slepc (src/eps/examples/tutorials/ex3.c):

./ex3 -eps_hermitian -eps_largest_magnitude -eps_monitor ascii -eps_nev 1 -eps_type power -n 65536

I also put this line in the source:
EPSSetDimensions(eps,1,2,1); 

Cheers,

Simon.



More information about the petsc-users mailing list