UMFPACK out of memory

francois pacull fpacull at fluorem.com
Tue Oct 27 10:12:21 CDT 2009


Dear PETSc team,

I have a few questions... During the resolution of a linear system in 
parallel, I am trying to apply a local LU solver to each of the SEQaij 
diagonal blocks of a MPIaij matrix partitioned with PARMETIS.

- I started with MUMPS but it seems that it only works with 
unpartitioned aij matrices, is it really the case? Or could we use MUMPS 
to build an additive Schwarz preconditioner for example?

- Then I tried UMFPACK. This works fine when the diagonal blocks (and 
the memory required to store the factors) are small but crashes when 
they are a little bit larger. For example with a "numeric final size" of 
4203.7 MBytes,  I got the following message "ERROR: out of memory" while 
there was plenty of memory left in the computer. I tried either with the 
UMFPACK version 5.2, downloaded by PETSc, or with a manually installed 
version 5.4, linked to PETSc. Is this a behavior from UMFPACK that you 
already experienced?

- Since UMFPACK seemed to have a memory limit around 4096 MB,  I tried 
to install a PETSc version with the option "--with-64-bit-indices", 
however none of the partitioning packages could be compiled with this 
option (parmetis,chaco,jostle,party,scotch). Is there a way to compile 
PETSc with 64 bit indices AND a partitioning package?

- Finally, I tried to modify the PETSc source code umfpack.c so that it 
would deal with 64 bit indices, but I only ended up so far with a 
segmentation violation message at the execution... Is it the only way I 
could use UMPACK with large sparse matrices?

Thank you,
Regards,
francois pacull.


More information about the petsc-users mailing list