UMFPACK out of memory
francois pacull
fpacull at fluorem.com
Tue Oct 27 14:06:35 CDT 2009
Yes Matt, MUMPS as the subproblem solver works well. I think that when I
tried it earlier, I forgot to convert my seqbaij submatrices into seqaij
ones...
Thanks, francois.
ps: for information to the community, I had at first a similar problem
as when using UMFPACK (crashed when a submatrix is large) because of the
default MUMPS reordering mat_mumps_icntl_7 option that was set to METIS.
After switching to PORD, the problem was solved.
Matthew Knepley a écrit :
> On Tue, Oct 27, 2009 at 10:56 AM, francois pacull <fpacull at fluorem.com
> <mailto:fpacull at fluorem.com>> wrote:
>
> Thank you Matt for your quick reply.
>
> I will try to use MUMPS soon. About UMFPACK, the OS used is a
> 64-bit Linux one. Also, we do have 64-bit pointers from what I
> understand; the following lines are from the PETSc configure.log file:
>
> #ifndef PETSC_SIZEOF_VOID_P
> #define PETSC_SIZEOF_VOID_P 8
> #endif
>
>
> Yes, then it appears that UMFPACK cannot be used for large memory. Did
> you try MUMPS?
>
> Matt
>
>
> And here is the umfpack_UMF_report_info from the subdomain that
> crashed:
>
>
> UMFPACK V5.4.0 (May 20, 2009), Info:
> matrix entry defined as: double
> Int (generic integer) defined as: int
> BLAS library used: Fortran BLAS. size of BLAS integer: 4
> MATLAB: no.
> CPU timer: POSIX times ( ) routine.
> number of rows in matrix A: 79002
> number of columns in matrix A: 79002
> entries in matrix A: 12030970
> memory usage reported in: 8-byte Units
> size of int: 4 bytes
> size of UF_long: 8 bytes
> size of pointer: 8 bytes
> size of numerical entry: 8 bytes
>
> strategy used: unsymmetric
> ordering used: colamd on A
> modify Q during factorization: yes
> prefer diagonal pivoting: no
> pivots with zero Markowitz cost: 0
> submatrix S after removing zero-cost pivots:
> number of "dense" rows: 0
> number of "dense" columns: 0
> number of empty rows: 0
> number of empty columns 0
> submatrix S square and diagonal preserved
> symbolic factorization defragmentations: 0
> symbolic memory usage (Units): 27435792
> symbolic memory usage (MBytes): 209.3
> Symbolic size (Units): 177636
> Symbolic size (MBytes): 1
> symbolic factorization CPU time (sec): 4.95
> symbolic factorization wallclock time(sec): 4.95
>
> symbolic/numeric factorization: upper bound
> actual %
> variable-sized part of Numeric object:
> initial size (Units) 31236744
> - -
> peak size (Units) 597607658
> - -
> final size (Units) 550474688
> - -
> Numeric final size (Units) 550988250
> - -
> Numeric final size (MBytes) 4203.7
> - -
> peak memory usage (Units) 598718594
> - -
> peak memory usage (MBytes) 4567.9
> - -
> numeric factorization flops 1.63141e+12
> - -
> nz in L (incl diagonal) 171352664
> - -
> nz in U (incl diagonal) 346187947
> - -
> nz in L+U (incl diagonal) 517461609
> - -
> largest front (# entries) 15783705
> - -
> largest # rows in front 2815
> - -
> largest # columns in front 5607
> - -
>
>
> UMFPACK V5.4.0 (May 20, 2009): ERROR: out of memory
>
> Thanks again,
> francois.
>
>
>
> Matthew Knepley a écrit :
>
> On Tue, Oct 27, 2009 at 10:12 AM, francois pacull
> <fpacull at fluorem.com <mailto:fpacull at fluorem.com>
> <mailto:fpacull at fluorem.com <mailto:fpacull at fluorem.com>>> wrote:
>
> Dear PETSc team,
>
> I have a few questions... During the resolution of a linear
> system
> in parallel, I am trying to apply a local LU solver to each
> of the
> SEQaij diagonal blocks of a MPIaij matrix partitioned with
> PARMETIS.
>
> - I started with MUMPS but it seems that it only works with
> unpartitioned aij matrices, is it really the case? Or could
> we use
> MUMPS to build an additive Schwarz preconditioner for example?
>
>
> You can use MUMPS for the subproblem solver.
>
> - Then I tried UMFPACK. This works fine when the diagonal
> blocks
> (and the memory required to store the factors) are small but
> crashes when they are a little bit larger. For example with a
> "numeric final size" of 4203.7 MBytes, I got the following
> message "ERROR: out of memory" while there was plenty of memory
> left in the computer. I tried either with the UMFPACK
> version 5.2,
> downloaded by PETSc, or with a manually installed version 5.4,
> linked to PETSc. Is this a behavior from UMFPACK that you
> already
> experienced?
>
>
> Send all the error output. However, in oder to address more
> than 4G, you will need 64-bit pointers.
>
> - Since UMFPACK seemed to have a memory limit around 4096
> MB, I
> tried to install a PETSc version with the option
> "--with-64-bit-indices", however none of the partitioning
> packages
> could be compiled with this option
> (parmetis,chaco,jostle,party,scotch). Is there a way to compile
> PETSc with 64 bit indices AND a partitioning package?
>
>
> Not that I know of.
>
> - Finally, I tried to modify the PETSc source code umfpack.c so
> that it would deal with 64 bit indices, but I only ended up
> so far
> with a segmentation violation message at the execution... Is it
> the only way I could use UMPACK with large sparse matrices?
>
>
> Why not just upgrade to a 64-bit OS if you want to address so
> much memory on a single machine?
>
> Matt
>
> Thank you,
> Regards,
> francois pacull.
>
>
>
>
> --
> What most experimenters take for granted before they begin
> their experiments is infinitely more interesting than any
> results to which their experiments lead.
> -- Norbert Wiener
>
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
More information about the petsc-users
mailing list