On Tue, Oct 27, 2009 at 10:56 AM, francois pacull <span dir="ltr"><<a href="mailto:fpacull@fluorem.com">fpacull@fluorem.com</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
Thank you Matt for your quick reply.<br>
<br>
I will try to use MUMPS soon. About UMFPACK, the OS used is a 64-bit Linux one. Also, we do have 64-bit pointers from what I understand; the following lines are from the PETSc configure.log file:<br>
<br>
#ifndef PETSC_SIZEOF_VOID_P<br>
#define PETSC_SIZEOF_VOID_P 8<br>
#endif<br></blockquote><div><br>Yes, then it appears that UMFPACK cannot be used for large memory. Did you try MUMPS?<br><br> Matt<br> </div><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
And here is the umfpack_UMF_report_info from the subdomain that crashed:<br>
<br>
<br>
UMFPACK V5.4.0 (May 20, 2009), Info:<br>
matrix entry defined as: double<br>
Int (generic integer) defined as: int<br>
BLAS library used: Fortran BLAS. size of BLAS integer: 4<br>
MATLAB: no.<br>
CPU timer: POSIX times ( ) routine.<br>
number of rows in matrix A: 79002<br>
number of columns in matrix A: 79002<br>
entries in matrix A: 12030970<br>
memory usage reported in: 8-byte Units<br>
size of int: 4 bytes<br>
size of UF_long: 8 bytes<br>
size of pointer: 8 bytes<br>
size of numerical entry: 8 bytes<br>
<br>
strategy used: unsymmetric<br>
ordering used: colamd on A<br>
modify Q during factorization: yes<br>
prefer diagonal pivoting: no<br>
pivots with zero Markowitz cost: 0<br>
submatrix S after removing zero-cost pivots:<br>
number of "dense" rows: 0<br>
number of "dense" columns: 0<br>
number of empty rows: 0<br>
number of empty columns 0<br>
submatrix S square and diagonal preserved<br>
symbolic factorization defragmentations: 0<br>
symbolic memory usage (Units): 27435792<br>
symbolic memory usage (MBytes): 209.3<br>
Symbolic size (Units): 177636<br>
Symbolic size (MBytes): 1<br>
symbolic factorization CPU time (sec): 4.95<br>
symbolic factorization wallclock time(sec): 4.95<br>
<br>
symbolic/numeric factorization: upper bound actual %<br>
variable-sized part of Numeric object:<br>
initial size (Units) 31236744 - -<br>
peak size (Units) 597607658 - -<br>
final size (Units) 550474688 - -<br>
Numeric final size (Units) 550988250 - -<br>
Numeric final size (MBytes) 4203.7 - -<br>
peak memory usage (Units) 598718594 - -<br>
peak memory usage (MBytes) 4567.9 - -<br>
numeric factorization flops 1.63141e+12 - -<br>
nz in L (incl diagonal) 171352664 - -<br>
nz in U (incl diagonal) 346187947 - -<br>
nz in L+U (incl diagonal) 517461609 - -<br>
largest front (# entries) 15783705 - -<br>
largest # rows in front 2815 - -<br>
largest # columns in front 5607 - -<br>
<br>
<br>
UMFPACK V5.4.0 (May 20, 2009): ERROR: out of memory<br>
<br>
Thanks again,<br>
francois.<br>
<br>
<br>
<br>
Matthew Knepley a écrit :<div><div></div><div class="h5"><br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
On Tue, Oct 27, 2009 at 10:12 AM, francois pacull <<a href="mailto:fpacull@fluorem.com" target="_blank">fpacull@fluorem.com</a> <mailto:<a href="mailto:fpacull@fluorem.com" target="_blank">fpacull@fluorem.com</a>>> wrote:<br>
<br>
Dear PETSc team,<br>
<br>
I have a few questions... During the resolution of a linear system<br>
in parallel, I am trying to apply a local LU solver to each of the<br>
SEQaij diagonal blocks of a MPIaij matrix partitioned with PARMETIS.<br>
<br>
- I started with MUMPS but it seems that it only works with<br>
unpartitioned aij matrices, is it really the case? Or could we use<br>
MUMPS to build an additive Schwarz preconditioner for example?<br>
<br>
<br>
You can use MUMPS for the subproblem solver.<br>
<br>
- Then I tried UMFPACK. This works fine when the diagonal blocks<br>
(and the memory required to store the factors) are small but<br>
crashes when they are a little bit larger. For example with a<br>
"numeric final size" of 4203.7 MBytes, I got the following<br>
message "ERROR: out of memory" while there was plenty of memory<br>
left in the computer. I tried either with the UMFPACK version 5.2,<br>
downloaded by PETSc, or with a manually installed version 5.4,<br>
linked to PETSc. Is this a behavior from UMFPACK that you already<br>
experienced?<br>
<br>
<br>
Send all the error output. However, in oder to address more than 4G, you will need 64-bit pointers.<br>
<br>
- Since UMFPACK seemed to have a memory limit around 4096 MB, I<br>
tried to install a PETSc version with the option<br>
"--with-64-bit-indices", however none of the partitioning packages<br>
could be compiled with this option<br>
(parmetis,chaco,jostle,party,scotch). Is there a way to compile<br>
PETSc with 64 bit indices AND a partitioning package?<br>
<br>
<br>
Not that I know of.<br>
<br>
- Finally, I tried to modify the PETSc source code umfpack.c so<br>
that it would deal with 64 bit indices, but I only ended up so far<br>
with a segmentation violation message at the execution... Is it<br>
the only way I could use UMPACK with large sparse matrices?<br>
<br>
<br>
Why not just upgrade to a 64-bit OS if you want to address so much memory on a single machine?<br>
<br>
Matt<br>
<br>
Thank you,<br>
Regards,<br>
francois pacull.<br>
<br>
<br>
<br>
<br>
-- <br>
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>
</blockquote>
<br>
</div></div></blockquote></div><br><br clear="all"><br>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>