[petsc-users] Direct solvers

Manav Bhatia bhatiamanav at gmail.com
Thu Feb 5 17:47:39 CST 2015


Hi, 

   I am trying to use an lu decomposition method for a relatively large matrix (~775,000 dofs) coming from a thermoelasticity problem. 

   For the past few weeks, LU solver in 3.5.1 has been solving it just fine. I just upgraded to 3.5.2 from macports (running on Mac OS 10.10.2), and am getting the following “out of memory" error 

sab_old_mast_structural_analysis(378,0x7fff75f6e300) malloc: *** mach_vm_map(size=18446744066373115904) failed (error code=3)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Out of memory. This could be due to allocating
[0]PETSC ERROR: too large an object or bleeding by not properly
[0]PETSC ERROR: destroying unneeded objects.
[0]PETSC ERROR: Memory allocated 3649788624 Memory used by process 3943817216
[0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info.
[0]PETSC ERROR: Memory requested 18446744066373113856
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.5.2, unknown 
[0]PETSC ERROR: ./sab_old_mast_structural_analysis on a arch-macports named ws243-49.walker.dynamic.msstate.edu by manav Thu Feb  5 17:30:18 2015
[0]PETSC ERROR: Configure options --prefix=/opt/local --prefix=/opt/local/lib/petsc --with-valgrind=0 --with-shared-libraries --with-c2html-dir=/opt/local --with-x=0 --with-blas-lapack-lib=/System/Library/Frameworks/Accelerate.framework/Versions/Current/Accelerate --with-hwloc-dir=/opt/local --with-suitesparse-dir=/opt/local --with-superlu-dir=/opt/local --with-metis-dir=/opt/local --with-parmetis-dir=/opt/local --with-scalapack-dir=/opt/local --with-mumps-dir=/opt/local CC=/opt/local/bin/mpicc-openmpi-mp CXX=/opt/local/bin/mpicxx-openmpi-mp FC=/opt/local/bin/mpif90-openmpi-mp F77=/opt/local/bin/mpif90-openmpi-mp F90=/opt/local/bin/mpif90-openmpi-mp COPTFLAGS=-Os CXXOPTFLAGS=-Os FOPTFLAGS=-Os LDFLAGS="-L/opt/local/lib -Wl,-headerpad_max_install_names" CPPFLAGS=-I/opt/local/include CFLAGS="-Os -arch x86_64" CXXFLAGS=-Os FFLAGS=-Os FCFLAGS=-Os F90FLAGS=-Os PETSC_ARCH=arch-macports --with-mpiexec=mpiexec-openmpi-mp
[0]PETSC ERROR: #1 PetscMallocAlign() line 46 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/sys/memory/mal.c
[0]PETSC ERROR: #2 PetscTrMallocDefault() line 184 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/sys/memory/mtr.c
[0]PETSC ERROR: #3 PetscFreeSpaceGet() line 13 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/mat/utils/freespace.c
[0]PETSC ERROR: #4 MatLUFactorSymbolic_SeqAIJ() line 362 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/mat/impls/aij/seq/aijfact.c
[0]PETSC ERROR: #5 MatLUFactorSymbolic() line 2842 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/mat/interface/matrix.c
[0]PETSC ERROR: #6 PCSetUp_LU() line 127 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: #7 PCSetUp() line 902 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #8 KSPSetUp() line 305 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #9 KSPSolve() line 417 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #10 SNESSolve_NEWTONLS() line 232 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/snes/impls/ls/ls.c
[0]PETSC ERROR: #11 SNESSolve() line 3743 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/snes/interface/snes.c
[0]PETSC ERROR: #12 solve() line 559 in src/solvers/petsc_nonlinear_solver.C
--------------------------------------------------------------------------


A few questions: 

— has something changed between 3.5.1 and 3.5.2 that might lead to this behavior? 

— So far I have tried the following iterative solver option: -pc_type ilu -pc_factor_levels 1 (and 2) with very slow convergence. Is there a better preconditioner recommended for this problem? This is a solid mechanics problem with thermal load (not a coupled thermal-structural probelm). 

— I tried using MUMPS through the option -pc_factor_mat_solver_package mumps -mat_mumps_icntl_ 22 1 -mat_mumps_icntl_ 23 8000  to try to get it to use the disk I/O and limit the memory to 8GB, but that too returned with an out of memory error. Is this the correct format to specify the options? If so, is the write to disk option expected to work with MUMPS called via petsc? 

I would greatly appreciate your inputs. 

Thanks,
Manav




-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150205/73176517/attachment.html>


More information about the petsc-users mailing list