[petsc-users] Direct solvers
Barry Smith
bsmith at mcs.anl.gov
Thu Feb 5 23:18:54 CST 2015
> On Feb 5, 2015, at 5:47 PM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>
> Hi,
>
> I am trying to use an lu decomposition method for a relatively large matrix (~775,000 dofs) coming from a thermoelasticity problem.
>
> For the past few weeks, LU solver in 3.5.1 has been solving it just fine. I just upgraded to 3.5.2 from macports (running on Mac OS 10.10.2), and am getting the following “out of memory" error
This is surprising. The changes are 3.5.1 to 3.5.2 are supposed to be only minor bug fixes. Is the code otherwise __exactly__ the same with the same options? Are all the external libraries exactly the same in both cases?
> Memory allocated 3649788624 Memory used by process 3943817216
> [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info.
> [0]PETSC ERROR: Memory requested 18446744066373113856
^^^^^^^^^^^^^^^^^^^
This memory size here is truly absurd. If you have access to a linux system I suggest running the program with valgrind to see if there is memory corruption messing things up http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
You can also try installing 3.5.2 directly from our tarball instead of macports and see if the same thing happens.
Barry
>
> sab_old_mast_structural_analysis(378,0x7fff75f6e300) malloc: *** mach_vm_map(size=18446744066373115904) failed (error code=3)
> *** error: can't allocate region
> *** set a breakpoint in malloc_error_break to debug
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Out of memory. This could be due to allocating
> [0]PETSC ERROR: too large an object or bleeding by not properly
> [0]PETSC ERROR: destroying unneeded objects.
> [0]PETSC ERROR: Memory allocated 3649788624 Memory used by process 3943817216
> [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info.
> [0]PETSC ERROR: Memory requested 18446744066373113856
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.5.2, unknown
> [0]PETSC ERROR: ./sab_old_mast_structural_analysis on a arch-macports named ws243-49.walker.dynamic.msstate.edu by manav Thu Feb 5 17:30:18 2015
> [0]PETSC ERROR: Configure options --prefix=/opt/local --prefix=/opt/local/lib/petsc --with-valgrind=0 --with-shared-libraries --with-c2html-dir=/opt/local --with-x=0 --with-blas-lapack-lib=/System/Library/Frameworks/Accelerate.framework/Versions/Current/Accelerate --with-hwloc-dir=/opt/local --with-suitesparse-dir=/opt/local --with-superlu-dir=/opt/local --with-metis-dir=/opt/local --with-parmetis-dir=/opt/local --with-scalapack-dir=/opt/local --with-mumps-dir=/opt/local CC=/opt/local/bin/mpicc-openmpi-mp CXX=/opt/local/bin/mpicxx-openmpi-mp FC=/opt/local/bin/mpif90-openmpi-mp F77=/opt/local/bin/mpif90-openmpi-mp F90=/opt/local/bin/mpif90-openmpi-mp COPTFLAGS=-Os CXXOPTFLAGS=-Os FOPTFLAGS=-Os LDFLAGS="-L/opt/local/lib -Wl,-headerpad_max_install_names" CPPFLAGS=-I/opt/local/include CFLAGS="-Os -arch x86_64" CXXFLAGS=-Os FFLAGS=-Os FCFLAGS=-Os F90FLAGS=-Os PETSC_ARCH=arch-macports --with-mpiexec=mpiexec-openmpi-mp
> [0]PETSC ERROR: #1 PetscMallocAlign() line 46 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/sys/memory/mal.c
> [0]PETSC ERROR: #2 PetscTrMallocDefault() line 184 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/sys/memory/mtr.c
> [0]PETSC ERROR: #3 PetscFreeSpaceGet() line 13 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/mat/utils/freespace.c
> [0]PETSC ERROR: #4 MatLUFactorSymbolic_SeqAIJ() line 362 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/mat/impls/aij/seq/aijfact.c
> [0]PETSC ERROR: #5 MatLUFactorSymbolic() line 2842 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/mat/interface/matrix.c
> [0]PETSC ERROR: #6 PCSetUp_LU() line 127 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/ksp/pc/impls/factor/lu/lu.c
> [0]PETSC ERROR: #7 PCSetUp() line 902 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: #8 KSPSetUp() line 305 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #9 KSPSolve() line 417 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #10 SNESSolve_NEWTONLS() line 232 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/snes/impls/ls/ls.c
> [0]PETSC ERROR: #11 SNESSolve() line 3743 in /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_math_petsc/petsc/work/v3.5.2/src/snes/interface/snes.c
> [0]PETSC ERROR: #12 solve() line 559 in src/solvers/petsc_nonlinear_solver.C
> --------------------------------------------------------------------------
>
>
> A few questions:
>
> — has something changed between 3.5.1 and 3.5.2 that might lead to this behavior?
>
> — So far I have tried the following iterative solver option: -pc_type ilu -pc_factor_levels 1 (and 2) with very slow convergence. Is there a better preconditioner recommended for this problem? This is a solid mechanics problem with thermal load (not a coupled thermal-structural probelm).
>
> — I tried using MUMPS through the option -pc_factor_mat_solver_package mumps -mat_mumps_icntl_ 22 1 -mat_mumps_icntl_ 23 8000 to try to get it to use the disk I/O and limit the memory to 8GB, but that too returned with an out of memory error. Is this the correct format to specify the options? If so, is the write to disk option expected to work with MUMPS called via petsc?
>
> I would greatly appreciate your inputs.
>
> Thanks,
> Manav
>
>
>
>
More information about the petsc-users
mailing list