[petsc-users] problem (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster

Bishesh Khanal bisheshkh at gmail.com
Fri Aug 9 11:52:11 CDT 2013


Dear all,
I was experimenting with my stokes problem in 3D staggered grid with high
viscosity jump using -pc_fieldsplit of type schur complement. Using hypre
pilut preconditioner for the ksp for A00 block seemed to be giving nice
results for smaller size. Using the following options in my laptop, or in
the cluster I'm using with ONE node multiple cores WORKS fine:
-ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur
-pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2
-pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view
-fieldsplit_0_ksp_type gcr -fieldsplit_0_ksp_rtol 1.0e-5
-fieldsplit_0_pc_type hypre -fieldsplit_0_pc_hypre_type pilut

But when I try to submit jobs with multiple nodes, the process never seem
to end! When using gamg instead of hypre, the same program works with
multiple nodes in the same cluster.
But gamg gave much slower convergence than the hypre, so I wanted to use
the hypre.
When I kill the job and look at the error file, the error it reports:
[8]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[8]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[8]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[8]PETSC ERROR:
or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
corruption errors
[8]PETSC ERROR: likely location of problem given in stack below
[8]PETSC ERROR: ---------------------  Stack Frames
------------------------------------
[8]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[8]PETSC ERROR:       INSTEAD the line number of the start of the function
[8]PETSC ERROR:       is given.
[8]PETSC ERROR: [8] HYPRE_SetupXXX line 130
/tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c
[8]PETSC ERROR: [8] PCSetUp_HYPRE line 94
/tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c
[8]PETSC ERROR: [8] PCSetUp line 868
/tmp/petsc-3.4.1/src/ksp/pc/interface/precon.c
[8]PETSC ERROR: [8] KSPSetUp line 192
/tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c
[8]PETSC ERROR: [8] KSPSolve line 356
/tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c
[8]PETSC ERROR: [8] MatMult_SchurComplement line 75
/tmp/petsc-3.4.1/src/ksp/ksp/utils/schurm.c
[8]PETSC ERROR: [8] MatNullSpaceTest line 408
/tmp/petsc-3.4.1/src/mat/interface/matnull.c
[8]PETSC ERROR: [8] solveModel line 133
"unknowndirectory/"/epi/asclepios2/bkhanal/works/AdLemModel/src/PetscAdLemTaras3D.cxx
[8]PETSC ERROR: --------------------- Error Message
------------------------------------
[8]PETSC ERROR: Signal received!
[8]PETSC ERROR:
------------------------------------------------------------------------
[8]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013
[8]PETSC ERROR: See docs/changes/index.html for recent updates.
[8]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[8]PETSC ERROR: See docs/index.html for manual pages.
[8]PETSC ERROR:
------------------------------------------------------------------------
[8]PETSC ERROR:
/epi/asclepios2/bkhanal/works/AdLemModel/build/src/AdLemMain on a
arch-linux2-cxx-debug named nef002 by bkhanal Fri Aug  9 18:00:22 2013
[8]PETSC ERROR: Libraries linked from /home/bkhanal/petsc/lib
[8]PETSC ERROR: Configure run at Mon Jul  1 13:44:30 2013
[8]PETSC ERROR: Configure options --with-mpi-dir=/opt/openmpi-gcc/current/
--with-shared-libraries --prefix=/home/bkhanal/petsc
-download-f-blas-lapack=1 --download-hypre --with-clanguage=cxx
[8]PETSC ERROR:
------------------------------------------------------------------------
[8]PETSC ERROR: User provided function() line 0 in unknown directory
unknown file
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130809/ca42f092/attachment.html>


More information about the petsc-users mailing list