<div dir="ltr"><div>Dear all,<br>I was experimenting with my stokes problem in 3D staggered grid with high viscosity jump using -pc_fieldsplit of type schur complement. Using hypre pilut preconditioner for the ksp for A00 block seemed to be giving nice results for smaller size. Using the following options in my laptop, or in the cluster I'm using with ONE node multiple cores WORKS fine:<br>
-ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view -fieldsplit_0_ksp_type gcr -fieldsplit_0_ksp_rtol 1.0e-5 -fieldsplit_0_pc_type hypre -fieldsplit_0_pc_hypre_type pilut<br>
<br></div>But when I try to submit jobs with multiple nodes, the process never seem to end! When using gamg instead of hypre, the same program works with multiple nodes in the same cluster. <br>But gamg gave much slower convergence than the hypre, so I wanted to use the hypre.<br>
When I kill the job and look at the error file, the error it reports:<br>[8]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>[8]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
[8]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[8]PETSC">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[8]PETSC</a> ERROR: or try <a href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
[8]PETSC ERROR: likely location of problem given in stack below<br>[8]PETSC ERROR: --------------------- Stack Frames ------------------------------------<br>[8]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
[8]PETSC ERROR: INSTEAD the line number of the start of the function<br>[8]PETSC ERROR: is given.<br>[8]PETSC ERROR: [8] HYPRE_SetupXXX line 130 /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c<br>[8]PETSC ERROR: [8] PCSetUp_HYPRE line 94 /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c<br>
[8]PETSC ERROR: [8] PCSetUp line 868 /tmp/petsc-3.4.1/src/ksp/pc/interface/precon.c<br>[8]PETSC ERROR: [8] KSPSetUp line 192 /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c<br>[8]PETSC ERROR: [8] KSPSolve line 356 /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c<br>
[8]PETSC ERROR: [8] MatMult_SchurComplement line 75 /tmp/petsc-3.4.1/src/ksp/ksp/utils/schurm.c<br>[8]PETSC ERROR: [8] MatNullSpaceTest line 408 /tmp/petsc-3.4.1/src/mat/interface/matnull.c<br>[8]PETSC ERROR: [8] solveModel line 133 "unknowndirectory/"/epi/asclepios2/bkhanal/works/AdLemModel/src/PetscAdLemTaras3D.cxx<br>
[8]PETSC ERROR: --------------------- Error Message ------------------------------------<br>[8]PETSC ERROR: Signal received!<br>[8]PETSC ERROR: ------------------------------------------------------------------------<br>[8]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013<br>
[8]PETSC ERROR: See docs/changes/index.html for recent updates.<br>[8]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>[8]PETSC ERROR: See docs/index.html for manual pages.<br>[8]PETSC ERROR: ------------------------------------------------------------------------<br>
[8]PETSC ERROR: /epi/asclepios2/bkhanal/works/AdLemModel/build/src/AdLemMain on a arch-linux2-cxx-debug named nef002 by bkhanal Fri Aug 9 18:00:22 2013<br>[8]PETSC ERROR: Libraries linked from /home/bkhanal/petsc/lib<br>
[8]PETSC ERROR: Configure run at Mon Jul 1 13:44:30 2013<br>[8]PETSC ERROR: Configure options --with-mpi-dir=/opt/openmpi-gcc/current/ --with-shared-libraries --prefix=/home/bkhanal/petsc -download-f-blas-lapack=1 --download-hypre --with-clanguage=cxx<br>
[8]PETSC ERROR: ------------------------------------------------------------------------<br>[8]PETSC ERROR: User provided function() line 0 in unknown directory unknown file<br><br></div>