[petsc-users] Unknown PETSc-HYPRE Error

Smith, Barry F. bsmith at mcs.anl.gov
Wed Jul 18 16:07:46 CDT 2018


  valgrind   http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind

   Note if your HPC resource doesn't have valgrind you can still run your app under it on a local machine. (Big as problem as you can fit).


   Barry

> On Jul 18, 2018, at 1:20 PM, Alistair Bentley <alistairbntl at gmail.com> wrote:
> 
> Hello all,
> 
> I'm a first time PETSc mailing list user, so I apologize in advance if this is not the right mailing list for this post!
> 
> I've been using PETSc with HYPRE (BoomerAMG) as part of a Schur complement block preconditioner for a time-dependent 3D Navier-Stokes equation.
> 
> Specifically, for the Schur complement approximation, I've been using PETSc's selfp method (given the simulation's short time steps, this has been producing reasonable results) and a single application of the default HYPRE BoomerAMG.
> 
> When I run on my local machine (up to 4 processors), the simulation runs to completion smoothly.  The GMRES iterations for the global system (as well as the block solves) appear to scale well with h.  
> 
> Next I pushed to job to our HPC resource and ran a small job (stabilized P1P1 elements on 329013 tetrahedra and 56970 nodes) with 36 cores.  This time, however, the simulation did not run to completion.  Rather, I began to encounter the following error:
> 
> [ 6] KSPSolve() line 666 in /p/home/abentle/.hashdist/tmp/petsc-mtyptrixhukj/src/ksp/ksp/interface/itfunc.c
> [ 6] KSPSolve_FGMRES() line 291 in /p/home/abentle/.hashdist/tmp/petsc-mtyptrixhukj/src/ksp/ksp/impls/gmres/fgmres/fgmres.c
> [ 6] KSPFGMRESCycle() line 166 in /p/home/abentle/.hashdist/tmp/petsc-mtyptrixhukj/src/ksp/ksp/impls/gmres/fgmres/fgmres.c
> [ 6] KSP_PCApply() line 275 in /p/home/abentle/.hashdist/tmp/petsc-mtyptrixhukj/include/petsc/private/kspimpl.h
> [ 6] PCApply() line 458 in /p/home/abentle/.hashdist/tmp/petsc-mtyptrixhukj/src/ksp/pc/interface/precon.c
> [ 6] PCApply_FieldSplit_Schur() line 900 in /p/home/abentle/.hashdist/tmp/petsc-mtyptrixhukj/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> [ 6] KSPSolve() line 666 in /p/home/abentle/.hashdist/tmp/petsc-mtyptrixhukj/src/ksp/ksp/interface/itfunc.c
> [ 6] KSPSolve_PREONLY() line 22 in /p/home/abentle/.hashdist/tmp/petsc-mtyptrixhukj/src/ksp/ksp/impls/preonly/preonly.c
> [ 6] KSP_PCApply() line 275 in /p/home/abentle/.hashdist/tmp/petsc-mtyptrixhukj/include/petsc/private/kspimpl.h
> [ 6] PCApply() line 458 in /p/home/abentle/.hashdist/tmp/petsc-mtyptrixhukj/src/ksp/pc/interface/precon.c
> [ 6] PCApply_HYPRE() line 351 in /p/home/abentle/.hashdist/tmp/petsc-mtyptrixhukj/src/ksp/pc/impls/hypre/hypre.c
> [ 6] Error in external library
> [ 6] Error in HYPRE solver, error code 1
> 
> Unfortunately, HYPRE lists an error code1 as a generic error which does not give me much to go on.  Further, while this error appears consistently, it seems to occur randomly.  For instance, the error message above occurred on the 6th MPI process after the simulation had reached t = 0.1.  A previous run with the exact same configuration occurred on the 28th MPI process after the simulation reached t = 1.1 seconds.  
> 
> I've been experimenting with different configurations to try and find a stable solution, but I'm at a bit of a loss as to how to proceed.  Any suggestions for moving forward would be greatly appreciated.   
> 
> Thanks!
> 
> - Alistair
> 



More information about the petsc-users mailing list