[petsc-users] zero pivot in LU factorization when using -fieldsplit_0_pc_type gamg in schur fieldsplit PC

Bishesh Khanal bisheshkh at gmail.com
Mon Dec 7 17:35:16 CST 2015


Hello,
I'm revisiting my code for solving a system of eqs similar to Stokes to get
better convergence for big jumps in viscosity (mu) and wanted to experiment
with petsc's gamg:
mu lap(u) + grad(p) = f1
      div(u)                = f2
Dirichlet boundary condition of zero velocity.
The implementation is a finite difference with a staggered grid
discretization.

Using Schur FieldSplit PC type, and for constant viscosity, using hypre PC
for A00 block had been serving well. Now when trying gamg, I get zero pivot
error. From some of the past discussions in the archives of the mailing
list, it seems I've to play around with ..._sub_pc_factor_shift_type but I
could not figure out the exact set of options.

Setting mu = 1 everywhere; Using the following options:
-pc_type fieldsplit -pc_fieldsplit_type schur
-pc_fieldsplit_schur_precondition self -pc_fieldsplit_dm_splits 0
-pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3
-fieldsplit_0_pc_type gamg
I get an error.
When I use -fieldsplit_0_pc_type lu , it converges in a single iteration so
I guess it should mean that A00 is not singular.
Below are the results of using:
1. -fieldsplit_0_pc_type hypre (converges)
2. -ksp_view -fieldsplit_0_pc_type lu (converges)
3. -fieldsplit_0_pc_type gamg (all error output)


1. ----------------------------------------
With -fieldsplit_0_pc_type hypre
----------------------------------------
  Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 7
  Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 7
  Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 17
Linear solve converged due to CONVERGED_RTOL iterations 2

2. --------------------------------------------
With -fieldsplit_0_pc_type lu -ksp_view
-pc_type fieldsplit -pc_fieldsplit_type schur
-pc_fieldsplit_schur_precondition self -pc_fieldsplit_dm_splits 0
-pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3
-fieldsplit_0_pc_type lu -fieldsplit_1_ksp_rtol 1.0e-10
-fieldsplit_1_ksp_converged_reason -ksp_converged_reason -ksp_view

  Linear fieldsplit_0_ solve converged due to CONVERGED_ATOL iterations 0
  Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
  Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
              ...
  Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
  Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22
  Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
  Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
              ...
  Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
  Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22
  Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
Linear solve converged due to CONVERGED_RTOL iterations 1

KSP Object: 1 MPI processes
  type: gmres
    GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
    GMRES: happy breakdown tolerance 1e-30
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
  left preconditioning
  using PRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
  type: fieldsplit
    FieldSplit with Schur preconditioner, blocksize = 4, factorization FULL
    Preconditioner for the Schur complement formed from S itself
    Split info:
    Split number 0 Fields  0, 1, 2
    Split number 1 Fields  3
    KSP solver for A00 block
      KSP Object:      (fieldsplit_0_)       1 MPI processes
        type: gmres
          GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
          GMRES: happy breakdown tolerance 1e-30
        maximum iterations=10000, initial guess is zero
        tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
        left preconditioning
        using PRECONDITIONED norm type for convergence test
      PC Object:      (fieldsplit_0_)       1 MPI processes
        type: lu
          LU: out-of-place factorization
          tolerance for zero pivot 2.22045e-14
          matrix ordering: nd
          factor fill ratio given 5, needed 20.772
            Factored matrix follows:
              Mat Object:               1 MPI processes
                type: seqaij
                rows=54243, cols=54243, bs=3
                package used to perform factorization: petsc
                total: nonzeros=8.41718e+07, allocated nonzeros=8.41718e+07
                total number of mallocs used during MatSetValues calls =0
                  using I-node routines: found 17801 nodes, limit used is 5
        linear system matrix = precond matrix:
        Mat Object:        (fieldsplit_0_)         1 MPI processes
          type: seqaij
          rows=54243, cols=54243, bs=3
          total: nonzeros=4.05217e+06, allocated nonzeros=4.05217e+06
          total number of mallocs used during MatSetValues calls =0
            using I-node routines: found 18081 nodes, limit used is 5
    KSP solver for S = A11 - A10 inv(A00) A01
      KSP Object:      (fieldsplit_1_)       1 MPI processes
        type: gmres
          GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
          GMRES: happy breakdown tolerance 1e-30
        maximum iterations=10000, initial guess is zero
        tolerances:  relative=1e-10, absolute=1e-50, divergence=10000
        left preconditioning
        using PRECONDITIONED norm type for convergence test
      PC Object:      (fieldsplit_1_)       1 MPI processes
        type: none
        linear system matrix = precond matrix:
        Mat Object:        (fieldsplit_1_)         1 MPI processes
          type: schurcomplement
          rows=18081, cols=18081
            has attached null space
            Schur complement A11 - A10 inv(A00) A01
            A11
              Mat Object:              (fieldsplit_1_)               1 MPI
processes
                type: seqaij
                rows=18081, cols=18081
                total: nonzeros=450241, allocated nonzeros=450241
                total number of mallocs used during MatSetValues calls =0
                  not using I-node routines
            A10
              Mat Object:               1 MPI processes
                type: seqaij
                rows=18081, cols=54243
                total: nonzeros=1.35072e+06, allocated nonzeros=1.35072e+06
                total number of mallocs used during MatSetValues calls =0
                  not using I-node routines
            KSP of A00
              KSP Object:              (fieldsplit_0_)               1 MPI
processes
                type: gmres
                  GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement
                  GMRES: happy breakdown tolerance 1e-30
                maximum iterations=10000, initial guess is zero
                tolerances:  relative=1e-05, absolute=1e-50,
divergence=10000
                left preconditioning
                using PRECONDITIONED norm type for convergence test
              PC Object:              (fieldsplit_0_)               1 MPI
processes
                type: lu
                  LU: out-of-place factorization
                  tolerance for zero pivot 2.22045e-14
                  matrix ordering: nd
                  factor fill ratio given 5, needed 20.772
                    Factored matrix follows:
                      Mat Object:                       1 MPI processes
                        type: seqaij
                        rows=54243, cols=54243, bs=3
                        package used to perform factorization: petsc
                        total: nonzeros=8.41718e+07, allocated
nonzeros=8.41718e+07
                        total number of mallocs used during MatSetValues
calls =0
                          using I-node routines: found 17801 nodes, limit
used is 5
                linear system matrix = precond matrix:
                Mat Object:                (fieldsplit_0_)
1 MPI processes
                  type: seqaij
                  rows=54243, cols=54243, bs=3
                  total: nonzeros=4.05217e+06, allocated
nonzeros=4.05217e+06
                  total number of mallocs used during MatSetValues calls =0
                    using I-node routines: found 18081 nodes, limit used is
5
            A01
              Mat Object:               1 MPI processes
                type: seqaij
                rows=54243, cols=18081, rbs=3, cbs = 1
                total: nonzeros=1.35072e+06, allocated nonzeros=1.35072e+06
                total number of mallocs used during MatSetValues calls =0
                  using I-node routines: found 18081 nodes, limit used is 5
  linear system matrix = precond matrix:
  Mat Object:   1 MPI processes
    type: seqaij
    rows=72324, cols=72324, bs=4
    total: nonzeros=7.20386e+06, allocated nonzeros=7.20386e+06
    total number of mallocs used during MatSetValues calls =0
      has attached null space
      using I-node routines: found 18081 nodes, limit used is 5


3. ----------------------------------------
when using gamg: -pc_type fieldsplit -pc_fieldsplit_type schur
-pc_fieldsplit_schur_precondition self -pc_fieldsplit_dm_splits 0
-pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3
-fieldsplit_0_pc_type gamg -fieldsplit_0_ksp_converged_reason
-fieldsplit_1_ksp_converged_reason -ksp_converged_reason -ksp_view
----------------------------------------

[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Zero pivot in LU factorization:
http://www.mcs.anl.gov/petsc/documentation/faq.html#zeropivot
[0]PETSC ERROR: Zero pivot, row 3
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
[0]PETSC ERROR: /user/bkhanal/home/works/AdLemModel/build/src/AdLemMain on
a arch-linux2-cxx-debug named delorme by bkhanal Mon Dec  7 23:13:00 2015
[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --with-clanguage=cxx --download-fblaslapack
--download-scalapack --download-mpich --download-superlu_dist
--download-mumps --download-hypre --download-metis --download-parmetis
[0]PETSC ERROR: #1 PetscKernel_A_gets_inverse_A_5() line 48 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/baij/seq/dgefa5.c
[0]PETSC ERROR: #2 MatSOR_SeqAIJ_Inode() line 2808 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/aij/seq/inode.c
[0]PETSC ERROR: #3 MatSOR() line 3697 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
[0]PETSC ERROR: #4 PCApply_SOR() line 37 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/sor/sor.c
[0]PETSC ERROR: #5 PCApply() line 483 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #6 KSP_PCApply() line 242 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
[0]PETSC ERROR: #7 KSPInitialResidual() line 63 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: #8 KSPSolve_GMRES() line 235 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: #9 KSPSolve() line 604 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #10 KSPSolve_Chebyshev() line 381 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/cheby/cheby.c
[0]PETSC ERROR: #11 KSPSolve() line 604 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #12 PCMGMCycle_Private() line 19 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: #13 PCMGMCycle_Private() line 48 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: #14 PCApply_MG() line 340 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: #15 PCApply() line 483 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #16 KSP_PCApply() line 242 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
[0]PETSC ERROR: #17 KSPInitialResidual() line 63 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: #18 KSPSolve_GMRES() line 235 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: #19 KSPSolve() line 604 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #20 MatMult_SchurComplement() line 105 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/utils/schurm.c
[0]PETSC ERROR: #21 MatNullSpaceTest() line 431 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matnull.c

 WARNING: not a valid pressure null space
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR:  Vec is locked read only, argument # 1
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
[0]PETSC ERROR: /user/bkhanal/home/works/AdLemModel/build/src/AdLemMain on
a arch-linux2-cxx-debug named delorme by bkhanal Mon Dec  7 23:13:00 2015
[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --with-clanguage=cxx --download-fblaslapack
--download-scalapack --download-mpich --download-superlu_dist
--download-mumps --download-hypre --download-metis --download-parmetis
[0]PETSC ERROR: #22 VecSet() line 573 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/vec/vec/interface/rvector.c
[0]PETSC ERROR: #23 MatMultTranspose_SeqAIJ() line 1263 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: #24 MatMultTranspose() line 2282 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
[0]PETSC ERROR: #25 MatRestrict() line 7829 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
[0]PETSC ERROR: #26 PCMGMCycle_Private() line 44 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: #27 PCApply_MG() line 340 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: #28 PCApply() line 483 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #29 KSP_PCApply() line 242 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
[0]PETSC ERROR: #30 KSPInitialResidual() line 63 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: #31 KSPSolve_GMRES() line 235 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: #32 KSPSolve() line 604 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #33 PCApply_FieldSplit_Schur() line 898 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/fieldsplit/fieldsplit.c
[0]PETSC ERROR: #34 PCApply() line 483 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #35 KSP_PCApply() line 242 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
[0]PETSC ERROR: #36 KSPInitialResidual() line 63 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: #37 KSPSolve_GMRES() line 235 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: #38 KSPSolve() line 604 in
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #39 solveModel() line 313 in
/home/bkhanal/works/AdLemModel/src/PetscAdLemTaras3D.cxx
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] getSolVelocityAt line 137
/home/bkhanal/works/AdLemModel/src/PetscAdLem3D.cxx
[0]PETSC ERROR: [0] VecSet line 568
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/vec/vec/interface/rvector.c
[0]PETSC ERROR: [0] MatMultTranspose_SeqAIJ line 1262
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: [0] MatMultTranspose line 2263
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] MatRestrict line 7817
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] PCMGMCycle_Private line 17
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: [0] PCApply_MG line 325
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: [0] PCApply line 463
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSP_PCApply line 240
/home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSPInitialResidual line 44
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: [0] KSPSolve_GMRES line 225
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: [0] KSPSolve line 510
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] PCApply_FieldSplit_Schur line 848
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/fieldsplit/fieldsplit.c
[0]PETSC ERROR: [0] PCApply line 463
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSP_PCApply line 240
/home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSPInitialResidual line 44
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: [0] KSPSolve_GMRES line 225
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: [0] KSPSolve line 510
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] PetscKernel_A_gets_inverse_A_5 line 25
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/baij/seq/dgefa5.c
[0]PETSC ERROR: [0] MatSOR_SeqAIJ_Inode line 2764
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/aij/seq/inode.c
[0]PETSC ERROR: [0] MatSOR line 3678
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] PCApply_SOR line 35
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/sor/sor.c
[0]PETSC ERROR: [0] PCApply line 463
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSP_PCApply line 240
/home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSPInitialResidual line 44
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: [0] KSPSolve_GMRES line 225
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: [0] KSPSolve line 510
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] KSPSolve_Chebyshev line 358
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/cheby/cheby.c
[0]PETSC ERROR: [0] KSPSolve line 510
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] PCMGMCycle_Private line 17
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: [0] PCMGMCycle_Private line 17
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: [0] PCApply_MG line 325
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: [0] PCApply line 463
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSP_PCApply line 240
/home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSPInitialResidual line 44
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: [0] KSPSolve_GMRES line 225
/home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
[0]PETSC ERROR: /user/bkhanal/home/works/AdLemModel/build/src/AdLemMain on
a arch-linux2-cxx-debug named delorme by bkhanal Mon Dec  7 23:13:00 2015
[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --with-clanguage=cxx --download-fblaslapack
--download-scalapack --download-mpich --download-superlu_dist
--download-mumps --download-hypre --download-metis --download-parmetis
[0]PETSC ERROR: #40 User provided function() line 0 in  unknown file
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
[unset]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151208/6c9b9f66/attachment-0001.html>


More information about the petsc-users mailing list