[petsc-users] zero pivot in LU factorization when using -fieldsplit_0_pc_type gamg in schur fieldsplit PC

Bishesh Khanal bisheshkh at gmail.com
Tue Dec 8 03:44:23 CST 2015


On Tue, Dec 8, 2015 at 12:57 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>    It is getting a zero pivot when factoring the little dense diagonal
> blocks of the smoother matrix (to do SOR).
>
>    What happens if you run with the additional option -mat_no_inode?
>

With -mat_no_inode I see that the fieldsplit_0_solve has preconditioned
norm converging but not the true resid norm. And therefore
fieldsplit_1_solve not converge.

     Residual norms for fieldsplit_0_ solve.
    0 KSP preconditioned resid norm 0.000000000000e+00 true resid norm
0.000000000000e+00 ||r(i)||/||b||           -nan
    Residual norms for fieldsplit_0_ solve.
    0 KSP preconditioned resid norm 5.366616065799e-01 true resid norm
1.763251719324e+01 ||r(i)||/||b|| 1.000000000000e+00
    1 KSP preconditioned resid norm 2.051153486997e-15 true resid norm
1.763377056173e+01 ||r(i)||/||b|| 1.000071082789e+00
    Residual norms for fieldsplit_1_ solve.
      Residual norms for fieldsplit_0_ solve.
      0 KSP preconditioned resid norm 0.000000000000e+00 true resid norm
0.000000000000e+00 ||r(i)||/||b||           -nan
    0 KSP preconditioned resid norm 7.379043098663e+00 true resid norm
7.379043147181e+00 ||r(i)||/||b|| 1.000000000000e+00
    Residual norms for fieldsplit_0_ solve.
    0 KSP preconditioned resid norm 1.013832784140e-01 true resid norm
1.194932744284e+00 ||r(i)||/||b|| 1.000000000000e+00
    1 KSP preconditioned resid norm 3.651850464015e-16 true resid norm
1.195413534155e+00 ||r(i)||/||b|| 1.000402357265e+00
      Residual norms for fieldsplit_0_ solve.
      0 KSP preconditioned resid norm 9.468489343208e-01 true resid norm
1.115983634787e+01 ||r(i)||/||b|| 1.000000000000e+00
      1 KSP preconditioned resid norm 4.573576948730e-15 true resid norm
1.116432658910e+01 ||r(i)||/||b|| 1.000402357265e+00
    1 KSP preconditioned resid norm 7.376179991979e+00 true resid norm
7.376180019146e+00 ||r(i)||/||b|| 9.996119919645e-01
    Residual norms for fieldsplit_0_ solve.
    0 KSP preconditioned resid norm 4.141204698273e+00 true resid norm
1.010004734003e+00 ||r(i)||/||b|| 1.000000000000e+00
    1 KSP preconditioned resid norm 3.094717134282e-14 true resid norm
1.612661239170e+00 ||r(i)||/||b|| 1.596686812327e+00
      Residual norms for fieldsplit_0_ solve.
      0 KSP preconditioned resid norm 9.488116584954e-01 true resid norm
2.657897132073e+01 ||r(i)||/||b|| 1.000000000000e+00
      1 KSP preconditioned resid norm 2.486948812256e-15 true resid norm
2.658134372650e+01 ||r(i)||/||b|| 1.000089258750e+00
    2 KSP preconditioned resid norm 7.376160898815e+00 true resid norm
7.376160928269e+00 ||r(i)||/||b|| 9.996094047893e-01
                                  .......................
                  .......................
    Residual norms for fieldsplit_0_ solve.
    0 KSP preconditioned resid norm 3.198579793302e+00 true resid norm
1.508640813109e+00 ||r(i)||/||b|| 1.000000000000e+00
    1 KSP preconditioned resid norm 1.839267393927e-14 true resid norm
1.825155061421e+00 ||r(i)||/||b|| 1.209800931781e+00
      Residual norms for fieldsplit_0_ solve.
      0 KSP preconditioned resid norm 1.497257742332e+01 true resid norm
4.292799680684e+14 ||r(i)||/||b|| 1.000000000000e+00
      1 KSP preconditioned resid norm 1.138874539002e-13 true resid norm
4.292799680684e+14 ||r(i)||/||b|| 1.000000000000e+00
   43 KSP preconditioned resid norm 6.372901415746e+00 true resid norm
7.990778566794e+00 ||r(i)||/||b|| 1.082901726878e+00
        .... And continues


>    Run with the additional option -start_in_debugger noxterm
>
> In the debugger type
>
> cont
>
> when it crashes type
>
> bt
>
> then use
>
> up
>
> as many times as needed to get to the routine MatSOR() then type
>
> call MatView(mat,0)
>
> you should see that diagonal subblock from the 3rd row to 7th row is
> singular. Normally we would expect this from the operator
> mu lap(u) unless the mu has zero entries?
>

Sorry I didn't understand what exactly the diagonal sublock mean here?
mat is of a big size, so it was not easy to see with MatView. Looking at
its output, what I see is that every now and then there are 3 or 4
contiguous rows (i, i+1, i+2) having mostly 0 values. No mu does not have
zero entries, I set it to 1 everywhere except at Dirichlet boundaries where
mu is not used.


> What happens if you run -fieldsplit_0_pc_type sor ? Does it also give a
> zero factorization?
>

No it does not give zero factorization, the system converges in 2
iterations.
The -fieldsplit_0_ solves mostly in around 27 iterations, and for
-fieldsplit_1_:
Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 7
Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 7
Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 17





>
>
> > On Dec 7, 2015, at 5:35 PM, Bishesh Khanal <bisheshkh at gmail.com> wrote:
> >
> > Hello,
> > I'm revisiting my code for solving a system of eqs similar to Stokes to
> get better convergence for big jumps in viscosity (mu) and wanted to
> experiment with petsc's gamg:
> > mu lap(u) + grad(p) = f1
> >       div(u)                = f2
> > Dirichlet boundary condition of zero velocity.
> > The implementation is a finite difference with a staggered grid
> discretization.
> >
> > Using Schur FieldSplit PC type, and for constant viscosity, using hypre
> PC for A00 block had been serving well. Now when trying gamg, I get zero
> pivot error. From some of the past discussions in the archives of the
> mailing list, it seems I've to play around with
> ..._sub_pc_factor_shift_type but I could not figure out the exact set of
> options.
> >
> > Setting mu = 1 everywhere; Using the following options:
> > -pc_type fieldsplit -pc_fieldsplit_type schur
> -pc_fieldsplit_schur_precondition self -pc_fieldsplit_dm_splits 0
> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3
> -fieldsplit_0_pc_type gamg
> > I get an error.
> > When I use -fieldsplit_0_pc_type lu , it converges in a single iteration
> so I guess it should mean that A00 is not singular.
> > Below are the results of using:
> > 1. -fieldsplit_0_pc_type hypre (converges)
> > 2. -ksp_view -fieldsplit_0_pc_type lu (converges)
> > 3. -fieldsplit_0_pc_type gamg (all error output)
> >
> >
> > 1. ----------------------------------------
> > With -fieldsplit_0_pc_type hypre
> > ----------------------------------------
> >   Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 7
> >   Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 7
> >   Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations
> 17
> > Linear solve converged due to CONVERGED_RTOL iterations 2
> >
> > 2. --------------------------------------------
> > With -fieldsplit_0_pc_type lu -ksp_view
> > -pc_type fieldsplit -pc_fieldsplit_type schur
> -pc_fieldsplit_schur_precondition self -pc_fieldsplit_dm_splits 0
> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3
> -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_rtol 1.0e-10
> -fieldsplit_1_ksp_converged_reason -ksp_converged_reason -ksp_view
> >
> >   Linear fieldsplit_0_ solve converged due to CONVERGED_ATOL iterations 0
> >   Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
> >   Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
> >               ...
> >   Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
> >   Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations
> 22
> >   Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
> >   Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
> >               ...
> >   Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
> >   Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations
> 22
> >   Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1
> > Linear solve converged due to CONVERGED_RTOL iterations 1
> >
> > KSP Object: 1 MPI processes
> >   type: gmres
> >     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> >     GMRES: happy breakdown tolerance 1e-30
> >   maximum iterations=10000, initial guess is zero
> >   tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
> >   left preconditioning
> >   using PRECONDITIONED norm type for convergence test
> > PC Object: 1 MPI processes
> >   type: fieldsplit
> >     FieldSplit with Schur preconditioner, blocksize = 4, factorization
> FULL
> >     Preconditioner for the Schur complement formed from S itself
> >     Split info:
> >     Split number 0 Fields  0, 1, 2
> >     Split number 1 Fields  3
> >     KSP solver for A00 block
> >       KSP Object:      (fieldsplit_0_)       1 MPI processes
> >         type: gmres
> >           GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> >           GMRES: happy breakdown tolerance 1e-30
> >         maximum iterations=10000, initial guess is zero
> >         tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
> >         left preconditioning
> >         using PRECONDITIONED norm type for convergence test
> >       PC Object:      (fieldsplit_0_)       1 MPI processes
> >         type: lu
> >           LU: out-of-place factorization
> >           tolerance for zero pivot 2.22045e-14
> >           matrix ordering: nd
> >           factor fill ratio given 5, needed 20.772
> >             Factored matrix follows:
> >               Mat Object:               1 MPI processes
> >                 type: seqaij
> >                 rows=54243, cols=54243, bs=3
> >                 package used to perform factorization: petsc
> >                 total: nonzeros=8.41718e+07, allocated
> nonzeros=8.41718e+07
> >                 total number of mallocs used during MatSetValues calls =0
> >                   using I-node routines: found 17801 nodes, limit used
> is 5
> >         linear system matrix = precond matrix:
> >         Mat Object:        (fieldsplit_0_)         1 MPI processes
> >           type: seqaij
> >           rows=54243, cols=54243, bs=3
> >           total: nonzeros=4.05217e+06, allocated nonzeros=4.05217e+06
> >           total number of mallocs used during MatSetValues calls =0
> >             using I-node routines: found 18081 nodes, limit used is 5
> >     KSP solver for S = A11 - A10 inv(A00) A01
> >       KSP Object:      (fieldsplit_1_)       1 MPI processes
> >         type: gmres
> >           GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> >           GMRES: happy breakdown tolerance 1e-30
> >         maximum iterations=10000, initial guess is zero
> >         tolerances:  relative=1e-10, absolute=1e-50, divergence=10000
> >         left preconditioning
> >         using PRECONDITIONED norm type for convergence test
> >       PC Object:      (fieldsplit_1_)       1 MPI processes
> >         type: none
> >         linear system matrix = precond matrix:
> >         Mat Object:        (fieldsplit_1_)         1 MPI processes
> >           type: schurcomplement
> >           rows=18081, cols=18081
> >             has attached null space
> >             Schur complement A11 - A10 inv(A00) A01
> >             A11
> >               Mat Object:              (fieldsplit_1_)               1
> MPI processes
> >                 type: seqaij
> >                 rows=18081, cols=18081
> >                 total: nonzeros=450241, allocated nonzeros=450241
> >                 total number of mallocs used during MatSetValues calls =0
> >                   not using I-node routines
> >             A10
> >               Mat Object:               1 MPI processes
> >                 type: seqaij
> >                 rows=18081, cols=54243
> >                 total: nonzeros=1.35072e+06, allocated
> nonzeros=1.35072e+06
> >                 total number of mallocs used during MatSetValues calls =0
> >                   not using I-node routines
> >             KSP of A00
> >               KSP Object:              (fieldsplit_0_)               1
> MPI processes
> >                 type: gmres
> >                   GMRES: restart=30, using Classical (unmodified)
> Gram-Schmidt Orthogonalization with no iterative refinement
> >                   GMRES: happy breakdown tolerance 1e-30
> >                 maximum iterations=10000, initial guess is zero
> >                 tolerances:  relative=1e-05, absolute=1e-50,
> divergence=10000
> >                 left preconditioning
> >                 using PRECONDITIONED norm type for convergence test
> >               PC Object:              (fieldsplit_0_)               1
> MPI processes
> >                 type: lu
> >                   LU: out-of-place factorization
> >                   tolerance for zero pivot 2.22045e-14
> >                   matrix ordering: nd
> >                   factor fill ratio given 5, needed 20.772
> >                     Factored matrix follows:
> >                       Mat Object:                       1 MPI processes
> >                         type: seqaij
> >                         rows=54243, cols=54243, bs=3
> >                         package used to perform factorization: petsc
> >                         total: nonzeros=8.41718e+07, allocated
> nonzeros=8.41718e+07
> >                         total number of mallocs used during MatSetValues
> calls =0
> >                           using I-node routines: found 17801 nodes,
> limit used is 5
> >                 linear system matrix = precond matrix:
> >                 Mat Object:                (fieldsplit_0_)
>    1 MPI processes
> >                   type: seqaij
> >                   rows=54243, cols=54243, bs=3
> >                   total: nonzeros=4.05217e+06, allocated
> nonzeros=4.05217e+06
> >                   total number of mallocs used during MatSetValues calls
> =0
> >                     using I-node routines: found 18081 nodes, limit used
> is 5
> >             A01
> >               Mat Object:               1 MPI processes
> >                 type: seqaij
> >                 rows=54243, cols=18081, rbs=3, cbs = 1
> >                 total: nonzeros=1.35072e+06, allocated
> nonzeros=1.35072e+06
> >                 total number of mallocs used during MatSetValues calls =0
> >                   using I-node routines: found 18081 nodes, limit used
> is 5
> >   linear system matrix = precond matrix:
> >   Mat Object:   1 MPI processes
> >     type: seqaij
> >     rows=72324, cols=72324, bs=4
> >     total: nonzeros=7.20386e+06, allocated nonzeros=7.20386e+06
> >     total number of mallocs used during MatSetValues calls =0
> >       has attached null space
> >       using I-node routines: found 18081 nodes, limit used is 5
> >
> >
> > 3. ----------------------------------------
> > when using gamg: -pc_type fieldsplit -pc_fieldsplit_type schur
> -pc_fieldsplit_schur_precondition self -pc_fieldsplit_dm_splits 0
> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3
> -fieldsplit_0_pc_type gamg -fieldsplit_0_ksp_converged_reason
> -fieldsplit_1_ksp_converged_reason -ksp_converged_reason -ksp_view
> > ----------------------------------------
> >
> > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [0]PETSC ERROR: Zero pivot in LU factorization:
> http://www.mcs.anl.gov/petsc/documentation/faq.html#zeropivot
> > [0]PETSC ERROR: Zero pivot, row 3
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
> > [0]PETSC ERROR: /user/bkhanal/home/works/AdLemModel/build/src/AdLemMain
> on a arch-linux2-cxx-debug named delorme by bkhanal Mon Dec  7 23:13:00 2015
> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --with-clanguage=cxx --download-fblaslapack
> --download-scalapack --download-mpich --download-superlu_dist
> --download-mumps --download-hypre --download-metis --download-parmetis
> > [0]PETSC ERROR: #1 PetscKernel_A_gets_inverse_A_5() line 48 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/baij/seq/dgefa5.c
> > [0]PETSC ERROR: #2 MatSOR_SeqAIJ_Inode() line 2808 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/aij/seq/inode.c
> > [0]PETSC ERROR: #3 MatSOR() line 3697 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
> > [0]PETSC ERROR: #4 PCApply_SOR() line 37 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/sor/sor.c
> > [0]PETSC ERROR: #5 PCApply() line 483 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
> > [0]PETSC ERROR: #6 KSP_PCApply() line 242 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
> > [0]PETSC ERROR: #7 KSPInitialResidual() line 63 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
> > [0]PETSC ERROR: #8 KSPSolve_GMRES() line 235 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
> > [0]PETSC ERROR: #9 KSPSolve() line 604 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
> > [0]PETSC ERROR: #10 KSPSolve_Chebyshev() line 381 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/cheby/cheby.c
> > [0]PETSC ERROR: #11 KSPSolve() line 604 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
> > [0]PETSC ERROR: #12 PCMGMCycle_Private() line 19 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
> > [0]PETSC ERROR: #13 PCMGMCycle_Private() line 48 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
> > [0]PETSC ERROR: #14 PCApply_MG() line 340 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
> > [0]PETSC ERROR: #15 PCApply() line 483 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
> > [0]PETSC ERROR: #16 KSP_PCApply() line 242 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
> > [0]PETSC ERROR: #17 KSPInitialResidual() line 63 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
> > [0]PETSC ERROR: #18 KSPSolve_GMRES() line 235 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
> > [0]PETSC ERROR: #19 KSPSolve() line 604 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
> > [0]PETSC ERROR: #20 MatMult_SchurComplement() line 105 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/utils/schurm.c
> > [0]PETSC ERROR: #21 MatNullSpaceTest() line 431 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matnull.c
> >
> >  WARNING: not a valid pressure null space
> > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [0]PETSC ERROR: Object is in wrong state
> > [0]PETSC ERROR:  Vec is locked read only, argument # 1
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
> > [0]PETSC ERROR: /user/bkhanal/home/works/AdLemModel/build/src/AdLemMain
> on a arch-linux2-cxx-debug named delorme by bkhanal Mon Dec  7 23:13:00 2015
> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --with-clanguage=cxx --download-fblaslapack
> --download-scalapack --download-mpich --download-superlu_dist
> --download-mumps --download-hypre --download-metis --download-parmetis
> > [0]PETSC ERROR: #22 VecSet() line 573 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/vec/vec/interface/rvector.c
> > [0]PETSC ERROR: #23 MatMultTranspose_SeqAIJ() line 1263 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/aij/seq/aij.c
> > [0]PETSC ERROR: #24 MatMultTranspose() line 2282 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
> > [0]PETSC ERROR: #25 MatRestrict() line 7829 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
> > [0]PETSC ERROR: #26 PCMGMCycle_Private() line 44 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
> > [0]PETSC ERROR: #27 PCApply_MG() line 340 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
> > [0]PETSC ERROR: #28 PCApply() line 483 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
> > [0]PETSC ERROR: #29 KSP_PCApply() line 242 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
> > [0]PETSC ERROR: #30 KSPInitialResidual() line 63 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
> > [0]PETSC ERROR: #31 KSPSolve_GMRES() line 235 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
> > [0]PETSC ERROR: #32 KSPSolve() line 604 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
> > [0]PETSC ERROR: #33 PCApply_FieldSplit_Schur() line 898 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> > [0]PETSC ERROR: #34 PCApply() line 483 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
> > [0]PETSC ERROR: #35 KSP_PCApply() line 242 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
> > [0]PETSC ERROR: #36 KSPInitialResidual() line 63 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
> > [0]PETSC ERROR: #37 KSPSolve_GMRES() line 235 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
> > [0]PETSC ERROR: #38 KSPSolve() line 604 in
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
> > [0]PETSC ERROR: #39 solveModel() line 313 in
> /home/bkhanal/works/AdLemModel/src/PetscAdLemTaras3D.cxx
> > [0]PETSC ERROR:
> ------------------------------------------------------------------------
> > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> > [0]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger
> > [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
> OS X to find memory corruption errors
> > [0]PETSC ERROR: likely location of problem given in stack below
> > [0]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------
> > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> > [0]PETSC ERROR:       INSTEAD the line number of the start of the
> function
> > [0]PETSC ERROR:       is given.
> > [0]PETSC ERROR: [0] getSolVelocityAt line 137
> /home/bkhanal/works/AdLemModel/src/PetscAdLem3D.cxx
> > [0]PETSC ERROR: [0] VecSet line 568
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/vec/vec/interface/rvector.c
> > [0]PETSC ERROR: [0] MatMultTranspose_SeqAIJ line 1262
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/aij/seq/aij.c
> > [0]PETSC ERROR: [0] MatMultTranspose line 2263
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
> > [0]PETSC ERROR: [0] MatRestrict line 7817
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
> > [0]PETSC ERROR: [0] PCMGMCycle_Private line 17
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
> > [0]PETSC ERROR: [0] PCApply_MG line 325
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
> > [0]PETSC ERROR: [0] PCApply line 463
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
> > [0]PETSC ERROR: [0] KSP_PCApply line 240
> /home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
> > [0]PETSC ERROR: [0] KSPInitialResidual line 44
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
> > [0]PETSC ERROR: [0] KSPSolve_GMRES line 225
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
> > [0]PETSC ERROR: [0] KSPSolve line 510
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
> > [0]PETSC ERROR: [0] PCApply_FieldSplit_Schur line 848
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> > [0]PETSC ERROR: [0] PCApply line 463
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
> > [0]PETSC ERROR: [0] KSP_PCApply line 240
> /home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
> > [0]PETSC ERROR: [0] KSPInitialResidual line 44
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
> > [0]PETSC ERROR: [0] KSPSolve_GMRES line 225
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
> > [0]PETSC ERROR: [0] KSPSolve line 510
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
> > [0]PETSC ERROR: [0] PetscKernel_A_gets_inverse_A_5 line 25
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/baij/seq/dgefa5.c
> > [0]PETSC ERROR: [0] MatSOR_SeqAIJ_Inode line 2764
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/impls/aij/seq/inode.c
> > [0]PETSC ERROR: [0] MatSOR line 3678
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/mat/interface/matrix.c
> > [0]PETSC ERROR: [0] PCApply_SOR line 35
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/sor/sor.c
> > [0]PETSC ERROR: [0] PCApply line 463
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
> > [0]PETSC ERROR: [0] KSP_PCApply line 240
> /home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
> > [0]PETSC ERROR: [0] KSPInitialResidual line 44
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
> > [0]PETSC ERROR: [0] KSPSolve_GMRES line 225
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
> > [0]PETSC ERROR: [0] KSPSolve line 510
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
> > [0]PETSC ERROR: [0] KSPSolve_Chebyshev line 358
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/cheby/cheby.c
> > [0]PETSC ERROR: [0] KSPSolve line 510
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itfunc.c
> > [0]PETSC ERROR: [0] PCMGMCycle_Private line 17
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
> > [0]PETSC ERROR: [0] PCMGMCycle_Private line 17
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
> > [0]PETSC ERROR: [0] PCApply_MG line 325
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/impls/mg/mg.c
> > [0]PETSC ERROR: [0] PCApply line 463
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/pc/interface/precon.c
> > [0]PETSC ERROR: [0] KSP_PCApply line 240
> /home/bkhanal/Documents/softwares/petsc-3.6.3/include/petsc/private/kspimpl.h
> > [0]PETSC ERROR: [0] KSPInitialResidual line 44
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/interface/itres.c
> > [0]PETSC ERROR: [0] KSPSolve_GMRES line 225
> /home/bkhanal/Documents/softwares/petsc-3.6.3/src/ksp/ksp/impls/gmres/gmres.c
> > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [0]PETSC ERROR: Signal received
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
> > [0]PETSC ERROR: /user/bkhanal/home/works/AdLemModel/build/src/AdLemMain
> on a arch-linux2-cxx-debug named delorme by bkhanal Mon Dec  7 23:13:00 2015
> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --with-clanguage=cxx --download-fblaslapack
> --download-scalapack --download-mpich --download-superlu_dist
> --download-mumps --download-hypre --download-metis --download-parmetis
> > [0]PETSC ERROR: #40 User provided function() line 0 in  unknown file
> > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> > [unset]: aborting job:
> > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151208/9aabb970/attachment-0001.html>


More information about the petsc-users mailing list