[petsc-users] ML - zero pivot error

Matthew Knepley knepley at gmail.com
Tue Jul 17 07:11:32 CDT 2012


On Tue, Jul 17, 2012 at 7:07 AM, Benjamin Sanderse <B.Sanderse at cwi.nl>wrote:

> The two processor case does not reach far enough that ksp_view gives me
> output. The one processor case gives me the output below.
> I also added -mg_levels_pc_factor_shift_type nonzero, but still with the
> same error.
>

Stick in the identity instead. There must be a reason the solver does not
receive the option. You could also
break inside the PCSetFromOptions_Factor() method.

    Matt


> Benjamin
>
>
> KSP Object: 1 MPI processes
>   type: cg
>   maximum iterations=500
>   tolerances:  relative=1e-10, absolute=1e-10, divergence=10000
>   left preconditioning
>   has attached null space
>   using nonzero initial guess
>   using PRECONDITIONED norm type for convergence test
> PC Object: 1 MPI processes
>   type: ml
>     MG: type is MULTIPLICATIVE, levels=4 cycles=v
>       Cycles per PCApply=1
>       Using Galerkin computed coarse grid matrices
>   Coarse grid solver -- level -------------------------------
>     KSP Object:    (mg_coarse_)     1 MPI processes
>       type: preonly
>       maximum iterations=1, initial guess is zero
>       tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>       left preconditioning
>       using NONE norm type for convergence test
>     PC Object:    (mg_coarse_)     1 MPI processes
>       type: lu
>         LU: out-of-place factorization
>         tolerance for zero pivot 2.22045e-14
> *        using diagonal shift to prevent zero pivot*
>         matrix ordering: nd
>         factor fill ratio given 5, needed 1
>           Factored matrix follows:
>             Matrix Object:             1 MPI processes
>               type: seqaij
>               rows=1, cols=1
>               package used to perform factorization: petsc
>               total: nonzeros=1, allocated nonzeros=1
>               total number of mallocs used during MatSetValues calls =0
>                 not using I-node routines
>       linear system matrix = precond matrix:
>       Matrix Object:       1 MPI processes
>         type: seqaij
>         rows=1, cols=1
>         total: nonzeros=1, allocated nonzeros=1
>         total number of mallocs used during MatSetValues calls =0
>           not using I-node routines
>   Down solver (pre-smoother) on level 1 -------------------------------
>     KSP Object:    (mg_levels_1_)     1 MPI processes
>       type: richardson
>         Richardson: damping factor=1
>       maximum iterations=2
>       tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>       left preconditioning
>       using nonzero initial guess
>       using NONE norm type for convergence test
>     PC Object:    (mg_levels_1_)     1 MPI processes
>       type: sor
>         SOR: type = local_symmetric, iterations = 1, local iterations = 1,
> omega = 1
>       linear system matrix = precond matrix:
>       Matrix Object:       1 MPI processes
>         type: seqaij
>         rows=6, cols=6
>         total: nonzeros=36, allocated nonzeros=36
>         total number of mallocs used during MatSetValues calls =0
>           using I-node routines: found 2 nodes, limit used is 5
>   Up solver (post-smoother) same as down solver (pre-smoother)
>   Down solver (pre-smoother) on level 2 -------------------------------
>     KSP Object:    (mg_levels_2_)     1 MPI processes
>       type: richardson
>         Richardson: damping factor=1
>       maximum iterations=2
>       tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>       left preconditioning
>       using nonzero initial guess
>       using NONE norm type for convergence test
>     PC Object:    (mg_levels_2_)     1 MPI processes
>       type: sor
>         SOR: type = local_symmetric, iterations = 1, local iterations = 1,
> omega = 1
>       linear system matrix = precond matrix:
>       Matrix Object:       1 MPI processes
>         type: seqaij
>         rows=130, cols=130
>         total: nonzeros=2704, allocated nonzeros=2704
>         total number of mallocs used during MatSetValues calls =0
>           not using I-node routines
>   Up solver (post-smoother) same as down solver (pre-smoother)
>   Down solver (pre-smoother) on level 3 -------------------------------
>     KSP Object:    (mg_levels_3_)     1 MPI processes
>       type: richardson
>         Richardson: damping factor=1
>       maximum iterations=2
>       tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>       left preconditioning
>       using nonzero initial guess
>       using NONE norm type for convergence test
>     PC Object:    (mg_levels_3_)     1 MPI processes
>       type: sor
>         SOR: type = local_symmetric, iterations = 1, local iterations = 1,
> omega = 1
>       linear system matrix = precond matrix:
>       Matrix Object:       1 MPI processes
>         type: seqaij
>         rows=1000, cols=1000
>         total: nonzeros=6400, allocated nonzeros=6400
>         total number of mallocs used during MatSetValues calls =0
>           not using I-node routines
>   Up solver (post-smoother) same as down solver (pre-smoother)
>   linear system matrix = precond matrix:
>   Matrix Object:   1 MPI processes
>     type: seqaij
>     rows=1000, cols=1000
>     total: nonzeros=6400, allocated nonzeros=6400
>     total number of mallocs used during MatSetValues calls =0
>       not using I-node routines
>
>
> Op 17 jul 2012, om 12:58 heeft Matthew Knepley het volgende geschreven:
>
> On Tue, Jul 17, 2012 at 3:23 AM, Benjamin Sanderse <B.Sanderse at cwi.nl>wrote:
>
>> Hello all,
>>
>> I am trying to use ML to solve a Poisson equation with Neumann BC
>> (singular matrix) and found the thread below to avoid a zero pivot error. I
>> used the option that Barry suggests, -mg_coarse_pc_factor_shift_type
>> nonzero, which works, but only when I run on a single processor. For two or
>> more processors I still get a zero pivot error. Are there more options to
>> be set for the parallel case?
>>
>
> Should still work. Can you check that the prefix for the coarse solver is
> correct with -ksp_view?
>
>    Matt
>
>
>> Benjamin
>>
>>
>> [1]PETSC ERROR: --------------------- Error Message
>> ------------------------------------
>> [1]PETSC ERROR: Detected zero pivot in LU factorization:
>> see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot!
>> [1]PETSC ERROR: Zero pivot row 1 value 5.7431e-18 tolerance 2.22045e-14!
>> [1]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [1]PETSC ERROR: Petsc Release Version 3.3.0, Patch 1, Fri Jun 15 09:30:49
>> CDT 2012
>> [1]PETSC ERROR: See docs/changes/index.html for recent updates.
>> [1]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message
>> ------------------------------------
>> [0]PETSC ERROR: Detected zero pivot in LU factorization:
>> see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot!
>> [0]PETSC ERROR: Zero pivot row 1 value 5.7431e-18 tolerance 2.22045e-14!
>> [0]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 1, Fri Jun 15 09:30:49
>> CDT 2012
>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>> [0]PETSC ERROR: See docs/index.html for manual pages.
>> [0]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [0]PETSC ERROR: bin/navier-stokes on a linux-gnu named
>> gb-r3n32.irc.sara.nl by sanderse Tue Jul 17 10:04:05 2012
>> [0]PETSC ERROR: Libraries linked from
>> /home/sanderse/Software/petsc-3.3-p1/linux-gnu-c-debug/lib
>> [0]PETSC ERROR: Configure run at Mon Jul 16 21:06:33 2012
>> [0]PETSC ERROR: Configure options --with-cc=mpicc --with-fc=mpif90
>> --with-cxx=mpicxx --with-shared-libraries --with-hypre --download-hypre
>> --with-blas-lapack-dir=/sara
>> /sw/intel/Compiler/11.0/069 --with-hdf5 --download-hdf5
>> --with-debugging=0 --download-ml
>> [0]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [0]PETSC ERROR: MatPivotCheck_none() line 583 in
>> /home/sanderse/Software/petsc-3.3-p1/include/petsc-private/matimpl.h
>> [0]PETSC ERROR: MatPivotCheck() line 602 in
>> /home/sanderse/Software/petsc-3.3-p1/include/petsc-private/matimpl.h
>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>> [1]PETSC ERROR: See docs/index.html for manual pages.
>> [1]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [1]PETSC ERROR: bin/navier-stokes on a linux-gnu named
>> gb-r3n32.irc.sara.nl by sanderse Tue Jul 17 10:04:05 2012
>> [1]PETSC ERROR: Libraries linked from
>> /home/sanderse/Software/petsc-3.3-p1/linux-gnu-c-debug/lib
>> [1]PETSC ERROR: Configure run at Mon Jul 16 21:06:33 2012
>> [1]PETSC ERROR: Configure options --with-cc=mpicc --with-fc=mpif90
>> --with-cxx=mpicxx --with-shared-libraries --with-hypre --download-hypre
>> --with-blas-lapack-dir=/sara
>> /sw/intel/Compiler/11.0/069 --with-hdf5 --download-hdf5
>> --with-debugging=0 --download-ml
>> [1]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [1]PETSC ERROR: MatPivotCheck_none() line 583 in
>> /home/sanderse/Software/petsc-3.3-p1/include/petsc-private/matimpl.h
>> MatLUFactorNumeric_SeqAIJ_Inode() line 1469 in
>> /home/sanderse/Software/petsc-3.3-p1/src/mat/impls/aij/seq/inode.c
>> [0]PETSC ERROR: MatLUFactorNumeric() line 2790 in
>> /home/sanderse/Software/petsc-3.3-p1/src/mat/interface/matrix.c
>> [0]PETSC ERROR: PCSetUp_LU() line 160 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/impls/factor/lu/lu.c
>> [0]PETSC ERROR: PCSetUp() line 832 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/interface/precon.c
>> [0]PETSC ERROR: KSPSetUp() line 278 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/ksp/interface/itfunc.c
>> [0]PETSC ERROR: PCSetUp_Redundant() line 176 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/impls/redundant/redundant.c
>> [0]PETSC ERROR: PCSetUp() line 832 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/interface/precon.c
>> [0]PETSC ERROR: KSPSetUp() line 278 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/ksp/interface/itfunc.c
>> [0]PETSC ERROR: PCSetUp_MG() line 729 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/impls/mg/mg.c
>> [0]PETSC ERROR: [1]PETSC ERROR: MatPivotCheck() line 602 in
>> /home/sanderse/Software/petsc-3.3-p1/include/petsc-private/matimpl.h
>> [1]PETSC ERROR: MatLUFactorNumeric_SeqAIJ_Inode() line 1469 in
>> /home/sanderse/Software/petsc-3.3-p1/src/mat/impls/aij/seq/inode.c
>> [1]PETSC ERROR: MatLUFactorNumeric() line 2790 in
>> /home/sanderse/Software/petsc-3.3-p1/src/mat/interface/matrix.c
>> [1]PETSC ERROR: PCSetUp_LU() line 160 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/impls/factor/lu/lu.c
>> [1]PETSC ERROR: PCSetUp_ML() line 820 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/impls/ml/ml.c
>> [0]PETSC ERROR: PCSetUp() line 832 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/interface/precon.c
>> [0]PETSC ERROR: KSPSetUp() line 278 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/ksp/interface/itfunc.c
>> PCSetUp() line 832 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/interface/precon.c
>> [1]PETSC ERROR: KSPSetUp() line 278 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/ksp/interface/itfunc.c
>> [1]PETSC ERROR: PCSetUp_Redundant() line 176 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/impls/redundant/redundant.c
>> [1]PETSC ERROR: PCSetUp() line 832 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/interface/precon.c
>> [1]PETSC ERROR: KSPSetUp() line 278 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/ksp/interface/itfunc.c
>> [1]PETSC ERROR: PCSetUp_MG() line 729 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/impls/mg/mg.c
>> [1]PETSC ERROR: PCSetUp_ML() line 820 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/impls/ml/ml.c
>> [1]PETSC ERROR: PCSetUp() line 832 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/pc/interface/precon.c
>> [1]PETSC ERROR: KSPSetUp() line 278 in
>> /home/sanderse/Software/petsc-3.3-p1/src/ksp/ksp/interface/itfunc.c
>>
>>
>> Op 3 jun 2011, om 14:26 heeft Barry Smith het volgende geschreven:
>>
>> >
>> >   It is the direct solver on the the coarse grid that is finding the
>> zero pivot (since the coarse grid problem like all the levels has a null
>> space).
>> >
>> >   You can use the option -mg_coarse_pc_factor_shift_type nonzero (in
>> petsc-3.1 or petsc-dev)    Also keep the KSPSetNullSpace() function you are
>> using.
>> >
>> >
>> >   Barry
>> >
>> > On Jun 3, 2011, at 3:54 AM, Stijn A. M. Vantieghem wrote:
>> >
>> >> Dear all,
>> >>
>> >> I am using PETSc (Fortran interface) to solve a Poisson equation with
>> Neumann boundary conditions. Up till now, I managed to do this with Hypre's
>> BoomerAMG. Now, I am investigating whether I can improve the performance of
>> my code by using ML. However, at execution I receive a zero pivot error; I
>> tried to remove the (constant) null space with KSPSetNullSpace, but this
>> didn't solve my problem. Do you have an idea of what I'm doing wrong?
>> Thanks.
>> >>
>> >> The relevant portions of my code are as follows:
>> >>
>> >> !****************************************************
>> >> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr)
>> >> call KSPSetOperators(ksp,M,M,DIFFERENT_NONZERO_PATTERN,ierr)
>> >>
>> >> call
>> MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,sp,ierr)
>> >> call KSPSetNullSpace(ksp,sp,ierr)
>> >>
>> >> call KSPGetPC(ksp,pc,ierr)
>> >> call PCSetType(pc,PCML,ierr)
>> >>
>> >> call KSPSetFromOptions(ksp,ierr)
>> >> ...
>> >> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr)
>> >> call KSPSolve(ksp,petsc_rhs,petsc_pressure,ierr)
>> >> !****************************************************
>> >>
>> >> and the error message is:
>> >>
>> >> [0]PETSC ERROR: --------------------- Error Message
>> ------------------------------------
>> >> [0]PETSC ERROR: Detected zero pivot in LU factorization
>> >> see
>> http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#ZeroPivot
>> !
>> >> [0]PETSC ERROR: Zero pivot row 0 value 3.57045e-20 tolerance 1e-12!
>> >> [0]PETSC ERROR:
>> ------------------------------------------------------------------------
>> >> ...
>> >> [0]PETSC ERROR:
>> ------------------------------------------------------------------------
>> >> [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 574 in
>> src/mat/impls/aij/seq/aijfact.c
>> >> [0]PETSC ERROR: MatLUFactorNumeric() line 2587 in
>> src/mat/interface/matrix.c
>> >> [0]PETSC ERROR: PCSetUp_LU() line 158 in
>> src/ksp/pc/impls/factor/lu/lu.c
>> >> [0]PETSC ERROR: PCSetUp() line 795 in src/ksp/pc/interface/precon.c
>> >> [0]PETSC ERROR: KSPSetUp() line 237 in src/ksp/ksp/interface/itfunc.c
>> >> [0]PETSC ERROR: PCSetUp_MG() line 602 in src/ksp/pc/impls/mg/mg.c
>> >> [0]PETSC ERROR: PCSetUp_ML() line 668 in src/ksp/pc/impls/ml/ml.c
>> >> [0]PETSC ERROR: PCSetUp() line 795 in src/ksp/pc/interface/precon.c
>> >> [0]PETSC ERROR: KSPSetUp() line 237 in src/ksp/ksp/interface/itfunc.c
>> >> [0]PETSC ERROR: KSPSolve() line 353 in src/ksp/ksp/interface/itfunc.c
>> >>
>> >> Regards
>> >> Stijn
>> >>
>> >> Stijn A.M. Vantieghem
>> >> Earth and Planetary Magnetism
>> >> Institute for Geophysics
>> >> ETH Zürich
>> >> Sonneggstrasse 5 - CH 8092 Zürich
>> >> tel: +41 44 632 39 90
>> >> e-mail: stijn.vantieghem at erdw.ethz.ch
>> >>
>> >>
>> >>
>> >>
>> >
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
> --
> Ir. B. Sanderse
>
> Centrum Wiskunde en Informatica
> Science Park 123
> 1098 XG Amsterdam
>
> t: +31 20 592 4161
> e: sanderse at cwi.nl
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120717/d06de09e/attachment-0001.html>


More information about the petsc-users mailing list