[petsc-users] KSP solver always has Zero pivot in LU

Barry Smith bsmith at mcs.anl.gov
Tue Oct 21 19:01:18 CDT 2014


  Your matrix has very differently scaled values on the diagonal:


     (18713,1)           -1.6000e+11
     (18714,1)            2.6550e+13
     (18715,1)            5.0000e+11
     (18716,1)            5.0001e+11
     (18717,1)            1.0000e+00
     (18718,1)            1.0000e+00
     (18719,1)            1.0000e+00
     (18720,1)            1.0000e+00
     (18721,1)           -2.4000e+11
     (18722,1)            4.9999e+11

  You do not want this. First I would change the meaning of the variables so all the diagonal entries are positive and then I would change those values of 1.0 on the diagonal to 1.0e12  If you are using MatZeroRowsColumns() or MatZeroRows() you can easily do this. You need to make sure as the matrix gets larger the diagonal entries you put in scale the same way. So for example if the matrix entries involve a 1/h^2 then you need to make sure the diagonal entries you put in scale that same way.

  Barry





> On Oct 21, 2014, at 3:10 PM, Sharp Stone <thronesf at gmail.com> wrote:
> 
> Hi Barry,
> 
> Thank you very much for your reply. They are illustrating. 
> 
> I use LU and gmres in the solver. When I use a rather small size of grids, the KSP solver works fine. However, when a larger domain is used (still small in the problem I'm trying to solve), I still got the error "Zero pivot in LU factorization" from Petsc.  As you previously suggested, I solved the problem in Matlab, and got the solution, though the matrix is close to singular. Do I have any options to solve such a system with Petsc? For the completeness, I attached the matrix and rhs vector, and look forward to hearing from you. Thank you in advance!
> 
> On Mon, Oct 20, 2014 at 11:38 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> 
>   What exact options are you running with?  I tried ILU and LU and all was fine.
> 
> $ ./ex10 -f0 ~/Desktop/binaryoutput -rhs 0 -mat_view -pc_type lu -ksp_view
> Mat Object: 1 MPI processes
>   type: seqaij
> row 0: (0, 1)  (1, 0)  (3, 0)
> row 1: (0, 0)  (1, 1)  (2, 0)  (4, 0)
> row 2: (1, 0)  (2, 1)  (5, 0)
> row 3: (0, -0.219298)  (3, 1)  (4, -0.561404)  (6, -0.219298)
> row 4: (1, -0.304878)  (3, -0.097561)  (4, 1)  (5, -0.292683)  (7, -0.304878)
> row 5: (2, 0)  (4, 0)  (5, 1)  (8, 0)
> row 6: (3, 0)  (6, 1)  (7, 0)
> row 7: (4, 0)  (6, 0)  (7, 1)  (8, 0)
> row 8: (5, 0)  (7, 0)  (8, 1)
> KSP Object: 1 MPI processes
>   type: gmres
>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
>     GMRES: happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object: 1 MPI processes
>   type: lu
>     LU: out-of-place factorization
>     tolerance for zero pivot 2.22045e-14
>     matrix ordering: nd
>     factor fill ratio given 5, needed 1.54545
>       Factored matrix follows:
>         Mat Object:         1 MPI processes
>           type: seqaij
>           rows=9, cols=9
>           package used to perform factorization: petsc
>           total: nonzeros=51, allocated nonzeros=51
>           total number of mallocs used during MatSetValues calls =0
>             not using I-node routines
>   linear system matrix = precond matrix:
>   Mat Object:   1 MPI processes
>     type: seqaij
>     rows=9, cols=9
>     total: nonzeros=33, allocated nonzeros=33
>     total number of mallocs used during MatSetValues calls =0
>       not using I-node routines
> Number of iterations =   1
>   Residual norm < 1.e-12
> ~/Src/petsc/src/ksp/ksp/examples/tutorials (barry/fix-setters *)
> $ ./ex10 -f0 ~/Desktop/binaryoutput -rhs 0 -mat_view -pc_type ilu -ksp_view
> Mat Object: 1 MPI processes
>   type: seqaij
> row 0: (0, 1)  (1, 0)  (3, 0)
> row 1: (0, 0)  (1, 1)  (2, 0)  (4, 0)
> row 2: (1, 0)  (2, 1)  (5, 0)
> row 3: (0, -0.219298)  (3, 1)  (4, -0.561404)  (6, -0.219298)
> row 4: (1, -0.304878)  (3, -0.097561)  (4, 1)  (5, -0.292683)  (7, -0.304878)
> row 5: (2, 0)  (4, 0)  (5, 1)  (8, 0)
> row 6: (3, 0)  (6, 1)  (7, 0)
> row 7: (4, 0)  (6, 0)  (7, 1)  (8, 0)
> row 8: (5, 0)  (7, 0)  (8, 1)
> KSP Object: 1 MPI processes
>   type: gmres
>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
>     GMRES: happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object: 1 MPI processes
>   type: ilu
>     ILU: out-of-place factorization
>     0 levels of fill
>     tolerance for zero pivot 2.22045e-14
>     using diagonal shift on blocks to prevent zero pivot [INBLOCKS]
>     matrix ordering: natural
>     factor fill ratio given 1, needed 1
>       Factored matrix follows:
>         Mat Object:         1 MPI processes
>           type: seqaij
>           rows=9, cols=9
>           package used to perform factorization: petsc
>           total: nonzeros=33, allocated nonzeros=33
>           total number of mallocs used during MatSetValues calls =0
>             not using I-node routines
>   linear system matrix = precond matrix:
>   Mat Object:   1 MPI processes
>     type: seqaij
>     rows=9, cols=9
>     total: nonzeros=33, allocated nonzeros=33
>     total number of mallocs used during MatSetValues calls =0
>       not using I-node routines
> Number of iterations =   2
>   Residual norm < 1.e-12
> 
> 
> > On Oct 20, 2014, at 10:02 PM, Sharp Stone <thronesf at gmail.com> wrote:
> >
> > Dear All,
> >
> > Last week I raised a question about KSP solver with zero pivot problem. Now I downsized the matrix in my problem and still got the zero pivot problem. I looked at the matrix and computed its eigenvalues in matlab, and it seems the matrix is good (not singular, I mean), but petsc still reported zero pivot in LU factorization. What could possibly cause this? Thank you very much in advance!
> >
> > PS: I attached the matrix output in my simplest test.
> >
> > --
> > Best regards,
> >
> > Feng
> > <binaryoutput>
> 
> 
> 
> 
> -- 
> Best regards,
> 
> Feng
> <Matrix.dat><RHSvec.dat>



More information about the petsc-users mailing list