[petsc-users] KSP solver always has Zero pivot in LU

Matthew Knepley knepley at gmail.com
Tue Oct 21 17:04:12 CDT 2014


On Tue, Oct 21, 2014 at 3:10 PM, Sharp Stone <thronesf at gmail.com> wrote:

> Hi Barry,
>
> Thank you very much for your reply. They are illustrating.
>
> I use LU and gmres in the solver. When I use a rather small size of grids,
> the KSP solver works fine. However, when a larger domain is used (still
> small in the problem I'm trying to solve), I still got the error "Zero
> pivot in LU factorization" from Petsc.  As you previously suggested, I
> solved the problem in Matlab, and got the solution, though the matrix is
> close to singular. Do I have any options to solve such a system with Petsc?
> For the completeness, I attached the matrix and rhs vector,
>

You can possibly use an iterative method with a specified null space. Do
you know what the null space is?
Do you have an idea what preconditioner would be effective for your
operator, since block box PCs tend not
to work on singular problems?

   Matt


> and look forward to hearing from you. Thank you in advance!
>
> On Mon, Oct 20, 2014 at 11:38 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>>
>>   What exact options are you running with?  I tried ILU and LU and all
>> was fine.
>>
>> $ ./ex10 -f0 ~/Desktop/binaryoutput -rhs 0 -mat_view -pc_type lu -ksp_view
>> Mat Object: 1 MPI processes
>>   type: seqaij
>> row 0: (0, 1)  (1, 0)  (3, 0)
>> row 1: (0, 0)  (1, 1)  (2, 0)  (4, 0)
>> row 2: (1, 0)  (2, 1)  (5, 0)
>> row 3: (0, -0.219298)  (3, 1)  (4, -0.561404)  (6, -0.219298)
>> row 4: (1, -0.304878)  (3, -0.097561)  (4, 1)  (5, -0.292683)  (7,
>> -0.304878)
>> row 5: (2, 0)  (4, 0)  (5, 1)  (8, 0)
>> row 6: (3, 0)  (6, 1)  (7, 0)
>> row 7: (4, 0)  (6, 0)  (7, 1)  (8, 0)
>> row 8: (5, 0)  (7, 0)  (8, 1)
>> KSP Object: 1 MPI processes
>>   type: gmres
>>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>> Orthogonalization with no iterative refinement
>>     GMRES: happy breakdown tolerance 1e-30
>>   maximum iterations=10000, initial guess is zero
>>   tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>>   left preconditioning
>>   using PRECONDITIONED norm type for convergence test
>> PC Object: 1 MPI processes
>>   type: lu
>>     LU: out-of-place factorization
>>     tolerance for zero pivot 2.22045e-14
>>     matrix ordering: nd
>>     factor fill ratio given 5, needed 1.54545
>>       Factored matrix follows:
>>         Mat Object:         1 MPI processes
>>           type: seqaij
>>           rows=9, cols=9
>>           package used to perform factorization: petsc
>>           total: nonzeros=51, allocated nonzeros=51
>>           total number of mallocs used during MatSetValues calls =0
>>             not using I-node routines
>>   linear system matrix = precond matrix:
>>   Mat Object:   1 MPI processes
>>     type: seqaij
>>     rows=9, cols=9
>>     total: nonzeros=33, allocated nonzeros=33
>>     total number of mallocs used during MatSetValues calls =0
>>       not using I-node routines
>> Number of iterations =   1
>>   Residual norm < 1.e-12
>> ~/Src/petsc/src/ksp/ksp/examples/tutorials (barry/fix-setters *)
>> $ ./ex10 -f0 ~/Desktop/binaryoutput -rhs 0 -mat_view -pc_type ilu
>> -ksp_view
>> Mat Object: 1 MPI processes
>>   type: seqaij
>> row 0: (0, 1)  (1, 0)  (3, 0)
>> row 1: (0, 0)  (1, 1)  (2, 0)  (4, 0)
>> row 2: (1, 0)  (2, 1)  (5, 0)
>> row 3: (0, -0.219298)  (3, 1)  (4, -0.561404)  (6, -0.219298)
>> row 4: (1, -0.304878)  (3, -0.097561)  (4, 1)  (5, -0.292683)  (7,
>> -0.304878)
>> row 5: (2, 0)  (4, 0)  (5, 1)  (8, 0)
>> row 6: (3, 0)  (6, 1)  (7, 0)
>> row 7: (4, 0)  (6, 0)  (7, 1)  (8, 0)
>> row 8: (5, 0)  (7, 0)  (8, 1)
>> KSP Object: 1 MPI processes
>>   type: gmres
>>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>> Orthogonalization with no iterative refinement
>>     GMRES: happy breakdown tolerance 1e-30
>>   maximum iterations=10000, initial guess is zero
>>   tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>>   left preconditioning
>>   using PRECONDITIONED norm type for convergence test
>> PC Object: 1 MPI processes
>>   type: ilu
>>     ILU: out-of-place factorization
>>     0 levels of fill
>>     tolerance for zero pivot 2.22045e-14
>>     using diagonal shift on blocks to prevent zero pivot [INBLOCKS]
>>     matrix ordering: natural
>>     factor fill ratio given 1, needed 1
>>       Factored matrix follows:
>>         Mat Object:         1 MPI processes
>>           type: seqaij
>>           rows=9, cols=9
>>           package used to perform factorization: petsc
>>           total: nonzeros=33, allocated nonzeros=33
>>           total number of mallocs used during MatSetValues calls =0
>>             not using I-node routines
>>   linear system matrix = precond matrix:
>>   Mat Object:   1 MPI processes
>>     type: seqaij
>>     rows=9, cols=9
>>     total: nonzeros=33, allocated nonzeros=33
>>     total number of mallocs used during MatSetValues calls =0
>>       not using I-node routines
>> Number of iterations =   2
>>   Residual norm < 1.e-12
>>
>>
>> > On Oct 20, 2014, at 10:02 PM, Sharp Stone <thronesf at gmail.com> wrote:
>> >
>> > Dear All,
>> >
>> > Last week I raised a question about KSP solver with zero pivot problem.
>> Now I downsized the matrix in my problem and still got the zero pivot
>> problem. I looked at the matrix and computed its eigenvalues in matlab, and
>> it seems the matrix is good (not singular, I mean), but petsc still
>> reported zero pivot in LU factorization. What could possibly cause this?
>> Thank you very much in advance!
>> >
>> > PS: I attached the matrix output in my simplest test.
>> >
>> > --
>> > Best regards,
>> >
>> > Feng
>> > <binaryoutput>
>>
>>
>
>
> --
> Best regards,
>
> Feng
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141021/a3ba33bd/attachment.html>


More information about the petsc-users mailing list