[petsc-users] ILU preconditioner hangs with some zero elements on the diagonal

Barry Smith bsmith at mcs.anl.gov
Tue Oct 27 13:50:26 CDT 2015


> On Oct 27, 2015, at 12:40 PM, Hong <hzhang at mcs.anl.gov> wrote:
> 
> Here is the reason why it does not converge:
> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view -ksp_converged_reason
> Linear solve did not converge due to DIVERGED_NANORINF iterations 0

   This means it found a zero pivot either in the factorization or in the first attempt to do a triangular solve.  You can try 

-pc_factor_nonzeros_along_diagonal

and or 

-pc_factor_shift_type nonzero 

to generate a usable LU factorization but these are ad hoc fixes.



  Barry

> 
> KSP Object: 1 MPI processes
>   type: gmres
>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
>     GMRES: happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object: 1 MPI processes
>   type: ilu
>     ILU: out-of-place factorization
> ...
> 
> Hong
> 
> On Tue, Oct 27, 2015 at 12:36 PM, Hong <hzhang at mcs.anl.gov> wrote:
> Matt:
> On Tue, Oct 27, 2015 at 11:13 AM, Hong <hzhang at mcs.anl.gov> wrote:
> Gary :
> I tested your mat.bin using
> petsc/src/ksp/ksp/examples/tutorials/ex10.c
> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view
> ...
>   Mat Object:   1 MPI processes
>     type: seqaij
>     rows=588, cols=588
>     total: nonzeros=11274, allocated nonzeros=11274
>     total number of mallocs used during MatSetValues calls =0
>       using I-node routines: found 291 nodes, limit used is 5
> Number of iterations =   0
> Residual norm 24.2487
> 
> It does not converge, neither hangs.
>  
> This is the default GMRES/ILU.
> Hong 
>  
> As you said, matrix is non-singular, LU gives a solution
> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -pc_type lu
>   0 KSP preconditioned resid norm 3.298891225772e+03 true resid norm 2.424871130596e+01 ||r(i)||/||b|| 1.000000000000e+00
>   1 KSP preconditioned resid norm 1.918157196467e-12 true resid norm 5.039404549028e-13 ||r(i)||/||b|| 2.078215409241e-14
> Number of iterations =   1
>   Residual norm < 1.e-12
> 
> Is this the same matrix as you mentioned?
> 
> Hong, could you run ILU on it as well?
> 
>   Thanks,
> 
>     Matt
>  
> Hong
> 
> 
>  
> 
> On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <knepley at gmail.com> wrote:
> 
> On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <gary.rebt at gmx.ch[gary.rebt at gmx.ch]> wrote:
> 
> Dear petsc-users,
>  
> While using the FEniCS package to Solve a simple Stokes' flow problem, I have run into problems with PETSc preconditioners. In particular, I would like to use ILU (no parallel version) along with GMRES to solve my linear system but the solver just hangs indefinitely at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage is at 100% but even for a tiny system (59x59 for minimal test case), the solver does not seem to manage to push through it after 30 mins.
>  
> PETSc version is 3.6 and the matrix for the minimal test case is as follows :
> http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]
>  
> Hanging is a bug. We will check it out.
>  
> I do not have any way to read in this ASCII. Can you output a binary version
>  
>   -mat_view binary:mat.bin
>  
>   Thanks,
>  
>      Matt
>  
> 
> It contains zero diagonal entries, has a condition number of around 1e3 but is definitely non-singular. Direct solvers manage to solve the system as well as GMRES without preconditioner (although after many iterations for a 59x59 system..).
>  
> This will never work. Direct solvers work because they pivot away the zeros, but ILU is defined by having no pivoting.
>  
>   Thanks,
>  
>      Matt
>  
> 
> Playing with the available options here http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html] did not seem to solve the issue (even after activating diagonal_fill and/or nonzeros_along_diagonal) although sometimes error 71 is returned which stands for zero pivot detected. Are there yet other options that I have not considered? The default ILU factorization in MATLAB returns satisfactory problems without errors so surely it must be possible with PETSc?
>  
> As for the choice of ILU, I agree it might be suboptimal in this setting but I do need it for benchmarking purposes.
>  
> Best regards,
>  
> Gary 
>  --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener 
>  --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> 



More information about the petsc-users mailing list