<div dir="ltr">I did a quick test:<div>1) ilu with any shift type does not converge</div><div>2) gmres/4 bjacobi/ilu + shift_type nonzero does not converge</div><div>3) mpiexec -n 4 ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor -sub_pc_type lu -sub_pc_factor_shift_type nonzero</div><div>...</div><div><div>0 KSP Residual norm 3.284826093129e+03</div><div> 1 KSP Residual norm 2.802972716423e+03</div><div> 2 KSP Residual norm 2.039112137210e+03</div><div>...</div><div><div> 24 KSP Residual norm 2.666350543810e-02</div><div>Number of iterations = 24</div><div>Residual norm 0.0179698</div></div><div><br></div><div>Hong</div><div class="gmail_extra"><br><div class="gmail_quote">On Tue, Oct 27, 2015 at 1:50 PM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><span class=""><br>
> On Oct 27, 2015, at 12:40 PM, Hong <<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>> wrote:<br>
><br>
> Here is the reason why it does not converge:<br>
> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view -ksp_converged_reason<br>
> Linear solve did not converge due to DIVERGED_NANORINF iterations 0<br>
<br>
</span> This means it found a zero pivot either in the factorization or in the first attempt to do a triangular solve. You can try<br>
<br>
-pc_factor_nonzeros_along_diagonal<br>
<br>
and or<br>
<br>
-pc_factor_shift_type nonzero<br>
<br>
to generate a usable LU factorization but these are ad hoc fixes.<br>
<span class=""><font color="#888888"><br>
<br>
<br>
Barry<br>
</font></span><div class=""><div class="h5"><br>
><br>
> KSP Object: 1 MPI processes<br>
> type: gmres<br>
> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br>
> GMRES: happy breakdown tolerance 1e-30<br>
> maximum iterations=10000, initial guess is zero<br>
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.<br>
> left preconditioning<br>
> using PRECONDITIONED norm type for convergence test<br>
> PC Object: 1 MPI processes<br>
> type: ilu<br>
> ILU: out-of-place factorization<br>
> ...<br>
><br>
> Hong<br>
><br>
> On Tue, Oct 27, 2015 at 12:36 PM, Hong <<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>> wrote:<br>
> Matt:<br>
> On Tue, Oct 27, 2015 at 11:13 AM, Hong <<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>> wrote:<br>
> Gary :<br>
> I tested your mat.bin using<br>
> petsc/src/ksp/ksp/examples/tutorials/ex10.c<br>
> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view<br>
> ...<br>
> Mat Object: 1 MPI processes<br>
> type: seqaij<br>
> rows=588, cols=588<br>
> total: nonzeros=11274, allocated nonzeros=11274<br>
> total number of mallocs used during MatSetValues calls =0<br>
> using I-node routines: found 291 nodes, limit used is 5<br>
> Number of iterations = 0<br>
> Residual norm 24.2487<br>
><br>
> It does not converge, neither hangs.<br>
><br>
> This is the default GMRES/ILU.<br>
> Hong<br>
><br>
> As you said, matrix is non-singular, LU gives a solution<br>
> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -pc_type lu<br>
> 0 KSP preconditioned resid norm 3.298891225772e+03 true resid norm 2.424871130596e+01 ||r(i)||/||b|| 1.000000000000e+00<br>
> 1 KSP preconditioned resid norm 1.918157196467e-12 true resid norm 5.039404549028e-13 ||r(i)||/||b|| 2.078215409241e-14<br>
> Number of iterations = 1<br>
> Residual norm < 1.e-12<br>
><br>
> Is this the same matrix as you mentioned?<br>
><br>
> Hong, could you run ILU on it as well?<br>
><br>
> Thanks,<br>
><br>
> Matt<br>
><br>
> Hong<br>
><br>
><br>
><br>
><br>
> On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> wrote:<br>
><br>
> On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <<a href="mailto:gary.rebt@gmx.ch">gary.rebt@gmx.ch</a>[<a href="mailto:gary.rebt@gmx.ch">gary.rebt@gmx.ch</a>]> wrote:<br>
><br>
> Dear petsc-users,<br>
><br>
> While using the FEniCS package to Solve a simple Stokes' flow problem, I have run into problems with PETSc preconditioners. In particular, I would like to use ILU (no parallel version) along with GMRES to solve my linear system but the solver just hangs indefinitely at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage is at 100% but even for a tiny system (59x59 for minimal test case), the solver does not seem to manage to push through it after 30 mins.<br>
><br>
> PETSc version is 3.6 and the matrix for the minimal test case is as follows :<br>
> <a href="http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]" rel="noreferrer" target="_blank">http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]</a><br>
><br>
> Hanging is a bug. We will check it out.<br>
><br>
> I do not have any way to read in this ASCII. Can you output a binary version<br>
><br>
> -mat_view binary:mat.bin<br>
><br>
> Thanks,<br>
><br>
> Matt<br>
><br>
><br>
> It contains zero diagonal entries, has a condition number of around 1e3 but is definitely non-singular. Direct solvers manage to solve the system as well as GMRES without preconditioner (although after many iterations for a 59x59 system..).<br>
><br>
> This will never work. Direct solvers work because they pivot away the zeros, but ILU is defined by having no pivoting.<br>
><br>
> Thanks,<br>
><br>
> Matt<br>
><br>
><br>
> Playing with the available options here <a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html]" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html]</a> did not seem to solve the issue (even after activating diagonal_fill and/or nonzeros_along_diagonal) although sometimes error 71 is returned which stands for zero pivot detected. Are there yet other options that I have not considered? The default ILU factorization in MATLAB returns satisfactory problems without errors so surely it must be possible with PETSc?<br>
><br>
> As for the choice of ILU, I agree it might be suboptimal in this setting but I do need it for benchmarking purposes.<br>
><br>
> Best regards,<br>
><br>
> Gary<br>
> --<br>
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
> -- Norbert Wiener<br>
> --<br>
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
> -- Norbert Wiener<br>
><br>
><br>
><br>
><br>
> --<br>
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
> -- Norbert Wiener<br>
><br>
><br>
<br>
</div></div></blockquote></div><br></div></div></div>