<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Tue, Oct 27, 2015 at 11:13 AM, Hong <span dir="ltr"><<a href="mailto:hzhang@mcs.anl.gov" target="_blank">hzhang@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">Gary :</div><div class="gmail_quote">I tested your mat.bin using</div><div class="gmail_quote">petsc/src/ksp/ksp/examples/tutorials/ex10.c</div><div class="gmail_quote">./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view</div><div class="gmail_quote">...</div><div class="gmail_quote"><div class="gmail_quote"> Mat Object: 1 MPI processes</div><div class="gmail_quote"> type: seqaij</div><div class="gmail_quote"> rows=588, cols=588</div><div class="gmail_quote"> total: nonzeros=11274, allocated nonzeros=11274</div><div class="gmail_quote"> total number of mallocs used during MatSetValues calls =0</div><div class="gmail_quote"> using I-node routines: found 291 nodes, limit used is 5</div><div class="gmail_quote">Number of iterations = 0</div><div class="gmail_quote">Residual norm 24.2487</div><div class="gmail_quote"><br></div><div class="gmail_quote">It does not converge, neither hangs.</div><div class="gmail_quote">As you said, matrix is non-singular, LU gives a solution</div><div class="gmail_quote"><div class="gmail_quote">./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -pc_type lu</div><div class="gmail_quote"> 0 KSP preconditioned resid norm 3.298891225772e+03 true resid norm 2.424871130596e+01 ||r(i)||/||b|| 1.000000000000e+00</div><div class="gmail_quote"> 1 KSP preconditioned resid norm 1.918157196467e-12 true resid norm 5.039404549028e-13 ||r(i)||/||b|| 2.078215409241e-14</div><div class="gmail_quote">Number of iterations = 1</div><div class="gmail_quote"> Residual norm < 1.e-12</div><div class="gmail_quote"><br></div><div class="gmail_quote">Is this the same matrix as you mentioned?</div></div></div></div></div></blockquote><div><br></div><div>Hong, could you run ILU on it as well?</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><div class="gmail_quote"><span class="HOEnZb"><font color="#888888"><div class="gmail_quote">Hong</div><div class="gmail_quote"><br></div></font></span></div><div><div class="h5"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><br>
<span> <br>
<br>
On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br>
<br>
</span><span>On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <<a href="mailto:gary.rebt@gmx.ch" target="_blank">gary.rebt@gmx.ch</a>[<a href="mailto:gary.rebt@gmx.ch" target="_blank">gary.rebt@gmx.ch</a>]> wrote:<br>
<br>
Dear petsc-users,<br>
<br>
While using the FEniCS package to Solve a simple Stokes' flow problem, I have run into problems with PETSc preconditioners. In particular, I would like to use ILU (no parallel version) along with GMRES to solve my linear system but the solver just hangs indefinitely at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage is at 100% but even for a tiny system (59x59 for minimal test case), the solver does not seem to manage to push through it after 30 mins.<br>
<br>
PETSc version is 3.6 and the matrix for the minimal test case is as follows :<br>
</span><a href="http://pastebin.com/t3fvdkaS%5Bhttp://pastebin.com/t3fvdkaS%5D" rel="noreferrer" target="_blank">http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS]</a><br>
<span> <br>
Hanging is a bug. We will check it out.<br>
<br>
I do not have any way to read in this ASCII. Can you output a binary version<br>
<br>
-mat_view binary:mat.bin<br>
<br>
Thanks,<br>
<br>
Matt<br>
<br>
<br>
It contains zero diagonal entries, has a condition number of around 1e3 but is definitely non-singular. Direct solvers manage to solve the system as well as GMRES without preconditioner (although after many iterations for a 59x59 system..).<br>
<br>
This will never work. Direct solvers work because they pivot away the zeros, but ILU is defined by having no pivoting.<br>
<br>
Thanks,<br>
<br>
Matt<br>
<br>
<br>
</span>Playing with the available options here <a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html%5Bhttp://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html%5D" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html]</a> did not seem to solve the issue (even after activating diagonal_fill and/or nonzeros_along_diagonal) although sometimes error 71 is returned which stands for zero pivot detected. Are there yet other options that I have not considered? The default ILU factorization in MATLAB returns satisfactory problems without errors so surely it must be possible with PETSc?<br>
<div><div> <br>
As for the choice of ILU, I agree it might be suboptimal in this setting but I do need it for benchmarking purposes.<br>
<br>
Best regards,<br>
<br>
Gary <br>
--<br>
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener <br>
--<br>
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener</div></div></blockquote></div></div></div><br></div></div>
</blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div>
</div></div>