<div class="gmail_extra">On Fri, Apr 27, 2012 at 2:12 AM, Jed Brown <span dir="ltr"><<a href="mailto:jedbrown@mcs.anl.gov" target="_blank">jedbrown@mcs.anl.gov</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div class="im"><div class="gmail_extra"><div class="gmail_quote">On Fri, Apr 27, 2012 at 01:00, Dominik Szczerba <span dir="ltr"><<a href="mailto:dominik@itis.ethz.ch" target="_blank">dominik@itis.ethz.ch</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div>I have<br>
properly converging non-trivial cases with the Newton solver using<br>
analytically derived Jacobian. I want to reproduce these results<br>
WITHOUT using my Jacobian at all with the secant method (i.e. F'(x)<br>
approximated with finite differences)). You say my Jacobian is wrong,<br>
and maybe you are right and maybe not. But it does not matter: I do<br>
not want to use my Jacobian anyway, hoping for the petsc's secant<br>
method to approximate it. Does petsc have a secant method and does<br>
-snes_mf_operator option enables it? Or it only works as a<br>
preconditioner, requiring the Jacobian in any case?</div></blockquote></div><br></div></div><div class="gmail_extra">-snes_mf_operator is NOT the secant method. Secant is actually only a 1D method, the generalizations to multiple dimensions are usually referred to as quasi-Newton, of which BFGS is perhaps the most popular. Those are -snes_type qn, most of that work is new in petsc-dev.</div>
<div class="gmail_extra"><br></div><div class="gmail_extra">-snes_mf_operator is Newton, but with the *operator* applied using matrix-free finite differences. It uses whatever you provide as the preconditioning matrix.</div>
<div class="gmail_extra"><br></div><div class="gmail_extra">As for what is going wrong, some of the symptoms you report often occur when the original system is poorly scaled. What units are you using for state variables and residuals? This has lots of useful tips</div>
</blockquote><div><br></div><div>I disagree here. I think what is wrong is</div><div><br></div><div> You are screwing up providing the preconditioner matrix. There is no other way to explain the GMRES performance.</div><div>
<br></div><div>Try using -snes_mf and it will converge, just slowly. This is a simple thing of correcting an error in providing that matrix I think.</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div class="gmail_extra"><a href="http://scicomp.stackexchange.com/questions/30/why-is-newtons-method-not-converging" target="_blank">http://scicomp.stackexchange.com/questions/30/why-is-newtons-method-not-converging</a></div>
<div class="gmail_extra"><br></div><div class="gmail_extra">See these entries relating to inaccurate finite differencing</div><div class="gmail_extra"><br></div><div class="gmail_extra"><div class="gmail_extra">* Run with -snes_type test -snes_test_display to see if the Jacobian you are using is wrong. Compare the output when you add -mat_fd_type ds to see if the result is sensitive to the choice of differencing parameter.</div>
<div class="gmail_extra">* Run with -snes_mf_operator -pc_type lu to see if the Jacobian you are using is wrong. If the problem is too large for a direct solve, try -snes_mf_operator -pc_type ksp -ksp_ksp_rtol 1e-12. Compare the output when you add -mat_mffd_type ds to see if the result is sensitive to choice of differencing parameter.</div>
<div class="gmail_extra"><div class="gmail_extra">* Run with quad precision (./configure --with-precision=__float128 --download-f2cblaslapack with PETSc 3.2 and later, needs version 4.6 or later GNU compilers)</div><div class="gmail_extra">
* Change the units (nondimensionalization), boundary condition scaling, or formulation so that the Jacobian is better conditioned.</div></div></div>
</blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>
</div>