<div dir="ltr"><div class="gmail_quote"><div dir="ltr">On Thu, Jan 3, 2019 at 7:36 AM Yingjie Wu via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Thanks for your reply.</div><div>I read the article you provided. This is my first contact with the quasi-Newton method.</div><div>I have some problem:</div><div>1. From the point of view of algorithm, the quasi-Newton method does not need to be solved by linear equations, so why would KSP be used?</div><div></div></div></blockquote><div><br></div><div>It is solving the equations, but we know the analytical answer. KSP is our abstraction for linear equation solver,</div><div>so we also use it in this case.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>2. Where and how to use preconditioner in quasi-Newton method? </div></div></blockquote><div><br></div><div>You do not need a PC here. Note that this is only going to work well if you have a good initial</div><div>guess for the inverse of your Jacobian. The optimization people who invented have that (it is</div><div>the identity). In Jed's paper, they use a V-cycle as the good initial guess.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Thanks,</div><div>Yingjie<br></div></div><br><div class="gmail_quote"><div dir="ltr">Jed Brown <<a href="mailto:jed@jedbrown.org" target="_blank">jed@jedbrown.org</a>> 于2018年12月27日周四 上午10:11写道:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Yingjie Wu via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> writes:<br>
<br>
> I my opinion, the difficulty in constructing my Jacobian matrix is complex<br>
> coefficient.(eg, thermal conductivity* λ* , density )<br>
> For example, in temperature equation(T):<br>
> ∇(*λ*∇T) - ∇(ρ* Cp* u ) + Q = 0<br>
> *λ* is thermal conductivity , ρ* is density Cp* is specific heat , u is<br>
> velocity, Q is source.<br>
> *λ = *1.9*(1.0e-3)*pow(T+273.0-150.0,1.29) function of T<br>
> ρ=<br>
> (0.4814*P/1.0e3/(T+273.15))/(1.0+0.446*(1.0e-2)*P/1.0e3/(pow(T+273.15,1.2)))<br>
> function of T and P<br>
><br>
> In theory, the coefficient contain variable. So it is complicated to<br>
> calculate the element of Jacobian.<br>
> In my snes_mf_operator method, I treat λ,ρ as constant. In every nonlinear<br>
> step, I use the solution update the λ,ρ and thus update the<br>
> preconditioning matrix. At each residual function call(in<br>
> SNESFormFunction), I also update the coefficient to ensure the correction<br>
> of the residual function.<br>
<br>
If those Jacobian entries are really that hard to get right, you can try<br>
using quasi-Newton as an alternative to -snes_mf_operator; similar to<br>
<br>
<a href="https://jedbrown.org/files/BrownBrune-LowRankQuasiNewtonRobustJacobianLagging-2013.pdf" rel="noreferrer" target="_blank">https://jedbrown.org/files/BrownBrune-LowRankQuasiNewtonRobustJacobianLagging-2013.pdf</a><br>
<br>
<br>
In general, I would start with an exact Jacobian (computed by coloring,<br>
AD, or just in one specific material/configuration). Test Newton using<br>
direct solvers to see your "best case" convergence (globalization may be<br>
needed). Test fieldsplit using direct solvers on the blocks so you know<br>
how much error is attributable to that approximation. Only then look at<br>
backing off on the approximation of the Jacobian and the preconditioner.<br>
</blockquote></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>