[petsc-users] Problems about Picard and NolinearGS

Jed Brown jed at jedbrown.org
Thu Dec 27 09:10:56 CST 2018


Yingjie Wu via petsc-users <petsc-users at mcs.anl.gov> writes:

> I my opinion, the difficulty in constructing my Jacobian matrix is complex
> coefficient.(eg, thermal conductivity* λ* , density )
> For example, in temperature equation(T):
>                 ∇(*λ*∇T) - ∇(ρ* Cp* u ) + Q =  0
> *λ* is thermal conductivity , ρ* is density  Cp* is specific heat , u is
> velocity, Q is source.
> *λ = *1.9*(1.0e-3)*pow(T+273.0-150.0,1.29)            function of T
> ρ=
> (0.4814*P/1.0e3/(T+273.15))/(1.0+0.446*(1.0e-2)*P/1.0e3/(pow(T+273.15,1.2)))
> function of T and P
>
> In theory, the coefficient contain variable. So it is complicated to
> calculate the element of Jacobian.
> In my snes_mf_operator method, I treat λ,ρ as constant. In every nonlinear
> step, I use the solution update the λ,ρ  and thus update the
> preconditioning matrix. At each residual function call(in
> SNESFormFunction), I also update the coefficient to ensure the correction
> of the residual function.

If those Jacobian entries are really that hard to get right, you can try
using quasi-Newton as an alternative to -snes_mf_operator; similar to

  https://jedbrown.org/files/BrownBrune-LowRankQuasiNewtonRobustJacobianLagging-2013.pdf


In general, I would start with an exact Jacobian (computed by coloring,
AD, or just in one specific material/configuration).  Test Newton using
direct solvers to see your "best case" convergence (globalization may be
needed).  Test fieldsplit using direct solvers on the blocks so you know
how much error is attributable to that approximation.  Only then look at
backing off on the approximation of the Jacobian and the preconditioner.


More information about the petsc-users mailing list