[petsc-users] Problems about GMRES restart and Scaling

Alexander Lindsay alexlindsay239 at gmail.com
Mon Mar 25 15:32:03 CDT 2019


We also see this behavior quite frequently in MOOSE applications that have
physics that generate residuals of largely different scales. Like Matt said
non-dimensionalization would help a lot. Without proper scaling for some of
these types of problems, even when the GMRES iteration converges the
non-linear step can be garbage for the reason that Barry stated; running
with -ksp_monitor_true_residual for these troublesome cases would
illustrate that we weren't converging the true residual. The failure of
matrix-free under these conditions was one of the biggest motivators for us
to add automatic differentiation to MOOSE. Sometimes biting the bullet and
working on the Jacobian can be worth it.

On Thu, Mar 21, 2019 at 9:02 AM Yingjie Wu via petsc-users <
petsc-users at mcs.anl.gov> wrote:

> Thanks for all the reply.
> The model I simulated is a thermal model that contains multiple physical
> fields(eg. temperature, pressure, velocity). In PDEs, these variables are
> preceded by some physical parameters, which in turn are functions of these
> variables(eg. density is a function of pressure and temperature.). Due to
> the complexity of these physical parameter functions, we cannot explicitly
> construct Jacobian matrices for this problem. So I use -snes_mf_operator.
>
> My preconditioner is to treat these physical parameters as constants. At
> the beginning of each nonlinear step(SNES), the Jacobian matrix is updated
> with the result of the previous nonlinear step output(the physical
> parameters are updated).
>
>
> After setting a large KSP restart step, about 60 KSP can converge(ksp_rtol
> = 1.e-5).
>
>
> I have a feeling that my initial values are too large to cause this
> phenomenon.
>
>
> Snes/ex19 is actually a lot like my example, setting up: -da_grid_x 200
> -da_grid_y 200 -snes_mf
>
> There will also be a residual rise in step 1290 of KSP
>
> But not all examples will produce this phenomenon.
>
>
> Thanks, Yingjie
>
> Smith, Barry F. <bsmith at mcs.anl.gov> 于2019年3月21日周四 上午1:18写道:
>
>>
>>
>> > On Mar 20, 2019, at 5:52 AM, Yingjie Wu via petsc-users <
>> petsc-users at mcs.anl.gov> wrote:
>> >
>> > Dear PETSc developers:
>> > Hi,
>> > Recently, I used PETSc to solve a non-linear PDEs for thermodynamic
>> problems. In the process of solving, I found the following two phenomena,
>> hoping to get some help and suggestions.
>> >
>> > 1. Because my problem involves a lot of physical parameters, it needs
>> to call a series of functions, and can not analytically construct Jacobian
>> matrix, so I use - snes_mf_operator to solve it, and give an approximate
>> Jacobian matrix as a preconditioner. Because of the large dimension of the
>> problem and the magnitude difference of the physical variables involved, it
>> is found that the linear step residuals will increase at each restart
>> (default 30th linear step) . This problem can be solved by setting a large
>> number of restart steps. I would like to ask the reasons for this
>> phenomenon? What knowledge or articles should I learn if I want to find out
>> this problem?
>>
>>    I've seen this behavior. I think in your case it is likely the
>> -snes_mf_operator is not really producing an "accurate enough"
>> Jacobian-Vector product (and the "solution" being generated by GMRES may be
>> garbage). Run with -ksp_monitor_true_residual
>>
>>    If your residual function has if () statements in it or other very
>> sharp changes (discontinuities) then it may not even have a true Jacobian
>> at the locations it is being evaluated at.  In the sense that the
>> "Jacobian" you are applying via finite differences is not a linear operator
>> and hence GMRES will fail on it.
>>
>>     What are you using for a preconditioner? And roughly how many KSP
>> iterations are being used.
>>
>>    Barry
>>
>> >
>> >
>> > 2. In my problem model, there are many physical fields (variables are
>> realized by finite difference method), and the magnitude of variables
>> varies greatly. Is there any Scaling interface or function in Petsc?
>> >
>> > Thanks,
>> > Yingjie
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190325/e95ac8db/attachment.html>


More information about the petsc-users mailing list