[petsc-users] Matrix-free vs finite differenced Jacobian approximation
Smith, Barry F.
bsmith at mcs.anl.gov
Tue Dec 12 13:33:33 CST 2017
> On Dec 12, 2017, at 11:26 AM, Alexander Lindsay <alexlindsay239 at gmail.com> wrote:
>
> Ok, I'm going to go back on my original statement...the physics being run here is a sub-set of a much larger set of physics; for the current set the hand-coded Jacobian actually appears to be quite good.
>
> With hand-coded Jacobian, -pc_type lu, the convergence is perfect:
>
> 0 Nonlinear |R| = 2.259203e-02
> 0 Linear |R| = 2.259203e-02
> 1 Linear |R| = 1.129089e-10
> 1 Nonlinear |R| = 6.295583e-11
>
> So yea I guess at this point I'm just curious about the different behavior between `-snes_fd` and `-snes_fd -snes_mf_operator`.
Now that you have provided the exact options you are using, yes it is very unexpected behavior. Is there any chance you can send us the code that reproduces this?
The code that does the differencing in -snes_fd is similar to the code that does the differencing for -snes_mf_operator so normally one expects similar behavior but there are a couple of options you can try. Run with -snes_mf_operator and -help | grep mat_mffd and this will show options to control the differencing for the matrix free. For -snes_fd you have the option -mat_fd_type wp or ds
> Does the hand-coded result change your opinion Matt that the rules for FormFunction/Jacobian might be being violated?
>
> I understand that a finite difference approximation of the true Jacobian is an approximation. However, in the absence of possible complications like Matt suggested where an on-the-fly calculation might stand a better chance of capturing the behavior, I would expect both `-snes_mf_operator -snes_fd` and `-snes_fd` to suffer from the same approximations, right?
>
> On Tue, Dec 12, 2017 at 9:43 AM, Matthew Knepley <knepley at gmail.com> wrote:
> On Tue, Dec 12, 2017 at 11:30 AM, Alexander Lindsay <alexlindsay239 at gmail.com> wrote:
> I'm not using any hand-coded Jacobians.
>
> This looks to me like the rules for FormFunction/Jacobian() are being broken. If the residual function
> depends on some third variable, and it changes between calls independent of the solution U, then
> the stored Jacobian could look wrong, but one done every time on the fly might converge.
>
> Matt
>
> Case 1 options: -snes_fd -pc_type lu
>
> 0 Nonlinear |R| = 2.259203e-02
> 0 Linear |R| = 2.259203e-02
> 1 Linear |R| = 7.821248e-11
> 1 Nonlinear |R| = 2.258733e-02
> 0 Linear |R| = 2.258733e-02
> 1 Linear |R| = 5.277296e-11
> 2 Nonlinear |R| = 2.258733e-02
> 0 Linear |R| = 2.258733e-02
> 1 Linear |R| = 5.993971e-11
> Nonlinear solve did not converge due to DIVERGED_LINE_SEARCH iterations 2
>
> Case 2 options: -snes_fd -snes_mf_operator -pc_type lu
>
> 0 Nonlinear |R| = 2.259203e-02
> 0 Linear |R| = 2.259203e-02
> 1 Linear |R| = 2.258733e-02
> 2 Linear |R| = 3.103342e-06
> 3 Linear |R| = 6.779865e-12
> 1 Nonlinear |R| = 7.497740e-06
> 0 Linear |R| = 7.497740e-06
> 1 Linear |R| = 8.265413e-12
> 2 Nonlinear |R| = 7.993729e-12
> Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 2
>
>
> On Tue, Dec 12, 2017 at 9:12 AM, zakaryah . <zakaryah at gmail.com> wrote:
> When you say "Jacobians are bad" and "debugging the Jacobians", do you mean that the hand-coded Jacobian is wrong? In that case, why would you be surprised that the finite difference Jacobians, which are "correct" to approximation error, perform better? Otherwise, what does "Jacobians are bad" mean - ill-conditioned? Singular? Not symmetric? Not positive definite?
>
>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
>
More information about the petsc-users
mailing list