[petsc-users] Matrix-free vs finite differenced Jacobian approximation
Smith, Barry F.
bsmith at mcs.anl.gov
Tue Dec 12 13:53:13 CST 2017
> On Dec 12, 2017, at 1:48 PM, Alexander Lindsay <alexlindsay239 at gmail.com> wrote:
>
> On Tue, Dec 12, 2017 at 10:39 AM, Matthew Knepley <knepley at gmail.com> wrote:
> On Tue, Dec 12, 2017 at 12:26 PM, Alexander Lindsay <alexlindsay239 at gmail.com> wrote:
> Ok, I'm going to go back on my original statement...the physics being run here is a sub-set of a much larger set of physics; for the current set the hand-coded Jacobian actually appears to be quite good.
>
> With hand-coded Jacobian, -pc_type lu, the convergence is perfect:
>
> 0 Nonlinear |R| = 2.259203e-02
> 0 Linear |R| = 2.259203e-02
> 1 Linear |R| = 1.129089e-10
> 1 Nonlinear |R| = 6.295583e-11
>
> So yea I guess at this point I'm just curious about the different behavior between `-snes_fd` and `-snes_fd -snes_mf_operator`. Does the hand-coded result change your opinion Matt that the rules for FormFunction/Jacobian might be being violated?
>
> I understand that a finite difference approximation of the true Jacobian is an approximation. However, in the absence of possible complications like Matt suggested where an on-the-fly calculation might stand a better chance of capturing the behavior, I would expect both `-snes_mf_operator -snes_fd` and `-snes_fd` to suffer from the same approximations, right?
>
> There are too many things that do not make sense:
>
> 1) How could LU be working with -snes_mf_operator?
>
> What operator are you using here. The hand-coded Jacobian?
>
> Why wouldn't LU work with -snes_mf_operator? My understanding is that LU is configured to operate interchangeably with iterative preconditioners.
>
Matt didn't see that in addition to the -snes_mf_operator you also had the -snes_fd. Thus it is doing the factorization on the fd matrix and using it as a preconditioner for the mf matrix. No mystery.
> Jacobian:
>
> (lldb) call MatView(snes->jacobian, 0)
> Mat Object: 1 MPI processes
> type: mffd
> Matrix-free approximation:
> err=1.49012e-08 (relative error in function evaluation)
> The compute h routine has not yet been set
>
> Preconditioner is computed with with SNESComputeJacobianDefault
>
>
> 2) How can LU take more than one iterate?
>
> 0 Nonlinear |R| = 2.259203e-02
> 0 Linear |R| = 2.259203e-02
> 1 Linear |R| = 2.258733e-02
> 2 Linear |R| = 3.103342e-06
> 3 Linear |R| = 6.779865e-12
>
> That seems to say that we are not solving a linear system for some reason.
>
> I'm curious about the same thing :-) Perhaps because B^-1 is not a perfect inverse of the action of J on x in the matrix-free case?
Yes, this is completely normal. The matrix-free application is not identical to the application of the fd matrix hence solving LU with the fd matrix will not make a perfect preconditioner for the matrix free operator. You will always see this behavior in this situation.
Please read my email, the issue you report is still there (Matt got sidetracked).
Barry
> If I actually run with matrix-free J and hand-coded B, I do get one linear iteration:
>
> SMP jacobian PJFNK, -pc_type lu:
> 0 Nonlinear |R| = 2.259203e-02
> 0 Linear |R| = 2.259203e-02
> 1 Linear |R| = 2.857149e-11
> 1 Nonlinear |R| = 1.404604e-11
> Solve Converged!
>
>
> 3) Why would -snes_fd be different from a hand-coded Jacobian?
>
> Did you try -snes_type test?
>
> The matrix ratio norm of the difference is on the order of 1e-6, which isn't perfect but here the hand-coded performs better than the finite-differenced Jacobian.
>
> Alex
>
> Matt
>
> On Tue, Dec 12, 2017 at 9:43 AM, Matthew Knepley <knepley at gmail.com> wrote:
> On Tue, Dec 12, 2017 at 11:30 AM, Alexander Lindsay <alexlindsay239 at gmail.com> wrote:
> I'm not using any hand-coded Jacobians.
>
> This looks to me like the rules for FormFunction/Jacobian() are being broken. If the residual function
> depends on some third variable, and it changes between calls independent of the solution U, then
> the stored Jacobian could look wrong, but one done every time on the fly might converge.
>
> Matt
>
> Case 1 options: -snes_fd -pc_type lu
>
> 0 Nonlinear |R| = 2.259203e-02
> 0 Linear |R| = 2.259203e-02
> 1 Linear |R| = 7.821248e-11
> 1 Nonlinear |R| = 2.258733e-02
> 0 Linear |R| = 2.258733e-02
> 1 Linear |R| = 5.277296e-11
> 2 Nonlinear |R| = 2.258733e-02
> 0 Linear |R| = 2.258733e-02
> 1 Linear |R| = 5.993971e-11
> Nonlinear solve did not converge due to DIVERGED_LINE_SEARCH iterations 2
>
> Case 2 options: -snes_fd -snes_mf_operator -pc_type lu
>
> 0 Nonlinear |R| = 2.259203e-02
> 0 Linear |R| = 2.259203e-02
> 1 Linear |R| = 2.258733e-02
> 2 Linear |R| = 3.103342e-06
> 3 Linear |R| = 6.779865e-12
> 1 Nonlinear |R| = 7.497740e-06
> 0 Linear |R| = 7.497740e-06
> 1 Linear |R| = 8.265413e-12
> 2 Nonlinear |R| = 7.993729e-12
> Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 2
>
>
> On Tue, Dec 12, 2017 at 9:12 AM, zakaryah . <zakaryah at gmail.com> wrote:
> When you say "Jacobians are bad" and "debugging the Jacobians", do you mean that the hand-coded Jacobian is wrong? In that case, why would you be surprised that the finite difference Jacobians, which are "correct" to approximation error, perform better? Otherwise, what does "Jacobians are bad" mean - ill-conditioned? Singular? Not symmetric? Not positive definite?
>
>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
>
>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
>
More information about the petsc-users
mailing list