[petsc-users] A question on finite difference Jacobian

Zou (Non-US), Ling ling.zou at inl.gov
Wed Oct 7 09:54:10 CDT 2015


Thank you Barry.

The background I am asking this question is that I want to reduce (or you
can say optimize) the cost of my finite difference Jacobian evaluation,
which is used for preconditioning purpose. The concept is based on my
understanding of the problem I am solving, but I am not sure if it will
work, thus I want to do some test.

Here is the concept, assume that my residual reads,

F(\vec{U}) = F[\vec{U}, g(\vec{U})]


in which, g(\vec{U}) is a quite complicated and thus expensive function
evaluation. This function, however, is not very sensitive to \vec{U}, i.e.,
\partial{g(\vec{U})}/\partial{g(\vec{U})} is not that important.



Normally, a finite difference Jacobian is evaluated as (as discussed in
PETSc manual),


J(\vec{u}) \approx \frac{F(\vec{U}+\epsilon \vec{v}) - F(\vec{U})}
{\epsilon}


In my case, it reads,


J(\vec{u}) \approx \frac{F[(\vec{U}+\epsilon \vec{v}), g(\vec{U}+\epsilon
\vec{v})] - F[(\vec{U}), g(\vec{U})]} {\epsilon}


Because \partial{g(\vec{U})}/\partial{g(\vec{U})} is not important, the
simplification I want to make is, when finite difference Jacobian (as
preconditioner) is evaluated, it can be further simplified as,


J(\vec{u}) \approx \frac{F[(\vec{U}+\epsilon \vec{v}), g(\vec{U})] -
F[(\vec{U}), g(\vec{U})]} {\epsilon}


Thus, the re-evaluation on g(\vec{U}+\epsilon \vec{v}) is removed. It seems
to me that I need some kind of signal from PETSc so I can tell the code not
to update g(\vec{U}). However, I never tested it and I don't know if
anybody did similar things before.


Thanks,


Ling



On Tue, Oct 6, 2015 at 7:09 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
> > On Oct 6, 2015, at 4:22 PM, Zou (Non-US), Ling <ling.zou at inl.gov> wrote:
> >
> >
> >
> > On Tue, Oct 6, 2015 at 2:38 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> > > On Oct 6, 2015, at 3:29 PM, Zou (Non-US), Ling <ling.zou at inl.gov>
> wrote:
> > >
> > > Hi All,
> > >
> > > If the non-zero pattern of a finite difference Jacobian needs 20
> colors to color it (20 comes from MatFDColoringView, the non-zero pattern
> is pre-determined from mesh connectivity), is it true that PETSc needs 40
> functions evaluation to get the full Jacobian matrix filled? This is
> because that a perturbation per color needs two function evaluation
> according to PETSc manual (ver 3.6, page 123, equations shown in the middle
> of the page).
> > > But I only see 20 function evaluations. I probably have some
> misunderstanding somewhere. Any suggestions?
> >
> >    PETSc uses forward differencing to compute the derivatives, hence it
> needs a single function evaluation at the given point (which has almost
> always been previously computed in Newton's method)
> > ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> > Is it a potential problem if the user chooses to use a different (e.g.
> simplified) residual function as the function for MatFDColoringSetFunction?
>
>    Yes, you can do that. But this may result in a "Newton" direction that
> is not a descent direction hence Newton stalls. If you have 20 colors I
> doubt that it would be a good idea to use a cheaper function there. If you
> have several hundred colors then you can use a simpler function PLUS
> -snes_mf_operator to insure that the Newton direction is correct.
>
>
>   Barry
>
> >
> >
> > and then one function evaluation for each color. This is why it reports
> 20 function evaluations.
> >
> >   Barry
> >
> >
> > >
> > > Ling
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151007/26a2ba8b/attachment-0001.html>


More information about the petsc-users mailing list