[petsc-users] Simultaneously compute Residual+Jacobian in SNES

Matthew Knepley knepley at gmail.com
Mon Dec 12 14:27:58 CST 2016


On Mon, Dec 12, 2016 at 10:36 AM, Derek Gaston <friedmud at gmail.com> wrote:

> On Mon, Dec 12, 2016 at 12:36 AM Jed Brown <jed at jedbrown.org> wrote:
>
>> > Can you expand on that?  Do you believe automatic differentiation in
>> > general to be "bad code management"?
>>
>> AD that prevents calling the non-AD function is bad AD.
>>
>
> That's not exactly the problem.  Even if you can call an AD and a non-AD
> residual... you still have to compute two residuals to compute a residual
> and a Jacobian separately when using AD.
>
> It's not the end of the world... but it was something that prompted me to
> ask the question.
>
>
>> Are all the fields in unique function spaces that need different
>> transforms or different quadratures?  If not, it seems like the presence
>> of many fields would already amortize the geometric overhead of visiting
>> an element.
>>
>
> These were two separate examples.  Expensive shape functions, by
> themselves, could warrant computing the residual and Jacobian
> simultaneously.  Also: many variables, by themselves, could do the same.
>
>
>> Alternatively, you could cache the effective material coefficient (and
>> its gradient) at each quadrature point during residual evaluation, thus
>> avoiding a re-solve when building the Jacobian.
>
>
> I agree with this.  We have some support for it in MOOSE now... and more
> plans for better support in the future.  It's a classic time/space tradeoff.
>
>
>> I would recommend that unless you know that line searches are rare.
>>
>
> BTW: Many (most?) of our most complex applications all _disable_ line
> search.  Over the years we've found line search to be more of a hindrance
> than a help.  We typically prefer using some sort of "physics based" damped
> Newton.
>

We should move this discussion to a separate thread. I think this is the
wrong choice. I would make the analogy that
you have a Stokes problem, and after finding that ILU fails you go back to
GS and thousands of iterates which eventually
succeeds. The line search (or globalization) needs to respect problem
structure. I think we have the potential to handle
this in PETSc.

   Matt


> It is far more common that the Jacobian is _much_ more expensive than
>> the residual, in which case the mere possibility of a line search (or of
>> converging) would justify deferring the Jacobian.  I think it's much
>> better to make residuals and Jacobians fast independently, then perhaps
>> make the residual do some cheap caching, and worry about
>> second-guessing Newton only as a last resort.
>
>
> I think I agree.  These are definitely "fringe" cases... for most
> applications Jacobians are _way_ more expensive.
>
>
>> That said, I have no doubt that we could
>> demonstrate some benefit to using heuristics and a relative cost model
>> to sometimes compute residuals and Jacobians together.  It just isn't
>> that interesting and I think the gains are likely small and will
>> generate lots of bikeshedding about the heuristic.
>>
>
> I agree here too.  It could be done... but I think you've convinced me
> that it's not worth the trouble :-)
>
> Thanks for the discussion everyone!
>
> Derek
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20161212/fd337d03/attachment.html>


More information about the petsc-users mailing list