[petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods

zakaryah . zakaryah at gmail.com
Mon Oct 2 22:54:29 CDT 2017


I'm still working on this.  I've made some progress, and it looks like the
issue is with the KSP, at least for now.  The Jacobian may be
ill-conditioned.  Is it possible to use -snes_test_display during an
intermediate step of the analysis?  I would like to inspect the Jacobian
after several solves have already completed, just to make sure there are no
mistakes there.  I tried

    ierr = SNESSetType(mp->PETSc_snes, SNESTEST);CHKERRQ(ierr);



    ierr = PetscOptionsSetValue(NULL, "-snes_test_display", "");
CHKERRQ(ierr);

and the first line works, of course, but the second line doesn't seem to
activate the printing of the Jacobian.  I also tried it with "true" in the
last argument and that didn't work either.

On Tue, Sep 5, 2017 at 9:39 AM, Jed Brown <jed at jedbrown.org> wrote:

> "zakaryah ." <zakaryah at gmail.com> writes:
>
> > OK - I've checked the Jacobian and function very thoroughly and I am
> > confident there are no errors.
>
> Does Newton converge quadratically when you have a good enough initial
> guess?
>
> Globalization of large deformation elasticity is a persistent
> engineering challenge.  The standard approach is to use a continuation,
> often in the form of load increments.
>
> Regarding trust region documentation, the man page says
>
>    The basic algorithm is taken from "The Minpack Project", by More',
>    Sorensen, Garbow, Hillstrom, pages 88-111 of "Sources and Development
>    of Mathematical Software", Wayne Cowell, editor.
>
> You should be able to make sense of it reading from any other source on
> trust region methods.
>
> > I suspect that I am having problems with a bad starting point, and the
> SNES
> > cannot find the global minimum from there.  I know that the global
> minimum
> > (with residual zero) exists in all cases but I would like the methods for
> > finding it to be more robust to the starting value.
> >
> > The problem comes from the physics of finite deformations of elastic
> > materials.  In short, I have a functional of smooth maps on a 3D domain
> to
> > itself.  The functional contains two terms.  The first term represents
> > forces which come from external data, and in the Lagrangian this term
> only
> > involves the values of the map at the point in question.  The second term
> > penalizes fluctuations in the map, and can take various forms.  The
> > simplest form is just the Dirichlet energy, but I'm also interested in
> the
> > infinitesimal strain energy and the finite strain energy.  The first two
> > have terms in the Lagrangian which are first order in the second spatial
> > derivatives of the map, while the third (finite strain energy) has terms
> > which are up to third order in the first and second spatial derivatives
> of
> > the map.  It is the finite strain energy term which has been problematic.
> >
> > The Euler-Lagrange equations are discretized on a cubic grid, with equal
> > interval spacing in each dimension.  The map is the dependent variable,
> > i.e. the x in F(x) = 0.  I prefer Neumann boundary conditions.  Because
> the
> > spatial derivatives of the map are usually small, the Jacobian typically
> > has large values in 3x3 blocks along the diagonal (which depend on the
> map
> > and the external data), and up to 54 values which are functions of the
> > spatial derivatives of the map and tend to be smaller.
> >
> > Do you have any advice on diagnosing and improving situations in which
> > Newton's method finds a stationary point that is not the state with
> > globally minimal residual?  One hint is that -snes_type newtonls does not
> > find as good a solution as -snes_type newtontr but I don't know much
> about
> > these trust region methods, or how to customize and assess them.  I'm
> > grateful for any advice.
> >
> > On Mon, Sep 4, 2017 at 5:44 PM, zakaryah . <zakaryah at gmail.com> wrote:
> >
> >> Yes, it looks like it IS the other way around, and I think the row is
> >>
> >> r.c + r.i*3 + r.j*3*M + r.k*3*M*N, where r.i is in [0,M-1], r.j is in
> >> [0,N-1], and r.k is in [0,P-1].
> >>
> >> That matches the boundary conditions in the displayed Jacobian.
> >>
> >> On Mon, Sep 4, 2017 at 5:33 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >>
> >>>
> >>> > On Sep 4, 2017, at 4:09 PM, zakaryah . <zakaryah at gmail.com> wrote:
> >>> >
> >>> > OK that is super helpful.  Just to be sure - for MxNxP, the row r in
> >>> the Jacobian is at r.i*P*N*3 + r.j*P*3 + r.k*3 + r.c?
> >>>
> >>>   It is that way, or the other way around r.k*M*N*3 + r.j*N*3 + r.k*3 +
> >>> r.c
> >>> >
> >>> >
> >>> > On Mon, Sep 4, 2017 at 4:58 PM, Barry Smith <bsmith at mcs.anl.gov>
> wrote:
> >>> >
> >>> > > On Sep 4, 2017, at 3:48 PM, zakaryah . <zakaryah at gmail.com> wrote:
> >>> > >
> >>> > > One piece of information that would be useful is what ordering
> PETSc
> >>> uses for the Jacobian in the snes_test_display.  Is it a natural
> ordering,
> >>> or the PETSc ordering?  For debugging the Jacobian manually, the
> natural
> >>> ordering is much easier to work with.
> >>> >
> >>> >    What is displayed is always the natural ordering (internally it is
> >>> not the natural ordering).
> >>> >
> >>> > >  For -n 1, are the orderings the same?
> >>> >
> >>> >   yes
> >>> >
> >>> >
> >>> >
> >>> > >
> >>> > > If I use a MatStencil r to represent a field with 3 degrees of
> >>> freedom, and the dimensions of my 3D DMDA are MxNxP, which row of the
> >>> Jacobian corresponds to r.i=x, r.j=y, r.k=z, r.c=f?
> >>> >
> >>> > Internally it is complicated but for any viewing it is just the
> natural
> >>> ordering and all the degrees of freedom for a single point are next to
> each
> >>> other in the vector/matrix.
> >>> >
> >>> >
> >>>
> >>>
> >>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171002/2adf0ced/attachment.html>


More information about the petsc-users mailing list