[petsc-users] SetVariableBounds vs ComputeVariableBounds

Justin Chang jychang48 at gmail.com
Mon Jun 27 17:45:18 CDT 2016


Thanks all,

Btw, does Tao's Hessian evaluation routines also "cheat" the way the
Jacobian routines do? Or is it fine to supply the Hessian only once (assume
it is independent of the solution)?

Thanks,
Justin

On Monday, June 27, 2016, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>    There is the same issue with ODE integrators for linear problems. The
> solvers tromp on the Jacobian.
>
>    We should actually add an error indicator in these TAO/TS solvers, if
> the "Jacobian" state value is not increased in the next time step/iteration
> this means the person did not supply the new Jacobian (in other words the
> Jacobian is still whatever it was tromped to) so the solver should error
> out and tell the user their mistake.
>
>
>   Barry
>
>
>
> > On Jun 27, 2016, at 1:41 PM, Munson, Todd <tmunson at mcs.anl.gov
> <javascript:;>> wrote:
> >
> >
> > Hi Justin,
> >
> > I will have to look regarding the TAO semismooth solvers.  The TAO
> > solvers probably "cheated" and modified the Jacobian matrix rather
> > than extracting submatrices and shifting the diagonal or using a
> > matrix-free version.
> >
> > Note: the TAO interior-point and semismooth methods start from an element
> > of the generalized Jacobian matrix for a semismooth reformulation of
> > the VI problem.  This generalization Jacobian is a diagonal
> > perturbation to a scale version of the Jacobian you input.
> > If we cheat and modify the matrix, then it needs to be
> > filled back in during a Jacobian evaluation.
> >
> > At some point, the plan was to move all the VI methods into PETSc proper,
> > but we may have stopped with the active-set (ASLS) method because that
> > tends to have the best performance for PDE-related problems.
> >
> > Todd.
> >
> >> On Jun 27, 2016, at 12:37 PM, Justin Chang <jychang48 at gmail.com
> <javascript:;>> wrote:
> >>
> >> So I figured it out. I had to explicitly form the Tao
> Gradient/Constraints and Jacobian. I couldn't just "pre-process" the
> gradient Vec and Jacobian Mat through SNESComputeXXX. Attached is the
> updated file and makefile.
> >>
> >> My question now is, why exactly is this the case? This preprocessing
> strategy seemed to work for bounded constrained optimization solvers (e.g.,
> TRON/BLMVM) but apparently not with the MCP ones. The system is linear so
> my original reasoning was that the Jacobian shouldn't change, thus it just
> needs to be assembled once. I recall faintly from a previous discussion
> that the SNESVI solvers will vary the Mat and Vec sizes depending on the
> regions that need to be "constrained" or something?
> >>
> >> Thanks,
> >> Justin
> >>
> >> On Sun, Jun 26, 2016 at 5:03 AM, Barry Smith <bsmith at mcs.anl.gov
> <javascript:;>> wrote:
> >>
> >>  I wish I could answer this but I am weak on these algorithms.
> Hopefully Todd has a good understanding of their application and strengths
> and weaknesses.
> >>
> >>  Barry
> >>
> >>> On Jun 25, 2016, at 3:31 PM, Justin Chang <jychang48 at gmail.com
> <javascript:;>> wrote:
> >>>
> >>> Hi all,
> >>>
> >>> So I modified SNES ex9.c so that one has the option to use TAO's
> complementarity solvers for this problem. Attached is the file.
> >>>
> >>> I expect the TAO solvers to behave the same as the SNESVI ones, but I
> am having the same issues as before - SSILS and SSFLS do not work
> whatsoever but for some reason ASILS and ASFLS work. Although the latter
> two produce the same results as the SNES VI counterparts, they converge
> much slower, and something tells me I am not doing something correctly.
> Based on what I have seen from the two TAO complementarity examples, I
> would also expect the AS and SS solvers to be roughly the same.
> >>>
> >>> BTW, in the modified code, I made some "shortcuts." Instead of
> explicitly forming the Tao versions of the Gradient and Jacobian, I first
> assemble the residual r and Jacobian J through the SNESComputeXXX
> functions. Then I pass them into the TaoSetConstraints and TaoSetJacobian
> routines. Because this is a linear system, I have:
> >>>
> >>> f = r - J*u^0
> >>> gradient g = J*u - f = J*(u + *u^0) + r
> >>>
> >>> were u^0 is the initial vector. I am not sure if this "shortcut" has
> anything to do with the issue at hand. Attached is the makefile which has
> instructions on how to run the problem.
> >>>
> >>> Any ideas what is going on??
> >>>
> >>> Thanks!
> >>> Justin
> >>>
> >>> On Wed, Jun 22, 2016 at 9:42 PM, Ed Bueler <elbueler at alaska.edu
> <javascript:;>> wrote:
> >>> Justin --
> >>>
> >>> Yeah, good point.  SNESVISetVariableBounds() works fine, at least in
> ex9.c (see attached patch).  The reason for the other choice, which I found
> in my 5 year old email, was some bug in petsc3.2.
> >>>
> >>> Ed
> >>>
> >>> Date: Wed, 22 Jun 2016 08:42:33 +0100
> >>> From: Justin Chang <jychang48 at gmail.com <javascript:;>>
> >>> To: petsc-users <petsc-users at mcs.anl.gov <javascript:;>>
> >>> Subject: [petsc-users] SetVariableBounds vs ComputeVariableBounds
> >>>
> >>> Hi all,
> >>>
> >>> I am looking at the SNES tutorials ex9.c and ex58.c and am wondering
> why
> >>> SNESVISetComputeVariableBounds() is called instead of just
> >>> SNESVISetVariableBounds(). When would it be appropriate to use only
> using
> >>> the latter?
> >>>
> >>> Thanks,
> >>> Justin
> >>>
> >>> <ex9_TAO.c><makefile>
> >>
> >>
> >> <ex9_TAO.c><makefile>
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160628/4c972475/attachment.html>


More information about the petsc-users mailing list