[petsc-dev] Discussion about time-dependent optimization moved from PR
Matthew Knepley
knepley at gmail.com
Sun Oct 15 10:50:36 CDT 2017
On Sun, Oct 15, 2017 at 11:35 AM, Stefano Zampini <stefano.zampini at gmail.com
> wrote:
> If anyone wants to know what this discussion is about, see
> https://bitbucket.org/petsc/petsc/pull-requests/766/
> support-for-pde-constrained-optimization
>
> I'll try to summarize the interfaces here. Hong's code API is labeled with
> H, mine with S.
>
> Both methods support cost functions (i.e. objectives functions given by a
> time integral): H TSSetCostIntegrand(), S TSSetObjective()
> With TSSetCostIntegrand you set a single function that computes numcost
> cost integral: TSSetObjective instead append to a list.
> The prototype for the function evaluation is similar, expect that I also
> carry over a vector which stores the current values of the parameters.
>
As for parameter descriptions, I think these, just like the state
variables, should be described by a DM. This allows us a uniform interface
for parallel update, output and visualization, and formation of subproblems
to update the parameters themselves.
> Point-form functionals (i.e., objective functions that are not integrated
> over time, but just sampled at a given time) drive the initialization of
> the adjoint variables, and they are not supported explicitly in Hong's
> code. S: TSSetObjective()
>
Should these just be handled by a particular quadrature in the cost
integral, and by this I mean functionals are specified by quadratures.
> Both methods need the Jacobian of the DAE wrt the parameters: H
> TSAdjointSetRHSJacobian(), S TSSetGradientDAE()
>
Here I understand Stefano to be using DAE in the following sense: Suppose
my continuous problem is a Partial Differential-Algebraic Equation (PDAE).
Then after discretizing in space, which I would call Method of Lines, I am
left with a Differential-Algebraic Equation (DAE) in time.
> Initial condition dependence on the parameters is implicitly computed in
> Hong's code (limited to linear dependence on all the variables); instead I
> have TSSetGradientIC which is a general way to express initial condition
> dependence by an implicit function.
>
> I'm not very familiar with the TSForward interface, Hong can elaborate
> more. But my gut feeling is that the public API for cost functions is
> a duplicate of the one used by TSAdjoint. TLMTS (which is my TS that solves
> the tangent linear model), reuses the callbacks set by TSSetGradientDAE and
> TSSetGradientIC. Hong, do you also need to integrate some quadrature
> variable in your TSForward code?
>
For people like me who do not have all the terminology down, is the Tangent
Linear Model (TLM) just the forward sensitivities?
Matt
>
> 2017-10-15 18:14 GMT+03:00 Matthew Knepley <knepley at gmail.com>:
>
>> Someone had to do it.
>>
>> I will not try to frame the entire discussion. Barry has already thrown
>> down the "show me your interface" gauntlet. However, I want to emphasize
>> one point that may have been lost in the prior discussion. Every example I
>> have looked at so far is focused on the reduced space formulation of the
>> optimization problem. However, I am interested in the full space
>> formulation so that I can do multigrid on the entire optimal control
>> problem. This is not a new idea, in particular Borzi does this in SIAM
>> Review in 2009. I think we have a tremendous opportunity here since other
>> codes cannot do this, it has the potential (I think) for much better
>> globalization, and perhaps can be faster.
>>
>> So, when we come up with interface proposals, I think we should keep a
>> full space solution method in mind.
>>
>> Thanks,
>>
>> Matt
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
>>
>
>
>
> --
> Stefano
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20171015/9d899462/attachment.html>
More information about the petsc-dev
mailing list