[petsc-dev] Discussion about time-dependent optimization moved from PR
Jed Brown
jed at jedbrown.org
Sun Oct 15 12:59:46 CDT 2017
Matthew Knepley <knepley at gmail.com> writes:
> On Sun, Oct 15, 2017 at 11:35 AM, Stefano Zampini <stefano.zampini at gmail.com
>> wrote:
>
>> If anyone wants to know what this discussion is about, see
>> https://bitbucket.org/petsc/petsc/pull-requests/766/
>> support-for-pde-constrained-optimization
>>
>> I'll try to summarize the interfaces here. Hong's code API is labeled with
>> H, mine with S.
>>
>> Both methods support cost functions (i.e. objectives functions given by a
>> time integral): H TSSetCostIntegrand(), S TSSetObjective()
>> With TSSetCostIntegrand you set a single function that computes numcost
>> cost integral: TSSetObjective instead append to a list.
>> The prototype for the function evaluation is similar, expect that I also
>> carry over a vector which stores the current values of the parameters.
>>
>
> As for parameter descriptions, I think these, just like the state
> variables, should be described by a DM. This allows us a uniform interface
> for parallel update, output and visualization, and formation of subproblems
> to update the parameters themselves.
In many cases this would be a DMRedundant. This can be made to work,
but if it is to be provided as an argument in the TS callback, that
change would also need to be made in the forward interfaces. Otherwise
(as currently), parameters just reside in the context and user functions
are responsible for getting it from their context.
>> Point-form functionals (i.e., objective functions that are not integrated
>> over time, but just sampled at a given time) drive the initialization of
>> the adjoint variables, and they are not supported explicitly in Hong's
>> code. S: TSSetObjective()
>>
>
> Should these just be handled by a particular quadrature in the cost
> integral, and by this I mean functionals are specified by quadratures.
Yeah, but they are sparse quadratures. That's sort of how Stefano has
implemented it, though the concept should be generalized and
fixtime=PETSC_MIN_REAL is not a nice way to denote an integral.
>> Both methods need the Jacobian of the DAE wrt the parameters: H
>> TSAdjointSetRHSJacobian(), S TSSetGradientDAE()
>>
>
> Here I understand Stefano to be using DAE in the following sense: Suppose
> my continuous problem is a Partial Differential-Algebraic Equation (PDAE).
> Then after discretizing in space, which I would call Method of Lines, I am
> left with a Differential-Algebraic Equation (DAE) in time.
Yeah, but the method isn't very useful in that context. Fortunately,
TSCreateAdjointTS() is an interface that makes it easy to replace the
adjoint of the spatially-discretized operator with the discretization of
the adjoint (i.e., call TSSetComputeRHSJacobian(tsadj, ...)).
>> Initial condition dependence on the parameters is implicitly computed in
>> Hong's code (limited to linear dependence on all the variables); instead I
>> have TSSetGradientIC which is a general way to express initial condition
>> dependence by an implicit function.
>>
>> I'm not very familiar with the TSForward interface, Hong can elaborate
>> more. But my gut feeling is that the public API for cost functions is
>> a duplicate of the one used by TSAdjoint. TLMTS (which is my TS that solves
>> the tangent linear model), reuses the callbacks set by TSSetGradientDAE and
>> TSSetGradientIC. Hong, do you also need to integrate some quadrature
>> variable in your TSForward code?
>>
>
> For people like me who do not have all the terminology down, is the Tangent
> Linear Model (TLM) just the forward sensitivities?
Yes.
More information about the petsc-dev
mailing list