[petsc-dev] Discussion about time-dependent optimization moved from PR
Jed Brown
jed at jedbrown.org
Sun Oct 15 12:40:44 CDT 2017
Stefano Zampini <stefano.zampini at gmail.com> writes:
> If anyone wants to know what this discussion is about, see
> https://bitbucket.org/petsc/petsc/pull-requests/766/support-for-pde-constrained-optimization
>
> I'll try to summarize the interfaces here. Hong's code API is labeled with
> H, mine with S.
>
> Both methods support cost functions (i.e. objectives functions given by a
> time integral): H TSSetCostIntegrand(), S TSSetObjective()
> With TSSetCostIntegrand you set a single function that computes numcost
> cost integral: TSSetObjective instead append to a list.
It'd have to be TSAddObjective if you're building a list. Convention is
that Set replaces what might have been there before.
In case of multiple objectives, there may be a performance reason to
amortize evaluation of several at once, though the list interface is
convenient. Consider common objectives being quantities like lift and
drag on different surfaces of a fluids simulation or stress/strain at
certain critical joints in a structure. Although these have some
locality, it's reasonable to assume that state dependence will have
quickly become global, thus make no attempt to handle sparse
representations of the adjoint vectors lambda.
> The prototype for the function evaluation is similar, expect that I also
> carry over a vector which stores the current values of the parameters.
>
> Point-form functionals (i.e., objective functions that are not integrated
> over time, but just sampled at a given time) drive the initialization of
> the adjoint variables, and they are not supported explicitly in Hong's
> code. S: TSSetObjective()
How are parameters accessed in TSComputeRHSFunction? It looks like
they're coming out of the context. Why should this be different? (If
parameters need to go into a Vec, we could do that, but it comes at a
readability and possibly parallel cost if the global Vec needs to be
communicated to local vectors.)
> Both methods need the Jacobian of the DAE wrt the parameters: H
> TSAdjointSetRHSJacobian(), S TSSetGradientDAE()
>
> Initial condition dependence on the parameters is implicitly computed in
> Hong's code (limited to linear dependence on all the variables);
How so? Once the user gets \lambda(time=0), they can apply the chain
rule to produce any dependency on the parameter vector?
> instead I have TSSetGradientIC which is a general way to express
> initial condition dependence by an implicit function.
>
> I'm not very familiar with the TSForward interface, Hong can elaborate
> more. But my gut feeling is that the public API for cost functions is
> a duplicate of the one used by TSAdjoint. TLMTS (which is my TS that solves
> the tangent linear model), reuses the callbacks set by TSSetGradientDAE and
> TSSetGradientIC. Hong, do you also need to integrate some quadrature
> variable in your TSForward code?
More information about the petsc-dev
mailing list