[petsc-users] Using matrix-free method for SNES problems with an explicit Jacobian
Matthew Knepley
knepley at gmail.com
Wed May 13 17:05:40 CDT 2015
On Wed, May 13, 2015 at 4:59 PM, Justin Chang <jychang48 at gmail.com> wrote:
> Okay that makes sense. Reason for asking this is because I am solving the
> following convex optimization problem for a linear diffusion problem:
>
> min 1/2 x^T * A * x + x^T * g
> s.t. 0 <= x <= 1
>
> Using Tao's BLMVM solver, the objective function and gradient vector reads
> as follows:
>
> f = 0.5*x^T*A*x + x^T*(r - A*x_0)
> g = A*(x - x_0) + r
>
> Where A, x_0, and r are the jacobian, initial guess, and residual
> respectively. To compute these, I have TaoSetObjectiveAndGradientRoutine()
> invoke the following function:
>
> #undef __FUNCT__
> #define __FUNCT__ "FormFunctionGradient"
> PetscErrorCode FormFunctionGradient(Tao tao, Vec x, PetscReal *f, Vec g,
> void *ctx)
> {
> AppCtx *user = (AppCtx*)ctx;
> PetscScalar xtHx;
> PetscErrorCode ierr;
>
> PetscFunctionBegin;
> ierr = MatMult(user->A,x,g);CHKERRQ(ierr);
> ierr = VecDot(x,g,&xtHx);CHKERRQ(ierr);
> ierr = VecDot(x,user->f,f);CHKERRQ(ierr);
> *f += 0.5*xtHx;
> ierr = VecAXPY(g,1.0,user->f);CHKERRQ(ierr);
> PetscFunctionReturn(0);
> }
>
> where user->f = (r - A*x_0) The timing results from -log_summary seem to
> indicate that MatMult(user->A,x,g) is a bottleneck within TaoSolve() so I
> was wondering if I could somehow make the computation of Vec g matrix-free
> considering that user->A is constant. The problem proliferates if I decide
> to use TRON. Or is this unavoidable (Jason any thoughts?)
>
If you have a first order discretization, its still better to form the
matrix since it uses fewer flops on repeated application.
Thanks,
Matt
> Thanks,
> Justin
>
> On Wed, May 13, 2015 at 3:15 PM, Matthew Knepley <knepley at gmail.com>
> wrote:
>
>> On Wed, May 13, 2015 at 2:54 PM, Justin Chang <jychang48 at gmail.com>
>> wrote:
>>
>>> Hello everyone,
>>>
>>> From what I read online, one could use a matrix-free method if using the
>>> conjugate gradient method for a symmetric and positive definite problem.
>>> Some of the PETSc SNES examples like ex12 compute the Jacobian A
>>> explicitly, but I was wondering if it's possible to convert this
>>> problem into matrix free form. That is, to directly assemble the product of
>>> A*x into vector form without first forming A and then invoking MatMult().
>>>
>>
>> Yes, Jed advocates this all the time, but there is some infrastructure
>> that you want for this
>> to be really effective. For quadratic and higher orders, it makes sense
>> to compute the action
>> of your operator MF. This is done by defining something that looks a lot
>> like your residual
>> evaluation (element vec in and out) but computes the linearization.
>> However, you still usually
>> need a low order explicit matrix for preconditioning. I have not spent
>> time hooking that all together.
>>
>> Thanks,
>>
>> Matt
>>
>>
>>> Thanks,
>>> Justin
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150513/305965f1/attachment-0001.html>
More information about the petsc-users
mailing list