[petsc-users] PetscDSSetJacobianPreconditioner causing DIVERGED_LINE_SEARCH for multi-field problem

Matthew Knepley knepley at gmail.com
Thu Mar 3 07:21:38 CST 2016


On Thu, Mar 3, 2016 at 6:20 AM, Sander Arens <Sander.Arens at ugent.be> wrote:

> Ok, I forgot to call SNESSetJacobian(snes, J, P, NULL, NULL) with J != P,
> which caused to write the mass matrix into the (otherwise zero) (1,1) block
> of the Jacobian and which was the reason for the linesearch to fail.
> However, after fixing that and trying to solve it with FieldSplit with LU
> factorization for the (0,0) block it failed because there were zero pivots
> for all rows.
>
> Anyway, I found out that attaching the mass matrix to the Lagrange
> multiplier field also worked.
>
> Another related question for my elasticity problem: after creating the
> rigid body modes with DMPlexCreateRigidBody and attaching it to the
> displacement field, does the matrix block size of the (0,0) block still
> have to be set for good performance with gamg? If so, how can I do this?
>

Yes, it should be enough to set the block size of the preconditioner matrix.

  Matt


> Thanks,
> Sander
>
> On 2 March 2016 at 12:25, Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Wed, Mar 2, 2016 at 5:13 AM, Sander Arens <Sander.Arens at ugent.be>
>> wrote:
>>
>>> Hi,
>>>
>>> I'm trying to set a mass matrix preconditioner for the Schur complement
>>> of an incompressible finite elasticity problem. I tried using the command
>>> PetscDSSetJacobianPreconditioner(prob, 1, 1, g0_pre_mass_pp, NULL, NULL,
>>> NULL) (field 1 is the Lagrange multiplier field).
>>> However, this causes a DIVERGED_LINE_SEARCH due to to Nan or Inf in the
>>> function evaluation after Newton iteration 1. (Btw, I'm using the next
>>> branch).
>>>
>>> Is this because I didn't use PetscDSSetJacobianPreconditioner for the
>>> other blocks (which uses the Jacobian itself for preconditioning)? If so,
>>> how can I tell Petsc to use the Jacobian for those blocks?
>>>
>>
>> 1) I put that code in very recently, and do not even have sufficient
>> test, so it may be buggy
>>
>> 2) If you are using FieldSplit, you can control which blocks come from A
>> and which come from the preconditioner P
>>
>>
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetDiagUseAmat.html#PCFieldSplitSetDiagUseAmat
>>
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetOffDiagUseAmat.html#PCFieldSplitSetOffDiagUseAmat
>>
>>
>>> I guess when using PetscDSSetJacobianPreconditioner the preconditioner
>>> is recomputed at every Newton step, so for a constant mass matrix this
>>> might not be ideal. How can I avoid recomputing this at every Newton
>>> iteration?
>>>
>>
>> Maybe we need another flag like
>>
>>
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESSetLagPreconditioner.html
>>
>> or we need to expand
>>
>>
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESSetLagJacobian.html
>>
>> to separately cover the preconditioner matrix. However, both matrices are
>> computed by one call so this would
>> involve interface changes to user code, which we do not like to do. Right
>> now it seems like a small optimization.
>> I would want to wait and see whether it would really be maningful.
>>
>>   Thanks,
>>
>>     Matt
>>
>>
>>> Thanks,
>>> Sander
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160303/ef997f11/attachment.html>


More information about the petsc-users mailing list