[petsc-users] PetscDSSetJacobianPreconditioner causing DIVERGED_LINE_SEARCH for multi-field problem
Sander Arens
Sander.Arens at UGent.be
Mon Mar 21 03:49:28 CDT 2016
Matt,
I think the problem is in DMCreateSubDM_Section_Private. The block size can
be counted there (very similar as in DMCreateMatrix_Plex) and then passed
down to the IS. I can create a pull request for this if you'd like (or
should I just add this to my other pull request for the Neumann bc's)?
On 3 March 2016 at 18:02, Sander Arens <Sander.Arens at ugent.be> wrote:
> Ok, thanks for clearing that up!
>
> On 3 March 2016 at 18:00, Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Thu, Mar 3, 2016 at 10:48 AM, Sander Arens <Sander.Arens at ugent.be>
>> wrote:
>>
>>> So is the block size here refering to the dimension of the
>>> near-nullspace or has it something to do with the coarsening?
>>>
>>> I would have thought to also see the block size in this part:
>>>
>>
>> Dang, that is a problem. It should have the correct block size when FS
>> pulls it out of the overall matrix. This should be
>> passed down using the block size of the IS. Something has been broken
>> here. I will put it on my list of things to look at.
>>
>> Thanks,
>>
>> Matt
>>
>>
>>> Mat Object: 1 MPI processes
>>> type: seqaij
>>> rows=1701, cols=1701
>>> total: nonzeros=109359, allocated nonzeros=109359
>>> total number of mallocs used during MatSetValues calls
>>> =0
>>> has attached near null space
>>> using I-node routines: found 567 nodes, limit used
>>> is 5
>>>
>>> Thanks,
>>> Sander
>>>
>>>
>>> On 3 March 2016 at 17:29, Matthew Knepley <knepley at gmail.com> wrote:
>>>
>>>> You can see the block size in the output
>>>>
>>>> Mat Object: 1 MPI processes
>>>> type: seqaij
>>>> rows=12, cols=12, bs=6
>>>> total: nonzeros=144, allocated nonzeros=144
>>>> total number of mallocs used during MatSetValues
>>>> calls =0
>>>> using I-node routines: found 3 nodes, limit used
>>>> is 5
>>>> linear system matrix = precond matrix:
>>>> Mat Object: 1 MPI processes
>>>> type: seqaij
>>>> rows=12, cols=12, bs=6
>>>> total: nonzeros=144, allocated nonzeros=144
>>>> total number of mallocs used during MatSetValues calls
>>>> =0
>>>> using I-node routines: found 3 nodes, limit used is 5
>>>>
>>>> Thanks,
>>>>
>>>> Matt
>>>>
>>>> On Thu, Mar 3, 2016 at 10:28 AM, Sander Arens <Sander.Arens at ugent.be>
>>>> wrote:
>>>>
>>>>> Sure, here it is.
>>>>>
>>>>> Thanks,
>>>>> Sander
>>>>>
>>>>> On 3 March 2016 at 15:33, Matthew Knepley <knepley at gmail.com> wrote:
>>>>>
>>>>>> On Thu, Mar 3, 2016 at 8:05 AM, Sander Arens <Sander.Arens at ugent.be>
>>>>>> wrote:
>>>>>>
>>>>>>> Yes, the matrix is created by one of the assembly functions in Plex,
>>>>>>> so that answers my question. I was confused by this because I couldn't see
>>>>>>> this information when using -snes_view.
>>>>>>>
>>>>>>
>>>>>> Hmm, then something is wrong because the block size should be printed
>>>>>> for the matrix at the end of the solver in -snes_view, Can you
>>>>>> send that to me?
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Matt
>>>>>>
>>>>>>
>>>>>>> Thanks for the helpful replies,
>>>>>>> Sander
>>>>>>>
>>>>>>> On 3 March 2016 at 14:52, Matthew Knepley <knepley at gmail.com> wrote:
>>>>>>>
>>>>>>>> On Thu, Mar 3, 2016 at 7:49 AM, Sander Arens <Sander.Arens at ugent.be
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> And how can I do this? Because when I look at all the options with
>>>>>>>>> -help I can strangely enough only find -fieldsplit_pressure_mat_block_size
>>>>>>>>> and not -fieldsplit_displacement_mat_block_size.
>>>>>>>>>
>>>>>>>>
>>>>>>>> Maybe I am missing something here. The matrix from which you
>>>>>>>> calculate the preconditioner using GAMG must be created somewhere. Why
>>>>>>>> is the block size not specified at creation time? If the matrix is
>>>>>>>> created by one of the assembly functions in Plex, and the PetscSection has
>>>>>>>> a number of field components, then the block size will already be
>>>>>>>> set.
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>>
>>>>>>>> Matt
>>>>>>>>
>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>> Sander
>>>>>>>>>
>>>>>>>>> On 3 March 2016 at 14:21, Matthew Knepley <knepley at gmail.com>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> On Thu, Mar 3, 2016 at 6:20 AM, Sander Arens <
>>>>>>>>>> Sander.Arens at ugent.be> wrote:
>>>>>>>>>>
>>>>>>>>>>> Ok, I forgot to call SNESSetJacobian(snes, J, P, NULL, NULL)
>>>>>>>>>>> with J != P, which caused to write the mass matrix into the (otherwise
>>>>>>>>>>> zero) (1,1) block of the Jacobian and which was the reason for the
>>>>>>>>>>> linesearch to fail.
>>>>>>>>>>> However, after fixing that and trying to solve it with
>>>>>>>>>>> FieldSplit with LU factorization for the (0,0) block it failed because
>>>>>>>>>>> there were zero pivots for all rows.
>>>>>>>>>>>
>>>>>>>>>>> Anyway, I found out that attaching the mass matrix to the
>>>>>>>>>>> Lagrange multiplier field also worked.
>>>>>>>>>>>
>>>>>>>>>>> Another related question for my elasticity problem: after
>>>>>>>>>>> creating the rigid body modes with DMPlexCreateRigidBody and attaching it
>>>>>>>>>>> to the displacement field, does the matrix block size of the (0,0) block
>>>>>>>>>>> still have to be set for good performance with gamg? If so, how can I do
>>>>>>>>>>> this?
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Yes, it should be enough to set the block size of the
>>>>>>>>>> preconditioner matrix.
>>>>>>>>>>
>>>>>>>>>> Matt
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> Thanks,
>>>>>>>>>>> Sander
>>>>>>>>>>>
>>>>>>>>>>> On 2 March 2016 at 12:25, Matthew Knepley <knepley at gmail.com>
>>>>>>>>>>> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> On Wed, Mar 2, 2016 at 5:13 AM, Sander Arens <
>>>>>>>>>>>> Sander.Arens at ugent.be> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>
>>>>>>>>>>>>> I'm trying to set a mass matrix preconditioner for the Schur
>>>>>>>>>>>>> complement of an incompressible finite elasticity problem. I tried using
>>>>>>>>>>>>> the command PetscDSSetJacobianPreconditioner(prob, 1, 1, g0_pre_mass_pp,
>>>>>>>>>>>>> NULL, NULL, NULL) (field 1 is the Lagrange multiplier field).
>>>>>>>>>>>>> However, this causes a DIVERGED_LINE_SEARCH due to to Nan or
>>>>>>>>>>>>> Inf in the function evaluation after Newton iteration 1. (Btw, I'm using
>>>>>>>>>>>>> the next branch).
>>>>>>>>>>>>>
>>>>>>>>>>>>> Is this because I didn't use PetscDSSetJacobianPreconditioner
>>>>>>>>>>>>> for the other blocks (which uses the Jacobian itself for preconditioning)?
>>>>>>>>>>>>> If so, how can I tell Petsc to use the Jacobian for those blocks?
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> 1) I put that code in very recently, and do not even have
>>>>>>>>>>>> sufficient test, so it may be buggy
>>>>>>>>>>>>
>>>>>>>>>>>> 2) If you are using FieldSplit, you can control which blocks
>>>>>>>>>>>> come from A and which come from the preconditioner P
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetDiagUseAmat.html#PCFieldSplitSetDiagUseAmat
>>>>>>>>>>>>
>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetOffDiagUseAmat.html#PCFieldSplitSetOffDiagUseAmat
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>> I guess when using PetscDSSetJacobianPreconditioner the
>>>>>>>>>>>>> preconditioner is recomputed at every Newton step, so for a constant mass
>>>>>>>>>>>>> matrix this might not be ideal. How can I avoid recomputing this at every
>>>>>>>>>>>>> Newton iteration?
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> Maybe we need another flag like
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESSetLagPreconditioner.html
>>>>>>>>>>>>
>>>>>>>>>>>> or we need to expand
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESSetLagJacobian.html
>>>>>>>>>>>>
>>>>>>>>>>>> to separately cover the preconditioner matrix. However, both
>>>>>>>>>>>> matrices are computed by one call so this would
>>>>>>>>>>>> involve interface changes to user code, which we do not like to
>>>>>>>>>>>> do. Right now it seems like a small optimization.
>>>>>>>>>>>> I would want to wait and see whether it would really be
>>>>>>>>>>>> maningful.
>>>>>>>>>>>>
>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>
>>>>>>>>>>>> Matt
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>> Sander
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> --
>>>>>>>>>>>> What most experimenters take for granted before they begin
>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which
>>>>>>>>>>>> their experiments lead.
>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>>>> experiments lead.
>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>> experiments lead.
>>>>>>>> -- Norbert Wiener
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their
>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>> experiments lead.
>>>>>> -- Norbert Wiener
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160321/12c8ab78/attachment-0001.html>
More information about the petsc-users
mailing list