[petsc-dev] DM not propagating into Fieldsplit when using external DMShell?

Matthew Knepley knepley at gmail.com
Mon Sep 24 14:31:31 CDT 2018


On Mon, Sep 24, 2018 at 2:48 PM Boris Boutkov <borisbou at buffalo.edu> wrote:

> Hello all,
>
> I am trying to solve a Stokes type problem using PETSc's Fieldsplit
> infrastructure while providing external GMG information coming from libMesh
> and am running into issues when using the two simultaneously.
>
> I have verified that FS + MG behaves as expected on its own, (I can use
> -pc_type gamg on the velocity block just fine), and I also checked that the
> construction of the external DMShell hierarchy also behaves as expected
> when using -pc_type mg in the absence of fieldsplit; yet when I combine the
> two I get errors indicating that "This DM cannot coarsen" in dm.c.
>
> After a little digging it appeared to me that during
> PCApply_FieldSplit_Schur the DM which I provided to SNES that is attached
> to the PC isn't propagating through to the sub PC/KSPs because while
> inspecting the pc->dm data I could see my coarsen/refine hooks which I
> provided, while the sub PCs and KSPs seemed to lack those pointers.
>

1) PCApply is when the action of the preconditioner is computed. This is
not the right place for propagating DMs.

2) Could you be more specific about what should be different? It sounds
like you want "your" DM attached to the (0,0) block so that
    you can do GMG on the Laplacian there.

3) We do not actually attach the DM proper. Rather DMCreateSubDM() is
called to create a new DM with only the fields in the 0 block present. It is
    likely that your DMShell does not implement this, and so something
default is being done which does not work.

   Here is the call:


https://bitbucket.org/petsc/petsc/src/c793126dad5dfafbcc0ba07afda1a0495448c20f/src/ksp/pc/impls/fieldsplit/fieldsplit.c#lines-351

  and this is the function that eventually gets called for you:


https://bitbucket.org/petsc/petsc/src/c793126dad5dfafbcc0ba07afda1a0495448c20f/src/dm/interface/dmi.c#lines-85

What is not getting propagated properly for you?

  Thanks,

     Matt


> I tried a naive fix by preempting the schurfactorization switch in
> PCApply_FieldSplit_Schur to try and extract the DM I provided and push it
> through to the other Schur components ala:
>
> ierr = PCGetDM(pc,&dm);CHKERRQ(ierr);
> ierr = PCSetDM(kspA->pc,dm);CHKERRQ(ierr);
> ierr = KSPSetDM(kspA,dm);CHKERRQ(ierr);
>
> and likewise for the Upper and Lower PC/KSPs, but with such a change I
> still seem to lose the contexts which I attached to my DMs during the GMG
> setup phase, which in turn prevents PCSetUp_MG from completing successfully.
>
> As such, I'm wondering if there are any additional setup steps which I am
> forgetting to take into account here, or if there's something else I could
> try to accommodate this solver configuration.
>
> Thanks as always for any assistance,
>
> - Boris Boutkov
>
>
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20180924/96764bfc/attachment.html>


More information about the petsc-dev mailing list