[petsc-dev] DM not propagating into Fieldsplit when using external DMShell?
borisbou at buffalo.edu
Mon Sep 24 13:45:49 CDT 2018
I am trying to solve a Stokes type problem using PETSc's Fieldsplit
infrastructure while providing external GMG information coming from libMesh
and am running into issues when using the two simultaneously.
I have verified that FS + MG behaves as expected on its own, (I can use
-pc_type gamg on the velocity block just fine), and I also checked that the
construction of the external DMShell hierarchy also behaves as expected
when using -pc_type mg in the absence of fieldsplit; yet when I combine the
two I get errors indicating that "This DM cannot coarsen" in dm.c.
After a little digging it appeared to me that during
PCApply_FieldSplit_Schur the DM which I provided to SNES that is attached
to the PC isn't propagating through to the sub PC/KSPs because while
inspecting the pc->dm data I could see my coarsen/refine hooks which I
provided, while the sub PCs and KSPs seemed to lack those pointers.
I tried a naive fix by preempting the schurfactorization switch in
PCApply_FieldSplit_Schur to try and extract the DM I provided and push it
through to the other Schur components ala:
ierr = PCGetDM(pc,&dm);CHKERRQ(ierr);
ierr = PCSetDM(kspA->pc,dm);CHKERRQ(ierr);
ierr = KSPSetDM(kspA,dm);CHKERRQ(ierr);
and likewise for the Upper and Lower PC/KSPs, but with such a change I
still seem to lose the contexts which I attached to my DMs during the GMG
setup phase, which in turn prevents PCSetUp_MG from completing successfully.
As such, I'm wondering if there are any additional setup steps which I am
forgetting to take into account here, or if there's something else I could
try to accommodate this solver configuration.
Thanks as always for any assistance,
- Boris Boutkov
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the petsc-dev