[petsc-users] Fwd: SNES with Fieldsplit + MatNest in parallel
barnafi
nabw91 at gmail.com
Mon Nov 8 09:59:35 CST 2021
Hi PETSc users,
I am testing a block preconditioner for a nonlinear problem using
fenics for matrix assembly, and I obtain correcet results in serial
only (parallel diverges). The difficulty lies in using Nest matrices,
as I want a performant code... I already see astonishing gains in
serial.
The code seems simple enough, where I have a UFL form from which I
obtain all different blocks. Then, I create a SNESSolver as follows:
--------------- SNESSolver.py -----------------
class SNESProblem():
def __init__(self, FFs, JJs, uu, pp, bbcs):
self.Ls = FFs # UFL form for residuals
self.Js = JJs # UFL form for jacobian
self.bcs = bbcs
self.bcs0 = [DirichletBC(_bc) for _bc in bbcs] # Homogenized
boundary conditions for Newton iterations
for bc in self.bcs0:
bc.homogenize()
self.u = uu
self.p = pp
def assemble_F(self, FF):
Fu, Fp = FF.getNestSubVecs()
assemble(self.Ls[0], tensor=PETScVector(Fu))
assemble(self.Ls[1], tensor=PETScVector(Fp))
for bc in self.bcs0:
bc.apply(PETScVector(Fu))
Fu.assemble()
FF.assemble()
def assemble_J(self, JJ):
Juu = JJ.getNestSubMatrix(0, 0)
Jup = JJ.getNestSubMatrix(0, 1)
Jpu = JJ.getNestSubMatrix(1, 0)
Jpp = JJ.getNestSubMatrix(1, 1)
assemble(self.Js[0], tensor=PETScMatrix(Juu))
assemble(self.Js[1], tensor=PETScMatrix(Jup))
assemble(self.Js[2], tensor=PETScMatrix(Jpu))
assemble(self.Js[3], tensor=PETScMatrix(Jpp))
MM = Juu.getDiagonal().sum() / Juu.getSize()[0] # Better
performance
for bc in self.bcs0:
indexes = [*bc.get_boundary_values().keys()]
Juu.zeroRows(indexes, diag=MM)
Jup.zeroRows(indexes, diag=0.0)
Juu.assemble()
Jup.assemble()
JJ.assemble()
def F(self, snes, xx, FF):
xu, xp = xx.getNestSubVecs()
# Update fenics vectors to keep track of ghost nodes
xu.copy(self.u.vector().vec())
xp.copy(self.p.vector().vec())
self.u.vector().apply("")
self.p.vector().apply("")
self.assemble_F(FF)
def J(self, snes, xx, JJ, PP):
xu, xp = xx.getNestSubVecs()
xu.copy(self.u.vector().vec())
xp.copy(self.p.vector().vec())
# Update fenics vectors to keep track of ghost nodes
self.u.vector().apply("")
self.p.vector().apply("")
self.assemble_J(JJ)
----------- SNESSolver.py end -------------
I then initialize the Jacobian matrices to create the Nest (no 'isrows'
nor 'iscols') and then initialize the PC from the SNES with
------ PC init ------
is_0, is_1 = J_mat.getNestISs()[0]
ksp = snes.getKSP()
pc = ksp.getPC()
pc.setType("fieldsplit")
pc.setFieldSplitIS(("0", is_0), ("1", is_1))
---------------------
The current behavior is that in serial everything works as expected,
and instead in parallel not even the boundary conditions are set up
correctly. The two possible problems I see regard the ordering of the
dofs, or the update of ghost nodes, but I have tried modifying the code
to account for this without success.
If anyone has had similar issues, I would appreciate any comments.
Thanks in advance.
Best,
NB
More information about the petsc-users
mailing list