[petsc-users] Fwd: SNES with Fieldsplit + MatNest in parallel

Matthew Knepley knepley at gmail.com
Mon Nov 8 14:02:42 CST 2021


On Mon, Nov 8, 2021 at 11:00 AM barnafi <nabw91 at gmail.com> wrote:

> Hi PETSc users,
>
> I am testing a block preconditioner for a nonlinear problem using
> fenics for matrix assembly, and I obtain correcet results in serial
> only (parallel diverges). The difficulty lies in using Nest matrices,
> as I want a performant code... I already see astonishing gains in
> serial.
>
> The code seems simple enough, where I have a UFL form from which I
> obtain all different blocks. Then, I create a SNESSolver as follows:
>
>
> --------------- SNESSolver.py -----------------
> class SNESProblem():
>    def __init__(self, FFs, JJs, uu, pp, bbcs):
>        self.Ls = FFs # UFL form for residuals
>        self.Js = JJs # UFL form for jacobian
>        self.bcs = bbcs
>        self.bcs0 = [DirichletBC(_bc) for _bc in bbcs] # Homogenized
> boundary conditions for Newton iterations
>        for bc in self.bcs0:
>            bc.homogenize()
>        self.u = uu
>        self.p = pp
>
>    def assemble_F(self, FF):
>        Fu, Fp = FF.getNestSubVecs()
>        assemble(self.Ls[0], tensor=PETScVector(Fu))
>        assemble(self.Ls[1], tensor=PETScVector(Fp))
>        for bc in self.bcs0:
>            bc.apply(PETScVector(Fu))
>        Fu.assemble()
>        FF.assemble()
>
>    def assemble_J(self, JJ):
>        Juu = JJ.getNestSubMatrix(0, 0)
>        Jup = JJ.getNestSubMatrix(0, 1)
>        Jpu = JJ.getNestSubMatrix(1, 0)
>        Jpp = JJ.getNestSubMatrix(1, 1)
>        assemble(self.Js[0], tensor=PETScMatrix(Juu))
>        assemble(self.Js[1], tensor=PETScMatrix(Jup))
>        assemble(self.Js[2], tensor=PETScMatrix(Jpu))
>        assemble(self.Js[3], tensor=PETScMatrix(Jpp))
>        MM = Juu.getDiagonal().sum() / Juu.getSize()[0] # Better
> performance
>        for bc in self.bcs0:
>            indexes = [*bc.get_boundary_values().keys()]
>            Juu.zeroRows(indexes, diag=MM)
>            Jup.zeroRows(indexes, diag=0.0)
>        Juu.assemble()
>        Jup.assemble()
>        JJ.assemble()
>
>    def F(self, snes, xx, FF):
>
>        xu, xp = xx.getNestSubVecs()
>  # Update fenics vectors to keep track of ghost nodes
>        xu.copy(self.u.vector().vec())
>        xp.copy(self.p.vector().vec())
>        self.u.vector().apply("")
>        self.p.vector().apply("")
>        self.assemble_F(FF)
>
>    def J(self, snes, xx, JJ, PP):
>        xu, xp = xx.getNestSubVecs()
>        xu.copy(self.u.vector().vec())
>        xp.copy(self.p.vector().vec())
> # Update fenics vectors to keep track of ghost nodes
>        self.u.vector().apply("")
>        self.p.vector().apply("")
>        self.assemble_J(JJ)
> ----------- SNESSolver.py end -------------
>
> I then initialize the Jacobian matrices to create the Nest (no 'isrows'
> nor 'iscols') and then initialize the PC from the SNES with
>
> ------ PC init ------
> is_0, is_1 = J_mat.getNestISs()[0]
> ksp = snes.getKSP()
> pc = ksp.getPC()
> pc.setType("fieldsplit")
> pc.setFieldSplitIS(("0", is_0), ("1", is_1))
> ---------------------
>
> The current behavior is that in serial everything works as expected,
> and instead in parallel not even the boundary conditions are set up
> correctly. The two possible problems I see regard the ordering of the
> dofs, or the update of ghost nodes, but I have tried modifying the code
> to account for this without success.
>
> If anyone has had similar issues, I would appreciate any comments.
>

I am not adept enough at FEniCS to look at the code and debug. However,
I would make a very small problem for 2 processes, say two cells on each,
and then print out is_0 and is_1. That should tell you if you have the right
unknowns in each block. If so, then it seems like something is not correct
in
your assembly routine, which is probably better debugged on the FEniCS list.
However, if they are not right, we can help get them right.

  THanks,

    Matt


> Thanks in advance.
>
> Best,
> NB
>
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211108/283a0cd5/attachment-0001.html>


More information about the petsc-users mailing list