[petsc-users] MatNest with Shell blocks for multipysics

Jed Brown jed at jedbrown.org
Thu Jul 1 10:52:07 CDT 2021


Matteo Semplice <matteo.semplice at uninsubria.it> writes:

> Hi.
>
> We are designing a PETSc application that will employ a SNES solver on a 
> multiphysics problem whose jacobian will have a 2x2 block form, say 
> A=[A00,A01;A10,A11]. We already have code for the top left block A_00 (a 
> MatShell and a related Shell preconditioner) that we wish to reuse. We 
> could implement the other blocks as Shells or assembled matrices. We'd 
> like also to compare our method with existing ones, so we'd like to be 
> quite flexible in the choice of KSP and PC within the SNES. (To this 
> end, implementing an assembled version of the A00 and the other blocks 
> would be easy)
>
> I am assuming that, in order to have one or more shell blocks, the full 
> jacobian should be a nested matrix, and I am wondering what is the best 
> way to design the code.
>
> We are going to use DMDA's to manipulate Vecs for both variable sets, so 
> the DMComposite approach of 
> https://www.mcs.anl.gov/petsc/petsc-current/src/snes/tutorials/ex28.c.html 
> is intriguing, but I have read in the comments that it has issues with 
> MatNest type.

I think ex28 is better organization of code. You can DMCreateMatrix() and then set types/preallocation for off-diagonal blocks of the MatNest. I think the comment is unclear and not quite what was intended and originally worked (which was to assemble the off-diagonal blocks despite bad preallocation).

https://gitlab.com/petsc/petsc/-/commit/6bdeb4dbc27a59cf9af4930e08bd1f9937e47c2d

https://www.mcs.anl.gov/petsc/petsc-current/src/snes/tutorials/ex28.c.html#line410


Note that if you're using DMDA and have collocated fields, you can skip all this complexity. And if you have a scattered discretization, consider DMStag. ex28 is showing how to solve a coupled problem where there is no suitable structure to convey the relation between discretizations.

> My next guess would be to create the four submatrices ahead and then 
> insert them in a MatNest, like in the Stokes example of 
> https://www.mcs.anl.gov/petsc/petsc-current/src/snes/tutorials/ex70.c.html. 
> However, in order to have shell blocks I guess it is almost mandatory to 
> have the matrix partitioned among cpus as the Vecs are and I don't 
> understand how Vecs end up being partitioned in ex70.


More information about the petsc-users mailing list