[petsc-users] MatNest with Shell blocks for multipysics

Matteo Semplice matteo.semplice at uninsubria.it
Thu Jul 1 03:04:26 CDT 2021


Hi.

We are designing a PETSc application that will employ a SNES solver on a 
multiphysics problem whose jacobian will have a 2x2 block form, say 
A=[A00,A01;A10,A11]. We already have code for the top left block A_00 (a 
MatShell and a related Shell preconditioner) that we wish to reuse. We 
could implement the other blocks as Shells or assembled matrices. We'd 
like also to compare our method with existing ones, so we'd like to be 
quite flexible in the choice of KSP and PC within the SNES. (To this 
end, implementing an assembled version of the A00 and the other blocks 
would be easy)

I am assuming that, in order to have one or more shell blocks, the full 
jacobian should be a nested matrix, and I am wondering what is the best 
way to design the code.

We are going to use DMDA's to manipulate Vecs for both variable sets, so 
the DMComposite approach of 
https://www.mcs.anl.gov/petsc/petsc-current/src/snes/tutorials/ex28.c.html 
is intriguing, but I have read in the comments that it has issues with 
MatNest type.

My next guess would be to create the four submatrices ahead and then 
insert them in a MatNest, like in the Stokes example of 
https://www.mcs.anl.gov/petsc/petsc-current/src/snes/tutorials/ex70.c.html. 
However, in order to have shell blocks I guess it is almost mandatory to 
have the matrix partitioned among cpus as the Vecs are and I don't 
understand how Vecs end up being partitioned in ex70.

We could

- create a DMComposite and create the Vecs with it

- get the local sizes of the Vecs and subVecs for the two variable groups

- create the matrix as in ex70, using the shell type where/when needed, 
but instead of

      MatSetSizes(matblock, NULL, NULL, globalRows, globalCols)

call

      MatSetSizes(matblock, localRows, localCols, NULL, NULL)

using the local sizes of the subvectors.

Does this sound a viable approach? Or do you have some different 
suggestions?

Thanks

     Matteo




More information about the petsc-users mailing list