[petsc-users] MatNest with Shell blocks for multipysics

Barry Smith bsmith at petsc.dev
Thu Jul 1 14:42:19 CDT 2021



> On Jul 1, 2021, at 11:44 AM, Matteo Semplice <matteo.semplice at uninsubria.it> wrote:
> 
> Il 01/07/21 17:52, Jed Brown ha scritto:
>> I think ex28 is better organization of code. You can DMCreateMatrix() and then set types/preallocation for off-diagonal blocks of the MatNest. I think the comment is unclear and not quite what was intended and originally worked (which was to assemble the off-diagonal blocks despite bad preallocation).
>> 
>> https://gitlab.com/petsc/petsc/-/commit/6bdeb4dbc27a59cf9af4930e08bd1f9937e47c2d
>> 
>> https://www.mcs.anl.gov/petsc/petsc-current/src/snes/tutorials/ex28.c.html#line410
> 
> Thanks! Yesterday I was unable to make it work, but I'll have another go with ex28 then...
> 
>> Note that if you're using DMDA and have collocated fields, you can skip all this complexity. And if you have a scattered discretization, consider DMStag. ex28 is showing how to solve a coupled problem where there is no suitable structure to convey the relation between discretizations.
> 
> The current discretization is in fact colocated with the variables on the same grid, and we might stick to that for a while.
> 
> However, one key point in the design is that the jacobian will be [A00,A01;A10,A11] and we already have a taylor-made shell preconditioner for A00 and A00 implemented as a shell matrix; the preconditioner for the full Jacobian will be to neglect the A10 block and do a block-triangular solve inverting A00 approximately with the shell preconditioner.
> 
> I do not understand how creating a DMDA with n0+n1 dofs will let me easily reuse my shell preconditioner code on the top-left block.

   PCFIELDSPLIT (and friends) do not order the dof by block, rather they "pull out" the required pieces of the vector (using IS's) when needed. Your shell preconditioner will just operate on the "pulled out" vectors. If you use DMDAVecGetArray etc in your shell preconditioner you can create an auxiliary DMDA of that smaller dof to still be able to use the DMDAVecGetArray constructs.

   You definitely should use an "all dof" DMDA to define your entire problem and not try to "glue" together vectors using DMComposites or other such things. Taking apart is much easier in parallel computing then putting together.

  Barry

> 
> Matteo
> 
> 



More information about the petsc-users mailing list