[petsc-users] FieldSplit, multigrid and blocksize
Eric Chamberland
Eric.Chamberland at giref.ulaval.ca
Wed Dec 7 09:55:30 CST 2016
Hi Nicolas,
for us the solution has been to "manually" create a MatNest with, ie,
block A00 containing only u-u coupling and block A11 containing p-p
coupling.
Thus, we are able to assign block size of 3 for A00 and block size of 1
for A11.
The other thing we did, is to be able to number the unknowns in a u then
p order (on each process). Thus, the ISs are "continuous strides" per
process.
It allowed us to be able to do the assembly efficiently *directly* into
a MatNest (which is not a Petsc native feature) then, we didn't touch
the complex part of the assembly in our code, but just the small
function where we call MatSetValues to split the elementary indices
accordingly to A00 then A01, A10 and A11 (I mean we keep the same
elementary matrix, but just translate at most 4 times the indices to
sub-matrix ranges, and negate the ones not to be assembled into the
sub-matrix).
Have a nice day!
Eric
On 07/12/16 08:43 AM, Karin&NiKo wrote:
> Thanks Barry.
> I must emphasize that my unknowns are not numbered in a regular way : I
> am using a P2-P1 finite element and the middle nodes do not carry a
> pressure DOF. So the global numbering is somewhat like :
> -----------------------------------------------------------------------------------------------------------
> u1x, u1y, u1z, p, u2x, u2y, u2z, p2, u3x, u3y, u3z, u4x, u4y, u4z, p4, .....
> node 1 DOF | node 2 DOF | node 3 DOF | node 4 DOF |
> -----------------------------------------------------------------------------------------------------------
>
> So my global matrix does not have a block-size of 4. Nevertheless the
> A00 matrix has a block size of 3!
> Is there a way to specify that only on the A00 sub-matrix?
>
> Nicolas
>
>
>
> 2016-12-07 14:22 GMT+01:00 Barry Smith <bsmith at mcs.anl.gov
> <mailto:bsmith at mcs.anl.gov>>:
>
>
> > On Dec 7, 2016, at 7:06 AM, Karin&NiKo <niko.karin at gmail.com <mailto:niko.karin at gmail.com>> wrote:
> >
> > Dear PETSc gurus,
> >
> > I am using FieldSplit to solve a poro-mechanics problem. Thus, I am dealing with 3 displacement DOF and 1 pressure DOF.
> > In order to precondition the 00 block (aka the displacement block), I am using a multigrid method (ml or gamg). Nevertheless, I have the feeling that the multigrids performance is much lower than in the case where they are used on pure displacement problems (say elasticity). Indeed, I do not know how to set the block size of the 00 block when using FieldSplit!
> > Could you please give me some hint on that?
>
> In your case you can use a block size of 4. The first field is
> defined by "components" 0, 1, and 2 and the second field (the
> pressure) is defined by component 3. Use PCFieldSplitSetFields() to
> set the fields and set the matrix block size to 4 (use AIJ matrix).
>
> If the displacement block corresponds to a true displacement
> problem then one should expect similar convergence of the multigrid.
> BUT note that usually with PCFIELDSPLIT one just does a single
> V-cycle of multigrid (KSP type of preonly) on the 00 block in each
> iteration. Run with -ksp_view to see what the solve is actually doing.
>
> > (the phrase "The fieldsplit preconditioner cannot currently be used with the BAIJ or SBAIJ data formats if the blocksize is larger than 1." is not clear enough for me...).
>
> To use fieldsplit you should use AIJ matrix, not BAIJ or SBAIJ
> (don't worry about impacting performance the fieldsplit pulls apart
> the blocks anyways so there would be no advantage to BAIJ or SBAIJ).
> >
> > Thanks in advance,
> > Nicolas
>
>
More information about the petsc-users
mailing list