[petsc-dev] Block system

Jed Brown jed at jedbrown.org
Mon Feb 13 10:07:21 CST 2017


Pierre Jolivet <Pierre.Jolivet at enseeiht.fr> writes:

> Hello,
> Given this block matrix:
> A = [A11,A12,A13,A14;
>       A21,A22,A23,A24;
>       A31,A32,A33,A34;
>       A41,A42,A43,A44];
> It is trivial to precondition Ax = b with M^-1 = diag(A11^-1, A22^-1, 
> A33^-1, A44^-1);
> My application requires a slightly fancier preconditionner which should 
> be M^-1 = diag(inv([A11,A12;A21,A22]),inv([A33,A34;A43,A44]));
> I'm not sure what is the right tool for this.
> I've stopped at a 4x4 block matrix, but at scale I have a matrix with 
> few thousands x few thousands blocks (still with the nested 2 x 2 block 
> structure).

Are all of these blocks distributed on your communicator or do they have
some locality?  PCFieldSplit is intended for problems where the blocks
are all distributed and solving them sequentially is acceptable.  The
other limiting case for an additive preconditioner like you have above
is block Jacobi (perhaps with multi-process subdomains or multiple
subdomains per process; such decompositions are supported).
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 832 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20170213/26fa326c/attachment.sig>


More information about the petsc-dev mailing list