[petsc-dev] Block system

Pierre Jolivet Pierre.Jolivet at enseeiht.fr
Mon Feb 13 10:19:55 CST 2017


On Mon, 13 Feb 2017 17:07:21 +0100, Jed Brown wrote:
> Pierre Jolivet <Pierre.Jolivet at enseeiht.fr> writes:
>
>> Hello,
>> Given this block matrix:
>> A = [A11,A12,A13,A14;
>>       A21,A22,A23,A24;
>>       A31,A32,A33,A34;
>>       A41,A42,A43,A44];
>> It is trivial to precondition Ax = b with M^-1 = diag(A11^-1, 
>> A22^-1,
>> A33^-1, A44^-1);
>> My application requires a slightly fancier preconditionner which 
>> should
>> be M^-1 = diag(inv([A11,A12;A21,A22]),inv([A33,A34;A43,A44]));
>> I'm not sure what is the right tool for this.
>> I've stopped at a 4x4 block matrix, but at scale I have a matrix 
>> with
>> few thousands x few thousands blocks (still with the nested 2 x 2 
>> block
>> structure).
>
> Are all of these blocks distributed on your communicator or do they 
> have
> some locality?  PCFieldSplit is intended for problems where the 
> blocks

All the blocks are distributed indeed.

> are all distributed and solving them sequentially is acceptable.  The
> other limiting case for an additive preconditioner like you have 
> above
> is block Jacobi (perhaps with multi-process subdomains or multiple
> subdomains per process; such decompositions are supported).

Yes, that is basically what I need, block Jacobi with subdomains 
defined as aggregation of multiple processes, but I don't know how to do 
this and thought of using an additive FieldSplit. Could you give me a 
pointer to such a distribution, please?
Thanks in advance.




More information about the petsc-dev mailing list