[petsc-dev] Block system

Pierre Jolivet Pierre.Jolivet at enseeiht.fr
Mon Feb 13 10:14:59 CST 2017


On Mon, 13 Feb 2017 10:00:48 -0600, Matthew Knepley wrote:
> On Mon, Feb 13, 2017 at 9:46 AM, Pierre Jolivet  wrote:
>
>> Hello,
>> Given this block matrix:
>> A = [A11,A12,A13,A14;
>>      A21,A22,A23,A24;
>>      A31,A32,A33,A34;
>>      A41,A42,A43,A44];
>> It is trivial to precondition Ax = b with M^-1 = diag(A11^-1,
>> A22^-1, A33^-1, A44^-1);
>> My application requires a slightly fancier preconditionner which
>> should be M^-1 =
>> diag(inv([A11,A12;A21,A22]),inv([A33,A34;A43,A44]));
>> I'm not sure what is the right tool for this.
>> I've stopped at a 4x4 block matrix, but at scale I have a matrix
>> with few thousands x few thousands blocks (still with the nested 2
> x
>> 2 block structure).
>>
>> 1) should I implement a PCSHELL myself, or use a fieldsplit
>> preconditioner with "few thousands / 2" fields (i.e., does
>> PCFIELDSPLIT scale relatively well with the number of fields, or do
>> you recommend it only for "Stokes-like" problems?)?
>
> FieldSplit is not that scalable right now (I think). For 4x4 blocks,
> you want to solve 8x8 systems. You could use MATBAIJ with block size 
> 8
> and
> then PBJACOBI. Would that work for you?

In the application, all my blocks are sparse matrices, and the 
assumption that the number of rows of A_11, A_22, A_33, and A_44 are 
equal does not hold. (That's also why in the MWE I sent I'm not using 
ISCreateBlock)

>   Thanks,
>
>      Matt
>  
>
>> 2) I gave PCFIELDSPLIT a go, but I'm failing miserably. In the
>> attached tarball, I'm loading matrix A on four processes. Each
>> process owns 2 rows of A. I'm thus creating two ISes:
>
> I will look at this as soon as I can, but I am really swamped right
> now.
>
>   Matt
>  What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
>
> Links:
> ------
> [1] mailto:Pierre.Jolivet at enseeiht.fr




More information about the petsc-dev mailing list