<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Mon, Feb 13, 2017 at 9:46 AM, Pierre Jolivet <span dir="ltr"><<a href="mailto:Pierre.Jolivet@enseeiht.fr" target="_blank">Pierre.Jolivet@enseeiht.fr</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hello,<br>
Given this block matrix:<br>
A = [A11,A12,A13,A14;<br>
A21,A22,A23,A24;<br>
A31,A32,A33,A34;<br>
A41,A42,A43,A44];<br>
It is trivial to precondition Ax = b with M^-1 = diag(A11^-1, A22^-1, A33^-1, A44^-1);<br>
My application requires a slightly fancier preconditionner which should be M^-1 = diag(inv([A11,A12;A21,A22]),in<wbr>v([A33,A34;A43,A44]));<br>
I'm not sure what is the right tool for this.<br>
I've stopped at a 4x4 block matrix, but at scale I have a matrix with few thousands x few thousands blocks (still with the nested 2 x 2 block structure).<br>
<br>
1) should I implement a PCSHELL myself, or use a fieldsplit preconditioner with "few thousands / 2" fields (i.e., does PCFIELDSPLIT scale relatively well with the number of fields, or do you recommend it only for "Stokes-like" problems?)?<br></blockquote><div><br></div><div>FieldSplit is not that scalable right now (I think). For 4x4 blocks, you want to solve 8x8 systems. You could use MATBAIJ with block size 8 and</div><div>then PBJACOBI. Would that work for you?</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
2) I gave PCFIELDSPLIT a go, but I'm failing miserably. In the attached tarball, I'm loading matrix A on four processes. Each process owns 2 rows of A. I'm thus creating two ISes:<br></blockquote><div><br></div><div>I will look at this as soon as I can, but I am really swamped right now.</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
IS Object: 4 MPI processes<br>
type: general<br>
[0] Number of indices in set 2<br>
[0] 0 0<br>
[0] 1 1<br>
[1] Number of indices in set 2<br>
[1] 0 2<br>
[1] 1 3<br>
[2] Number of indices in set 0<br>
[3] Number of indices in set 0<br>
IS Object: 4 MPI processes<br>
type: general<br>
[0] Number of indices in set 0<br>
[1] Number of indices in set 0<br>
[2] Number of indices in set 2<br>
[2] 0 4<br>
[2] 1 5<br>
[3] Number of indices in set 2<br>
[3] 0 6<br>
[3] 1 7<br>
<br>
which looks good to me. But when I run:<br>
$ mpirun -np 4 ./a.out -f almost_full_jolivet -pc_type fieldsplit -pc_fieldsplit_type additive -fieldsplit_ksp_type preonly -fieldsplit_pc_type lu -fieldsplit_pc_factor_mat_solv<wbr>er_package mumps<br>
It fails during setup:<br>
[0]PETSC ERROR: Petsc has generated inconsistent data<br>
[0]PETSC ERROR: Unhandled case, must have at least two fields, not 1<br>
<br>
I would appreciate any feedback and/or fix for this.<br>
Thanks, and feel free to move the thread to petsc-users.<span class="HOEnZb"><font color="#888888"><br>
Pierre</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature" data-smartmail="gmail_signature">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div>
</div></div>