<div>Francesco,</div><div><br></div>Are you using any pressure stabilization scheme? If not, the 11-block in your Jacobian <div>A = [A00 A01; A10 A11] would typically be zero, and preconditioning it with jacobi wouldn't really work.</div>
<div><br></div><div>If A11 = 0, you ought to use the fieldsplit of type Schur, for example (assuming petsc-3.3; prior versions use slightly different option names):</div><div>-pc_fieldsplit_type schur -pc_fieldsplit_schur_fact_type full </div>
<div>The question is then how you want to solve the Schur complement system.</div><div>Since the A11 block for you is zero, the easiest choice is to use S as the preconditioning matrix:</div><div>-pc_fieldsplit_schur_precondition self</div>
<div><br></div><div>Preconditioning options for the Schur solver have prefix -fieldsplit_1_.</div><div>Since for Stokes the Schur complement is spectrally equivalent to the pressure mass matrix, you can </div><div>hope that an unpreconditioned GMRES (default when using -pc_fieldsplit_schur_precondition self) will solve it.</div>
<div><br></div><div><br></div><div>A better choice might be to use the Least Squares Commutator preconditioner </div><div>-fieldsplit_1_pc_type lsc</div><div>See <a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCLSC.html" target="_blank">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCLSC.html</a></div>
<div>and note the relevant references at the bottom of that manual page.</div><div>You might also find this useful: <a href="http://www.icfd.rdg.ac.uk/ICFD25/Talks/AWathen.pdf" target="_blank">http://www.icfd.rdg.ac.uk/ICFD25/Talks/AWathen.pdf</a></div>
<div><br></div><div>PCLSC preconditions S = A10 inv(A00) A01 with inv(A10A01) A10 A00 A01 inv(A10A01)</div><div>The "Laplacian" (a true Laplacian in the case of incompressible Stokes) can be preconditioned</div>
<div>using AMG:</div><div>-fieldsplit_1_lsc_pc_type ml</div><div>(assuming you have ML built, e.g., with --download-ml)</div><div>or </div><div>-fieldsplit_1_lsc_pc_type gamg </div><div>with some appropriate choice of gamg options:</div>
<div><a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGAMG.html" target="_blank">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGAMG.html</a></div><div><br></div><div>To start with I would use -pc_fieldsplit_type schur -pc_fieldsplit_schur_fact_type full -pc_fieldsplit_schur_precondition self -fieldsplit_1_pc_type lsc -fieldsplit_1_lsc_pc_type ml</div>
<div>gamg may also be very good in place of ml, but will require more option setting.</div><div><br></div>
<div><br></div><div>You might find src/ksp/ksp/examples/tutorials/ex42.c and ex43.c useful, albeit, possibly, a bit heavyweight for your needs.</div><div><br></div><div>A few more comments below:</div><div><br><div class="gmail_quote">
On Mon, Feb 4, 2013 at 10:20 AM, John Peterson <span dir="ltr"><<a href="mailto:jwpeterson@gmail.com" target="_blank">jwpeterson@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div><div>On Thu, Dec 27, 2012 at 4:31 AM, Francesco Ballarin<br>
<<a href="mailto:francesco.ballarin@mail.polimi.it" target="_blank">francesco.ballarin@mail.polimi.it</a>> wrote:<br>
> Dear libMesh users,<br>
> I am trying to solve a Stokes problem in a 3D domain, but I am currently<br>
> struggling with the choice of the preconditioner.<br>
> I have slightly modified systems_of_equations_ex1 to accommodate for the<br>
> third velocity component and boundary conditions of the problem at hand<br>
> (Dirichlet BCs on the inlet on the walls, enforced using a<br>
> DirichletBoundary object, and homogeneous Neumann BC on the outflow)<br>
><br>
> I am trying to use PETSc block preconditioners as shown, e.g., in<br>
> <a href="http://acts.nersc.gov/events/Workshop2012/slides/PETSc.pdf" target="_blank">http://acts.nersc.gov/events/Workshop2012/slides/PETSc.pdf</a><br>
> section 4. I have tried most of the preconditioners discussed during the<br>
> talk, but I am not satisfied because I get either PETSc errors or a<br>
> performance that I hope can be increased.<br>
><br>
> For example, with the following options:<br>
> *******<br>
> PetscOptionsSetValue("-ksp_type", "gmres");<br>
> PetscOptionsSetValue("-ksp_right_pc", PETSC_NULL);<br>
> PetscOptionsSetValue("-pc_type", "fieldsplit");<br>
> PetscOptionsSetValue("-pc_fieldsplit_detect_saddle_point", PETSC_NULL);<br>
> PetscOptionsSetValue("-pc_fieldsplit_type", "multiplicative");<br>
> PetscOptionsSetValue("-fieldsplit_0_ksp_type", "preonly");<br>
> PetscOptionsSetValue("-fieldsplit_0_pc_type", "ml");<br>
> PetscOptionsSetValue("-fieldsplit_1_ksp_type", "preonly");<br>
> PetscOptionsSetValue("-fieldsplit_1_pc_type", "jacobi");<br></div></div></blockquote><div><br></div><div>You do not need to set options to PETSC_NULL, if you do not intend to use them.</div><div>
Also, how are you forming the splits? </div>
<div>-pc_fieldsplit_detect_saddle_point is actually rather useful here.</div><div><br></div><div>Further, there are programmatic ways to set most of these options, for example:</div><div><a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetType.html" target="_blank">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetType.html</a></div>
<div><br></div><div>Even so, we typically recommend setting these on the command line whenever possible:</div><div>this way you do not hardwire your solver choices and can quickly experiment with different </div><div>solvers from the command-line without recompiling your code.</div>
<div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div><div>
> *******<br>
> I obtain a reasonable solution but after thousands of gmres iterations. Is<br>
> there any way to improve that?<br>
><br>
> Or, if I try to embed the previous block preconditioner in mg:<br>
> *******<br>
> PetscOptionsSetValue("-pc_type", "mg");<br>
> PetscOptionsSetValue("-pc_mg_levels", "5");<br>
> PetscOptionsSetValue("-pc_mg_galerkin", PETSC_NULL);<br>
> PetscOptionsSetValue("-ksp_type", "gmres");<br>
> PetscOptionsSetValue("-mg_levels_pc_type", "fieldsplit");<br>
> PetscOptionsSetValue("-mg_levels_pc_fieldsplit_detect_saddle_point",<br>
> PETSC_NULL);<br>
> PetscOptionsSetValue("-mg_levels_pc_fieldsplit_type", "multiplicative");<br>
> PetscOptionsSetValue("-mg_levels_fieldsplit_0_ksp_type", "preonly");<br>
> PetscOptionsSetValue("-mg_levels_fieldsplit_0_pc_type", "gamg");<br>
> PetscOptionsSetValue("-mg_levels_fieldsplit_1_ksp_type", "preonly");<br>
> PetscOptionsSetValue("-mg_levels_fieldsplit_1_pc_type", "jacobi");<br>
> *******<br>
> I get the following error:<br></div></div></blockquote><div>Always send the full stack trace, when reporting errors.</div><div><br></div><div>Finally, in the future you might want to post these questions to <a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a></div>
<div><br></div><div>Dmitry. </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div>
> *******<br>
> PETSC ERROR: Null argument, when expecting valid pointer!<br>
> PETSC ERROR: Null Object: Parameter # 2!<br>
> PETSC ERROR: MatPtAP() line 8237 in petsc-3.3-p4/src/mat/interface/matrix.c<br>
> *******<br>
><br>
> Do you have any suggestion? What are you currently using as preconditioner<br>
> for saddle point problems?<br>
<br>
</div></div>Apologies, it appears that this email fell through the cracks<br>
somewhere back around the holidays.<br>
<br>
Did you ever find a satisfactory answer to your question?<br>
<br>
Petsc developers may occasionally read this list, but for questions<br>
that are this advanced you would probably have better luck mailing the<br>
petsc developers list directly.<br>
<br>
And of course if you do find a good block preconditioning strategy for<br>
systems_of_equations_ex1, please feel free to report it here!<br>
<br>
--<br>
John<br>
<br>
------------------------------------------------------------------------------<br>
Everyone hates slow websites. So do we.<br>
Make your web apps faster with AppDynamics<br>
Download AppDynamics Lite for free today:<br>
<a href="http://p.sf.net/sfu/appdyn_d2d_jan" target="_blank">http://p.sf.net/sfu/appdyn_d2d_jan</a><br>
<div><div>_______________________________________________<br>
Libmesh-users mailing list<br>
<a href="mailto:Libmesh-users@lists.sourceforge.net" target="_blank">Libmesh-users@lists.sourceforge.net</a><br>
<a href="https://lists.sourceforge.net/lists/listinfo/libmesh-users" target="_blank">https://lists.sourceforge.net/lists/listinfo/libmesh-users</a><br>
</div></div></blockquote></div><br></div>
<br clear="all"><div><br></div>-- <br><div>Dmitry Karpeev, Ph.D.</div><div>Assistant Computational Mathematician</div><div>Mathematics and Computer Science</div><div>Argonne National Laboratory</div><div>Argonne, Illinois, USA</div>
<div>and</div><div>Fellow</div><div>Computation Institute</div><div>University of Chicago</div><div>5735 S. Ellis Avenue</div><div>Chicago, IL 60637</div><div>-----------------------</div><div>Phone: <a href="tel:630-252-1229" value="+16302521229" target="_blank">630-252-1229</a></div>
<div>
Fax: <a href="tel:630-252-5986" value="+16302525986" target="_blank">630-252-5986</a></div>