<div class="gmail_quote">On Tue, May 3, 2011 at 15:50, <span dir="ltr"><<a href="mailto:domenico.borzacchiello@univ-st-etienne.fr">domenico.borzacchiello@univ-st-etienne.fr</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Hi,<br>
I'm solving Stokes equations with Pc FieldSplit and Schur Complement<br>
Preconditioning. If I use a direct solver for the block A(0,0) and no<br>
preconditioning for the Schur Complement (just fgmres), the outer solver<br>
converges in typically 1-2 iterations and the inner solver for Schur<br>
Complement in 10-15 iterations depending on the problem size. If I try to<br>
precondition the Schur Complement with PCLSC (dev version) with direct<br>
solution for the product A(2,1)*A(1,2) the Schur Complement solver takes<br>
350+ iterations to converge. What could be the cause of this behavior?<br>
I've used LSC before (not in PETSC however) for transient Navier-Stokes<br>
and it used to work fine.<br><br></blockquote><div><br></div><div>I suspect it is because you need -ksp_diagonal_scale. In petsc-dev, look at src/ksp/ksp/examples/tests/makefile, targets runex11 and runex11_2, for working configurations for a hard Stokes problem in geodynamics.</div>
<div><br></div><div>LSC is normally described with the scaling "built in". Perhaps we should identify the fieldsplit/lsc combination and choose scaling automatically since it's usually necessary.</div></div>
<br>