[petsc-dev] FieldSplit and PCLSC

Jed Brown jed at 59A2.org
Tue May 3 09:26:39 CDT 2011


On Tue, May 3, 2011 at 15:50, <domenico.borzacchiello at univ-st-etienne.fr>wrote:

> Hi,
> I'm solving Stokes equations with Pc FieldSplit and Schur Complement
> Preconditioning. If I use a direct solver for the block A(0,0) and no
> preconditioning for the Schur Complement (just fgmres), the outer solver
> converges in typically 1-2 iterations and the inner solver for Schur
> Complement in 10-15 iterations depending on the problem size. If I try to
> precondition the Schur Complement with PCLSC (dev version) with direct
> solution for the product A(2,1)*A(1,2) the Schur Complement solver takes
> 350+ iterations to converge. What could be the cause of this behavior?
> I've used LSC before (not in PETSC however) for transient Navier-Stokes
> and it used to work fine.
>
>
I suspect it is because you need -ksp_diagonal_scale. In petsc-dev, look at
src/ksp/ksp/examples/tests/makefile, targets runex11 and runex11_2, for
working configurations for a hard Stokes problem in geodynamics.

LSC is normally described with the scaling "built in". Perhaps we should
identify the fieldsplit/lsc combination and choose scaling automatically
since it's usually necessary.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110503/ce9df01b/attachment.html>


More information about the petsc-dev mailing list