FieldSplit Schur preconditioner

Jed Brown jed at 59A2.org
Sat Sep 6 11:31:07 CDT 2008


On Sat 2008-09-06 10:57, Kees, Christopher E wrote:
> I'm interested in these types of Shur complement preconditioners as well. I
> had stopped working with petsc-dev some time back because I wasn't using new
> functionality. Would this be a good time to start working with petsc-dev in
> order to start testing these? I'm primarily interested in stokes and
> navier-stokes type systems but also have some other coupled systems that an
> "operator-split" preconditioner should work well on.

My opinion is YES, now is a good time to start using petsc-dev.

> With regard to testing, I am also looking for a solid parallel direct solver
> to verify everything from the nonlinear solver up. I have been using superlu
> in serial and superlu_dist in parallel (through PETSc) but have noticed some
> reduction in accuracy as I go to more processors with superlu_dist. This may
> just be a bug in our code, but I thought if I get petsc-dev the new approach
> to external direct solvers that Barry mentioned earlier might make it easier
> to look at some other direct solvers. Several years ago I got better results
> with spooles, but I'd be interested in any current recommendations.

From what I've seen this is more of a code refactoring and interface
change than a change in functionality.  The old way would be

  ./ex2 -pc_type lu -mat_type {aijspooles,superlu_dist,aijmumps,umfpack}

but now it looks like

  ./ex2 -pc_type lu -pc_factor_mat_solver_package {spooles,superlu_dist,mumps,...}

Jed
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 197 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20080906/67cb7ef8/attachment.sig>


More information about the petsc-dev mailing list