FieldSplit Schur preconditioner
Chris Kees
Christopher.E.Kees at usace.army.mil
Sat Sep 6 22:05:12 CDT 2008
OK. Thanks for the suggestions. I'll pull petsc-dev next week and see if
mumps gives me the same behavior as superlu_dist before trying to set up a
Schur complement preconditioner for stokes.
Chris
On 9/6/08 11:31 AM, "Jed Brown" <jed at 59A2.org> wrote:
> On Sat 2008-09-06 10:57, Kees, Christopher E wrote:
>> I'm interested in these types of Shur complement preconditioners as well. I
>> had stopped working with petsc-dev some time back because I wasn't using new
>> functionality. Would this be a good time to start working with petsc-dev in
>> order to start testing these? I'm primarily interested in stokes and
>> navier-stokes type systems but also have some other coupled systems that an
>> "operator-split" preconditioner should work well on.
>
> My opinion is YES, now is a good time to start using petsc-dev.
>
>> With regard to testing, I am also looking for a solid parallel direct solver
>> to verify everything from the nonlinear solver up. I have been using superlu
>> in serial and superlu_dist in parallel (through PETSc) but have noticed some
>> reduction in accuracy as I go to more processors with superlu_dist. This may
>> just be a bug in our code, but I thought if I get petsc-dev the new approach
>> to external direct solvers that Barry mentioned earlier might make it easier
>> to look at some other direct solvers. Several years ago I got better results
>> with spooles, but I'd be interested in any current recommendations.
>
> From what I've seen this is more of a code refactoring and interface
> change than a change in functionality. The old way would be
>
> ./ex2 -pc_type lu -mat_type {aijspooles,superlu_dist,aijmumps,umfpack}
>
> but now it looks like
>
> ./ex2 -pc_type lu -pc_factor_mat_solver_package
> {spooles,superlu_dist,mumps,...}
>
> Jed
More information about the petsc-dev
mailing list