[petsc-dev] Using PCFieldSplitSetIS

Jed Brown jed at 59A2.org
Mon Mar 14 07:45:13 CDT 2011


On Mon, Mar 14, 2011 at 12:32, Thomas Witkowski <
thomas.witkowski at tu-dresden.de> wrote:

> Should I define blocks or splits for the subdomains and the interior nodes?
> And what is the best way to force PETSc to make some LU factorization on
> each subdomain and to store it (it is needed to create the reduced Schur
> system, to define the action of the Schur complement operator and to solve
> the subdomain unknows in the last step) and to use it later?


Okay, define two splits. The first consists of all the interior nodes, the
second has all the interface nodes. Now use -pc_fieldsplit_type schur
-fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_type bjacobi
-fieldsplit_0_sub_pc_type lu. Remember to look at -ksp_view and -help for
options. You have a choice of preconditioning the Schur complement, by
default it just uses the interface matrix itself (which is usually nearly
diagonal).

That is, performing block Jacobi with direct subdomain solves on the
(parallel) interior matrix will be the same as a direct solve with this
matrix because all the subdomains are actually uncoupled.

My point about exposing less concurrency had to do with always needing to
solve problems with the parallel interior-node matrix which could actually
be stored separately since the systems are not truly coupled. This is most
relevant with multiple subdomains per process or if you are forming an
explicit Schur complement (to build a course level operator, such as with
FETI-DP/BDDC).
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110314/1a9b0db6/attachment.html>


More information about the petsc-dev mailing list