Schur complement system

Matthew Knepley knepley at gmail.com
Mon Mar 10 10:53:55 CDT 2008


On Mon, Mar 10, 2008 at 8:00 AM, Lisandro Dalcin <dalcinl at gmail.com> wrote:
> On 3/9/08, Kathrin Burckhardt <tribur at vision.ee.ethz.ch> wrote:
>  >  uff, sounds in fact quite complicated.
>
>  Yes, it is  :-(
>
>  > In particular, I don't
>  >  understand why you split the "local process matrix" (the local Schur
>  >  complement, right?) in four (sub?)submatrices? Is it because of the
>  >  structure of you domain?
>
>  Well, perhaps I did not understood your.
>
>  My implementation is intended to work with a MATIS global matrix. In
>  this matrix, each process assembles its local subdomain matrix (this
>  is somewhat natural in finite element methods). For a finite element
>  partitioning (as described in Y. Saad book on iterative methods,
>  http://www-users.cs.umn.edu/~saad/books.html), the 'local' Schur
>  complement is defined by four submatrices obtained from the local
>  subdomain matrix. Those 'local' Schur complements in turn define the
>  'global' one, which is never explicitely assembled, but implicitely
>  handled by defining the matrix-vector product.

By 4 matrices, I assume you mean A, B, C where

  S = B^T A^{-1} B + C

right?

  Thanks,

    Matt

>  >
>  >
>  >
>  >  On Fri, 7 Mar 2008, Lisandro Dalcin wrote:
>  >
>  >  > Well, I did. I have a preliminar implementation of this inside
>  >  > petsc4py (PETSc for Python). However, it is fully implemented in C as
>  >  > a standard preconditioner intended to be used with KSPPREONLY. The
>  >  > preconditioner has  a sub-KSP iterating only on global interface
>  >  > nodes. It should be almost trivial to incorporate it inside other
>  >  > codes.
>  >  >
>  >  > A warning: all this needs a MATIS matrix to work, that is each
>  >  > processor have is local portion of the global matrix in an unassembled
>  >  > fashion. In order to define the Schur complement system, the local
>  >  > process matrix is splitted in four submatrices. At each processor, the
>  >  > 'subdomain' can be further partitioned, this is implemented with a
>  >  > simple minded graph partitioning implemented by reusing the some core
>  >  > routines used inside PETSc for sparse matrix reordering. In order to
>  >  > precondition the Schur complement system, a subsidiary problem can
>  >  > defined by taking some layers of nodes around interface nodes and it
>  >  > is globally iterated with typically a few loops or sub-sub KSP.
>  >  >
>  >  > In short, the implementation is a really involved, and comparable in
>  >  > dificulty to the Newmann-Newman preconditioner (which is also
>  >  > implemented to work with MATIS). At the user level, the only parameter
>  >  > to tweack is the 'sub-subdomain' size and of course tolerances of the
>  >  > inner KSP solver inside the 'SCHUR' preconditioner.
>  >  >
>  >  > Why I never added this to PETSc? Time, lack of serious testing, etc.
>  >  > but mainly because I'm not sure of it is really a good choice compared
>  >  > to PCASM. In fact, some day I would implement my trics for
>  >  > sub-subdamain partitioning (or perhaps use metis for this) inside ASM.
>  >  > Then the sub-subdomain problems would have a reasonable size for using
>  >  > LU, and finally I would compare PCASM  with my PCSCHUR.
>  >  >
>  >  > Currently, we are using this for solving incompressible NS equations
>  >  > with monolithic stabilized FEM formulation, and other problems with
>  >  > badly conditioned global systems. We have found no better way to solve
>  >  > our linear systems for our application, probably because we are not
>  >  > smart enough.
>  >  >
>  >  > Hope you understood me...
>
> >  >
>  >  > On 3/7/08, Kathrin Burckhardt <tribur at vision.ee.ethz.ch> wrote:
>  >  >> Dear nice people,
>  >  >>
>  >  >>  Did you ever try to solve a Schur complement system using PETSc?
>  >  >>
>  >  >>
>  >  >
>  >  >
>  >  >
>  >
>  >
>
>
>  --
>  Lisandro Dalcín
>  ---------------
>  Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
>  Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
>  Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
>  PTLC - Güemes 3450, (3000) Santa Fe, Argentina
>  Tel/Fax: +54-(0)342-451.1594
>
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which
their experiments lead.
-- Norbert Wiener




More information about the petsc-users mailing list