[petsc-users] Using the PCASM interface to define minimally overlapping subdomains

Matthew Knepley knepley at gmail.com
Tue Sep 16 14:29:46 CDT 2014


On Tue, Sep 16, 2014 at 2:23 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>    Patrick,
>
>      This "local part of the subdomains for this processor” term in
> PCASMSetLocalSubdomains is, IMHO, extremely confusing. WTHWTS? Anyways, I
> think that if you set the is_local[] to be different than the is[] you will
> always end up with a nonsymetric preconditioner. I think for one dimension
> you need to use


No I don't think that is right. The problem below is that you have overlap
in only one direction. Process 0 overlaps
Process 1, but Process 1 has no overlap of Process 0. This is not how
Schwarz is generally envisioned.

Imagine the linear algebra viewpoint, which I think is cleaner here. You
partition the matrix rows into non-overlapping
sets. These sets are is_local[]. Then any information you get from another
domain is another row, which is put into
is[]. You can certainly have a non-symmetric overlap, which you have below,
but it mean one way information
transmission which is strange for convergence.

  Matt


>
> > is[0] <-- 0 1 2 3
> > is[1] <-- 3 4 5 6
> > is_local[0] <-- 0 1 2 3
> > is_local[1] <-- 3 4 5 6
>
> Or you can pass NULL for is_local use PCASMSetOverlap(pc,0);
>
>   Barry
>
>
> Note that is_local[] doesn’t have to be non-overlapping or anything.
>
>
> On Sep 16, 2014, at 10:48 AM, Patrick Sanan <patrick.sanan at gmail.com>
> wrote:
>
> > For the purposes of reproducing an example from a paper, I'd like to use
> PCASM with subdomains which 'overlap minimally' (though this is probably
> never a good idea in practice).
> >
> > In one dimension with 7 unknowns and 2 domains, this might look like
> >
> > 0  1  2  3  4  5  6  (unknowns)
> > ------------          (first subdomain  : 0 .. 3)
> >         -----------  (second subdomain : 3 .. 6)
> >
> > The subdomains share only a single grid point, which differs from the
> way PCASM is used in most of the examples.
> >
> > In two dimensions, minimally overlapping rectangular subdomains would
> overlap one exactly one row or column of the grid. Thus, for example, if
> the grid unknowns were
> >
> > 0  1  2  3  4  5  |
> > 6  7  8  9  10 11 | |
> > 12 13 14 15 16 17   |
> >         --------
> > -----------
> >
> > then one minimally-overlapping set of 4 subdomains would be
> > 0 1 2 3 6 7 8 9
> > 3 4 5 9 10 11
> > 6 7 8 9 12 13 14 15
> > 9 10 11 15 16 17
> > as suggested by the dashes and pipes above. The subdomains only overlap
> by a single row or column of the grid.
> >
> > My question is whether and how one can use the PCASM interface to work
> with these sorts of decompositions (It's fine for my purposes to use a
> single MPI process). In particular, I don't quite understand if should be
> possible to define these decompositions by correctly providing is and
> is_local arguments to PCASMSetLocalSubdomains.
> >
> > I have gotten code to run defining the is_local entries to be subsets of
> the is entries which define a partition of the global degrees of freedom*,
> but I'm not certain that this was the correct choice, as it appears to
> produce an unsymmetric preconditioner for a symmetric system when I use
> direct subdomain solves and the 'basic' type for PCASM.
> >
> > * For example, in the 1D example above this would correspond to
> > is[0] <-- 0 1 2 3
> > is[1] <-- 3 4 5 6
> > is_local[0] <-- 0 1 2
> > is_local[1] <-- 3 4 5 6
> >
> >
> >
> >
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140916/02d9da04/attachment.html>


More information about the petsc-users mailing list