[petsc-dev] PCASM uses DMCreateDecomposition which splits field, NOT expected behavior

Dmitry Karpeev karpeev at mcs.anl.gov
Tue May 29 06:42:26 CDT 2012


On Mon, May 28, 2012 at 5:22 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:

> On Mon, May 28, 2012 at 5:10 PM, Dmitry Karpeev <karpeev at mcs.anl.gov>wrote:
>
>> On Mon, May 28, 2012 at 5:02 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>>
>>> This is an enormous amount of new code and changes to put in during a
>>> code freeze, but my test case works now.
>>
>> We can revert, if it starts causing problems.  The extra code is due to
>> the API chance, hence several new interface functions.
>>
>>
>>> A couple comments  from looking at the patch:
>>>
>>> I'm not sure I like the direction of the link in DMSetEmbedding(DM dm,
>>> IS embedding, DM ambientdm). Normally the composite DM knows how to fit its
>>> pieces together, but the sub-DM does not have "upward links" to the global
>>> problem.
>>>
>> This is to be able to combine domain and field splits: sometimes a subDM
>> has to know how to "relativize" itself with respect to another subDM.
>>
>
> I don't believe it has to. The current approach in PETSc has always been
> that we store links going the other direction. I don't like the
> inconsistency and I think the model is much more sustainable to hold
> downward links only.
>

I removed the embedding code, so now there is just about the minimum
necessary to resolve the issue of field/subdomain decomposition mixing:
http://petsc.cs.iit.edu/petsc/petsc-dev/rev/b207eabfcac1
Little of PETSc's code depends on this, but external DM impls (e.g.,
DM_libMesh) need this API to serve up decompositions correctly.


>
> What algorithm can only be implemented with these upward links?
>

The "upward" links are still necessary at least in my experience, but they
can be hidden in the impl of the decomposition DM.  The difference with
other PETSc code (I assume you are alluding to DMComposite, which only
needs "downward" links to its constituent pieces) is that of putting things
together versus taking them apart: DMComposite inductively _defines_ a
global numbering by ordering the constituent pieces with an existing
relative numbering.  Those pieces don't need to know whether/how their
parent composite DM might be embedded into something else.  A DM wrapping a
piece of an existing mesh, on the other hand, already has the global
numbering of that piece, but may need to compute the relative numbering
inside another, bigger mesh piece.  To do that, the smaller DM needs to
know how the ambient DM is embedded into the overall mesh. For example, say
I have a problem with variables (u,v,p) numbered globally by libMesh.  I
first want to split off p decomposing the system into (u,v) and (p), and
then further decompose the first piece into (u) and (v).  I only have the
global numberings of the u, v and p fields to work with. In order to know
how u is embedded into (u,v) I need to know (u,v) --> (u,v,p) and then make
(u) --> (u,v,p) factor through that.  The problem is that there can in
principle be an arbitrary number of various decompositions, so storing all
of the "downward" links is impractical.  In contrast, DMComposite only
deals with a single decomposition.  Of course, the decomposition DM
obtained via DMCreateField/DomainDecomposition() is supposed to resolve
that issue by serving as a proxy for a particular collection of "downward"
or "upward" links in an impl-defined way.

Incidentally, another use of embedding for relativization is in composing
active set DMs with (other) field/domain decompositions: if I am only
solving a subsystem defined by (in)active variables, want to use
PCFIELDSPLIT or PC(G)ASM with it, and want to use the splits/subdomains
defined by the original DM, I would have to recalculate
the ISs defining the original subdomains, as I'm now only interested in
their intersections with the smaller (in)active
space.

>
>
>>
>>> There are dead links in your documentation because you reference
>>> functions that don't exist (were renamed), e.g. DMCreateDecomposition() in
>>> the DMSetEmbedding() man page. There may be more. You should generate the
>>> docs and check the man pages.
>>>
>> These should be gone now.

>
>>> How does your interface support "Neumann" subdomains?
>>>
>>
>> I think this issue is orthogonal to the definition of the decomposition,
>> since there has to be additional support at the PC
>> and Mat level to assemble the Neumann subproblems.  At this point you
>> would have to do some combination of PCASMSetModifySubmatrices() with
>> nullspace removal.
>>
>
> Sure, you have to assemble to a different format, but what interface are
> we going to use. In the linear case, we can, in principle, do everything
> with matrices. For the nonlinear case, we need DM APIs. So in the nonlinear
> context, how are we going to build Neumann problems?
>
I think the "PETSc-facing" API is more or less clear: DMFunction,
DMJacobian on each of the decomposition pieces define the subproblem. The
issue with the user-facing API is how to make defining the subproblems
relatively easy?
DMDAComputeFunctionLocal()?  In general you'd want something like the Moose
API: the user provides a quadrature-point kernel (maybe that's too fine a
granularity; an element-loop instead?), and the DM traverses the subdomain
and puts the kernel invocations together?

Dmitry.

>
>
>> Your DMCreateDomainDecomposition() seems to provide nominally overlapping
>>> index sets, but how do we determine what part is owned? (RASM, especially,
>>> needs both an overlapping decomposition and a (non-overlapping) partition.
>>>
>> My intent was to have these be nonoverlapping and expand them with
>> another call (e.g., DMDomainDecompositionIncreaseOverlap().
>>
>
> But with separate functions, we have to jump through extra hoops to use a
> user-specified decomposition (e.g. that carefully selects Lagrange
> multipliers or pressure variables to maintain inf-sup stability).
>
>
>>  The current API plays better with GASM, since that PC assumes that the
>> basic subdomains are nonoverlapping. That fix will have to wait for the
>> next release, I guess.
>>
>
> PCGASMSetSubdomains() also has two IS arguments, so I don't know what you
> mean.
>
>
> We need to work out an interface for nonlinear DD (like ASPIN) this
> summer, but I don't think it belongs in this release. I'm not so wild about
> putting new interfaces into this release that will be changed immediately
> after.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120529/ca98555a/attachment.html>


More information about the petsc-dev mailing list