[petsc-dev] Redistribution of a DMPlex onto a new communicator

Matthew Knepley knepley at gmail.com
Mon Aug 19 08:45:38 CDT 2019


On Mon, Aug 19, 2019 at 9:43 AM Lawrence Mitchell <wence at gmx.li> wrote:

> On Mon, 19 Aug 2019 at 13:53, Matthew Knepley <knepley at gmail.com> wrote:
>
> [...]
>
> >> OK, so I think I am getting there. Presently I am abusing
> >> DMPlexCreateFromDAG to migrate a DM on oldComm onto newComm, but this
> >> is very fragile. I attach what I have right now. You have to run it
> >> with PTSCOTCH, because parmetis refuses to partition graphs with no
> >> vertices on a process.: again, this would be avoided if the
> >> partitioning was done on a source communicator with a number of
> >> partitions given by a target communicator, anyway.
> >
> >
> > Sorry I am just getting to this. Its a great example. I was thinking of
> just pushing this stuff in the library, but
> > I had the following thought. What if we reused DMClone() to stick things
> on another Comm, since we do
> > not actually want a copy. The internals would then have to contend with
> a NULL impl, which might be a lot
> > of trouble. I was going to try it out in a branch. It seems more elegant
> to me.
>
> If you want to start from some (slightly more debugged code). Use
> wence/feature/dmplex-distribute-onto-comm
>
> I think if the internals of DMDistribute are going to be refactored to
> contend with a NULL implementation, then I think we should go the
> whole hog and disconnect the number of target partitions from the
> communicator of the to-be-distributed DM.
>

Hmm, that is nicer. We just allow a Comm argument to Distribute, and
underneath
it works like your example.

   Matt


> Lawrence
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190819/87885acd/attachment.html>


More information about the petsc-dev mailing list