[petsc-users] DMPlex overlap
Nicolas Barral
nicolas.barral at math.u-bordeaux.fr
Wed Mar 31 11:22:52 CDT 2021
@+
--
Nicolas
On 31/03/2021 17:51, Matthew Knepley wrote:
> On Sat, Mar 27, 2021 at 9:27 AM Nicolas Barral
> <nicolas.barral at math.u-bordeaux.fr
> <mailto:nicolas.barral at math.u-bordeaux.fr>> wrote:
>
> Hi all,
>
> First, I'm not sure I understand what the overlap parameter in
> DMPlexDistributeOverlap does. I tried the following: generate a small
> mesh on 1 rank with DMPlexCreateBoxMesh, then distribute it with
> DMPlexDistribute. At this point I have two nice partitions, with shared
> vertices and no overlapping cells. Then I call DMPlexDistributeOverlap
> with the overlap parameter set to 0 or 1, and get the same resulting
> plex in both cases. Why is that ?
>
>
> The overlap parameter says how many cell adjacencies to go out. You
> should not get the same
> mesh out. We have lots of examples that use this. If you send your small
> example, I can probably
> tell you what is happening.
>
Ok so I do have a small example on that and the DMClone thing I set up
to understand! I attach it to the email.
For the overlap, you can change the overlap constant at the top of the
file. With OVERLAP=0 or 1, the distributed overlapping mesh (shown using
-over_dm_view, it's DMover) are the same, and different from the mesh
before distributing the overlap (shown using -distrib_dm_view). For
larger overlap values they're different.
The process is:
1/ create a DM dm on 1 rank
2/ clone dm into dm2
3/ distribute dm
4/ clone dm into dm3
5/ distribute dm overlap
I print all the DMs after each step. dm has a distributed overlap, dm2
is not distributed, dm3 is distributed but without overlap. Since
distribute and distributeOverlap create new DMs, I don't seem have a
problem with the shallow copies.
> Second, I'm wondering what would be a good way to handle two overlaps
> and associated local vectors. In my adaptation code, the remeshing
> library requires a non-overlapping mesh, while the refinement criterion
> computation is based on hessian computations, which require a layer of
> overlap. What I can do is clone the dm before distributing the overlap,
> then manage two independent plex objects with their own local sections
> etc. and copy/trim local vectors manually. Is there a more automatic
> way
> to do this ?
>
>
> DMClone() is a shallow copy, so that will not work. You would maintain
> two different Plexes, overlapping
> and non-overlapping, with their own sections and vecs. Are you sure you
> need to keep around the non-overlapping one?
> Maybe if I understood what operations you want to work, I could say
> something more definitive.
>
I need to be able to pass the non-overlapping mesh to the remesher. I
can either maintain 2 plexes, or trim the overlapping plex when I create
the arrays I give to the remesher. I'm not sure which is the best/worst ?
Thanks
--
Nicolas
> Thanks,
>
> Matt
>
> Thanks
>
> --
> Nicolas
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: test_overlap.c
Type: text/x-csrc
Size: 2429 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210331/7e586b40/attachment-0001.bin>
More information about the petsc-users
mailing list