[petsc-users] DMPlex overlap
Matthew Knepley
knepley at gmail.com
Wed Mar 31 14:01:35 CDT 2021
On Wed, Mar 31, 2021 at 2:59 PM Nicolas Barral <
nicolas.barral at math.u-bordeaux.fr> wrote:
> On 31/03/2021 20:10, Matthew Knepley wrote:
> > On Wed, Mar 31, 2021 at 1:41 PM Nicolas Barral
> > <nicolas.barral at math.u-bordeaux.fr
> > <mailto:nicolas.barral at math.u-bordeaux.fr>> wrote:
> >
> > Thanks Matt, but sorry I still don't get it. Why does:
> >
> > static char help[] = "Tests plex distribution and overlaps.\n";
> >
> > #include <petsc/private/dmpleximpl.h>
> >
> > int main (int argc, char * argv[]) {
> >
> > DM dm;
> > MPI_Comm comm;
> > PetscErrorCode ierr;
> >
> > ierr = PetscInitialize(&argc, &argv, NULL, help);if (ierr)
> > return ierr;
> > comm = PETSC_COMM_WORLD;
> >
> > ierr = DMPlexCreateBoxMesh(comm, 2, PETSC_TRUE, NULL, NULL, NULL,
> > NULL, PETSC_TRUE, &dm);CHKERRQ(ierr);
> > ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
> > ierr = PetscObjectSetName((PetscObject) dm, "Initial
> > DM");CHKERRQ(ierr);
> > ierr = DMViewFromOptions(dm, NULL,
> > "-initial_dm_view");CHKERRQ(ierr);
> >
> >
> > ierr = DMDestroy(&dm);CHKERRQ(ierr);
> > ierr = PetscFinalize();
> > return ierr;
> > }
> >
> > called with mpiexec -n 2 ./test_overlapV2 -initial_dm_view
> > -dm_plex_box_faces 5,5 -dm_distribute -dm_distribute_overlap 0
> >
> > give
> > DM Object: Initial DM 2 MPI processes
> > type: plex
> > Initial DM in 2 dimensions:
> > 0-cells: 21 21
> > 1-cells: 45 45
> > 2-cells: 25 25
> > Labels:
> > depth: 3 strata with value/size (0 (21), 1 (45), 2 (25))
> > celltype: 3 strata with value/size (0 (21), 1 (45), 3 (25))
> > marker: 1 strata with value/size (1 (21))
> > Face Sets: 1 strata with value/size (1 (10))
> >
> > which is what I expect, while
> >
> > static char help[] = "Tests plex distribution and overlaps.\n";
> >
> > #include <petsc/private/dmpleximpl.h>
> >
> > int main (int argc, char * argv[]) {
> >
> > DM dm, odm;
> > MPI_Comm comm;
> > PetscErrorCode ierr;
> >
> > ierr = PetscInitialize(&argc, &argv, NULL, help);if (ierr)
> > return ierr;
> > comm = PETSC_COMM_WORLD;
> >
> > ierr = DMPlexCreateBoxMesh(comm, 2, PETSC_TRUE, NULL, NULL, NULL,
> > NULL, PETSC_TRUE, &dm);CHKERRQ(ierr);
> > ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
> > odm = dm;
> >
> >
> > This is just setting pointers, so no copy.
> >
> > DMPlexDistributeOverlap(odm, 0, NULL, &dm);
> >
> >
> > Here you have overwritten the DM here. Don't do this.
>
>
> I just copied what's in plex ex19!
>
> But ok if I change for:
>
> static char help[] = "Tests plex distribution and overlaps.\n";
>
> #include <petsc/private/dmpleximpl.h>
>
> int main (int argc, char * argv[]) {
>
> DM dm, odm;
> MPI_Comm comm;
> PetscErrorCode ierr;
>
> ierr = PetscInitialize(&argc, &argv, NULL, help);if (ierr) return ierr;
> comm = PETSC_COMM_WORLD;
>
> ierr = DMPlexCreateBoxMesh(comm, 2, PETSC_TRUE, NULL, NULL, NULL,
> NULL, PETSC_TRUE, &dm);CHKERRQ(ierr);
> ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
> DMPlexDistributeOverlap(dm, 0, NULL, &odm);
> ierr = PetscObjectSetName((PetscObject) odm, "Initial
> DM");CHKERRQ(ierr);
> ierr = DMViewFromOptions(odm, NULL, "-initial_dm_view");CHKERRQ(ierr);
>
>
> ierr = DMDestroy(&dm);CHKERRQ(ierr);
> ierr = DMDestroy(&odm);CHKERRQ(ierr);
> ierr = PetscFinalize();
> return ierr;
> }
>
> I still get:
>
> DM Object: Initial DM 2 MPI processes
> type: plex
> Initial DM in 2 dimensions:
> 0-cells: 29 29
> 1-cells: 65 65
> 2-cells: 37 37
> Labels:
> depth: 3 strata with value/size (0 (29), 1 (65), 2 (37))
> celltype: 3 strata with value/size (0 (29), 1 (65), 3 (37))
> marker: 1 strata with value/size (1 (27))
> Face Sets: 1 strata with value/size (1 (13))
>
> which is the mesh with a 1-overlap. So what's wrong now ?
>
Oh, you mean why did passing "0" in DistributeOverlap not reduce your
overlap to 0? That
function takes an existing mesh and gives you "k" more levels of overlap.
It does not set your
overlap at level k. Maybe that is not clear from the docs.
Thanks,
Matt
> Thanks,
>
> --
> Nicolas
>
> >
> > Thanks,
> >
> > Matt
> >
> > if (!dm) {printf("Big problem\n"); dm = odm;}
> > else {DMDestroy(&odm);}
> > ierr = PetscObjectSetName((PetscObject) dm, "Initial
> > DM");CHKERRQ(ierr);
> > ierr = DMViewFromOptions(dm, NULL,
> > "-initial_dm_view");CHKERRQ(ierr);
> >
> >
> > ierr = DMDestroy(&dm);CHKERRQ(ierr);
> > ierr = PetscFinalize();
> > return ierr;
> > }
> >
> > called with mpiexec -n 2 ./test_overlapV3 -initial_dm_view
> > -dm_plex_box_faces 5,5 -dm_distribute
> >
> > gives:
> > DM Object: Initial DM 2 MPI processes
> > type: plex
> > Initial DM in 2 dimensions:
> > 0-cells: 29 29
> > 1-cells: 65 65
> > 2-cells: 37 37
> > Labels:
> > depth: 3 strata with value/size (0 (29), 1 (65), 2 (37))
> > celltype: 3 strata with value/size (0 (29), 1 (65), 3 (37))
> > marker: 1 strata with value/size (1 (27))
> > Face Sets: 1 strata with value/size (1 (13))
> >
> > which is not what I expect ?
> >
> > Thanks,
> >
> > --
> > Nicolas
> >
> > On 31/03/2021 19:02, Matthew Knepley wrote:
> > > Alright, I think the problem had to do with keeping track of what
> > DM you
> > > were looking at. This code increases the overlap of an initial DM:
> > >
> > > static char help[] = "Tests plex distribution and overlaps.\n";
> > >
> > > #include <petsc/private/dmpleximpl.h>
> > >
> > > int main (int argc, char * argv[]) {
> > >
> > > DM dm, dm2;
> > > PetscInt overlap;
> > > MPI_Comm comm;
> > > PetscErrorCode ierr;
> > >
> > > ierr = PetscInitialize(&argc, &argv, NULL, help);if (ierr)
> > return ierr;
> > > comm = PETSC_COMM_WORLD;
> > >
> > > ierr = DMPlexCreateBoxMesh(comm, 2, PETSC_TRUE, NULL, NULL,
> NULL,
> > > NULL, PETSC_TRUE, &dm);CHKERRQ(ierr);
> > > ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
> > > ierr = PetscObjectSetName((PetscObject) dm, "Initial
> > DM");CHKERRQ(ierr);
> > > ierr = DMViewFromOptions(dm, NULL,
> > "-initial_dm_view");CHKERRQ(ierr);
> > >
> > > ierr = DMPlexGetOverlap(dm, &overlap);CHKERRQ(ierr);
> > > ierr = DMPlexDistributeOverlap(dm, overlap+1, NULL,
> > &dm2);CHKERRQ(ierr);
> > > ierr = PetscObjectSetName((PetscObject) dm2, "More Overlap
> > > DM");CHKERRQ(ierr);
> > > ierr = DMViewFromOptions(dm2, NULL,
> > "-over_dm_view");CHKERRQ(ierr);
> > >
> > > ierr = DMDestroy(&dm2);CHKERRQ(ierr);
> > > ierr = DMDestroy(&dm);CHKERRQ(ierr);
> > > ierr = PetscFinalize();
> > > return ierr;
> > > }
> > >
> > > and when we run it we get the expected result
> > >
> > > master *:~/Downloads/tmp/Nicolas$ /PETSc3/petsc/apple/bin/mpiexec
> > -n 2
> > > ./test_overlap -initial_dm_view -dm_plex_box_faces 5,5
> > -dm_distribute
> > > -dm_distribute_overlap 1 -over_dm_view
> > > DM Object: Initial DM 2 MPI processes
> > > type: plex
> > > Initial DM in 2 dimensions:
> > > 0-cells: 29 29
> > > 1-cells: 65 65
> > > 2-cells: 37 37
> > > Labels:
> > > depth: 3 strata with value/size (0 (29), 1 (65), 2 (37))
> > > celltype: 3 strata with value/size (0 (29), 1 (65), 3 (37))
> > > marker: 1 strata with value/size (1 (27))
> > > Face Sets: 1 strata with value/size (1 (13))
> > > DM Object: More Overlap DM 2 MPI processes
> > > type: plex
> > > More Overlap DM in 2 dimensions:
> > > 0-cells: 36 36
> > > 1-cells: 85 85
> > > 2-cells: 50 50
> > > Labels:
> > > depth: 3 strata with value/size (0 (36), 1 (85), 2 (50))
> > > celltype: 3 strata with value/size (0 (36), 1 (85), 3 (50))
> > > marker: 1 strata with value/size (1 (40))
> > > Face Sets: 1 strata with value/size (1 (20))
> > >
> > > Thanks,
> > >
> > > Matt
> > >
> > > On Wed, Mar 31, 2021 at 12:57 PM Matthew Knepley
> > <knepley at gmail.com <mailto:knepley at gmail.com>
> > > <mailto:knepley at gmail.com <mailto:knepley at gmail.com>>> wrote:
> > >
> > > Okay, let me show a really simple example that gives the
> expected
> > > result before I figure out what is going wrong for you. This
> code
> > >
> > > static char help[] = "Tests plex distribution and
> overlaps.\n";
> > >
> > > #include <petsc/private/dmpleximpl.h>
> > >
> > > int main (int argc, char * argv[]) {
> > > DM dm;
> > > MPI_Comm comm;
> > > PetscErrorCode ierr;
> > >
> > > ierr = PetscInitialize(&argc, &argv, NULL, help);if (ierr)
> > return
> > > ierr;
> > > comm = PETSC_COMM_WORLD;
> > >
> > > ierr = DMPlexCreateBoxMesh(comm, 2, PETSC_TRUE, NULL,
> > NULL, NULL,
> > > NULL, PETSC_TRUE, &dm);CHKERRQ(ierr);
> > > ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
> > > ierr = PetscObjectSetName((PetscObject) dm, "Initial
> > > DM");CHKERRQ(ierr);
> > > ierr = DMViewFromOptions(dm, NULL,
> > "-initial_dm_view");CHKERRQ(ierr);
> > > ierr = DMDestroy(&dm);CHKERRQ(ierr);
> > > ierr = PetscFinalize();
> > > return ierr;
> > > }
> > >
> > > can do all the overlap tests. For example, you can run it
> naively
> > > and get a serial mesh
> > >
> > > master *:~/Downloads/tmp/Nicolas$
> > /PETSc3/petsc/apple/bin/mpiexec -n
> > > 2 ./test_overlap -initial_dm_view -dm_plex_box_faces 5,5
> > > DM Object: Initial DM 2 MPI processes
> > > type: plex
> > > Initial DM in 2 dimensions:
> > > 0-cells: 36 0
> > > 1-cells: 85 0
> > > 2-cells: 50 0
> > > Labels:
> > > celltype: 3 strata with value/size (0 (36), 3 (50), 1 (85))
> > > depth: 3 strata with value/size (0 (36), 1 (85), 2 (50))
> > > marker: 1 strata with value/size (1 (40))
> > > Face Sets: 1 strata with value/size (1 (20))
> > >
> > > Then run it telling Plex to distribute after creating the mesh
> > >
> > > master *:~/Downloads/tmp/Nicolas$
> > /PETSc3/petsc/apple/bin/mpiexec -n
> > > 2 ./test_overlap -initial_dm_view -dm_plex_box_faces 5,5
> > -dm_distribute
> > > DM Object: Initial DM 2 MPI processes
> > > type: plex
> > > Initial DM in 2 dimensions:
> > > 0-cells: 21 21
> > > 1-cells: 45 45
> > > 2-cells: 25 25
> > > Labels:
> > > depth: 3 strata with value/size (0 (21), 1 (45), 2 (25))
> > > celltype: 3 strata with value/size (0 (21), 1 (45), 3 (25))
> > > marker: 1 strata with value/size (1 (21))
> > > Face Sets: 1 strata with value/size (1 (10))
> > >
> > > The get the same thing back with overlap = 0
> > >
> > > master *:~/Downloads/tmp/Nicolas$
> > /PETSc3/petsc/apple/bin/mpiexec -n
> > > 2 ./test_overlap -initial_dm_view -dm_plex_box_faces 5,5
> > > -dm_distribute -dm_distribute_overlap 0
> > > DM Object: Initial DM 2 MPI processes
> > > type: plex
> > > Initial DM in 2 dimensions:
> > > 0-cells: 21 21
> > > 1-cells: 45 45
> > > 2-cells: 25 25
> > > Labels:
> > > depth: 3 strata with value/size (0 (21), 1 (45), 2 (25))
> > > celltype: 3 strata with value/size (0 (21), 1 (45), 3 (25))
> > > marker: 1 strata with value/size (1 (21))
> > > Face Sets: 1 strata with value/size (1 (10))
> > >
> > > and get larger local meshes with overlap = 1
> > >
> > > master *:~/Downloads/tmp/Nicolas$
> > /PETSc3/petsc/apple/bin/mpiexec -n
> > > 2 ./test_overlap -initial_dm_view -dm_plex_box_faces 5,5
> > > -dm_distribute -dm_distribute_overlap 1
> > > DM Object: Initial DM 2 MPI processes
> > > type: plex
> > > Initial DM in 2 dimensions:
> > > 0-cells: 29 29
> > > 1-cells: 65 65
> > > 2-cells: 37 37
> > > Labels:
> > > depth: 3 strata with value/size (0 (29), 1 (65), 2 (37))
> > > celltype: 3 strata with value/size (0 (29), 1 (65), 3 (37))
> > > marker: 1 strata with value/size (1 (27))
> > > Face Sets: 1 strata with value/size (1 (13))
> > >
> > > Thanks,
> > >
> > > Matt
> > >
> > > On Wed, Mar 31, 2021 at 12:22 PM Nicolas Barral
> > > <nicolas.barral at math.u-bordeaux.fr
> > <mailto:nicolas.barral at math.u-bordeaux.fr>
> > > <mailto:nicolas.barral at math.u-bordeaux.fr
> > <mailto:nicolas.barral at math.u-bordeaux.fr>>> wrote:
> > >
> > >
> > >
> > > @+
> > >
> > > --
> > > Nicolas
> > >
> > > On 31/03/2021 17:51, Matthew Knepley wrote:
> > > > On Sat, Mar 27, 2021 at 9:27 AM Nicolas Barral
> > > > <nicolas.barral at math.u-bordeaux.fr
> > <mailto:nicolas.barral at math.u-bordeaux.fr>
> > > <mailto:nicolas.barral at math.u-bordeaux.fr
> > <mailto:nicolas.barral at math.u-bordeaux.fr>>
> > > > <mailto:nicolas.barral at math.u-bordeaux.fr
> > <mailto:nicolas.barral at math.u-bordeaux.fr>
> > > <mailto:nicolas.barral at math.u-bordeaux.fr
> > <mailto:nicolas.barral at math.u-bordeaux.fr>>>> wrote:
> > > >
> > > > Hi all,
> > > >
> > > > First, I'm not sure I understand what the overlap
> > > parameter in
> > > > DMPlexDistributeOverlap does. I tried the
> following:
> > > generate a small
> > > > mesh on 1 rank with DMPlexCreateBoxMesh, then
> > distribute
> > > it with
> > > > DMPlexDistribute. At this point I have two nice
> > > partitions, with shared
> > > > vertices and no overlapping cells. Then I call
> > > DMPlexDistributeOverlap
> > > > with the overlap parameter set to 0 or 1, and get
> the
> > > same resulting
> > > > plex in both cases. Why is that ?
> > > >
> > > >
> > > > The overlap parameter says how many cell adjacencies
> to go
> > > out. You
> > > > should not get the same
> > > > mesh out. We have lots of examples that use this. If
> > you send
> > > your small
> > > > example, I can probably
> > > > tell you what is happening.
> > > >
> > >
> > > Ok so I do have a small example on that and the DMClone
> > thing I
> > > set up
> > > to understand! I attach it to the email.
> > >
> > > For the overlap, you can change the overlap constant at
> > the top
> > > of the
> > > file. With OVERLAP=0 or 1, the distributed overlapping
> mesh
> > > (shown using
> > > -over_dm_view, it's DMover) are the same, and different
> > from the
> > > mesh
> > > before distributing the overlap (shown using
> > -distrib_dm_view). For
> > > larger overlap values they're different.
> > >
> > > The process is:
> > > 1/ create a DM dm on 1 rank
> > > 2/ clone dm into dm2
> > > 3/ distribute dm
> > > 4/ clone dm into dm3
> > > 5/ distribute dm overlap
> > >
> > > I print all the DMs after each step. dm has a distributed
> > > overlap, dm2
> > > is not distributed, dm3 is distributed but without
> > overlap. Since
> > > distribute and distributeOverlap create new DMs, I don't
> seem
> > > have a
> > > problem with the shallow copies.
> > >
> > >
> > > > Second, I'm wondering what would be a good way to
> > handle
> > > two overlaps
> > > > and associated local vectors. In my adaptation
> > code, the
> > > remeshing
> > > > library requires a non-overlapping mesh, while the
> > > refinement criterion
> > > > computation is based on hessian computations, which
> > > require a layer of
> > > > overlap. What I can do is clone the dm before
> > > distributing the overlap,
> > > > then manage two independent plex objects with
> > their own
> > > local sections
> > > > etc. and copy/trim local vectors manually. Is
> there a
> > > more automatic
> > > > way
> > > > to do this ?
> > > >
> > > >
> > > > DMClone() is a shallow copy, so that will not work.
> > You would
> > > maintain
> > > > two different Plexes, overlapping
> > > > and non-overlapping, with their own sections and vecs.
> Are
> > > you sure you
> > > > need to keep around the non-overlapping one?
> > > > Maybe if I understood what operations you want to
> work, I
> > > could say
> > > > something more definitive.
> > > >
> > > I need to be able to pass the non-overlapping mesh to the
> > > remesher. I
> > > can either maintain 2 plexes, or trim the overlapping
> > plex when
> > > I create
> > > the arrays I give to the remesher. I'm not sure which is
> the
> > > best/worst ?
> > >
> > > Thanks
> > >
> > > --
> > > Nicolas
> > >
> > >
> > > > Thanks,
> > > >
> > > > Matt
> > > >
> > > > Thanks
> > > >
> > > > --
> > > > Nicolas
> > > >
> > > >
> > > >
> > > > --
> > > > What most experimenters take for granted before they
> > begin their
> > > > experiments is infinitely more interesting than any
> > results
> > > to which
> > > > their experiments lead.
> > > > -- Norbert Wiener
> > > >
> > > > https://www.cse.buffalo.edu/~knepley/
> > > <http://www.cse.buffalo.edu/~knepley/>
> > >
> > >
> > >
> > > --
> > > What most experimenters take for granted before they begin
> their
> > > experiments is infinitely more interesting than any results
> > to which
> > > their experiments lead.
> > > -- Norbert Wiener
> > >
> > > https://www.cse.buffalo.edu/~knepley/
> > > <http://www.cse.buffalo.edu/~knepley/>
> > >
> > >
> > >
> > > --
> > > What most experimenters take for granted before they begin their
> > > experiments is infinitely more interesting than any results to
> which
> > > their experiments lead.
> > > -- Norbert Wiener
> > >
> > > https://www.cse.buffalo.edu/~knepley/
> > <http://www.cse.buffalo.edu/~knepley/>
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> > experiments is infinitely more interesting than any results to which
> > their experiments lead.
> > -- Norbert Wiener
> >
> > https://www.cse.buffalo.edu/~knepley/ <
> http://www.cse.buffalo.edu/~knepley/>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210331/e82e5954/attachment-0001.html>
More information about the petsc-users
mailing list