[petsc-users] DMCloning from a DMPlex has changed in Petsc-3.17.0?

Berend van Wachem berend.vanwachem at ovgu.de
Tue Apr 12 06:02:17 CDT 2022


Dear Matt,

In our code, the size of the overlap is determined in runtime, based on 
some calculations. Therefore, we cannot specify it using the 
-dm_distribute_overlap option.

Is there an alternative to set the overlap from within the code before 
calling DMPlexDistribute() from the code? Or is the best way to call 
DMPlexDistributeSetDefault(*NewDM, PETSC_FALSE), and call 
DMPlexDistribute() with the required value of the overlap?

Thanks, best, Berend.



On 4/12/22 12:49, Matthew Knepley wrote:
> On Tue, Apr 12, 2022 at 2:50 AM Berend van Wachem 
> <berend.vanwachem at ovgu.de <mailto:berend.vanwachem at ovgu.de>> wrote:
> 
>     Dear Matt,
> 
>     Thank you very much - I can confirm that that works.
> 
>     I have one question about your remark:
> 
>       > Also, the call to DMPlexDistribute() here (and the Partitioner
>     calls)
>       > are now superfluous.
> 
>     If I shouldn't call DMPlexDistribute(),
> 
> 
> I just meant that DMPlexDistribute() was being called automatically from 
> within DMSetFromOptions()
> 
>     how should I set the overlap of
>     the DM from within the code (our code determines the overlap required)?
> 
> 
> -dm_distribute_overlap <n> - The size of the overlap halo
> 
> from https://petsc.org/main/docs/manualpages/DM/DMSetFromOptions.html 
> <https://petsc.org/main/docs/manualpages/DM/DMSetFromOptions.html>
> 
>    Thanks,
> 
>       Matt
> 
>     Many thanks, best regards,
> 
>     Berend.
> 
> 
> 
> 
>     On 4/11/22 16:23, Matthew Knepley wrote:
>      > On Wed, Apr 6, 2022 at 9:41 AM Berend van Wachem
>      > <berend.vanwachem at ovgu.de <mailto:berend.vanwachem at ovgu.de>
>     <mailto:berend.vanwachem at ovgu.de <mailto:berend.vanwachem at ovgu.de>>>
>     wrote:
>      >
>      >     Dear Matt,
>      >
>      >     I have made a small working example of cloning a DM,
>     illustrating the
>      >     problem we have.
>      >     In the attached code, I wrote a function
>     'CloneDMWithNewSection', which
>      >     clones a section and puts a different number of fields on it.
>      >
>      >     The code itself prints the number of local cells of the DM, which
>      >     changes as the DM is cloned.
>      >     In our code, we assume that the cloned DM should have exactly
>     the same
>      >     partitioning - this was the behaviour in PETSc versions prior
>     to 3.17.
>      >
>      >     If I run the attached code on 2 processors, I get:
>      >
>      >     First DM: Processor 1 reports Start: 0, End 4000 giving
>     number of local
>      >     cells: 4000
>      >     First DM: Processor 0 reports Start: 0, End 4000 giving
>     number of local
>      >     cells: 4000
>      >
>      >     Cloned DM: Processor 1 reports Start: 0, End 3984 giving
>     number of
>      >     local
>      >     cells: 3984
>      >     Cloned DM: Processor 0 reports Start: 0, End 4016 giving
>     number of
>      >     local
>      >     cells: 4016
>      >
>      >     Maybe we are doing something wrong in the function
>      >     CloneDMWithNewSection?
>      >
>      >
>      > I apologize for taking so long on this. Jed persuaded me to
>     change the
>      > default. Now, when you
>      > call DMSetFromOptions() it distributes by default, rather than
>     requiring
>      > you to explicitly call it.
>      > You can shut this behavior off, so that if you add
>      >
>      >    ierr = DMPlexDistributeSetDefault(*NewDM,
>     PETSC_FALSE);CHKERRQ(ierr);
>      >
>      > right after DMClone(), you will preserve the layout you have.
>      >
>      > Also, the call to DMPlexDistribute() here (and the Partitioner
>     calls)
>      > are now superfluous.
>      >
>      >    Thanks,
>      >
>      >       Matt
>      >
>      >     Many thanks for looking into this, best regards,
>      >     Berend.
>      >
>      >
>      >
>      >     On 4/4/22 23:05, Matthew Knepley wrote:
>      >      > On Mon, Apr 4, 2022 at 3:36 PM Berend van Wachem
>      >      > <berend.vanwachem at ovgu.de
>     <mailto:berend.vanwachem at ovgu.de> <mailto:berend.vanwachem at ovgu.de
>     <mailto:berend.vanwachem at ovgu.de>>
>      >     <mailto:berend.vanwachem at ovgu.de
>     <mailto:berend.vanwachem at ovgu.de> <mailto:berend.vanwachem at ovgu.de
>     <mailto:berend.vanwachem at ovgu.de>>>>
>      >     wrote:
>      >      >
>      >      >     Dear Petsc team,
>      >      >
>      >      >     Since about 2 years we have been using Petsc with
>     DMPlex, but
>      >     since
>      >      >     upgrading our code to Petsc-3.17.0 something has broken.
>      >      >
>      >      >     First we generate a DM from a DMPlex with
>      >     DMPlexCreateFromFile or
>      >      >     creating one with DMPlexCreateBoxMesh. Then the DM is
>      >     distributed with
>      >      >     DMPlexDistribute. This DM works fine and we set a numer of
>      >     fields and
>      >      >     set a section to it.
>      >      >     However, on the same mesh we also want to solve a
>     problem with a
>      >      >     different number of fields, and therefore we create a
>     clone
>      >     of this
>      >      >     original DM, using the code:
>      >      >
>      >      >     DMClone(OriginalDM, NewDM);
>      >      >     DMClearDS(*NewDM);
>      >      >     PetscCalloc2(1, &NumComp, 4, &NumDof);
>      >      >     NumComp[0] = 1;
>      >      >     NumDof[3] = NFields;
>      >      >     DMSetNumFields(*NewDM, 1);
>      >      >     DMSetFromOptions(*NewDM);
>      >      >     DMPlexCreateSection(*NewDM, NULL, NumComp, NumDof, 0,
>     NULL,
>      >     NULL, NULL,
>      >      >     NULL, &section);
>      >      >     DMSetLocalSection(*NewDM, section);
>      >      >     PetscFree2(NumComp, NumDof);
>      >      >     PetscSectionDestroy(&section);
>      >      >
>      >      >     However, with Petsc-3.17.0, the *NewDM is corrupt -
>     When I call
>      >      >     DMGlobalToLocalBegin with a Global and Local vector
>     created
>      >     with this
>      >      >     NewDM, the code crashes. Indeed, the cloned DM seems to be
>      >     partitioned
>      >      >     differently than the original DM, as it these two DMs
>     have a
>      >     different
>      >      >     number of local cells.
>      >      >
>      >      >
>      >      > The cloned DM will have exactly the same topology and
>      >     distribution. This
>      >      > must be a misinterpretation
>      >      > of what is happening. We can do a few things:
>      >      >
>      >      > 1) Make a small example to show what you are talking about
>      >      >
>      >      > 2) Look at a PETSc example that does something similar
>      >      >
>      >      > 3) Look directly at your code if I can somehow run it here
>      >      >
>      >      > 4) Start doing diagnostics on your code to see what is
>     going on
>      >      >
>      >      > Which one do you prefer?
>      >      >
>      >      >     This worked fine in Petsc releases before 3.17 (e.g.
>     3.16.5).
>      >     So my
>      >      >     question is: what has changed? Am I doing something wrong,
>      >     which should
>      >      >     be changed for using with Petsc-3.17?
>      >      >
>      >      >
>      >      > I don't think any of this should have changed, so this
>     should be
>      >      > something simple.
>      >      >
>      >      >    Thanks,
>      >      >
>      >      >       Matt
>      >      >
>      >      >     Thanks, best regards,
>      >      >
>      >      >     Berend.
>      >      >
>      >      >
>      >      >
>      >      > --
>      >      > What most experimenters take for granted before they begin
>     their
>      >      > experiments is infinitely more interesting than any
>     results to which
>      >      > their experiments lead.
>      >      > -- Norbert Wiener
>      >      >
>      >      > https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>      >     <https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>
>      >     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>>
>      >
>      >
>      >
>      > --
>      > What most experimenters take for granted before they begin their
>      > experiments is infinitely more interesting than any results to which
>      > their experiments lead.
>      > -- Norbert Wiener
>      >
>      > https://www.cse.buffalo.edu/~knepley/
>     <https://www.cse.buffalo.edu/~knepley/>
>     <http://www.cse.buffalo.edu/~knepley/
>     <http://www.cse.buffalo.edu/~knepley/>>
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>


More information about the petsc-users mailing list