[petsc-users] reading and writing periodic DMPlex to file

Matthew Knepley knepley at gmail.com
Wed May 24 16:38:11 CDT 2023


Checking back. What does not work?

  Thanks,

    Matt

On Tue, Jan 24, 2023 at 11:26 AM Matthew Knepley <knepley at gmail.com> wrote:

> On Tue, Jan 24, 2023 at 10:39 AM Berend van Wachem <
> berend.vanwachem at ovgu.de> wrote:
>
>> Dear Matt,
>>
>> I have been working on this now with Petsc-3.18.3
>>
>> 1) I can confirm that enforcing periodicity works for a single core
>> simulation.
>>
>> 2) However, when using multiple cores, the code still hangs. Is there
>> something I should do to fix this? Or should this be fixed in the next
>> Petsc version?
>>
>
> Dang dang dang. I forgot to merge this fix. Thanks for reminding me. It is
> now here:
>
>   https://gitlab.com/petsc/petsc/-/merge_requests/6001
>
>
>> 3) This is strange, as it works fine for me.
>>
>
> Will try again with current main.
>
>   Thanks
>
>      Matt
>
>
>> Thanks, best, Berend.
>>
>>
>> On 12/15/22 18:56, Matthew Knepley wrote:
>> > On Wed, Dec 14, 2022 at 3:58 AM Berend van Wachem
>> > <berend.vanwachem at ovgu.de <mailto:berend.vanwachem at ovgu.de>> wrote:
>> >
>> >
>> >     Dear PETSc team and users,
>> >
>> >     I have asked a few times about this before, but we haven't really
>> >     gotten
>> >     this to work yet.
>> >
>> >     In our code, we use the DMPlex framework and are also interested in
>> >     periodic geometries.
>> >
>> >     As our simulations typically require many time-steps, we would like
>> to
>> >     be able to save the DM to file and to read it again to resume the
>> >     simulation (a restart).
>> >
>> >     Although this works for a non-periodic DM, we haven't been able to
>> get
>> >     this to work for a periodic one. To illustrate this, I have made a
>> >     working example, consisting of 2 files, createandwrite.c and
>> >     readandcreate.c. I have attached these 2 working examples. We are
>> using
>> >     Petsc-3.18.2.
>> >
>> >     In the first file (createandwrite.c) a DMPlex is created and
>> written to
>> >     a file. Periodicity is activated on lines 52-55 of the code.
>> >
>> >     In the second file (readandcreate.c) a DMPlex is read from the file.
>> >     When a periodic DM is read, this does not work. Also, trying to
>> >     'enforce' periodicity, lines 55 - 66, does not work if the number of
>> >     processes is larger than 1 - the code "hangs" without producing an
>> >     error.
>> >
>> >     Could you indicate what I am missing? I have really tried many
>> >     different
>> >     options, without finding a solution.
>> >
>> >
>> > Hi Berend,
>> >
>> > There are several problems. I will eventually fix all of them, but I
>> > think we can get this working quickly.
>> >
>> > 1) Periodicity information is not saved. I will fix this, but forcing
>> it
>> > should work.
>> >
>> > 2) You were getting a hang because the blocksize on the local
>> > coordinates was not set correctly after loading
>> >       since the vector had zero length. This does not happen in any
>> test
>> > because HDF5 loads a global vector, but
>> >       most other things create local coordinates. I have a fix for
>> this,
>> > which I will get in an MR, Also, I moved DMLocalizeCoordinates()
>> >       after distribution, since this is where it belongs.
>> >
>> > knepley/fix-plex-periodic-faces *$:/PETSc3/petsc/petsc-dev$ git diff
>> > diff --git a/src/dm/interface/dmcoordinates.c
>> > b/src/dm/interface/dmcoordinates.c
>> > index a922348f95b..6437e9f7259 100644
>> > --- a/src/dm/interface/dmcoordinates.c
>> > +++ b/src/dm/interface/dmcoordinates.c
>> > @@ -551,10 +551,14 @@ PetscErrorCode DMGetCoordinatesLocalSetUp(DM dm)
>> >     PetscFunctionBegin;
>> >     PetscValidHeaderSpecific(dm, DM_CLASSID, 1);
>> >     if (!dm->coordinates[0].xl && dm->coordinates[0].x) {
>> > -    DM cdm = NULL;
>> > +    DM       cdm = NULL;
>> > +    PetscInt bs;
>> >
>> >       PetscCall(DMGetCoordinateDM(dm, &cdm));
>> >       PetscCall(DMCreateLocalVector(cdm, &dm->coordinates[0].xl));
>> > +    // If the size of the vector is 0, it will not get the right block
>> size
>> > +    PetscCall(VecGetBlockSize(dm->coordinates[0].x, &bs));
>> > +    PetscCall(VecSetBlockSize(dm->coordinates[0].xl, bs));
>> >       PetscCall(PetscObjectSetName((PetscObject)dm->coordinates[0].xl,
>> > "coordinates"));
>> >       PetscCall(DMGlobalToLocalBegin(cdm, dm->coordinates[0].x,
>> > INSERT_VALUES, dm->coordinates[0].xl));
>> >       PetscCall(DMGlobalToLocalEnd(cdm, dm->coordinates[0].x,
>> > INSERT_VALUES, dm->coordinates[0].xl));
>> >
>> >   3) If I comment out forcing the periodicity, your example does not
>> run
>> > for me. I will try to figure it out
>> >
>> > [0]PETSC ERROR: --------------------- Error Message
>> > --------------------------------------------------------------
>> > [0]PETSC ERROR: Nonconforming object sizes
>> > [0]PETSC ERROR: SF roots 4400 < pEnd 6000
>> > [1]PETSC ERROR: --------------------- Error Message
>> > --------------------------------------------------------------
>> > [0]PETSC ERROR: WARNING! There are option(s) set that were not used!
>> > Could be the program crashed before they were used or a spelling
>> > mistake, etc!
>> > [1]PETSC ERROR: Nonconforming object sizes
>> > [0]PETSC ERROR: Option left: name:-start_in_debugger_no (no value)
>> > source: command line
>> > [1]PETSC ERROR: SF roots 4400 < pEnd 6000
>> > [0]PETSC ERROR: See https://petsc.org/release/faq/
>> > <https://petsc.org/release/faq/> for trouble shooting.
>> > [0]PETSC ERROR: Petsc Development GIT revision:
>> v3.18.1-494-g16200351da0
>> >   GIT Date: 2022-12-12 23:42:20 +0000
>> > [1]PETSC ERROR: WARNING! There are option(s) set that were not used!
>> > Could be the program crashed before they were used or a spelling
>> > mistake, etc!
>> > [1]PETSC ERROR: Option left: name:-start_in_debugger_no (no value)
>> > source: command line
>> > [0]PETSC ERROR: ./readandcreate on a arch-master-debug named
>> > MacBook-Pro.cable.rcn.com <http://MacBook-Pro.cable.rcn.com> by
>> knepley
>> > Thu Dec 15 12:50:26 2022
>> > [1]PETSC ERROR: See https://petsc.org/release/faq/
>> > <https://petsc.org/release/faq/> for trouble shooting.
>> > [0]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug
>> > --download-bamg --download-bison --download-chaco --download-ctetgen
>> > --download-egads --download-eigen --download-exodusii --download-fftw
>> > --download-hpddm --download-ks --download-libceed --download-libpng
>> > --download-metis --download-ml --download-mumps --download-muparser
>> > --download-netcdf --download-opencascade --download-p4est
>> > --download-parmetis --download-pnetcdf --download-pragmatic
>> > --download-ptscotch --download-scalapack --download-slepc
>> > --download-suitesparse --download-superlu_dist --download-tetgen
>> > --download-triangle --with-cmake-exec=/PETSc3/petsc/apple/bin/cmake
>> > --with-ctest-exec=/PETSc3/petsc/apple/bin/ctest
>> > --with-hdf5-dir=/PETSc3/petsc/apple --with-mpi-dir=/PETSc3/petsc/apple
>> > --with-petsc4py=1 --with-shared-libraries --with-slepc --with-zlib
>> > [1]PETSC ERROR: Petsc Development GIT revision:
>> v3.18.1-494-g16200351da0
>> >   GIT Date: 2022-12-12 23:42:20 +0000
>> > [0]PETSC ERROR: #1 PetscSectionCreateGlobalSection() at
>> > /PETSc3/petsc/petsc-dev/src/vec/is/section/interface/section.c:1322
>> > [1]PETSC ERROR: ./readandcreate on a arch-master-debug named
>> > MacBook-Pro.cable.rcn.com <http://MacBook-Pro.cable.rcn.com> by
>> knepley
>> > Thu Dec 15 12:50:26 2022
>> > [0]PETSC ERROR: #2 DMGetGlobalSection() at
>> > /PETSc3/petsc/petsc-dev/src/dm/interface/dm.c:4527
>> > [1]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug
>> > --download-bamg --download-bison --download-chaco --download-ctetgen
>> > --download-egads --download-eigen --download-exodusii --download-fftw
>> > --download-hpddm --download-ks --download-libceed --download-libpng
>> > --download-metis --download-ml --download-mumps --download-muparser
>> > --download-netcdf --download-opencascade --download-p4est
>> > --download-parmetis --download-pnetcdf --download-pragmatic
>> > --download-ptscotch --download-scalapack --download-slepc
>> > --download-suitesparse --download-superlu_dist --download-tetgen
>> > --download-triangle --with-cmake-exec=/PETSc3/petsc/apple/bin/cmake
>> > --with-ctest-exec=/PETSc3/petsc/apple/bin/ctest
>> > --with-hdf5-dir=/PETSc3/petsc/apple --with-mpi-dir=/PETSc3/petsc/apple
>> > --with-petsc4py=1 --with-shared-libraries --with-slepc --with-zlib
>> > [0]PETSC ERROR: #3 DMPlexSectionLoad_HDF5_Internal() at
>> > /PETSc3/petsc/petsc-dev/src/dm/impls/plex/plexhdf5.c:2750
>> > [1]PETSC ERROR: #1 PetscSectionCreateGlobalSection() at
>> > /PETSc3/petsc/petsc-dev/src/vec/is/section/interface/section.c:1322
>> > [0]PETSC ERROR: #4 DMPlexSectionLoad() at
>> > /PETSc3/petsc/petsc-dev/src/dm/impls/plex/plex.c:2364
>> > [1]PETSC ERROR: #2 DMGetGlobalSection() at
>> > /PETSc3/petsc/petsc-dev/src/dm/interface/dm.c:4527
>> > [0]PETSC ERROR: #5 main() at
>> > /Users/knepley/Downloads/tmp/Berend/readandcreate.c:85
>> > [1]PETSC ERROR: #3 DMPlexSectionLoad_HDF5_Internal() at
>> > /PETSc3/petsc/petsc-dev/src/dm/impls/plex/plexhdf5.c:2750
>> > [0]PETSC ERROR: PETSc Option Table entries:
>> > [0]PETSC ERROR: -malloc_debug (source: environment)
>> > [1]PETSC ERROR: #4 DMPlexSectionLoad() at
>> > /PETSc3/petsc/petsc-dev/src/dm/impls/plex/plex.c:2364
>> > [1]PETSC ERROR: #5 main() at
>> > /Users/knepley/Downloads/tmp/Berend/readandcreate.c:85
>> > [0]PETSC ERROR: -start_in_debugger_no (source: command line)
>> > [1]PETSC ERROR: PETSc Option Table entries:
>> > [0]PETSC ERROR: ----------------End of Error Message -------send entire
>> > error message to petsc-maint at mcs.anl.gov----------
>> > [1]PETSC ERROR: -malloc_debug (source: environment)
>> > application called MPI_Abort(MPI_COMM_SELF, 60) - process 0
>> > [1]PETSC ERROR: -start_in_debugger_no (source: command line)
>> > [1]PETSC ERROR: ----------------End of Error Message -------send entire
>> > error message to petsc-maint at mcs.anl.gov----------
>> > application called MPI_Abort(MPI_COMM_SELF, 60) - process 0
>> > 4) We now have parallel HDF5 loading, so you should not have to
>> manually
>> > distribute. I will change your example to use it
>> >       and send it back when I am done.
>> >
>> >    Thanks!
>> >
>> >       Matt
>> >
>> >     Many thanks and kind regards,
>> >     Berend.
>> >
>> >
>> >
>> > --
>> > What most experimenters take for granted before they begin their
>> > experiments is infinitely more interesting than any results to which
>> > their experiments lead.
>> > -- Norbert Wiener
>> >
>> > https://www.cse.buffalo.edu/~knepley/ <
>> http://www.cse.buffalo.edu/~knepley/>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230524/25ab233c/attachment.html>


More information about the petsc-users mailing list