<div dir="ltr">Checking back. What does not work?<div><br></div><div> Thanks,</div><div><br></div><div> Matt</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Jan 24, 2023 at 11:26 AM Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Tue, Jan 24, 2023 at 10:39 AM Berend van Wachem <<a href="mailto:berend.vanwachem@ovgu.de" target="_blank">berend.vanwachem@ovgu.de</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Dear Matt,<br>
<br>
I have been working on this now with Petsc-3.18.3<br>
<br>
1) I can confirm that enforcing periodicity works for a single core <br>
simulation.<br>
<br>
2) However, when using multiple cores, the code still hangs. Is there <br>
something I should do to fix this? Or should this be fixed in the next <br>
Petsc version?<br></blockquote><div><br></div><div>Dang dang dang. I forgot to merge this fix. Thanks for reminding me. It is now here:</div><div><br></div><div> <a href="https://gitlab.com/petsc/petsc/-/merge_requests/6001" target="_blank">https://gitlab.com/petsc/petsc/-/merge_requests/6001</a></div><div> <br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
3) This is strange, as it works fine for me.<br></blockquote><div><br></div><div>Will try again with current main.</div><div><br> Thanks<br><br> Matt<br></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
Thanks, best, Berend.<br>
<br>
<br>
On 12/15/22 18:56, Matthew Knepley wrote:<br>
> On Wed, Dec 14, 2022 at 3:58 AM Berend van Wachem <br>
> <<a href="mailto:berend.vanwachem@ovgu.de" target="_blank">berend.vanwachem@ovgu.de</a> <mailto:<a href="mailto:berend.vanwachem@ovgu.de" target="_blank">berend.vanwachem@ovgu.de</a>>> wrote:<br>
> <br>
> <br>
> Dear PETSc team and users,<br>
> <br>
> I have asked a few times about this before, but we haven't really<br>
> gotten<br>
> this to work yet.<br>
> <br>
> In our code, we use the DMPlex framework and are also interested in<br>
> periodic geometries.<br>
> <br>
> As our simulations typically require many time-steps, we would like to<br>
> be able to save the DM to file and to read it again to resume the<br>
> simulation (a restart).<br>
> <br>
> Although this works for a non-periodic DM, we haven't been able to get<br>
> this to work for a periodic one. To illustrate this, I have made a<br>
> working example, consisting of 2 files, createandwrite.c and<br>
> readandcreate.c. I have attached these 2 working examples. We are using<br>
> Petsc-3.18.2.<br>
> <br>
> In the first file (createandwrite.c) a DMPlex is created and written to<br>
> a file. Periodicity is activated on lines 52-55 of the code.<br>
> <br>
> In the second file (readandcreate.c) a DMPlex is read from the file.<br>
> When a periodic DM is read, this does not work. Also, trying to<br>
> 'enforce' periodicity, lines 55 - 66, does not work if the number of<br>
> processes is larger than 1 - the code "hangs" without producing an<br>
> error.<br>
> <br>
> Could you indicate what I am missing? I have really tried many<br>
> different<br>
> options, without finding a solution.<br>
> <br>
> <br>
> Hi Berend,<br>
> <br>
> There are several problems. I will eventually fix all of them, but I <br>
> think we can get this working quickly.<br>
> <br>
> 1) Periodicity information is not saved. I will fix this, but forcing it <br>
> should work.<br>
> <br>
> 2) You were getting a hang because the blocksize on the local <br>
> coordinates was not set correctly after loading<br>
> since the vector had zero length. This does not happen in any test <br>
> because HDF5 loads a global vector, but<br>
> most other things create local coordinates. I have a fix for this, <br>
> which I will get in an MR, Also, I moved DMLocalizeCoordinates()<br>
> after distribution, since this is where it belongs.<br>
> <br>
> knepley/fix-plex-periodic-faces *$:/PETSc3/petsc/petsc-dev$ git diff<br>
> diff --git a/src/dm/interface/dmcoordinates.c <br>
> b/src/dm/interface/dmcoordinates.c<br>
> index a922348f95b..6437e9f7259 100644<br>
> --- a/src/dm/interface/dmcoordinates.c<br>
> +++ b/src/dm/interface/dmcoordinates.c<br>
> @@ -551,10 +551,14 @@ PetscErrorCode DMGetCoordinatesLocalSetUp(DM dm)<br>
> PetscFunctionBegin;<br>
> PetscValidHeaderSpecific(dm, DM_CLASSID, 1);<br>
> if (!dm->coordinates[0].xl && dm->coordinates[0].x) {<br>
> - DM cdm = NULL;<br>
> + DM cdm = NULL;<br>
> + PetscInt bs;<br>
> <br>
> PetscCall(DMGetCoordinateDM(dm, &cdm));<br>
> PetscCall(DMCreateLocalVector(cdm, &dm->coordinates[0].xl));<br>
> + // If the size of the vector is 0, it will not get the right block size<br>
> + PetscCall(VecGetBlockSize(dm->coordinates[0].x, &bs));<br>
> + PetscCall(VecSetBlockSize(dm->coordinates[0].xl, bs));<br>
> PetscCall(PetscObjectSetName((PetscObject)dm->coordinates[0].xl, <br>
> "coordinates"));<br>
> PetscCall(DMGlobalToLocalBegin(cdm, dm->coordinates[0].x, <br>
> INSERT_VALUES, dm->coordinates[0].xl));<br>
> PetscCall(DMGlobalToLocalEnd(cdm, dm->coordinates[0].x, <br>
> INSERT_VALUES, dm->coordinates[0].xl));<br>
> <br>
> 3) If I comment out forcing the periodicity, your example does not run <br>
> for me. I will try to figure it out<br>
> <br>
> [0]PETSC ERROR: --------------------- Error Message <br>
> --------------------------------------------------------------<br>
> [0]PETSC ERROR: Nonconforming object sizes<br>
> [0]PETSC ERROR: SF roots 4400 < pEnd 6000<br>
> [1]PETSC ERROR: --------------------- Error Message <br>
> --------------------------------------------------------------<br>
> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! <br>
> Could be the program crashed before they were used or a spelling <br>
> mistake, etc!<br>
> [1]PETSC ERROR: Nonconforming object sizes<br>
> [0]PETSC ERROR: Option left: name:-start_in_debugger_no (no value) <br>
> source: command line<br>
> [1]PETSC ERROR: SF roots 4400 < pEnd 6000<br>
> [0]PETSC ERROR: See <a href="https://petsc.org/release/faq/" rel="noreferrer" target="_blank">https://petsc.org/release/faq/</a> <br>
> <<a href="https://petsc.org/release/faq/" rel="noreferrer" target="_blank">https://petsc.org/release/faq/</a>> for trouble shooting.<br>
> [0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-494-g16200351da0 <br>
> GIT Date: 2022-12-12 23:42:20 +0000<br>
> [1]PETSC ERROR: WARNING! There are option(s) set that were not used! <br>
> Could be the program crashed before they were used or a spelling <br>
> mistake, etc!<br>
> [1]PETSC ERROR: Option left: name:-start_in_debugger_no (no value) <br>
> source: command line<br>
> [0]PETSC ERROR: ./readandcreate on a arch-master-debug named <br>
> <a href="http://MacBook-Pro.cable.rcn.com" rel="noreferrer" target="_blank">MacBook-Pro.cable.rcn.com</a> <<a href="http://MacBook-Pro.cable.rcn.com" rel="noreferrer" target="_blank">http://MacBook-Pro.cable.rcn.com</a>> by knepley <br>
> Thu Dec 15 12:50:26 2022<br>
> [1]PETSC ERROR: See <a href="https://petsc.org/release/faq/" rel="noreferrer" target="_blank">https://petsc.org/release/faq/</a> <br>
> <<a href="https://petsc.org/release/faq/" rel="noreferrer" target="_blank">https://petsc.org/release/faq/</a>> for trouble shooting.<br>
> [0]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug <br>
> --download-bamg --download-bison --download-chaco --download-ctetgen <br>
> --download-egads --download-eigen --download-exodusii --download-fftw <br>
> --download-hpddm --download-ks --download-libceed --download-libpng <br>
> --download-metis --download-ml --download-mumps --download-muparser <br>
> --download-netcdf --download-opencascade --download-p4est <br>
> --download-parmetis --download-pnetcdf --download-pragmatic <br>
> --download-ptscotch --download-scalapack --download-slepc <br>
> --download-suitesparse --download-superlu_dist --download-tetgen <br>
> --download-triangle --with-cmake-exec=/PETSc3/petsc/apple/bin/cmake <br>
> --with-ctest-exec=/PETSc3/petsc/apple/bin/ctest <br>
> --with-hdf5-dir=/PETSc3/petsc/apple --with-mpi-dir=/PETSc3/petsc/apple <br>
> --with-petsc4py=1 --with-shared-libraries --with-slepc --with-zlib<br>
> [1]PETSC ERROR: Petsc Development GIT revision: v3.18.1-494-g16200351da0 <br>
> GIT Date: 2022-12-12 23:42:20 +0000<br>
> [0]PETSC ERROR: #1 PetscSectionCreateGlobalSection() at <br>
> /PETSc3/petsc/petsc-dev/src/vec/is/section/interface/section.c:1322<br>
> [1]PETSC ERROR: ./readandcreate on a arch-master-debug named <br>
> <a href="http://MacBook-Pro.cable.rcn.com" rel="noreferrer" target="_blank">MacBook-Pro.cable.rcn.com</a> <<a href="http://MacBook-Pro.cable.rcn.com" rel="noreferrer" target="_blank">http://MacBook-Pro.cable.rcn.com</a>> by knepley <br>
> Thu Dec 15 12:50:26 2022<br>
> [0]PETSC ERROR: #2 DMGetGlobalSection() at <br>
> /PETSc3/petsc/petsc-dev/src/dm/interface/dm.c:4527<br>
> [1]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug <br>
> --download-bamg --download-bison --download-chaco --download-ctetgen <br>
> --download-egads --download-eigen --download-exodusii --download-fftw <br>
> --download-hpddm --download-ks --download-libceed --download-libpng <br>
> --download-metis --download-ml --download-mumps --download-muparser <br>
> --download-netcdf --download-opencascade --download-p4est <br>
> --download-parmetis --download-pnetcdf --download-pragmatic <br>
> --download-ptscotch --download-scalapack --download-slepc <br>
> --download-suitesparse --download-superlu_dist --download-tetgen <br>
> --download-triangle --with-cmake-exec=/PETSc3/petsc/apple/bin/cmake <br>
> --with-ctest-exec=/PETSc3/petsc/apple/bin/ctest <br>
> --with-hdf5-dir=/PETSc3/petsc/apple --with-mpi-dir=/PETSc3/petsc/apple <br>
> --with-petsc4py=1 --with-shared-libraries --with-slepc --with-zlib<br>
> [0]PETSC ERROR: #3 DMPlexSectionLoad_HDF5_Internal() at <br>
> /PETSc3/petsc/petsc-dev/src/dm/impls/plex/plexhdf5.c:2750<br>
> [1]PETSC ERROR: #1 PetscSectionCreateGlobalSection() at <br>
> /PETSc3/petsc/petsc-dev/src/vec/is/section/interface/section.c:1322<br>
> [0]PETSC ERROR: #4 DMPlexSectionLoad() at <br>
> /PETSc3/petsc/petsc-dev/src/dm/impls/plex/plex.c:2364<br>
> [1]PETSC ERROR: #2 DMGetGlobalSection() at <br>
> /PETSc3/petsc/petsc-dev/src/dm/interface/dm.c:4527<br>
> [0]PETSC ERROR: #5 main() at <br>
> /Users/knepley/Downloads/tmp/Berend/readandcreate.c:85<br>
> [1]PETSC ERROR: #3 DMPlexSectionLoad_HDF5_Internal() at <br>
> /PETSc3/petsc/petsc-dev/src/dm/impls/plex/plexhdf5.c:2750<br>
> [0]PETSC ERROR: PETSc Option Table entries:<br>
> [0]PETSC ERROR: -malloc_debug (source: environment)<br>
> [1]PETSC ERROR: #4 DMPlexSectionLoad() at <br>
> /PETSc3/petsc/petsc-dev/src/dm/impls/plex/plex.c:2364<br>
> [1]PETSC ERROR: #5 main() at <br>
> /Users/knepley/Downloads/tmp/Berend/readandcreate.c:85<br>
> [0]PETSC ERROR: -start_in_debugger_no (source: command line)<br>
> [1]PETSC ERROR: PETSc Option Table entries:<br>
> [0]PETSC ERROR: ----------------End of Error Message -------send entire <br>
> error message to petsc-maint@mcs.anl.gov----------<br>
> [1]PETSC ERROR: -malloc_debug (source: environment)<br>
> application called MPI_Abort(MPI_COMM_SELF, 60) - process 0<br>
> [1]PETSC ERROR: -start_in_debugger_no (source: command line)<br>
> [1]PETSC ERROR: ----------------End of Error Message -------send entire <br>
> error message to petsc-maint@mcs.anl.gov----------<br>
> application called MPI_Abort(MPI_COMM_SELF, 60) - process 0<br>
> 4) We now have parallel HDF5 loading, so you should not have to manually <br>
> distribute. I will change your example to use it<br>
> and send it back when I am done.<br>
> <br>
> Thanks!<br>
> <br>
> Matt<br>
> <br>
> Many thanks and kind regards,<br>
> Berend.<br>
> <br>
> <br>
> <br>
> -- <br>
> What most experimenters take for granted before they begin their <br>
> experiments is infinitely more interesting than any results to which <br>
> their experiments lead.<br>
> -- Norbert Wiener<br>
> <br>
> <a href="https://www.cse.buffalo.edu/~knepley/" rel="noreferrer" target="_blank">https://www.cse.buffalo.edu/~knepley/</a> <<a href="http://www.cse.buffalo.edu/~knepley/" rel="noreferrer" target="_blank">http://www.cse.buffalo.edu/~knepley/</a>><br>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div><br clear="all"><div><br></div><span class="gmail_signature_prefix">-- </span><br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div>