<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Aug 14, 2019, at 1:19 AM, Jed Brown via petsc-dev <<a href="mailto:petsc-dev@mcs.anl.gov" class="">petsc-dev@mcs.anl.gov</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div class="">[Cc: petsc-dev]<br class=""><br class="">Also, why is our current mode of localized coordinates preferred over<br class="">the coordinate DM being non-periodic? Is the intent to preserve that<br class="">for every point in a DM, the point is also valid in the coordinate DM?<br class="">Can there be "gaps" in a chart?<br class=""><br class="">I've been digging around in the implementation because there is no<br class="">documentation of localized coordinates, but it feels more complicated<br class="">than I'd have hoped.<br class=""></div></div></blockquote><div><br class=""></div><div>A while ago, “localization” of coordinates was supporting only very simple cases, where periodic points were identified though the ‘maxCell’ parameter (used to compute the proper cell coordinates). I think this is the reason why you need at least 3 cells to support periodicity, since the BoxMesh constructor uses the maxCell trick.</div><div><br class=""></div><div>Now, you can also inform Plex about periodicity without the maxCell trick , see e.g. <a href="https://bitbucket.org/petsc/petsc/src/6a494beb09767ff86fff34131928e076224d7569/src/dm/impls/plex/plexgmsh.c#lines-1468" class="">https://bitbucket.org/petsc/petsc/src/6a494beb09767ff86fff34131928e076224d7569/src/dm/impls/plex/plexgmsh.c#lines-1468</a>. In this case, it is user responsibility to populate the cell part of the coordinate section with the proper localized coordinates.</div><div>The DMPlex code fully support coordinates localized only in those cells touching the periodic boundary. (I’m not a fan of this, since it requires a lot of ‘if’ ‘else’ switches )</div><div><br class=""></div><div>I think domain_box_size 1 is not possible, we can probably allow domain_box_size 2.</div><div><br class=""></div><blockquote type="cite" class=""><div class=""><div class=""><br class="">Jed Brown <<a href="mailto:jed@jedbrown.org" class="">jed@jedbrown.org</a>> writes:<br class=""><br class=""><blockquote type="cite" class="">Can this be fixed? Even better, can we allow -domain_box_size 1?<br class=""><br class="">$ mpich/tests/dm/impls/plex/examples/tests/ex1 -dim 1 -domain_box_sizes 2 -cell_simplex 0 -x_periodicity periodic<br class="">[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br class="">[0]PETSC ERROR: Invalid argument<br class="">[0]PETSC ERROR: Mesh cell 1 is inverted, |J| = -0.25<br class="">[0]PETSC ERROR: See <a href="https://www.mcs.anl.gov/petsc/documentation/faq.html" class="">https://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br class="">[0]PETSC ERROR: Petsc Development GIT revision: v3.11.3-1683-g1ac5c604ca GIT Date: 2019-08-13 14:39:38 +0000<br class="">[0]PETSC ERROR: mpich/tests/dm/impls/plex/examples/tests/ex1 on a mpich named <a href="http://joule.cs.colorado.edu" class="">joule.cs.colorado.edu</a> by jed Tue Aug 13 17:11:25 2019<br class="">[0]PETSC ERROR: Configure options --download-chaco --download-ctetgen --download-exodusii --download-hypre --download-med --download-ml --download-mumps --download-pnetcdf --download-pragmati<br class="">c --download-scalapack --download-spai --download-sundials --download-superlu --download-superlu_dist --download-triangle --with-c2html --with-eigen-dir=/usr --with-hdf5-dir=/opt/hdf5-mpich -<br class="">-with-lgrind --with-metis --with-mpi-dir=/home/jed/usr/ccache/mpich/ --download-netcdf --download-conduit --with-parmetis --with-single-library=0 --with-suitesparse --with-yaml --with-zlib -P<br class="">ETSC_ARCH=mpich COPTFLAGS="-Og -march=native -g" CXXOPTFLAGS="-Og -march=native -g" FOPTFLAGS="-Og -march=native -g"<br class="">[0]PETSC ERROR: #1 DMPlexCheckGeometry() line 7029 in /home/jed/petsc/src/dm/impls/plex/plex.c<br class="">[0]PETSC ERROR: #2 CreateMesh() line 412 in /home/jed/petsc/src/dm/impls/plex/examples/tests/ex1.c<br class="">[0]PETSC ERROR: #3 main() line 426 in /home/jed/petsc/src/dm/impls/plex/examples/tests/ex1.c<br class="">[0]PETSC ERROR: PETSc Option Table entries:<br class="">[0]PETSC ERROR: -cell_simplex 0<br class="">[0]PETSC ERROR: -dim 1<br class="">[0]PETSC ERROR: -domain_box_sizes 2<br class="">[0]PETSC ERROR: -malloc_test<br class="">[0]PETSC ERROR: -x_periodicity periodic<br class="">[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to <a href="mailto:petsc-maint@mcs.anl.gov" class="">petsc-maint@mcs.anl.gov</a>----------<br class="">application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0<br class="">[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=62<br class="">:<br class="">system msg for write_line failure : Bad file descriptor<br class=""></blockquote></div></div></blockquote></div><br class=""></body></html>