[petsc-dev] Periodic meshes with <3 elements per edge?
Stefano Zampini
stefano.zampini at gmail.com
Tue Aug 13 18:35:41 CDT 2019
> On Aug 14, 2019, at 1:19 AM, Jed Brown via petsc-dev <petsc-dev at mcs.anl.gov> wrote:
>
> [Cc: petsc-dev]
>
> Also, why is our current mode of localized coordinates preferred over
> the coordinate DM being non-periodic? Is the intent to preserve that
> for every point in a DM, the point is also valid in the coordinate DM?
> Can there be "gaps" in a chart?
>
> I've been digging around in the implementation because there is no
> documentation of localized coordinates, but it feels more complicated
> than I'd have hoped.
A while ago, “localization” of coordinates was supporting only very simple cases, where periodic points were identified though the ‘maxCell’ parameter (used to compute the proper cell coordinates). I think this is the reason why you need at least 3 cells to support periodicity, since the BoxMesh constructor uses the maxCell trick.
Now, you can also inform Plex about periodicity without the maxCell trick , see e.g. https://bitbucket.org/petsc/petsc/src/6a494beb09767ff86fff34131928e076224d7569/src/dm/impls/plex/plexgmsh.c#lines-1468 <https://bitbucket.org/petsc/petsc/src/6a494beb09767ff86fff34131928e076224d7569/src/dm/impls/plex/plexgmsh.c#lines-1468>. In this case, it is user responsibility to populate the cell part of the coordinate section with the proper localized coordinates.
The DMPlex code fully support coordinates localized only in those cells touching the periodic boundary. (I’m not a fan of this, since it requires a lot of ‘if’ ‘else’ switches )
I think domain_box_size 1 is not possible, we can probably allow domain_box_size 2.
>
> Jed Brown <jed at jedbrown.org> writes:
>
>> Can this be fixed? Even better, can we allow -domain_box_size 1?
>>
>> $ mpich/tests/dm/impls/plex/examples/tests/ex1 -dim 1 -domain_box_sizes 2 -cell_simplex 0 -x_periodicity periodic
>> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> [0]PETSC ERROR: Invalid argument
>> [0]PETSC ERROR: Mesh cell 1 is inverted, |J| = -0.25
>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
>> [0]PETSC ERROR: Petsc Development GIT revision: v3.11.3-1683-g1ac5c604ca GIT Date: 2019-08-13 14:39:38 +0000
>> [0]PETSC ERROR: mpich/tests/dm/impls/plex/examples/tests/ex1 on a mpich named joule.cs.colorado.edu by jed Tue Aug 13 17:11:25 2019
>> [0]PETSC ERROR: Configure options --download-chaco --download-ctetgen --download-exodusii --download-hypre --download-med --download-ml --download-mumps --download-pnetcdf --download-pragmati
>> c --download-scalapack --download-spai --download-sundials --download-superlu --download-superlu_dist --download-triangle --with-c2html --with-eigen-dir=/usr --with-hdf5-dir=/opt/hdf5-mpich -
>> -with-lgrind --with-metis --with-mpi-dir=/home/jed/usr/ccache/mpich/ --download-netcdf --download-conduit --with-parmetis --with-single-library=0 --with-suitesparse --with-yaml --with-zlib -P
>> ETSC_ARCH=mpich COPTFLAGS="-Og -march=native -g" CXXOPTFLAGS="-Og -march=native -g" FOPTFLAGS="-Og -march=native -g"
>> [0]PETSC ERROR: #1 DMPlexCheckGeometry() line 7029 in /home/jed/petsc/src/dm/impls/plex/plex.c
>> [0]PETSC ERROR: #2 CreateMesh() line 412 in /home/jed/petsc/src/dm/impls/plex/examples/tests/ex1.c
>> [0]PETSC ERROR: #3 main() line 426 in /home/jed/petsc/src/dm/impls/plex/examples/tests/ex1.c
>> [0]PETSC ERROR: PETSc Option Table entries:
>> [0]PETSC ERROR: -cell_simplex 0
>> [0]PETSC ERROR: -dim 1
>> [0]PETSC ERROR: -domain_box_sizes 2
>> [0]PETSC ERROR: -malloc_test
>> [0]PETSC ERROR: -x_periodicity periodic
>> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
>> application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
>> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=62
>> :
>> system msg for write_line failure : Bad file descriptor
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190814/c36294e3/attachment.html>
More information about the petsc-dev
mailing list