[petsc-dev] Periodic meshes with <3 elements per edge?
Matthew Knepley
knepley at gmail.com
Tue Aug 13 20:25:41 CDT 2019
On Tue, Aug 13, 2019 at 7:35 PM Stefano Zampini <stefano.zampini at gmail.com>
wrote:
>
>
> On Aug 14, 2019, at 1:19 AM, Jed Brown via petsc-dev <
> petsc-dev at mcs.anl.gov> wrote:
>
> [Cc: petsc-dev]
>
> Also, why is our current mode of localized coordinates preferred over
> the coordinate DM being non-periodic? Is the intent to preserve that
> for every point in a DM, the point is also valid in the coordinate DM?
>
> Yes.
> Can there be "gaps" in a chart?
>
> Yes.
> I've been digging around in the implementation because there is no
> documentation of localized coordinates, but it feels more complicated
> than I'd have hoped.
>
>
> A while ago, “localization” of coordinates was supporting only very simple
> cases, where periodic points were identified though the ‘maxCell’ parameter
> (used to compute the proper cell coordinates). I think this is the reason
> why you need at least 3 cells to support periodicity, since the BoxMesh
> constructor uses the maxCell trick.
>
This was my original conception since it was the only fully automatic way
to apply periodicity on an unstructured mesh that
was read from a file or generator.
> Now, you can also inform Plex about periodicity without the maxCell trick
> , see e.g.
> https://bitbucket.org/petsc/petsc/src/6a494beb09767ff86fff34131928e076224d7569/src/dm/impls/plex/plexgmsh.c#lines-1468.
> In this case, it is user responsibility to populate the cell part of the
> coordinate section with the proper localized coordinates.
>
This is a great addition from Stefano and Lisandro, but note that it is
nontrivial. The user has to identify the
periodic boundary.
> The DMPlex code fully support coordinates localized only in those cells
> touching the periodic boundary. (I’m not a fan of this, since it requires a
> lot of ‘if’ ‘else’ switches )
>
I believe that Lisandro wanted this to cut down on redundant storage of
coordinates.
> I think domain_box_size 1 is not possible, we can probably allow
> domain_box_size 2.
>
Technically, now a single box is possible with higher order coordinate
spaces, but you have to do everything by hand
and it completely untested.
Thanks,
Matt
>
> Jed Brown <jed at jedbrown.org> writes:
>
> Can this be fixed? Even better, can we allow -domain_box_size 1?
>
> $ mpich/tests/dm/impls/plex/examples/tests/ex1 -dim 1 -domain_box_sizes 2
> -cell_simplex 0 -x_periodicity periodic
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Invalid argument
> [0]PETSC ERROR: Mesh cell 1 is inverted, |J| = -0.25
> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.11.3-1683-g1ac5c604ca
> GIT Date: 2019-08-13 14:39:38 +0000
> [0]PETSC ERROR: mpich/tests/dm/impls/plex/examples/tests/ex1 on a mpich
> named joule.cs.colorado.edu by jed Tue Aug 13 17:11:25 2019
> [0]PETSC ERROR: Configure options --download-chaco --download-ctetgen
> --download-exodusii --download-hypre --download-med --download-ml
> --download-mumps --download-pnetcdf --download-pragmati
> c --download-scalapack --download-spai --download-sundials
> --download-superlu --download-superlu_dist --download-triangle
> --with-c2html --with-eigen-dir=/usr --with-hdf5-dir=/opt/hdf5-mpich -
> -with-lgrind --with-metis --with-mpi-dir=/home/jed/usr/ccache/mpich/
> --download-netcdf --download-conduit --with-parmetis
> --with-single-library=0 --with-suitesparse --with-yaml --with-zlib -P
> ETSC_ARCH=mpich COPTFLAGS="-Og -march=native -g" CXXOPTFLAGS="-Og
> -march=native -g" FOPTFLAGS="-Og -march=native -g"
> [0]PETSC ERROR: #1 DMPlexCheckGeometry() line 7029 in
> /home/jed/petsc/src/dm/impls/plex/plex.c
> [0]PETSC ERROR: #2 CreateMesh() line 412 in
> /home/jed/petsc/src/dm/impls/plex/examples/tests/ex1.c
> [0]PETSC ERROR: #3 main() line 426 in
> /home/jed/petsc/src/dm/impls/plex/examples/tests/ex1.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -cell_simplex 0
> [0]PETSC ERROR: -dim 1
> [0]PETSC ERROR: -domain_box_sizes 2
> [0]PETSC ERROR: -malloc_test
> [0]PETSC ERROR: -x_periodicity periodic
> [0]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=62
> :
> system msg for write_line failure : Bad file descriptor
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190813/dfb0eabb/attachment.html>
More information about the petsc-dev
mailing list