[petsc-dev] Periodic meshes with <3 elements per edge?

Jed Brown jed at jedbrown.org
Tue Aug 13 18:19:48 CDT 2019


[Cc: petsc-dev]

Also, why is our current mode of localized coordinates preferred over
the coordinate DM being non-periodic?  Is the intent to preserve that
for every point in a DM, the point is also valid in the coordinate DM?
Can there be "gaps" in a chart?

I've been digging around in the implementation because there is no
documentation of localized coordinates, but it feels more complicated
than I'd have hoped.

Jed Brown <jed at jedbrown.org> writes:

> Can this be fixed?  Even better, can we allow -domain_box_size 1?
>
> $ mpich/tests/dm/impls/plex/examples/tests/ex1 -dim 1 -domain_box_sizes 2 -cell_simplex 0 -x_periodicity periodic
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Invalid argument
> [0]PETSC ERROR: Mesh cell 1 is inverted, |J| = -0.25
> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.11.3-1683-g1ac5c604ca  GIT Date: 2019-08-13 14:39:38 +0000
> [0]PETSC ERROR: mpich/tests/dm/impls/plex/examples/tests/ex1 on a mpich named joule.cs.colorado.edu by jed Tue Aug 13 17:11:25 2019
> [0]PETSC ERROR: Configure options --download-chaco --download-ctetgen --download-exodusii --download-hypre --download-med --download-ml --download-mumps --download-pnetcdf --download-pragmati
> c --download-scalapack --download-spai --download-sundials --download-superlu --download-superlu_dist --download-triangle --with-c2html --with-eigen-dir=/usr --with-hdf5-dir=/opt/hdf5-mpich -
> -with-lgrind --with-metis --with-mpi-dir=/home/jed/usr/ccache/mpich/ --download-netcdf --download-conduit --with-parmetis --with-single-library=0 --with-suitesparse --with-yaml --with-zlib -P
> ETSC_ARCH=mpich COPTFLAGS="-Og -march=native -g" CXXOPTFLAGS="-Og -march=native -g" FOPTFLAGS="-Og -march=native -g"
> [0]PETSC ERROR: #1 DMPlexCheckGeometry() line 7029 in /home/jed/petsc/src/dm/impls/plex/plex.c
> [0]PETSC ERROR: #2 CreateMesh() line 412 in /home/jed/petsc/src/dm/impls/plex/examples/tests/ex1.c
> [0]PETSC ERROR: #3 main() line 426 in /home/jed/petsc/src/dm/impls/plex/examples/tests/ex1.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -cell_simplex 0
> [0]PETSC ERROR: -dim 1
> [0]PETSC ERROR: -domain_box_sizes 2
> [0]PETSC ERROR: -malloc_test
> [0]PETSC ERROR: -x_periodicity periodic
> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
> application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=62
> :
> system msg for write_line failure : Bad file descriptor


More information about the petsc-dev mailing list