[petsc-users] DMPlex export to hdf5/vtk for triangle/prism mesh
Fabian.Jakub
Fabian.Jakub at physik.uni-muenchen.de
Fri May 26 12:27:25 CDT 2017
Dear Petsc Team,
I am playing around with DMPlex, using it to generate the Mesh for the
ICON weather model(http://doi.org/10.1002/2015MS000431), which employs a
triangle mesh horizontally and columns, vertically.
This results in a grid, looking like prisms, where top and bottom faces
are triangles and side faces are rectangles.
I was delighted to see that I could export the triangle DMPlex (2d Mesh)
to hdf5 and use petsc_gen_xdmf.py to then visualize the mesh in
visit/paraview.
This is especially nice when exporting petscsections/vectors directly to
VTK.
I then tried the same approach for the prism grid in 3D.
I attached the code for one single cell, as well as the output in hdf5.
However, trying to convert the hdf5 output, it fails with:
make prism.xmf
$PETSC_DIR/bin/petsc_gen_xdmf.py prism.h5
Traceback (most recent call last):
File
"/software/meteo/xenial/x86_64/petsc/master/debug_gcc/..//bin/petsc_gen_xdmf.py",
line 241, in <module>
generateXdmf(f)
File
"/software/meteo/xenial/x86_64/petsc/master/debug_gcc/..//bin/petsc_gen_xdmf.py",
line 235, in generateXdmf
Xdmf(xdmfFilename).write(hdfFilename, topoPath, numCells,
numCorners, cellDim, geomPath, numVertices, spaceDim, time, vfields,
cfields)
File
"/software/meteo/xenial/x86_64/petsc/master/debug_gcc/..//bin/petsc_gen_xdmf.py",
line 193, in write
self.writeSpaceGridHeader(fp, numCells, numCorners, cellDim, spaceDim)
File
"/software/meteo/xenial/x86_64/petsc/master/debug_gcc/..//bin/petsc_gen_xdmf.py",
line 75, in writeSpaceGridHeader
''' % (self.cellMap[cellDim][numCorners], numCells, "XYZ" if
spaceDim > 2 else "XY"))
KeyError: 6
Also, if I try to export a vector directly to vtk, visit and paraview
fail to open it.
My question is:
Is this a general limitation of these output formats, that I can not mix
faces with 3 and 4 vertices or is it a limitation of the
petsc_gen_xdmf.py or the VTK Viewer.
I'd also welcome any thoughts on the prism mesh in general.
Is it that uncommon to use and do you foresee other complications with it?
I fear I cannot change the discretization of the host model but maybe it
makes sense to use a different grid for my radiative transfer code?
Many thanks,
Fabian
-------------- next part --------------
include ${PETSC_DIR}/lib/petsc/conf/variables
include ${PETSC_DIR}/lib/petsc/conf/rules
prism.xmf:: prism.h5
${PETSC_DIR}/bin/petsc_gen_xdmf.py prism.h5
prism.h5:: plex_prism
./plex_prism -show_plex ::ascii_info_detail
./plex_prism -show_plex hdf5:prism.h5
plex_prism:: plex_prism.F90
${PETSC_FCOMPILE} -c plex_prism.F90
${FLINKER} plex_prism.o -o plex_prism ${PETSC_LIB}
clean::
rm -rf *.o prism.h5 prism.xmf plex_prism
-------------- next part --------------
A non-text attachment was scrubbed...
Name: plex_prism.F90
Type: text/x-fortran
Size: 5515 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170526/92c3cbf5/attachment-0001.bin>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: prism.h5
Type: application/x-hdf
Size: 25400 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170526/92c3cbf5/attachment-0001.hdf>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 181 bytes
Desc: OpenPGP digital signature
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170526/92c3cbf5/attachment-0001.pgp>
More information about the petsc-users
mailing list