[petsc-dev] getting ready for PETSc release

Blaise A Bourdin bourdin at lsu.edu
Mon Jun 9 22:16:03 CDT 2014


Hi,

I agree with Barry that since this will not mean any interface changes, it can be done with patch releases.

Sounds like a plan



I still can’t figure out how to deal with cell sets of different types (say quads and tri).

I have not tried this. I had been focusing getting higher order (Q2) and either quads or tris to work. This
all seems to be working correctly now. I have been using it to visualize/restart a magma dynamics app
with 3 fields all with different discretizations.

Nice.


All cell and vertex sets seem to be contained in the hdf5 file, but not in a way that is usable by post processing tools (visit, paraview, ensight).

For restarting, mixed meshes should work fine. They are stored in

  /topology/cells
                   /cones
                   /order
                   /orientation

and the field values are in

  /fields/<name>

For visualization and most post-processing, there are separate arrays

  /viz/topology
        /geometry

which are exactly what I needed to make Paraview understand the xdmf. The fields
sampled down to cells and vertices are in

  /vertex_fields/<name>
  /cell_fields/<name>

I saw that. Does it mean that all heavy data is duplicated in the hdf5 file, i.e. that /fields contains values that petsc understands and /viz is for visualization?


The xdmf generation script bin/pythonscripts/petsc_gen_xdmf.py is quite fragile.

I have not had it fail for me, but would be happy to look at the failure you are getting.

Cool.

I am attaching a very simple code that reads an exo file and saves it, and two simple exodus meshes.

The xmf generation script fails on it, most likely because I don’t have a /time section in the file. My workaround is to replace l. 209 with time = [0,1].

When I read the xmf file in visit, I see only one cell set. In paraview, I see two blocks “domain” and “domain[1]", both of which contain the entire mesh.

If I do not interpolate the mesh in DMPlexCreateExodusFromFile, petsc_gen_xdmf.py fails.

DMView fails in parallel. I must be doing something wrong.
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Argument out of range
[0]PETSC ERROR: Point 8 has 0 constraints > -3 dof
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.4.4-4444-g8c25fe2  GIT Date: 2014-06-07 16:01:56 -0500
[0]PETSC ERROR: ./testHDF5Plex on a Darwin-intel14.0-g named iMac.local by blaise Mon Jun  9 22:14:29 2014
[0]PETSC ERROR: Configure options CFLAGS= CXXFLAGS= LDFLAGS=-Wl,-no_pie --download-chaco=1 --download-exodusii=1 --download-hdf5=1 --download-metis=1 --download-netcdf=1 --download-parmetis=1 --download-sowing=1 --download-triangle=1 --download-yaml=1 --with-blas-lapack-dir=/opt/intel/composerxe/mkl --with-cmake=cmake --with-debugging=1 --with-mpi-dir=/opt/HPC/mpich-3.0.4-intel14.0 --with-pic --with-shared-libraries=1 --with-vendor-compilers=intel --with-x11=1
[0]PETSC ERROR: #1 DMCreateDefaultSF() line 3065 in /opt/HPC/petsc-dev/src/dm/interface/dm.c
[0]PETSC ERROR: #2 DMGetDefaultSF() line 2985 in /opt/HPC/petsc-dev/src/dm/interface/dm.c
[0]PETSC ERROR: #3 DMLocalToGlobalBegin() line 1737 in /opt/HPC/petsc-dev/src/dm/interface/dm.c
[0]PETSC ERROR: #4 VecView_Plex_Local_HDF5() line 122 in /opt/HPC/petsc-dev/src/dm/impls/plex/plexhdf5.c
[0]PETSC ERROR: #5 VecView_Plex_Local() line 86 in /opt/HPC/petsc-dev/src/dm/impls/plex/plex.c
[0]PETSC ERROR: #6 VecView() line 601 in /opt/HPC/petsc-dev/src/vec/vec/interface/vector.c
[0]PETSC ERROR: #7 DMPlexWriteCoordinates_HDF5_Static() line 396 in /opt/HPC/petsc-dev/src/dm/impls/plex/plexhdf5.c
[0]PETSC ERROR: #8 DMPlexView_HDF5() line 485 in /opt/HPC/petsc-dev/src/dm/impls/plex/plexhdf5.c
[0]PETSC ERROR: #9 DMView_Plex() line 450 in /opt/HPC/petsc-dev/src/dm/impls/plex/plex.c
[0]PETSC ERROR: #10 DMView() line 648 in /opt/HPC/petsc-dev/src/dm/interface/dm.c
[0]PETSC ERROR: #11 main() line 79 in /Users/blaise/Development/DMComplex/testHDF5Plex.c
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov<mailto:petsc-maint at mcs.anl.gov>----------


Finally, DMView fails with the mixed element type mesh with the following error message:
Writing to TwoSquaresMixed_seq.h5
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: No support for this operation for this object type
[0]PETSC ERROR: Visualization topology currently only supports identical cell shapes
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.4.4-4444-g8c25fe2  GIT Date: 2014-06-07 16:01:56 -0500
[0]PETSC ERROR: ./testHDF5Plex on a Darwin-intel14.0-g named iMac.local by blaise Mon Jun  9 22:11:48 2014
[0]PETSC ERROR: Configure options CFLAGS= CXXFLAGS= LDFLAGS=-Wl,-no_pie --download-chaco=1 --download-exodusii=1 --download-hdf5=1 --download-metis=1 --download-netcdf=1 --download-parmetis=1 --download-sowing=1 --download-triangle=1 --download-yaml=1 --with-blas-lapack-dir=/opt/intel/composerxe/mkl --with-cmake=cmake --with-debugging=1 --with-mpi-dir=/opt/HPC/mpich-3.0.4-intel14.0 --with-pic --with-shared-libraries=1 --with-vendor-compilers=intel --with-x11=1
[0]PETSC ERROR: #1 DMPlexWriteTopology_Vertices_HDF5_Static() line 322 in /opt/HPC/petsc-dev/src/dm/impls/plex/plexhdf5.c
[0]PETSC ERROR: #2 DMPlexView_HDF5() line 488 in /opt/HPC/petsc-dev/src/dm/impls/plex/plexhdf5.c
[0]PETSC ERROR: #3 DMView_Plex() line 450 in /opt/HPC/petsc-dev/src/dm/impls/plex/plex.c
[0]PETSC ERROR: #4 main() line 59 in /Users/blaise/Development/DMComplex/testHDF5Plex.c
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov<mailto:petsc-maint at mcs.anl.gov>----------

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140610/b19a8eb3/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: testHDF5Plex.c
Type: application/octet-stream
Size: 3994 bytes
Desc: testHDF5Plex.c
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140610/b19a8eb3/attachment.obj>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140610/b19a8eb3/attachment.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: TwoSquares.gen
Type: application/octet-stream
Size: 2672 bytes
Desc: TwoSquares.gen
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140610/b19a8eb3/attachment-0001.obj>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140610/b19a8eb3/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: TwoSquaresMixed.gen
Type: application/octet-stream
Size: 2596 bytes
Desc: TwoSquaresMixed.gen
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140610/b19a8eb3/attachment-0002.obj>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140610/b19a8eb3/attachment-0002.htm>


More information about the petsc-dev mailing list