[petsc-dev] DMplex: Natural Ordering and subDM
Jed Brown
jed at jedbrown.org
Tue Nov 28 00:39:25 CST 2017
Blaise A Bourdin <bourdin at lsu.edu> writes:
> There may be good reasons to want to read / write in a given ordering: post-processing an existing computation, applying non trivial boundary conditions that need to be computed separately, or restarting a computation on a different number of processors.
> Also, Exodus has restriction on cell ordering (cells in an element block must be numbered sequentially) so the distributed cell ordering may not be acceptable.
Does PETSc have the ability to write Exodus files? Yes, if this is a
requirement then that viewer needs to reorder in some way to write.
For concreteness, what postprocessors are you thinking about here that
need an ordering that matches that of the input file? (I ask because I
usually see the simulation writing a different format from the input
file, and often viewed using different tools.)
> Exodus 6 introduced element sets which are free of this limitation, but as far as I know, no mesh generator or post processing deals with elements sets.
>
> I ended up inverting the migration SF, broadcast the distributed section back to the original mesh, then generate SFnatural using DMPlexCreateGlobalToNaturalSF. It is ugly but done only once.
>
> Still about exodus, I now have parallel (MPIIO through parallel netcdf) I/O working for nodal (linear and quadratic Lagrange elements) and zonal fields (in exodus jargon) in natural and standard ordering. I can have pull request and documented examples and tests ready in a few days.
Cool!
More information about the petsc-dev
mailing list