[petsc-users] PetscSection and DMPlexVTKWriteAll in parallel

Matthew Knepley knepley at gmail.com
Mon Mar 21 17:22:57 CDT 2022


On Mon, Mar 21, 2022 at 11:22 AM Ferrand, Jesus A. <FERRANJ2 at my.erau.edu>
wrote:

> Greetings.
>
> I am having trouble exporting a vertex-based solution field to ParaView
> when I run my PETSc script in parallel (see screenshots). The smoothly
> changing field is produced by my serial runs whereas the "messed up" one is
> produced by my parallel runs. This is not a calculation bug, rather, it
> concerns the vtk output only (the solution field  is the same in parallel
> and serial). I am using DMPlexVTKWriteAll() but will make the switch to
> hdf5 sometime.
>

For output, I would suggest keeping it as simple as possible inside your
code. For example, I would use

  Ierr = DMViewFromOptions(dm, NULL, "-dm_view");CHKERRQ(ierr);
  ierr = VecViewFromOptions(sol, NULL, "-sol_view");CHKERRQ(ierr);

as the only output in my program (until you need something more
sophisticated). Then you could use

  -sol_view vtk:sol.vtu

for output, or

  -dm_view hdf5:sol.h5 -sol_view hdf5:sol.h5::append

to get HDF5 output. For HDF5 you run

  ${PETSC_DIR}/lib/bin/petsc_gen_xdmf.py sol.h5

to get sol.xmf which can be loaded in Paraview.


> Anyways, my suspicion is about PetscSection and how I am setting it up. I
> call PetscSectionSetChart() where my "pStart" and "pEnd" I collect from
> DMPlexGetDepthStratum() where "depth" is set to zero (for vertices) and
> then I call DMSetLocalSection(). After tinkering with DMPlex routines, I
> realize that DMPlexGetDepthStratum() returns "pStart" and "pEnd" in local
> numbering when the run is in parallel. Thus, I think that my serial output
> is correct because in that case local numbering matches the global
> numbering.
>
> So, am I correct in believing that the PetscSectionSetChart() call should
> be done with global numbering?
>

No, the reason it is DMSetLocalSection() is that the section is explicitly
local. Also, even the global section uses local numbering for the points
(global point
numbering is never used inside Plex so that it does not inhibit
scalability).


> Also, I noticed that the parallel DMPlex counts ghost vertices towards the
> "pStart" and "pEnd". So, when I set the chart in the local PetscSection,
> should I figure out the chart for the owned vertices or can PETSc figure
> the ghost/owned dilemma when the local PetscSections feature overlapping
> charts?
>

You do not have to. PETSc will do that automatically.

  Thanks,

     Matt


> Sincerely:
>
> *J.A. Ferrand*
>
> Embry-Riddle Aeronautical University - Daytona Beach FL
>
> M.Sc. Aerospace Engineering | May 2022
>
> B.Sc. Aerospace Engineering
>
> B.Sc. Computational Mathematics
>
>
>
> Sigma Gamma Tau
>
> Tau Beta Pi
>
>
>
> *Phone:* (386)-843-1829
>
> *Email(s):* ferranj2 at my.erau.edu
>
>     jesus.ferrand at gmail.com
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220321/d3f440ce/attachment.html>


More information about the petsc-users mailing list