[petsc-users] node DG with DMPlex

Matthew Knepley knepley at gmail.com
Thu Mar 26 17:38:33 CDT 2020


On Thu, Mar 26, 2020 at 5:14 PM Yann Jobic <yann.jobic at univ-amu.fr> wrote:

> Hi matt
>
> Le 3/23/2020 à 2:24 PM, Matthew Knepley a écrit :
> > On Wed, Mar 18, 2020 at 12:58 PM Yann Jobic <yann.jobic at univ-amu.fr
> > <mailto:yann.jobic at univ-amu.fr>> wrote:
> >
> >     Hi matt,
> >
> >     Le 3/17/2020 à 4:00 PM, Matthew Knepley a écrit :
> >      > On Mon, Mar 16, 2020 at 5:20 PM Yann Jobic
> >     <yann.jobic at univ-amu.fr <mailto:yann.jobic at univ-amu.fr>
> >      > <mailto:yann.jobic at univ-amu.fr <mailto:yann.jobic at univ-amu.fr>>>
> >     wrote:
> >      >
> >      >     Hi all,
> >      >
> >      >     I would like to implement a nodal DG with the DMPlex
> interface.
> >      >     Therefore, i must add the internal nodes to the DM (GLL
> >     nodes), with
> >      >     the
> >      >     constrains :
> >      >     1) Add them as solution points, with correct coordinates (and
> >     keep the
> >      >     good rotational ordering)
> >      >     2) Find the shared nodes at faces in order to compute the
> fluxes
> >      >     3) For parallel use, so synchronize the ghost node at each
> >     time steps
> >      >
> >      >
> >      > Let me get the fundamentals straight before advising, since I
> >     have never
> >      > implemented nodal DG.
> >      >
> >      >    1) What is shared?
> >     I need to duplicate an edge in 2D, or a facet in 3D, and to sync it
> >     after a time step, in order to compute the numerical fluxes
> >     (Lax-Friedrichs at the beginning).
> >
> >
> > I should have been more specific, but I think I see what you want. You
> > do not "share" unknowns between cells,
> > so all unknowns should be associated with some cell in the Section.
> >
> > You think of some cell unknowns as being "connected" to a face, so when
> > you want to calculate a flux, you need
> > the unknowns from the adjacent cell in order to do it. In order to do
> > this, I would partition with overlap=1, which
> > is what we do for finite volume, which has the same adjacency needs. You
> > might also set
> >
> >
> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMSetAdjacency.html
> > to PETSC_TRUE, PETSC_FALSE, but you are probably doing everything
> > matrix-free if you are using DG.
> > The above is optimal for FV, but not for DG because you communicate more
> > than you absolutely have to.
> >
> > A more complicated, but optimal, thing to do would be to assign interior
> > dofs to the cell, and two sets of dofs
> > to each face, one for each cell. Then you only communicate the face
> > dofs. Its just more bookkeeping for you,
> > but it will work in parallel just fine.
> I'm going this way.
> So i should use dm/impls/plex/examples/tutorials/ex1.c.html as reference.
> I should define the internal nodes on cells :
> I define the section with 3 fields (field 0 on cells, field 1 and 2 on
> faces), as :
> numComp[0] = Nr; /* Total number of dof per Cell */
> numDof[0*(dim+1)+0] = dim; /* defined over the Cell */
> And on the same section, the dofs at faces :
> numComp[1] = NumDofPerFace;
> numComp[2] = NumDofPerFace;
> numDof[1*(dim+1)+dim-1] = dim-1; /* internal dof of the cell */
> numDof[2*(dim+1)+dim-1] = dim-1; /* external dof of the cell */
>
> Is it a good way to create my section ?
>

I would put them all in one field. A "field" is supposed to be a physical
thing, like velocity or pressure.


> Thus, the data is duplicated for the faces, that means that i have to
> sync the internal Face dof at faces with their corresponding values from
> the internal one (at cells).
>

That is not how I was envisioning it. Perhaps a drawing. Suppose you had
DG2, then you have

        17              18
  7-----8-----9-----14----15
   |               |                |
16,3    4    5,6    11   12,13
   |               |                |
   1-----2-----3-----9-----10
          19            20

so each face gets 2 dofs, one for each cell.When doing a cell integral, you
only use the dof that is for that cell.
The local-to-global would update the face dofs, so you would get each side.

There is a reordering when you extract the closure. I have written one for
spectral elements. We would need
another here that ordered all the "other" face dofs to the end.

This seems a little complicated to me. Do you know how Andreas Klockner
does it in Hedge? Or Tim Warburton?
I just want to make sure I am not missing an elegant way to handle this.


> Here field 1 is synchronised with field 0, locally.
> But the external Face dof, field 2, have to be synchronised with the
> values of the adjacent cell.
> Is it possible to use something like  DMPlexGetFaceFields ?
> Is there an example of such use of PetscSection and synchronisation
> process ?
>
> For the parallel part, should i use PetscSF object ?
>

In parallel, integrals would be summed into the global vector, so each side
has a 0 for the other face dof and the right contribution
for its face dof. Then both sides get both solution dofs. It seems to work
in my head.


> I read your article "Mesh Algorithms for PDE with Sieve I: Mesh
> Distribution". But it's refereeing to Matthew G. Knepley and Dmitry A.
> Karpeev. Sieve implementation.
> Technical Report ANL/MCS to appear, Argonne National Laboratory,
> January 2008.
> I couldn't find it. It is freely available ?
>

Don't bother reading that. There are later ones:

 There are two pretty good sources:

  https://arxiv.org/abs/1505.04633
  https://arxiv.org/abs/1506.06194

The last one is a follow-on to this paper

  https://arxiv.org/abs/0908.4427

  Thanks,

     Matt


> >
> > I don't think you need extra vertices, > or coordinates, and for output I
> > recommend using DMPlexProject() to get
> > the solution in some space that can be plotted like P1, or anything else
> > supported by your visualization.
>
> I would like to use DMplex as much as i can, as i would in the future
> refine locally the mesh.
>
> I hope you're good in this difficult situation (covid19),
>
> Best regards,
>
> Yann
>
> >
> >    Thanks,
> >
> >       Matt
> >
> >      >
> >      >        We have an implementation of spectral element ordering
> >      >
> >     (
> https://gitlab.com/petsc/petsc/-/blob/master/src/dm/impls/plex/examples/tutorials/ex6.c
> ).
> >
> >      > Those share
> >      >        the whole element boundary.
> >      >
> >      >    2) What ghosts do you need?
> >     In order to compute the numerical fluxes of one element, i need the
> >     values of the surrounding nodes connected to the adjacent elements.
> >      >
> >      >    3) You want to store real space coordinates for a quadrature?
> >     It should be basically the same as PetscFE of higher order.
> >     I add some vertex needed to compute a polynomal solution of the
> desired
> >     order. That means that if i have a N, order of the local
> approximation,
> >     i need 0.5*(N+1)*(N+2) vertex to store in the DMPlex (in 2D), in
> >     order to :
> >     1) have the correct number of dof
> >     2) use ghost nodes to sync the values of the vertex/edge/facet for
> >     1D/2D/3D problem
> >     2) save correctly the solution
> >
> >     Does it make sense to you ?
> >
> >     Maybe like
> >
> https://www.mcs.anl.gov/petsc/petsc-current/src/ts/examples/tutorials/ex11.c.html
> >     With the use of the function SplitFaces, which i didn't fully
> >     understood
> >     so far.
> >
> >     Thanks,
> >
> >     Yann
> >
> >      >
> >      >        We usually define a quadrature on the reference element
> once.
> >      >
> >      >    Thanks,
> >      >
> >      >      Matt
> >      >
> >      >     I found elements of answers in those threads :
> >      >
> >
> https://lists.mcs.anl.gov/pipermail/petsc-users/2016-August/029985.html
> >      >
> >
> https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-October/039581.html
> >      >
> >      >     However, it's not clear for me where to begin.
> >      >
> >      >     Quoting Matt, i should :
> >      >     "  DMGetCoordinateDM(dm, &cdm);
> >      >         <Set field information into cdm instead of dm>
> >      >        DMCreateLocalVector(cdm, &coordinatesLocal);
> >      >        <Fill in higher order coordinate values>
> >      >        DMSetCoordinatesLocal(dm, coordinatesLocal);"
> >      >
> >      >     However, i will not create ghost nodes this way. And i'm not
> >     sure to
> >      >     keep the good ordering.
> >      >     This part should be implemented in the PetscFE interface, for
> >     high
> >      >     order
> >      >     discrete solutions.
> >      >     I did not succeed in finding the correct part of the source
> >     doing it.
> >      >
> >      >     Could you please give me some hint to begin correctly thoses
> >     tasks ?
> >      >
> >      >     Thanks,
> >      >
> >      >     Yann
> >      >
> >      >
> >      >
> >      > --
> >      > What most experimenters take for granted before they begin their
> >      > experiments is infinitely more interesting than any results to
> which
> >      > their experiments lead.
> >      > -- Norbert Wiener
> >      >
> >      > https://www.cse.buffalo.edu/~knepley/
> >     <http://www.cse.buffalo.edu/~knepley/>
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> > experiments is infinitely more interesting than any results to which
> > their experiments lead.
> > -- Norbert Wiener
> >
> > https://www.cse.buffalo.edu/~knepley/ <
> http://www.cse.buffalo.edu/~knepley/>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200326/e6f1ea0d/attachment-0001.html>


More information about the petsc-users mailing list