[petsc-users] [EXTERNAL] Re: What exactly is the GlobalToNatural PetscSF of DMPlex/DM?

Matthew Knepley knepley at gmail.com
Sat Jul 13 15:51:21 CDT 2024


On Sat, Jul 13, 2024 at 4:39 PM Ferrand, Jesus A. <FERRANJ2 at my.erau.edu>
wrote:

> Matt:
>
> Thank you for the reply.
> The bulk of it makes a lot of sense.
> Yes! That need to keep track of the original mesh numbers (AKA "Natural")
> is what I find pressing for my research group.
> Awesome! I was separately keeping track of these numbers using a
> PetscSection that I was inputting into DMSetLocalSection() but of the
> coordinate DM, not the plex.
> It is good to know the "correct" way to do it.
>
> "What is repetitive? It should be able to be automated."
>
> Absolutely as the intrinsic process is ubiquitous between mesh formats.
> What I meant by "repetitive" is the information that is reused by
> different API calls (namely, global stratum sizes, and local point numbers
> corresponding to owned DAG points).
> I need to define a struct to bookkeep this. It's not really an issue,
> rather a minor annoyance (for me).
> I need the stratum sizes to offset DMPlex numbering cells in range
> [0,nCell) and vertices ranging in [nCell,nCell+nVert) to other mesh
> numberings where cells range from [1, nCell] and vertices range from [1,
> nVert]. In my experience, this information is needed at least three (3)
> times, during coordinate writes, during element connectivity writes, and
> during DMLabel writes for BC's and other labelled data.
>

This is a good point, and I think supports my argument that these formats
are insane. What you point you below is that the format demands
a completely artificial division of points when writing. I don't do this
when writing HDF5. This division can be recovered in linear time completely
locally after a read, so I think by any metric is it crazy to put it in the
file. However, I recognize that supporting previous formats is a good
thing, so I do not complain too loudly :)

  Thanks,

     Matt


> This information I determine using a code snippet like this:
> PetscCall(PetscObjectGetComm((PetscObject)plex,&mpiComm));
>  PetscCallMPI(MPI_Comm_rank(mpiComm,&mpiRank));
>  PetscCallMPI(MPI_Comm_size(mpiComm,&mpiCommSize));
>  PetscCall(DMPlexCreatePointNumbering(plex,&GlobalNumberIS));
>  PetscCall(ISGetIndices(GlobalNumberIS,&IdxPtr));
>  PetscCall(DMPlexGetDepth(plex,&Depth));
>  PetscCall(PetscMalloc3(//
>    Depth,&LocalIdxPtrPtr,//Indices in the local stratum to owned points.
>    Depth,&pOwnedPtr,//Number of points in the local stratum that are owned.
>    Depth,&GlobalStratumSizePtr//Global stratum size.
>  ));
>  for(PetscInt jj = 0;jj < Depth;jj++){
>    PetscCall(DMPlexGetDepthStratum(plex,jj,&pStart,&pEnd));
>    pOwnedPtr[jj] = 0;
>    for(PetscInt ii = pStart;ii < pEnd;ii++){
>      if(IdxPtr[ii] >= 0) pOwnedPtr[jj]++;
>    }
>
> PetscCallMPI(MPI_Allreduce(&pOwnedPtr[jj],&GlobalStratumSizePtr[jj],1,MPIU_INT,MPI_MAX,mpiComm));
>    PetscCall(PetscMalloc1(pOwnedPtr[jj],&LocalIdxPtrPtr[jj]));
>    kk = 0;
>    for(PetscInt ii = pStart;ii < pEnd; ii++){
>      if(IdxPtr[ii] >= 0){
>        LocalIdxPtrPtr[jj][kk] = ii;
>        kk++;
>      }
>    }
>  }
>  PetscCall(ISRestoreIndices(GlobalNumberIS,&IdxPtr));
>  PetscCall(ISDestroy(&GlobalNumberIS));
> ------------------------------
> *From:* Matthew Knepley <knepley at gmail.com>
> *Sent:* Thursday, July 11, 2024 8:32 PM
> *To:* Ferrand, Jesus A. <FERRANJ2 at my.erau.edu>
> *Cc:* petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject:* [EXTERNAL] Re: [petsc-users] What exactly is the
> GlobalToNatural PetscSF of DMPlex/DM?
>
> *CAUTION:* This email originated outside of Embry-Riddle Aeronautical
> University. Do not click links or open attachments unless you recognize the
> sender and know the content is safe.
>
> On Mon, Jul 8, 2024 at 10:28 PM Ferrand, Jesus A. <FERRANJ2 at my.erau.edu>
> wrote:
>
> This Message Is From an External Sender
> This message came from outside your organization.
>
> Dear PETSc team:
>
> Greetings.
> I keep working on mesh I/O utilities using DMPlex.
> Specifically for the output stage, I need a solid grasp on the global
> numbers and ideally how to set them into the DMPlex during an input
> operation and carrying the global numbers through API calls to
> DMPlexDistribute() or DMPlexMigrate() and hopefully also through some of
> the mesh adaption APIs. I was wondering if the GlobalToNatural PetscSF
> manages these global numbers. The next most useful object is the PointSF,
> but to me, it seems to only help establish DAG point ownership, not DAG
> point global indices.
>
>
> This is a good question, and gets at a design point of Plex. I don't
> believe global numbers are the "right" way to talk about mesh points, or
> even a very useful way to do it, for several reasons. Plex is designed to
> run just fine without any global numbers. It can, of course, produce
> them on command, as many people remain committed to their existence.
>
> Thus, the first idea is that global numbers should not be stored, since
> they can always be created on command very cheaply. It is much more
> costly to write global numbers to disk, or pull them through memory, than
> compute them.
>
> The second idea is that we use a combination of local numbers, namely
> (rank, point num) pairs, and PetscSF objects to establish sharing relations
> for parallel meshes. Global numbering is a particular traversal of a mesh,
> running over the locally owned parts of each mesh in local order. Thus an
> SF + a local order = a global order, and the local order is provided by the
> point numbering.
>
> The third idea is that a "natural" order is just the global order in which
> a mesh is first fed to Plex. When I redistribute and reorder for good
> performance, I keep track of a PetscSF that can map the mesh back to the
> original order in which it was provided. I see this as an unneeded expense,
> but many many people want output written in the original order (mostly
> because processing tools are so poor). This management is what we mean by
> GlobalToNatural.
>
>
> Otherwise, I have been working with the IS obtained from
> DMPlexGetPointNumbering() and manually determining global stratum sizes,
> offsets, and numbers by looking at the signs of the involuted index list
> that comes with that IS. It's working for now (I can monolithically write
> meshes to CGNS in parallel), but it is resulting in repetitive code that I
> will need for another mesh format that I want to support.
>
>
> What is repetitive? It should be able to be automated.
>
>   Thanks,
>
>     Matt
>
>
> Sincerely:
>
> *J.A. Ferrand*
>
> Embry-Riddle Aeronautical University - Daytona Beach - FL
> Ph.D. Candidate, Aerospace Engineering
>
> M.Sc. Aerospace Engineering
>
> B.Sc. Aerospace Engineering
>
> B.Sc. Computational Mathematics
>
>
> *Phone:* (386)-843-1829
>
> *Email(s):* ferranj2 at my.erau.edu
>
>     jesus.ferrand at gmail.com
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!bLSa5HDys-IGRjJaDJJCuDR-HrJx0HChqgMn-bRU4SNdeVJHOS_DDTqpyPcVlgdSHj-_cfoWf8BgbaCVtjAL$ 
> <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!bLSa5HDys-IGRjJaDJJCuDR-HrJx0HChqgMn-bRU4SNdeVJHOS_DDTqpyPcVlgdSHj-_cfoWf8BgbX7Dk7rm$ >
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!bLSa5HDys-IGRjJaDJJCuDR-HrJx0HChqgMn-bRU4SNdeVJHOS_DDTqpyPcVlgdSHj-_cfoWf8BgbaCVtjAL$  <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!bLSa5HDys-IGRjJaDJJCuDR-HrJx0HChqgMn-bRU4SNdeVJHOS_DDTqpyPcVlgdSHj-_cfoWf8BgbX7Dk7rm$ >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240713/cfd47a60/attachment-0001.html>


More information about the petsc-users mailing list