[petsc-users] Petsc Section in DMPlex

Matthew Knepley knepley at gmail.com
Mon Nov 28 05:18:36 CST 2022


On Sun, Nov 27, 2022 at 10:22 PM Nicholas Arnold-Medabalimi <
narnoldm at umich.edu> wrote:

> Hi Petsc Users
>
> I have a question about properly using PetscSection to assign state
> variables to a DM. I have an existing DMPlex mesh distributed on 2
> processors. My goal is to have state variables set to the cell centers. I
> then want to call DMPlexDistribute, which I hope will balance the mesh
> elements and hopefully transport the state variables to the hosting
> processors as the cells are distributed to a different processor count or
> simply just redistributing after doing mesh adaption.
>
> Looking at the DMPlex User guide, I should be able to achieve this with a
> single field section using SetDof and assigning the DOF to the points
> corresponding to cells.
>

Note that if you want several different fields, you can clone the DM
first for this field

  call DMClone(dm,dmState,ierr)

and use dmState in your calls below.


>
> call DMPlexGetHeightStratum(dm,0,c0,c1,ierr)
> call DMPlexGetChart(dm,p0,p1,ierr)
> call PetscSectionCreate(PETSC_COMM_WORLD,section,ierr)
> call PetscSectionSetNumFields(section,1,ierr) call
> PetscSectionSetChart(section,p0,p1,ierr)
> do i = c0, (c1-1)
>     call PetscSectionSetDof(section,i,nvar,ierr)
> end do
> call PetscSectionSetup(section,ierr)
> call DMSetLocalSection(dm,section,ierr)
>

In the loop, I would add a call to

  call PetscSectionSetFieldDof(section,i,0,nvar,ierr)

This also puts in the field breakdown. It is not essential, but nicer.


> From here, it looks like I can access and set the state vars using
>
> call DMGetGlobalVector(dmplex,state,ierr)
> call DMGetGlobalSection(dmplex,section,ierr)
> call VecGetArrayF90(state,stateVec,ierr)
> do i = c0, (c1-1)
> call PetscSectionGetOffset(section,i,offset,ierr)
> stateVec(offset:(offset+nvar))=state_i(:) !simplified assignment
> end do
> call VecRestoreArrayF90(state,stateVec,ierr)
> call DMRestoreGlobalVector(dmplex,state,ierr)
>
> To my understanding, I should be using Global vector since this is a pure
> assignment operation and I don't need the ghost cells.
>

Yes.

But the behavior I am seeing isn't exactly what I'd expect.
>
> To be honest, I'm somewhat unclear on a few things
>
> 1) Should be using nvar fields with 1 DOF each or 1 field with nvar
> DOFs or what the distinction between the two methods are?
>

We have two divisions in a Section. A field can have a number of
components. This is intended to model a vector or tensor field.
Then a Section can have a number of fields, such as velocity and pressure
for a Stokes problem. The division is mainly to help the
user, so I would use the most natural one.


> 2) Adding a print statement after the offset assignment I get (on rank 0
> of 2)
> cell           1 offset           0
>  cell           2 offset          18
>  cell           3 offset          36
> which is expected and works but on rank 1 I get
>  cell           1 offset        9000
>  cell           2 offset        9018
>  cell           3 offset        9036
>
> which isn't exactly what I would expect. Shouldn't the offsets reset at 0
> for the next rank?
>

The local and global sections hold different information. This is the
source of the confusion. The local section does describe a local
vector, and thus includes overlap or "ghost" dofs. The global section
describes a global vector. However, it is intended to deliver
global indices, and thus the offsets give back global indices. When you use
VecGetArray*() you are getting out the local array, and
thus you have to subtract the first index on this process. You can get that
from

  VecGetOwnershipRange(v, &rstart, &rEnd);

This is the same whether you are using DMDA or DMPlex or any other DM.


> 3) Does calling DMPlexDistribute also distribute the section data
> associated with the DOF, based on the description in DMPlexDistribute it
> looks like it should?
>

No. By default, DMPlexDistribute() only distributes coordinate data. I you
want to distribute your field, it would look something like this:

DMPlexDistribute(dm, 0, &sfDist, &dmDist);
VecCreate(comm, &stateDist);
VecSetDM(sateDist, dmDist);
PetscSectionCreate(comm &sectionDist);
DMSetLocalSection(dmDist, sectionDist);
DMPlexDistributeField(dmDist, sfDist, section, state, sectionDist,
stateDist);

We do this in src/dm/impls/plex/tests/ex36.c

  THanks,

     Matt

I'd appreciate any insight into the specifics of this usage. I expect I
> have a misconception on the local vs global section. Thank you.
>
> Sincerely
> Nicholas
>
> --
> Nicholas Arnold-Medabalimi
>
> Ph.D. Candidate
> Computational Aeroscience Lab
> University of Michigan
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221128/2b67826b/attachment-0001.html>


More information about the petsc-users mailing list