[petsc-users] Petsc Section in DMPlex

Matthew Knepley knepley at gmail.com
Wed Dec 7 20:28:36 CST 2022


On Wed, Dec 7, 2022 at 9:21 PM Nicholas Arnold-Medabalimi <
narnoldm at umich.edu> wrote:

> Thank you for the help.
>
> I think the last piece of the puzzle is how do I create the "expanded IS"
> from the subpoint IS using the section?
>

Loop over the points in the IS. For each point, get the dof and offset from
the Section. Make a new IS that has all the
dogs, namely each run [offset, offset+dof).

  Thanks,

     Matt


> Sincerely
> Nicholas
>
>
> On Wed, Dec 7, 2022 at 7:06 AM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Wed, Dec 7, 2022 at 6:51 AM Nicholas Arnold-Medabalimi <
>> narnoldm at umich.edu> wrote:
>>
>>> Hi
>>>
>>> Thank you so much for your patience. One thing to note: I don't have any
>>> need to go back from the filtered distributed mapping back to the full but
>>> it is good to know.
>>>
>>> One aside question.
>>> 1) Is natural and global ordering the same in this context?
>>>
>>
>> No.
>>
>>
>>> As far as implementing what you have described.
>>>
>>> When I call ISView on the generated SubpointIS, I get an unusual error
>>> which I'm not sure how to interpret. (this case is running on 2 ranks and
>>> the filter label has points located on both ranks of the original DM.
>>> However, if I manually get the indices (the commented lines), it seems to
>>> not have any issues.
>>> call DMPlexFilter(dmplex_full, iBlankLabel, 1, dmplex_filtered,ierr)
>>> call DMPlexGetSubpointIS(dmplex_filtered, subpointsIS,ierr)
>>> !call ISGetIndicesF90(subpointsIS, subPointKey,ierr)
>>> !write(*,*) subPointKey
>>> !call ISRestoreIndicesF90(subpointsIS, subPointKey,ierr)
>>> call ISView(subpointsIS,PETSC_VIEWER_STDOUT_WORLD,ierr)
>>>
>>> [1]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [1]PETSC ERROR: Arguments must have same communicators
>>> [1]PETSC ERROR: Different communicators in the two objects: Argument # 1
>>> and 2 flag 3
>>> [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
>>> [1]PETSC ERROR: Petsc Development GIT revision: v3.18.1-320-g7810d690132
>>>  GIT Date: 2022-11-20 20:25:41 -0600
>>> [1]PETSC ERROR: Configure options with-fc=mpiifort with-mpi-f90=mpiifort
>>> --download-triangle --download-parmetis --download-metis --with-debugging=1
>>> --download-hdf5 --prefix=/home/narnoldm/packages/petsc_install
>>> [1]PETSC ERROR: #1 ISView() at
>>> /home/narnoldm/packages/petsc/src/vec/is/is/interface/index.c:1629
>>>
>>
>> The problem here is the subpointsIS is a _serial_ object, and you are
>> using a parallel viewer. You can use PETSC_VIEWER_STDOUT_SELF,
>> or you can pull out the singleton viewer from STDOUT_WORLD if you want
>> them all to print in order.
>>
>>
>>> As far as the overall process you have described my question on first
>>> glance is do I have to allocate/create the vector that is output by
>>> VecISCopy before calling it, or does it create the vector automatically?
>>>
>>
>> You create both vectors. I would do it using DMCreateGlobalVector() from
>> both DMs.
>>
>>
>>> I think I would need to create it first using a section and Setting the
>>> Vec in the filtered DM?
>>>
>>
>> Setting the Section in the filtered DM.
>>
>>
>>> And I presume in this case I would be using the scatter reverse option
>>> to go from the full set to the reduced set?
>>>
>>
>> Yes
>>
>>   Thanks
>>
>>      Matt
>>
>>
>>> Sincerely
>>> Nicholas
>>>
>>>
>>>
>>>
>>>
>>>
>>> Sincerely
>>> Nick
>>>
>>> On Wed, Dec 7, 2022 at 6:00 AM Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> On Wed, Dec 7, 2022 at 3:35 AM Nicholas Arnold-Medabalimi <
>>>> narnoldm at umich.edu> wrote:
>>>>
>>>>> Hi Matthew
>>>>>
>>>>> Thank you for the help. This clarified a great deal.
>>>>>
>>>>> I have a follow-up question related to DMPlexFilter. It may be better
>>>>> to describe what I'm trying to achieve.
>>>>>
>>>>> I have a general mesh I am solving which has a section with cell
>>>>> center finite volume states, as described in my initial email. After
>>>>> calculating some metrics, I tag a bunch of cells with an identifying Label
>>>>> and use DMFilter to generate a new DM which is only that subset of cells.
>>>>> Generally, this leads to a pretty unbalanced DM so I then plan to use
>>>>> DMPlexDIstribute to balance that DM across the processors. The coordinates
>>>>> pass along fine, but the state(or I should say Section) does not at least
>>>>> as far as I can tell.
>>>>>
>>>>> Assuming I can get a filtered DM I then distribute the DM and state
>>>>> using the method you described above and it seems to be working ok.
>>>>>
>>>>> The last connection I have to make is the transfer of information from
>>>>> the full mesh to the "sampled" filtered mesh. From what I can gather I
>>>>> would need to get the mapping of points using DMPlexGetSubpointIS and then
>>>>> manually copy the values from the full DM section to the filtered DM? I
>>>>> have the process from full->filtered->distributed all working for the
>>>>> coordinates so its just a matter of transferring the section correctly.
>>>>>
>>>>> I appreciate all the help you have provided.
>>>>>
>>>>
>>>> Let's do this in two steps, which makes it easier to debug. First, do
>>>> not redistribute the submesh. Just use DMPlexGetSubpointIS()
>>>> to get the mapping of filtered points to points in the original mesh.
>>>> Then create an expanded IS using the Section which makes
>>>> dofs in the filtered mesh to dofs in the original mesh. From this use
>>>>
>>>>   https://petsc.org/main/docs/manualpages/Vec/VecISCopy/
>>>>
>>>> to move values between the original vector and the filtered vector.
>>>>
>>>> Once that works, you can try redistributing the filtered mesh. Before
>>>> calling DMPlexDistribute() on the filtered mesh, you need to call
>>>>
>>>>   https://petsc.org/main/docs/manualpages/DM/DMSetUseNatural/
>>>>
>>>> When you redistribute, it will compute a mapping back to the original
>>>> layout. Now when you want to transfer values, you
>>>>
>>>>   1) Create a natural vector with DMCreateNaturalVec()
>>>>
>>>>   2) Use DMGlobalToNaturalBegin/End() to move values from the filtered
>>>> vector to the natural vector
>>>>
>>>>   3) Use VecISCopy() to move values from the natural vector to the
>>>> original vector
>>>>
>>>> Let me know if you have any problems.
>>>>
>>>>   Thanks,
>>>>
>>>>      Matt
>>>>
>>>>
>>>>> Sincerely
>>>>> Nicholas
>>>>>
>>>>>
>>>>>
>>>>> On Mon, Nov 28, 2022 at 6:19 AM Matthew Knepley <knepley at gmail.com>
>>>>> wrote:
>>>>>
>>>>>> On Sun, Nov 27, 2022 at 10:22 PM Nicholas Arnold-Medabalimi <
>>>>>> narnoldm at umich.edu> wrote:
>>>>>>
>>>>>>> Hi Petsc Users
>>>>>>>
>>>>>>> I have a question about properly using PetscSection to assign state
>>>>>>> variables to a DM. I have an existing DMPlex mesh distributed on 2
>>>>>>> processors. My goal is to have state variables set to the cell centers. I
>>>>>>> then want to call DMPlexDistribute, which I hope will balance the mesh
>>>>>>> elements and hopefully transport the state variables to the hosting
>>>>>>> processors as the cells are distributed to a different processor count or
>>>>>>> simply just redistributing after doing mesh adaption.
>>>>>>>
>>>>>>> Looking at the DMPlex User guide, I should be able to achieve this
>>>>>>> with a single field section using SetDof and assigning the DOF to the
>>>>>>> points corresponding to cells.
>>>>>>>
>>>>>>
>>>>>> Note that if you want several different fields, you can clone the DM
>>>>>> first for this field
>>>>>>
>>>>>>   call DMClone(dm,dmState,ierr)
>>>>>>
>>>>>> and use dmState in your calls below.
>>>>>>
>>>>>>
>>>>>>>
>>>>>>> call DMPlexGetHeightStratum(dm,0,c0,c1,ierr)
>>>>>>> call DMPlexGetChart(dm,p0,p1,ierr)
>>>>>>> call PetscSectionCreate(PETSC_COMM_WORLD,section,ierr)
>>>>>>> call PetscSectionSetNumFields(section,1,ierr) call
>>>>>>> PetscSectionSetChart(section,p0,p1,ierr)
>>>>>>> do i = c0, (c1-1)
>>>>>>>     call PetscSectionSetDof(section,i,nvar,ierr)
>>>>>>> end do
>>>>>>> call PetscSectionSetup(section,ierr)
>>>>>>> call DMSetLocalSection(dm,section,ierr)
>>>>>>>
>>>>>>
>>>>>> In the loop, I would add a call to
>>>>>>
>>>>>>   call PetscSectionSetFieldDof(section,i,0,nvar,ierr)
>>>>>>
>>>>>> This also puts in the field breakdown. It is not essential, but nicer.
>>>>>>
>>>>>>
>>>>>>> From here, it looks like I can access and set the state vars using
>>>>>>>
>>>>>>> call DMGetGlobalVector(dmplex,state,ierr)
>>>>>>> call DMGetGlobalSection(dmplex,section,ierr)
>>>>>>> call VecGetArrayF90(state,stateVec,ierr)
>>>>>>> do i = c0, (c1-1)
>>>>>>> call PetscSectionGetOffset(section,i,offset,ierr)
>>>>>>> stateVec(offset:(offset+nvar))=state_i(:) !simplified assignment
>>>>>>> end do
>>>>>>> call VecRestoreArrayF90(state,stateVec,ierr)
>>>>>>> call DMRestoreGlobalVector(dmplex,state,ierr)
>>>>>>>
>>>>>>> To my understanding, I should be using Global vector since this is a
>>>>>>> pure assignment operation and I don't need the ghost cells.
>>>>>>>
>>>>>>
>>>>>> Yes.
>>>>>>
>>>>>> But the behavior I am seeing isn't exactly what I'd expect.
>>>>>>>
>>>>>>> To be honest, I'm somewhat unclear on a few things
>>>>>>>
>>>>>>> 1) Should be using nvar fields with 1 DOF each or 1 field with nvar
>>>>>>> DOFs or what the distinction between the two methods are?
>>>>>>>
>>>>>>
>>>>>> We have two divisions in a Section. A field can have a number of
>>>>>> components. This is intended to model a vector or tensor field.
>>>>>> Then a Section can have a number of fields, such as velocity and
>>>>>> pressure for a Stokes problem. The division is mainly to help the
>>>>>> user, so I would use the most natural one.
>>>>>>
>>>>>>
>>>>>>> 2) Adding a print statement after the offset assignment I get (on
>>>>>>> rank 0 of 2)
>>>>>>> cell           1 offset           0
>>>>>>>  cell           2 offset          18
>>>>>>>  cell           3 offset          36
>>>>>>> which is expected and works but on rank 1 I get
>>>>>>>  cell           1 offset        9000
>>>>>>>  cell           2 offset        9018
>>>>>>>  cell           3 offset        9036
>>>>>>>
>>>>>>> which isn't exactly what I would expect. Shouldn't the offsets reset
>>>>>>> at 0 for the next rank?
>>>>>>>
>>>>>>
>>>>>> The local and global sections hold different information. This is the
>>>>>> source of the confusion. The local section does describe a local
>>>>>> vector, and thus includes overlap or "ghost" dofs. The global section
>>>>>> describes a global vector. However, it is intended to deliver
>>>>>> global indices, and thus the offsets give back global indices. When
>>>>>> you use VecGetArray*() you are getting out the local array, and
>>>>>> thus you have to subtract the first index on this process. You can
>>>>>> get that from
>>>>>>
>>>>>>   VecGetOwnershipRange(v, &rstart, &rEnd);
>>>>>>
>>>>>> This is the same whether you are using DMDA or DMPlex or any other DM.
>>>>>>
>>>>>>
>>>>>>> 3) Does calling DMPlexDistribute also distribute the section data
>>>>>>> associated with the DOF, based on the description in DMPlexDistribute it
>>>>>>> looks like it should?
>>>>>>>
>>>>>>
>>>>>> No. By default, DMPlexDistribute() only distributes coordinate data.
>>>>>> I you want to distribute your field, it would look something like this:
>>>>>>
>>>>>> DMPlexDistribute(dm, 0, &sfDist, &dmDist);
>>>>>> VecCreate(comm, &stateDist);
>>>>>> VecSetDM(sateDist, dmDist);
>>>>>> PetscSectionCreate(comm &sectionDist);
>>>>>> DMSetLocalSection(dmDist, sectionDist);
>>>>>> DMPlexDistributeField(dmDist, sfDist, section, state, sectionDist,
>>>>>> stateDist);
>>>>>>
>>>>>> We do this in src/dm/impls/plex/tests/ex36.c
>>>>>>
>>>>>>   THanks,
>>>>>>
>>>>>>      Matt
>>>>>>
>>>>>> I'd appreciate any insight into the specifics of this usage. I expect
>>>>>>> I have a misconception on the local vs global section. Thank you.
>>>>>>>
>>>>>>> Sincerely
>>>>>>> Nicholas
>>>>>>>
>>>>>>> --
>>>>>>> Nicholas Arnold-Medabalimi
>>>>>>>
>>>>>>> Ph.D. Candidate
>>>>>>> Computational Aeroscience Lab
>>>>>>> University of Michigan
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their
>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>> experiments lead.
>>>>>> -- Norbert Wiener
>>>>>>
>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Nicholas Arnold-Medabalimi
>>>>>
>>>>> Ph.D. Candidate
>>>>> Computational Aeroscience Lab
>>>>> University of Michigan
>>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/
>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>
>>>
>>>
>>> --
>>> Nicholas Arnold-Medabalimi
>>>
>>> Ph.D. Candidate
>>> Computational Aeroscience Lab
>>> University of Michigan
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
>
> --
> Nicholas Arnold-Medabalimi
>
> Ph.D. Candidate
> Computational Aeroscience Lab
> University of Michigan
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221207/cea289ad/attachment-0001.html>


More information about the petsc-users mailing list