[petsc-users] Nodes coordinates in distributed dmplex
Matthew Knepley
knepley at gmail.com
Sat Jul 16 08:27:36 CDT 2022
On Fri, Jul 15, 2022 at 7:29 PM David Fuentes <fuentesdt at gmail.com> wrote:
> Hi, when looping over the nodal coordinates, is there a way to get the
> corresponding DOF in the index set for the DM Field(s) that is used in the
> residual vector?
> I would like to update the jacobian/residual entries obtained from
> DMCreateFieldIS that correspond to particular nodal coordinates that are
> not on the boundary.
>
I think I just need to understand exactly what you want. DMCreateFieldIS gives
you the global indices into the residual/Jacobian. You can get these same
indices
from the PetscSection that you get from DMGetGlobalSection(). You can put
any mesh point into the Section and get back an index
using PetscSectionGetOffset().
Maybe give me a more specific example and I can help write some code.
Thanks,
Matt
On Fri, Jul 10, 2015 at 11:14 AM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Fri, Jul 10, 2015 at 6:48 AM, Alejandro D Otero <aotero at fi.uba.ar>
>> wrote:
>>
>>> Hi Matt, thanks for your answer.
>>>
>>> I got a vector from getCoordinates(). How are their components indexed?
>>> is (p * dim + d) with p: node, dim: of the problem, x_d coordinate of the
>>> node, ok?
>>> Which numbering for p? The local number of node, the number of point in
>>> the DAG of the dm, the original number of node?
>>>
>>
>> All data layouts for Plex are described by a PetscSection (which is how
>> it should be in the rest of PETSc as well). The PetscSection is a map
>> from mesh points to (# dof, offset) pairs. Thus, the way I get
>> coordinates is the following:
>>
>> DMGetCoordinatesLocal(dm, &coordinates);
>> DMGetCoordinateDM(dm, &cdm);
>> DMGetDefaultSection(cdm, &cs);
>> PetscSectionGetChart(cs, &pStart, &pEnd);
>> VecGetArrayRead(coordinates, &coords);
>> for (p = pStart; p < pEnd; ++p) {
>> PetscInt dof, off;
>>
>> PetscSectionGetDof(cs, p, &dof);
>> PetscSectionGetOffset(cs, p, &off);
>> for (d = 0; d < dof; ++d) <compute with coords[off+d]>
>> }
>> VecRestoreArrayRead(coordinates, &coords);
>>
>>
>>> I am trying a simple square mesh with 16 4-node square elements parted
>>> into 2 process. Total of 25 nodes.
>>> The distributed dm seems alright to me. Each process gets a dm with 8
>>> elements an 15 nodes, which means that the 5 shared nodes are local to each
>>> process. But one of the process gives negative values for the shared nodes.
>>> How need them to be mapped to get the right number.
>>>
>>
>> Where do you get negative point numbers? I encode off-process point
>> numbers as -(remote point + 1).
>>
>>
>>> It seems I'm using a wrong approach to this. May be I need to get the
>>> coordinates in a somehow different way. I'm working on a from-scratch
>>> implementation of a FEM code based on petsc4py. I want to code the problem
>>> matrices assembly from elemental matrices. I've already done this
>>> sequentially, but I got stuck when trying to compute elemental matrices in
>>> parallel because I don't understand the way of obtaining the coordinates of
>>> the nodes in for each element.
>>>
>>
>> The way I get coordinates for the element c is
>>
>> PetscScalar *coords = NULL;
>> PetscInt csize;
>>
>> ierr = DMPlexVecGetClosure(dm, cs, coordinates, c, &csize,
>> &coords);CHKERRQ(ierr);
>> ierr = DMPlexVecRestoreClosure(dm, cs, coordinates, c, &csize,
>> &coords);CHKERRQ(ierr);
>>
>> Thanks,
>>
>> Matt
>>
>>
>>> Again, thanks for your help,
>>>
>>> A.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Thu, Jul 9, 2015 at 5:18 PM, Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> On Thu, Jul 9, 2015 at 7:42 AM, Alejandro D Otero <aotero at fi.uba.ar>
>>>> wrote:
>>>>
>>>>> Hi, sorry if this is an obvious question, but I cannot figure out how
>>>>> to recover finite element nodes coordinates once I have distributed a mesh
>>>>> stored as a dmplex. I am using petsc4py as interface to petsc rutines.
>>>>>
>>>>> I first created a dmplex using:
>>>>> dm.createFromCellList()
>>>>>
>>>>> In a sequential run I got the coordinates with:
>>>>> Coords = dm.getCoordinates()
>>>>>
>>>>> which gave a sequential vector with the coordinates of the mesh nodes.
>>>>>
>>>>> When I distribute the mesh with:
>>>>> dm.distribute()
>>>>>
>>>>> each mpi process has it own dm but the indexing of the vector
>>>>> resulting from getCoordinates() or getCoordinatesLocal() seems not
>>>>> consistent with the local numbering of the cells and nodes.
>>>>>
>>>>
>>>> When the mesh is distributed, the vertices are renumbered. Thus the
>>>> coordinates you get out are
>>>> for reordered local vertices, but they are consistent with the local
>>>> topology (cells still contain the
>>>> right vertices) and the overlap mapping (SF still connects the shared
>>>> vertices).
>>>>
>>>> What do you need it to do?
>>>>
>>>> Thanks,
>>>>
>>>> Matt
>>>>
>>>>
>>>>> Which is the correct way of doing this in PETSc philosophy?
>>>>>
>>>>> Thanks in advance,
>>>>> Alejandro
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220716/737bc6e7/attachment-0001.html>
More information about the petsc-users
mailing list