[petsc-users] Nodes coordinates in distributed dmplex

Matthew Knepley knepley at gmail.com
Thu Jul 9 10:18:38 CDT 2015


On Thu, Jul 9, 2015 at 7:42 AM, Alejandro D Otero <aotero at fi.uba.ar> wrote:

> Hi, sorry if this is an obvious question, but I cannot figure out how to
> recover finite element nodes coordinates once I have distributed a mesh
> stored as a dmplex. I am using petsc4py as interface to petsc rutines.
>
> I first created a dmplex using:
> dm.createFromCellList()
>
> In a sequential run I got the coordinates with:
> Coords = dm.getCoordinates()
>
> which gave a sequential vector with the coordinates of the mesh nodes.
>
> When I distribute the mesh with:
> dm.distribute()
>
> each mpi process has it own dm but the indexing of the vector resulting
> from getCoordinates() or getCoordinatesLocal() seems not consistent with
> the local numbering of the cells and nodes.
>

When the mesh is distributed, the vertices are renumbered. Thus the
coordinates you get out are
for reordered local vertices, but they are consistent with the local
topology (cells still contain the
right vertices) and the overlap mapping (SF still connects the shared
vertices).

What do you need it to do?

  Thanks,

    Matt


> Which is the correct way of doing this in PETSc philosophy?
>
> Thanks in advance,
> Alejandro
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150709/724e8561/attachment.html>


More information about the petsc-users mailing list