[petsc-users] Distributing data along with DMPlex

Justin Chang jychang48 at gmail.com
Wed Dec 16 16:01:47 CST 2015


I think you would follow this order:

1*) create a DMPlex (depth, chart, etc) on rank 0. Other ranks have an
empty DM

2) DMPlexDistribute()

3*) Create the PetscSection

4) DMCreateGlobalVector()

5) DMCreateLocalVector()

Now you have a global vector and a local vector for your distributed
DMPlex. The mapping/ghosting/etc of dofs is already taken care of.

* if you're using standard Galerkin FE then in SNES examples 12 and 62 (and
maybe others?) the first step is handled through the mesh generation
functions and step 3 is handled through step 4

Thanks,
Justin

On Wednesday, December 16, 2015, Alejandro D Otero <aotero at fi.uba.ar> wrote:

> Hi, I need some help understanding how to distribute data together with a
> dmplex representing a FE mesh.
> At the beginning I define the structure of the dmplex assigning certain
> number of DoF to cells, edges and vertexes, in one process (the dmplex in
> the rest is empty)
> I create a petscsecton and I create an associated global vector with the
> quantities I want to store.
> Then I distribute the dmplex over all the processes.
> * Although this does not perform well it is just a starting point. I know
> it has to be improved.
>
> I would like to have the global vector distributed accordingly so that
> each process has access to the corresponding local part with its DoF
> (possibly adding some ghost values corresponding to the shared DoF not
> taken care by it).
>
> Is there any 'correct' way to do that in PETSc?
>
> Thanks in advance,
>
> Alejandro
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151216/2e0ebe24/attachment.html>


More information about the petsc-users mailing list