[petsc-users] Distributing data along with DMPlex

Alejandro D Otero aotero at fi.uba.ar
Tue Dec 22 07:17:13 CST 2015


Thanks for your answer.
In principle, this is what I want, although I know that this doesn't scale.
At this point I am reading the mesh in only one proc so I have all the
properties there, that is why after that I need to distribute the mesh
along with the data.
Is there a better solution within this scheme (only one proc reads mesh
data)?
Regards,

Alejandro

On Tue, Dec 22, 2015 at 1:11 AM, Matthew Knepley <knepley at gmail.com> wrote:

> On Thu, Dec 17, 2015 at 1:27 PM, Justin Chang <jychang48 at gmail.com> wrote:
>
>> So you would use something like DMPlexDistributeField()
>> <http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/DM/DMPlexDistributeField.html> in
>> that case. You have your original/global DM, PetscSection, and Vec of
>> values. Then using the PetscSF that was created during DMPlexDistribute,
>> you map the global stuff into the local stuff.
>>
>
> Justin is correct. However, I will note that this strategy is not
> scalable. Usually the data is much larger than the mesh,
> so this runs out of memory much sooner. Are you sure that this is what you
> want?
>
>   Thanks,
>
>      Matt
>
>
>> On Thu, Dec 17, 2015 at 11:21 AM, Alejandro D Otero <aotero at fi.uba.ar>
>> wrote:
>>
>>> Thanks,
>>> The problem then is that after DMPlexDistribute the DMPlex 'points' are
>>> renumbered. So if the values are related to each point in the original
>>> numbering how do I set the values after the distribution. I know the
>>> property stored in the vector related to the entities with the numbering of
>>> the original mesh which I use to create the first DMPlex.
>>>
>>> Ideally for me, I would like to set the values in the vector before
>>> DMPlexDistribute and get the vector components renumbered and redistributed
>>> accordingly in a global vector. And then, get the local vector.
>>>
>>> Hope it could be more clear now.
>>> Regards,
>>>
>>> Alejandro
>>>
>>>
>>>
>>> On Wed, Dec 16, 2015 at 7:01 PM, Justin Chang <jychang48 at gmail.com>
>>> wrote:
>>>
>>>> I think you would follow this order:
>>>>
>>>> 1*) create a DMPlex (depth, chart, etc) on rank 0. Other ranks have an
>>>> empty DM
>>>>
>>>> 2) DMPlexDistribute()
>>>>
>>>> 3*) Create the PetscSection
>>>>
>>>> 4) DMCreateGlobalVector()
>>>>
>>>> 5) DMCreateLocalVector()
>>>>
>>>> Now you have a global vector and a local vector for your distributed
>>>> DMPlex. The mapping/ghosting/etc of dofs is already taken care of.
>>>>
>>>> * if you're using standard Galerkin FE then in SNES examples 12 and 62
>>>> (and maybe others?) the first step is handled through the mesh generation
>>>> functions and step 3 is handled through step 4
>>>>
>>>> Thanks,
>>>> Justin
>>>>
>>>> On Wednesday, December 16, 2015, Alejandro D Otero <aotero at fi.uba.ar>
>>>> wrote:
>>>>
>>>>> Hi, I need some help understanding how to distribute data together
>>>>> with a dmplex representing a FE mesh.
>>>>> At the beginning I define the structure of the dmplex assigning
>>>>> certain number of DoF to cells, edges and vertexes, in one process (the
>>>>> dmplex in the rest is empty)
>>>>> I create a petscsecton and I create an associated global vector with
>>>>> the quantities I want to store.
>>>>> Then I distribute the dmplex over all the processes.
>>>>> * Although this does not perform well it is just a starting point. I
>>>>> know it has to be improved.
>>>>>
>>>>> I would like to have the global vector distributed accordingly so that
>>>>> each process has access to the corresponding local part with its DoF
>>>>> (possibly adding some ghost values corresponding to the shared DoF not
>>>>> taken care by it).
>>>>>
>>>>> Is there any 'correct' way to do that in PETSc?
>>>>>
>>>>> Thanks in advance,
>>>>>
>>>>> Alejandro
>>>>>
>>>>
>>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151222/ce3f4b6a/attachment.html>


More information about the petsc-users mailing list