[petsc-users] Creating a 3D dmplex mesh with cell list and distributing it

Matthew Knepley knepley at gmail.com
Wed Aug 14 22:46:58 CDT 2019


On Wed, Aug 14, 2019 at 10:48 PM Swarnava Ghosh <swarnava89 at gmail.com>
wrote:

> Hi Matthew,
>
> "It looks like you are running things with the wrong 'mpirun' ", Could you
> please elaborate on this? I have another DMDA in my code, which is
> correctly being parallelized.
>

Ah, I see it now. You are feeding in the initial mesh on every process.
Normally one process generates the mesh
and the other ones just give 0 for num cells and vertices.

  Thanks,

     Matt


> Thanks,
> SG
>
> On Wed, Aug 14, 2019 at 7:35 PM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Wed, Aug 14, 2019 at 10:23 PM Swarnava Ghosh <swarnava89 at gmail.com>
>> wrote:
>>
>>> Hi Matthew,
>>>
>>> I added DMView(pCgdft->dmplex,PETSC_VIEWER_STDOUT_WORLD); before and
>>> after distribution, and I get the following:
>>>
>>
>> It looks like you are running things with the wrong 'mpirun'
>>
>>   Thanks,
>>
>>     Matt
>>
>>
>>> dmplex before distribution
>>> DM Object: 3 MPI processes
>>>   type: plex
>>> DM_0x84000004_0 in 3 dimensions:
>>>   0-cells: 7 7 7
>>>   1-cells: 17 17 17
>>>   2-cells: 17 17 17
>>>   3-cells: 6 6 6
>>> Labels:
>>>   depth: 4 strata with value/size (0 (7), 1 (17), 2 (17), 3 (6))
>>> dmplex after distribution
>>> DM Object: Parallel Mesh 3 MPI processes
>>>   type: plex
>>> Parallel Mesh in 3 dimensions:
>>>   0-cells: 7 7 7
>>>   1-cells: 17 17 17
>>>   2-cells: 17 17 17
>>>   3-cells: 6 6 6
>>> Labels:
>>>   depth: 4 strata with value/size (0 (7), 1 (17), 2 (17), 3 (6))
>>>
>>> Thanks,
>>> SG
>>>
>>> On Wed, Aug 14, 2019 at 6:48 PM Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> DMView() the mesh before and after distribution, so we can see what we
>>>> have.
>>>>
>>>>   Thanks,
>>>>
>>>>      Matt
>>>>
>>>> On Wed, Aug 14, 2019 at 5:30 PM Swarnava Ghosh via petsc-users <
>>>> petsc-users at mcs.anl.gov> wrote:
>>>>
>>>>> Hi PETSc team and users,
>>>>>
>>>>> I am trying to create a 3D dmplex mesh using DMPlexCreateFromCellList,
>>>>> then distribute it, and find out the coordinates of the vertices owned by
>>>>> each process.
>>>>> My cell list is as follows:
>>>>> numCells: 6
>>>>> numVertices: 7
>>>>> numCorners: 4
>>>>> cells:
>>>>> 0
>>>>> 3
>>>>> 2
>>>>> 1
>>>>> 4
>>>>> 0
>>>>> 2
>>>>> 1
>>>>> 6
>>>>> 4
>>>>> 2
>>>>> 1
>>>>> 3
>>>>> 6
>>>>> 2
>>>>> 1
>>>>> 5
>>>>> 4
>>>>> 6
>>>>> 1
>>>>> 3
>>>>> 5
>>>>> 6
>>>>> 1
>>>>> vertexCoords:
>>>>> -6.043000
>>>>> -5.233392
>>>>> -4.924000
>>>>> -3.021500
>>>>> 0.000000
>>>>> -4.924000
>>>>> -3.021500
>>>>> -3.488928
>>>>> 0.000000
>>>>> -6.043000
>>>>> 1.744464
>>>>> 0.000000
>>>>> 0.000000
>>>>> -5.233392
>>>>> -4.924000
>>>>> 3.021500
>>>>> 0.000000
>>>>> -4.924000
>>>>> 3.021500
>>>>> -3.488928
>>>>> 0.000000
>>>>>
>>>>> After reading this information, I do
>>>>>  ierr=
>>>>> DMPlexCreateFromCellList(PETSC_COMM_WORLD,3,pCgdft->numCellsESP,pCgdft->NESP,pCgdft->numCornersESP,interpolate,pCgdft->cellsESP,3,pCgdft->vertexCoordsESP,&pCgdft->dmplex);
>>>>>
>>>>> ierr = DMPlexDistribute(pCgdft->dmplex,0,&pCgdft->dmplexSF,
>>>>> &distributedMesh);CHKERRQ(ierr);
>>>>>
>>>>>    if (distributedMesh) {
>>>>>      printf("mesh is distributed \n");
>>>>>    ierr = DMDestroy(&pCgdft->dmplex);CHKERRQ(ierr);
>>>>>      pCgdft->dmplex  = distributedMesh;
>>>>>   }
>>>>>
>>>>>  DMGetCoordinates(pCgdft->dmplex,&VC);
>>>>>  VecView(VC,PETSC_VIEWER_STDOUT_WORLD);
>>>>>
>>>>> On running this with 3 mpi processes, From VecView, I see that all the
>>>>> processes own all the vertices.  Why is the dmplex not being distributed?
>>>>>
>>>>> The VecView is :
>>>>> Process [0]
>>>>> -6.043
>>>>> -5.23339
>>>>> -4.924
>>>>> -3.0215
>>>>> 0.
>>>>> -4.924
>>>>> -3.0215
>>>>> -3.48893
>>>>> 0.
>>>>> -6.043
>>>>> 1.74446
>>>>> 0.
>>>>> 0.
>>>>> -5.23339
>>>>> -4.924
>>>>> 3.0215
>>>>> 0.
>>>>> -4.924
>>>>> 3.0215
>>>>> -3.48893
>>>>> 0.
>>>>> Process [1]
>>>>> -6.043
>>>>> -5.23339
>>>>> -4.924
>>>>> -3.0215
>>>>> 0.
>>>>> -4.924
>>>>> -3.0215
>>>>> -3.48893
>>>>> 0.
>>>>> -6.043
>>>>> 1.74446
>>>>> 0.
>>>>> 0.
>>>>> -5.23339
>>>>> -4.924
>>>>> 3.0215
>>>>> 0.
>>>>> -4.924
>>>>> 3.0215
>>>>> -3.48893
>>>>> 0.
>>>>> Process [2]
>>>>> -6.043
>>>>> -5.23339
>>>>> -4.924
>>>>> -3.0215
>>>>> 0.
>>>>> -4.924
>>>>> -3.0215
>>>>> -3.48893
>>>>> 0.
>>>>> -6.043
>>>>> 1.74446
>>>>> 0.
>>>>> 0.
>>>>> -5.23339
>>>>> -4.924
>>>>> 3.0215
>>>>> 0.
>>>>> -4.924
>>>>> 3.0215
>>>>> -3.48893
>>>>> 0.
>>>>>
>>>>> Thanks,
>>>>> SG
>>>>>
>>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/
>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190814/5691f1de/attachment.html>


More information about the petsc-users mailing list