[petsc-users] Uninterpolating a distributed mesh

Matthew Knepley knepley at gmail.com
Mon Apr 13 16:28:59 CDT 2015


On Mon, Apr 13, 2015 at 4:22 PM, Justin Chang <jchang27 at uh.edu> wrote:

> Is there an example somewhere that does something similar to this?
>

There is DMPlexShiftSF_Internal() in plexsubmesh.c

  Matt


> Thanks,
>
> On Mon, Apr 13, 2015 at 12:31 PM, Matthew Knepley <knepley at gmail.com>
> wrote:
>
>> On Sat, Apr 11, 2015 at 6:55 PM, Justin Chang <jchang27 at uh.edu> wrote:
>>
>>> Hello,
>>>
>>> When I call DMPlexUninterpolate(...) on a distributed mesh (say 2
>>> processors), it seems to overwrite the "ghost" points (i.e., the points not
>>> locally owned by the processor) and treats all points as if they are local
>>> to the processor.
>>>
>>
>> Yes, I wrote Uninterpolate() just for testing, and do not currently
>> handle the SF. I put it on my TODO list.
>>
>> Its not hard if you want to try. You just filter out any points that are
>> not cells and vertices from the SF, so
>>
>>   PetscSFGetGraph()
>>   for (leaves)
>>     if leaf not a cell or vertex, skip
>>   PetscSFSetGraph()
>>
>>   Thanks,
>>
>>     Matt
>>
>>
>>> Say I have this 2D cell-vertex mesh:
>>>
>>> 14----15-----16
>>> | \  5  | \ 7   |
>>> |   \   |   \   |
>>> |  4  \ |  6  \ |
>>> 11----12-----13
>>> | \  1  | \ 3   |
>>> |   \   |   \   |
>>> |  0  \ |  2  \ |
>>> 8------9------10
>>>
>>> Which results in the following DM:
>>>
>>> DM Object: 2 MPI processes
>>>   type: plex
>>> DM_0x84000004_0 in 2 dimensions:
>>>   0-cells: 9 0
>>>   2-cells: 8 0
>>> Labels:
>>>   marker: 1 strata of sizes (8)
>>>   depth: 2 strata of sizes (9, 8)
>>>
>>> I proceed by interpolating this DM:
>>>
>>> DM Object: 2 MPI processes
>>>   type: plex
>>> DM_0x84000004_1 in 2 dimensions:
>>>   0-cells: 9 0
>>>   1-cells: 16 0
>>>   2-cells: 8 0
>>> Labels:
>>>   marker: 1 strata of sizes (16)
>>>   depth: 3 strata of sizes (9, 16, 8)
>>>
>>> Then distributing across 2 processors:
>>>
>>> DM Object:Parallel Mesh 2 MPI processes
>>>   type: plex
>>> Parallel Mesh in 2 dimensions:
>>>   0-cells: 6 6
>>>   1-cells: 9 9
>>>   2-cells: 4 4
>>> Labels:
>>>   marker: 1 strata of sizes (9)
>>>   depth: 3 strata of sizes (6, 9, 4)
>>>
>>> I have the option of uniformly refining the mesh here but I choose not
>>> to for now. If my dofs are vertex based, then the global size of my DM
>>> vector is 9 and the local sizes for ranks 0 and 1 are 3 and 6 respectively.
>>> However, if I choose to uninterpolate the mesh by calling
>>> DMPlexUninterpolate(...), I get this:
>>>
>>> DM Object: 2 MPI processes
>>>   type: plex
>>> DM_0x84000004_2 in 2 dimensions:
>>>   0-cells: 6 6
>>>   2-cells: 4 4
>>> Labels:
>>>   marker: 1 strata of sizes (5)
>>>   depth: 2 strata of sizes (6, 4)
>>>
>>> And the global size of my DM vector becomes 12 and the local size for
>>> both ranks is 6. It looks like the ghost points in rank 0 have been
>>> duplicated, which is not suppose to happen.
>>>
>>> Is there a way to capture the ghost point information when
>>> uninterpolating the DM?
>>>
>>> Thanks,
>>>
>>> --
>>> Justin Chang
>>> PhD Candidate, Civil Engineering - Computational Sciences
>>> University of Houston, Department of Civil and Environmental Engineering
>>> Houston, TX 77004
>>> (512) 963-3262
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
>
> --
> Justin Chang
> PhD Candidate, Civil Engineering - Computational Sciences
> University of Houston, Department of Civil and Environmental Engineering
> Houston, TX 77004
> (512) 963-3262
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150413/0938ec3e/attachment.html>


More information about the petsc-users mailing list