[petsc-dev] Cell overlap in DMPlexDistribute

Michael Lange michael.lange at imperial.ac.uk
Tue Mar 4 09:16:41 CST 2014


Hi Matt,

Are there any updates on this?

Thanks,

Michael

On 23/02/14 20:50, Matthew Knepley wrote:
> On Fri, Feb 21, 2014 at 6:56 AM, Michael Lange 
> <michael.lange at imperial.ac.uk <mailto:michael.lange at imperial.ac.uk>> 
> wrote:
>
>     Hi Matt,
>
>     Have you had a chance to think about the cell overlap problem? I'm
>     proposing to make the center dimension a general Plex attribute
>     with the current behaviour as the default. We can then generate
>     the cell overlap in DMPlexDistribute according to the center
>     dimension using a utility function for point adjacency, something
>     like DMPlexGetAdjacentPoints(dm, p).
>
>     I'm happy to provide a pull request for this change, just tell me
>     if you agree with this approach or if you see any problems.
>
>
> Yes, you are correct in this. We are having another discussion on 
> petsc-maint about the kinds of partitions that are
> appropriate for different problems. I will try and get through this as 
> quickly as I can, and I will solicit your feedback
> when I get the plan for implementation done.
>
>   Thanks,
>
>      Matt
>
>     Thanks and kind regards,
>
>     Michael
>
>
>
>     On 03/02/14 11:32, Michael Lange wrote:
>>     Hi Matt,
>>
>>     On 31/01/14 05:11, Matthew Knepley wrote:
>>>     On Tue, Jan 28, 2014 at 8:57 AM, Michael Lange
>>>     <michael.lange at imperial.ac.uk
>>>     <mailto:michael.lange at imperial.ac.uk>> wrote:
>>>
>>>         Hi,
>>>
>>>         I noticed that the cell overlap created during
>>>         DMPlexDistribute does not include cells that only share a
>>>         vertex but no edge with an owned cell. This causes problems
>>>         when performing local assembly (MAT_IGNORE_OFF_PROC_ENTRIES)
>>>         based on information from the plex, because one contribution
>>>         to the shared vertex is missing.
>>>
>>>         As an example consider the 2x2 square with 8 cells
>>>         (attached). When partitioned across two ranks with overlap=1
>>>         each rank owns 4 cells and in the current version knows
>>>         about 2 halo cells, giving a total of 6 cells. The central
>>>         vertex, however, touches 3 owned and 3 non-owned cells, one
>>>         of which doesn't share an edge with any owned cells. So, in
>>>         order to correctly assemble the central vertex locally, the
>>>         rank owning it needs to know about 7 cells in total.
>>>
>>>         I have attached a patch that fixes this problem by going
>>>         over the inverse closure of all vertices associated with a
>>>         given cell instead of using the provided edge graph. Please
>>>         tell me what you think and whether there might be an easier
>>>         way to fix this.
>>>
>>>
>>>     This is true, but unfortunately not what everyone wants. FVM
>>>     people want just what is there now. This
>>>     same choice comes up in preallocation. There I have put in the
>>>     "center dimension" which says what
>>>     kind of point do you use for the center, vertex or face? I guess
>>>     we need that here as well.
>>     Yes, I think the center dimension is exactly what we need here,
>>     because my proposed fix is to switch from using star(cone(p)) to
>>     star(closure(p)). So, do you want to make the center dimension a
>>     general Plex attribute and move the Set/Get functions to plex.c?
>>     If so, what would be the default? In that case a
>>     DMPlexGetAdjacentPoints(dm, p) function might also be helpful if
>>     this is used in several places? I'm happy to provide a pull
>>     request, just tell me how you want this to be structured.
>>
>>     Thanks
>>     Michael
>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140304/6ccab9e7/attachment.html>


More information about the petsc-dev mailing list