[petsc-users] ParMETIS question

Matthew Knepley knepley at gmail.com
Wed Dec 22 18:43:30 CST 2010


On Wed, Dec 22, 2010 at 6:01 AM, Thomas Witkowski <
thomas.witkowski at tu-dresden.de> wrote:

> So, I found the problem related to empty partitions. It is not possible to
> weight vertices (i.e. elements of a mesh) in such a way that one weight is
> much higher than the other ones. For more details see
>
> http://glaros.dtc.umn.edu/flyspray/task/11
>
> Its a pity that ParMetis makes is very hard to find this kind of errors.
>
> The open question for me is about the non continuous partitions. Is it a
> normal behavior of ParMetis to create partitions that are not continous?


Yes, this is normal.

   Matt


>
> Thomas
>
>
> Thomas Witkowski wrote:
>
>> Okay, in my computations, I have empty partitions on some ranks and
>> definitely not
>> minimal boundary sizes. So may be I generate a wrong input. But if this is
>> the case, I
>> wonder why the resulting mesh partitioning is quite good. If I neglect the
>> problem of
>> empty partitions, the redistributed mesh leads to a very good load
>> balancing. Is there
>> any meaningful way to debug the problem? Is there something link a
>> "verbose mode" in
>> ParMetis that says me whats happen on the input data? Otherwise I have to
>> print all the
>> input data to the screen and check it by hand. Although I have a quite
>> small example with
>> 128 overall coarse mesh elements on 8 ranks, this is not big fun :)
>>
>> Thomas
>>
>> @Matthew: By mistake I've answered your mail directly to you and not to
>> the mailing list, therefore I sent it now here again
>>
>> Matthew Knepley wrote:
>>
>>> On Tue, Dec 21, 2010 at 5:49 AM, Thomas Witkowski <
>>> thomas.witkowski at tu-dresden.de <mailto:thomas.witkowski at tu-dresden.de>>
>>> wrote:
>>>
>>>    Hi,
>>>
>>>    I have a not directly PETSc related question, but I hope to get
>>>    some answer from the community here. In my FEM code, I make use of
>>>    ParMETIS to partition the mesh. I make direct use of this library
>>>    and not of PETSc's ParMETIS integration. The initial partition is
>>>    always fine, but I use the ParMETIS_V3_AdaptiveRepart function for
>>>    repartition the mesh due to local mesh adaption. In most cases,
>>>    the result is fine, but there are two points, where I have trouble
>>>    with:
>>>
>>>    1) Sometimes ParMETIS generates empty partitions, i.e., a
>>>    processor has zero mesh elements. This is something my code cannot
>>>    handle. Is this a bug or a feature? If it is a feature, is there
>>>    any possiblity to disable it?
>>>
>>>
>>> ParMetis has a balance constraint if you weight vertices. This will
>>> enforce equal size partitions.
>>>
>>>    2) In most cases the specific partitions are not connected. If I
>>>    put all data to ParMETIS in a correct way, is this okay? My code
>>>    can handle it, but is slows down the computation due to larger
>>>    interior boundaries and therefore to more communications.
>>>
>>>
>>> ParMetis minimizes the overall boundary size, so I do not understand how
>>> you could see this slowdown.
>>>
>>>   Matt
>>>
>>>    Does anyone of you know an answer to these question? Is there a
>>>    debug mode in ParMETIS, where I can see which data is set to its
>>>    function calls?
>>>
>>>    Regards,
>>>
>>>    Thomas
>>>
>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>
>>
>>
>>
>


-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20101222/47e78d4a/attachment.htm>


More information about the petsc-users mailing list