[petsc-users] sequential partitioning?

Dominik Szczerba dominik at itis.ethz.ch
Sat Oct 22 10:00:34 CDT 2011


Many thanlks, Barry, you saved my Saturday afternoon...!
(so I can directly proceed to the valgrind issue reported separately... :))

Dominik

On Sat, Oct 22, 2011 at 4:22 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>  The MatCreateMPIAdj and MatPartitioningCreate  need to take PETSC_COMM_SELF so they are sequential.
>
>   Barry
>
> On Oct 22, 2011, at 9:18 AM, Dominik Szczerba wrote:
>
>> Thanks, this I had no chance to know. Is also true for MatPartitioningCreate?
>> Which one should be run collectively and which one only on master? My
>> code is along this line:
>>
>> MatCreateMPIAdj
>> MatPartitioningCreate
>> MatPartitioningSetAdjacency
>> MatPartitioningSetNParts
>> MatPartitioningApply
>>
>> Many thanks!
>> Dominik
>>
>> On Sat, Oct 22, 2011 at 4:05 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>>
>>>   You need to create the MPIAdj matrix with PETSC_COMM_SELF   not COMM_WORLD
>>>
>>>   Barry
>>>
>>> On Oct 22, 2011, at 8:45 AM, Dominik Szczerba wrote:
>>>
>>>> I do not seem to find any guidance on this subject in the docu.
>>>> I built petsc with chaco and party to attempt sequential partitioning,
>>>> where parmetis fails.
>>>> However, I get the error:
>>>>
>>>> [0]PETSC ERROR: No support for this operation for this object type!
>>>> [0]PETSC ERROR: Distributed matrix format MPIAdj is not supported for
>>>> sequential partitioners!
>>>>
>>>>
>>>> I do not seem to find a sequential equivalent of MatCreateMPIAdj...
>>>> Are there any examples how to perform partitioning sequentially?
>>>> My mesh/graph is located entirely on master.
>>>>
>>>> Thanks a lot,
>>>> Dominik
>>>
>>>
>
>


More information about the petsc-users mailing list