[petsc-users] sequential partitioning?

Barry Smith bsmith at mcs.anl.gov
Sat Oct 22 09:22:47 CDT 2011


  The MatCreateMPIAdj and MatPartitioningCreate  need to take PETSC_COMM_SELF so they are sequential. 

   Barry

On Oct 22, 2011, at 9:18 AM, Dominik Szczerba wrote:

> Thanks, this I had no chance to know. Is also true for MatPartitioningCreate?
> Which one should be run collectively and which one only on master? My
> code is along this line:
> 
> MatCreateMPIAdj
> MatPartitioningCreate
> MatPartitioningSetAdjacency
> MatPartitioningSetNParts
> MatPartitioningApply
> 
> Many thanks!
> Dominik
> 
> On Sat, Oct 22, 2011 at 4:05 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>> 
>>   You need to create the MPIAdj matrix with PETSC_COMM_SELF   not COMM_WORLD
>> 
>>   Barry
>> 
>> On Oct 22, 2011, at 8:45 AM, Dominik Szczerba wrote:
>> 
>>> I do not seem to find any guidance on this subject in the docu.
>>> I built petsc with chaco and party to attempt sequential partitioning,
>>> where parmetis fails.
>>> However, I get the error:
>>> 
>>> [0]PETSC ERROR: No support for this operation for this object type!
>>> [0]PETSC ERROR: Distributed matrix format MPIAdj is not supported for
>>> sequential partitioners!
>>> 
>>> 
>>> I do not seem to find a sequential equivalent of MatCreateMPIAdj...
>>> Are there any examples how to perform partitioning sequentially?
>>> My mesh/graph is located entirely on master.
>>> 
>>> Thanks a lot,
>>> Dominik
>> 
>> 



More information about the petsc-users mailing list