MPI-layout of PETSc

Barry Smith bsmith at mcs.anl.gov
Fri Jul 3 10:56:20 CDT 2009


    In designing the PETSc DA I did not (by ignorance) follow the  
layout approach of the MPI cartesian MPI_Cart_create (that gives the  
first local cpus first in the y-direction).
I had it put the first cpus in the x-direction.

    What you need to do is create a new communicator that changes the  
order of the processors so that when used by the PETSc DA they lie out  
in the ordering that matches the other code. You will need to read up  
on the MPI_Cart stuff.

    To change PETSC_COMM_WORLD you simply set PETSC_COMM_WORLD =  
yournewcom BEFORE calling PetscInitialize().

    Barry

On Jul 3, 2009, at 3:52 AM, Rolf Kuiper wrote:

> Hi,
>
> Am 30.06.2009 um 02:24 schrieb Barry Smith:
>>
>> On Jun 29, 2009, at 7:07 PM, Rolf Kuiper wrote:
>>
>>> Hi PETSc users,
>>>
>>> I ran into trouble in combining my developed PETSc application  
>>> with another code (based on another library called "ArrayLib").
>>> The problem is the parallel layout for MPI, e.g. in 2D with 6 cpus  
>>> the ArrayLib code gives the names/ranks of the local cpus first in  
>>> y-direction, than in x (from last to first, in the same way the  
>>> MPI arrays are called, like 3Darray[z][y][x]):
>>>
>>> y
>>> ^
>>> | 2-4-6
>>> | 1-3-5
>>> |--------> x
>>>
>>> If I call DACreate() from PETSc, it will assume an ordering  
>>> according to names/ranks first set in x-direction, than in y:
>>>
>>> y
>>> ^
>>> | 4-5-6
>>> | 1-2-3
>>> |--------> x
>>>
>>> Of course, if I now communicate the boundary values, I mix up the  
>>> domain (build by the other program).
>>>
>>> Is there a possibility / a flag to set the name of the ranks?
>>> Due to the fact that my application is written and working in  
>>> curvilinear coordinates and not in cartesian, I cannot just switch  
>>> the directions.
>>
>>  What we recommend in this case is to just change the meaning of x,  
>> y, and z when you use the PETSc DA.  This does mean changing your  
>> code that uses the PETSc DA.
>
> The code is used as a module for many codes, so I would prefer to  
> not change the code (and the meaning of directions, that's not user- 
> friendly), but 'just' change the communicator.
>
>> I do not understand why curvilinear coordinates has anything to do  
>> with it. Another choice is to create a new MPI communicator that  
>> has the different ordering of the ranks of the processors and then  
>> using that comm to create the PETSc DA objects; then you would not  
>> need to change your code that calls PETSc.
>
> I tried some time before to use the PetscSetCommWorld() routine, but  
> I can't find it anymore, how can I set a new communicator in PETSc3.0?
> The communicator, I want to use, is the MPI_COMM_WORLD, which takes  
> the first described ordering.
> Now I read that the MPI_COMM_WORLD is the default communicator for  
> PETSc. But why is the ordering than different?
>
> Sorry for all this question, but (as you can see) I really don't  
> understand this comm problem at the moment,
> Thanks for all,
> Rolf
>
>>  Unfortunately PETSc doesn't have any way to flip how the DA  
>> handles the layout automatically.
>>
>>   Barry
>>
>>>
>>> Thanks a lot for your help,
>>> Rolf
>>
>



More information about the petsc-users mailing list