process id in DA
Barry Smith
bsmith at mcs.anl.gov
Wed Nov 12 19:15:27 CST 2008
On Nov 12, 2008, at 6:39 PM, Sean Dettrick wrote:
>
> On Nov 12, 2008, at 4:24 PM, Shao-Ching Huang wrote:
>
>> Hi,
>>
>> Is there a PETSc call to obtain the "process coordinates" in a
>> process
>> grid, partitioned by a DA? (Something like what MPI_Cart_coords
>> does.)
>>
>> Thanks,
>>
>> Shao-Ching
>>
>
> A while ago (May 2006) I found that the PETSc DA has its processes
> ordered in the TRANSPOSE way to that of MPI Cartesian communicators,
> so you might have to consider mapping back and forth between the two
> cartesian representations.
>
> Barry said that he might change the DA so that the two agree with
> each other, but I don't know if he ever did...
I got lazy and never did it. In the abstract changing this is easy
but getting every right requires
care. It would be nice if someone did this.
Barry
>
>
> Does anybody know the status of that?
>
> In case it helps, below is part of our exchange on the matter.
>
> Sean
> ---------------------------
>
>
>
> Ok, so it is laid-out like a matrix, not like x, y coordinates.
>
> Will put the change to DA on the list of things to do.
>
>
> Barry
>
> On Thu, 18 May 2006, Sean Dettrick wrote:
>
>> Barry Smith wrote:
>>
>>> Bill, does the MPI standard dictate this decomposition or
>>> could different implementations do it the opposite way?
>>> Then we'd have to make the DA logic a bit more complicated.
>>
>> I don't have a copy of the standard, but to quote page 255 of "MPI,
>> the complete reference" by Snir et al:
>> "Row-major numbering is always used for the processes in a
>> Cartesian structure".
>> Their diagram in figure 6.1 matches my code output for coords
>> couplets (i,j):
>>
>> 0 1 2 3
>> (0,0) (0,1) (0,2) (0,3)
>>
>> 4 5 6 7
>> (1,0) (1,1) (1,2) (1,3)
>>
>> 8 9 10 11
>> (2,0) (2,1) (2,2) (2,3)
>>
>>
>> By the way I agree with you, I *should* be able to swap the x and y
>> myself. Just haven't had much luck yet in that regard.
>>
>> Sean
>>
>>> On Thu, 18 May 2006, Sean Dettrick wrote:
>>>> Barry Smith wrote:
>>>>> On Thu, 18 May 2006, Sean Dettrick wrote:
>>>>>> Hi Barry,
>>>>>> the order is determined by MPI_Cart_create.
>>>>>
>>>>> Do you mean that MPI_Cart_create() orders across the 2nd (y-axis)
>>>>> fastest and then the first (x-axis)? Hmmm, maybe we should
>>>>> change the
>>>>> DA? Changing it once and for all (not supporting both) is probably
>>>>> not a big deal and shouldn't break much (I hope).
>>>> Hi Barry,
>>>> it depends, what do you call x and what do you call y?
>>>> MPI_Cart_coords returns a vector, coords - I tend to say x is
>>>> coords[0], y is coords[1] and z is coords[2]. For what it's
>>>> worth, there's a short code appended to this email, which produces:
>>>> rank = 0 has Cartesian coords = { 0, 0 }
>>>> rank = 1 has Cartesian coords = { 0, 1 }
>>>> rank = 2 has Cartesian coords = { 1, 0 }
>>>> rank = 3 has Cartesian coords = { 1, 1 }
>>>> rank = 0 has DA range x=[0,50) and y=[0,50)
>>>> rank = 1 has DA range x=[50,100) and y=[0,50)
>>>> rank = 2 has DA range x=[0,50) and y=[50,100)
>>>> rank = 3 has DA range x=[50,100) and y=[50,100)
>>>>
>>>>>>> I don't completely understand what goes wrong. Is it because
>>>>>>> YOUR
>>>>>>> application orders the processors related to geometry in the
>>>>>>> following way?
>>>>>>>
>>>>>>> ^ y direction
>>>>>>> |
>>>>>>> 2 5 8
>>>>>>> 1 4 7
>>>>>>> 0 3 6
>>>>>>>
>>>>>>> -> x direction
>>>>>>> Or is this something inherent in MPI_Cart_create?
>>>> For my interpretation of x and y, MPI_Cart_create produces the
>>>> above layout. But if I said x=coords[1] and y=coords[0], then it
>>>> would match the one below.
>>>>>>> PETSc does it so
>>>>>>>
>>>>>>> ^ y direction
>>>>>>> |
>>>>>>> 6 7 8
>>>>>>> 3 4 5
>>>>>>> 0 1 2
>>>>>>>
>>>>>>> -> x direction
>>>> Code and makefile attached ... hopefully within the message size
>>>> limit.
>>>> Just make cartcommtest.
>>>> Sean
>>
>>
>
More information about the petsc-users
mailing list