MPI-layout of PETSc
Barry Smith
bsmith at mcs.anl.gov
Mon Jun 29 19:24:21 CDT 2009
On Jun 29, 2009, at 7:07 PM, Rolf Kuiper wrote:
> Hi PETSc users,
>
> I ran into trouble in combining my developed PETSc application with
> another code (based on another library called "ArrayLib").
> The problem is the parallel layout for MPI, e.g. in 2D with 6 cpus
> the ArrayLib code gives the names/ranks of the local cpus first in y-
> direction, than in x (from last to first, in the same way the MPI
> arrays are called, like 3Darray[z][y][x]):
>
> y
> ^
> | 2-4-6
> | 1-3-5
> |--------> x
>
> If I call DACreate() from PETSc, it will assume an ordering
> according to names/ranks first set in x-direction, than in y:
>
> y
> ^
> | 4-5-6
> | 1-2-3
> |--------> x
>
> Of course, if I now communicate the boundary values, I mix up the
> domain (build by the other program).
>
> Is there a possibility / a flag to set the name of the ranks?
> Due to the fact that my application is written and working in
> curvilinear coordinates and not in cartesian, I cannot just switch
> the directions.
What we recommend in this case is to just change the meaning of x,
y, and z when you use the PETSc DA. This does mean changing your code
that uses the PETSc DA.
I do not understand why curvilinear coordinates has anything to do
with it. Another choice is to create a new MPI communicator that has
the different ordering of the ranks of the processors and then using
that comm to create the PETSc DA objects; then you would not need to
change your code that calls PETSc.
Unfortunately PETSc doesn't have any way to flip how the DA
handles the layout automatically.
Barry
>
> Thanks a lot for your help,
> Rolf
More information about the petsc-users
mailing list