MPI-layout of PETSc
Rolf Kuiper
kuiper at mpia-hd.mpg.de
Mon Jun 29 19:07:35 CDT 2009
Hi PETSc users,
I ran into trouble in combining my developed PETSc application with
another code (based on another library called "ArrayLib").
The problem is the parallel layout for MPI, e.g. in 2D with 6 cpus the
ArrayLib code gives the names/ranks of the local cpus first in y-
direction, than in x (from last to first, in the same way the MPI
arrays are called, like 3Darray[z][y][x]):
y
^
| 2-4-6
| 1-3-5
|--------> x
If I call DACreate() from PETSc, it will assume an ordering according
to names/ranks first set in x-direction, than in y:
y
^
| 4-5-6
| 1-2-3
|--------> x
Of course, if I now communicate the boundary values, I mix up the
domain (build by the other program).
Is there a possibility / a flag to set the name of the ranks?
Due to the fact that my application is written and working in
curvilinear coordinates and not in cartesian, I cannot just switch the
directions.
Thanks a lot for your help,
Rolf
More information about the petsc-users
mailing list