[petsc-users] Mapping between application ordering and Petsc ordering

Mani Chandra mc0710 at gmail.com
Sun Aug 9 16:57:15 CDT 2015


Thank you! This was *exactly* what I was looking it. It fixed the problem.


On Sun, Aug 9, 2015 at 2:31 PM, Fabian <Fabian.Jakub at physik.uni-muenchen.de>
wrote:

> If the problem is due to the rank-ordering, the following excerpt from the
> PETSc FAQ section may help:
>
> <http://www.mcs.anl.gov/petsc/documentation/faq.html#da_mpi_cart>
> <http://www.mcs.anl.gov/petsc/documentation/faq.html#da_mpi_cart>
>
> The PETSc DA object decomposes the domain differently than the
> MPI_Cart_create() command. How can one use them together?
>
> The MPI_Cart_create() first divides the mesh along the z direction, then
> the y, then the x. DMDA divides along the x, then y, then z. Thus, for
> example, rank 1 of the processes will be in a different part of the mesh
> for the two schemes. To resolve this you can create a new MPI communicator
> that you pass to DMDACreate() that renumbers the process ranks so that each
> physical process shares the same part of the mesh with both the DMDA and
> the MPI_Cart_create(). The code to determine the new numbering was provided
> by Rolf Kuiper.
>
> // the numbers of processors per direction are (int) x_procs, y_procs, z_procs respectively
> // (no parallelization in direction 'dir' means dir_procs = 1)
>
> MPI_Comm NewComm;
> int MPI_Rank, NewRank, x,y,z;
>
> // get rank from MPI ordering:
> MPI_Comm_rank(MPI_COMM_WORLD, &MPI_Rank);
>
> // calculate coordinates of cpus in MPI ordering:
> x = MPI_rank / (z_procs*y_procs);
> y = (MPI_rank % (z_procs*y_procs)) / z_procs;
> z = (MPI_rank % (z_procs*y_procs)) % z_procs;
>
> // set new rank according to PETSc ordering:
> NewRank = z*y_procs*x_procs + y*x_procs + x;
>
> // create communicator with new ranks according to
> PETSc ordering:
> MPI_Comm_split(PETSC_COMM_WORLD, 1, NewRank, &NewComm);
>
> // override the default communicator (was
> MPI_COMM_WORLD as default)
> PETSC_COMM_WORLD = NewComm;
>
>
>
>
> On 08.08.2015 23:58, Matthew Knepley wrote:
>
> On Sat, Aug 8, 2015 at 4:56 PM, Mani Chandra <mc0710 at gmail.com> wrote:
>
>> So basically one needs to correctly map
>>
>> iPetsc, jPetsc -> iApplication, jApplication ?
>>
>> Is there is any standard way to do this? Can I get petsc to automatically
>> follow the same parallel topology as the host application?
>>
>
> If you want to use DMDA, there is only one mapping of ranks, namely
> lexicographic. However, every structured grid code I have
> ever seen uses that mapping, perhaps with a permutation of the directions
> {x, y, z}. Thus, the user needs to map the directions
> in PETSc in the right order for the application. I am not sure how you
> would automate this seeing as it depends on the application.
>
>   Thanks,
>
>      Matt
>
>
>> Thanks,
>> Mani
>>
>> On Sat, Aug 8, 2015 at 3:12 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>
>>>
>>> > On Aug 8, 2015, at 3:08 PM, Mani Chandra <mc0710 at gmail.com> wrote:
>>> >
>>> > Tried flipping the indices, I get a seg fault.
>>>
>>>   You would have to be careful in exactly what you flip.  Note that the
>>> meaning of N1 and N2 etc would also be reversed between your code and the
>>> PETSc DMDA code.
>>>
>>>   I would create a tiny DMDA and put entires like 1 2 3 4 ... into the
>>> array so you can track where the values go
>>>
>>>   Barry
>>>
>>> >
>>> > On Sat, Aug 8, 2015 at 3:03 PM, Barry Smith <bsmith at mcs.anl.gov>
>>> wrote:
>>> >
>>> > > On Aug 8, 2015, at 2:45 PM, Mani Chandra <mc0710 at gmail.com> wrote:
>>> > >
>>> > > Thanks. Any suggestions for a fix?
>>> >
>>> >   Just flip the meaning of the x indices and the y indices in the
>>> PETSc parts of the code?
>>> >
>>> >   Also run with a very different N1 and  N2 (instead of equal size) to
>>> better test the code coupling.
>>> >
>>> >   Barry
>>> >
>>> >
>>> > >
>>> > > Reorder the indices in arrayApplication?
>>> > >
>>> > > On Sat, Aug 8, 2015 at 2:19 PM, Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>> > > On Sat, Aug 8, 2015 at 1:52 PM, Mani Chandra <mc0710 at gmail.com>
>>> wrote:
>>> > > Hi,
>>> > >
>>> > > I'm having trouble interfacing petsc to an application which I think
>>> is related to the ordering of the nodes. Here's what I'm trying to do:
>>> > >
>>> > > The application uses a structured grid with a global array having
>>> dimensions N1 x N2, which is then decomposed into a local array with
>>> dimensions NX1 x NX2.
>>> > >
>>> > > I create a Petsc DMDA using
>>> > >
>>> > >     DMDACreate2d(MPI_COMM_WORLD,
>>> > >                  DM_BOUNDARY_PERIODIC, DM_BOUNDARY_PERIODIC,
>>> > >                  DMDA_STENCIL_BOX,
>>> > >                  N1, N2,
>>> > >                  N1/NX1, N2/NX2,
>>> > >                  1, nghost, PETSC_NULL, PETSC_NULL,
>>> > >                  &dmda);
>>> > >
>>> > > and then use this to create a vec:
>>> > >
>>> > >   DMCreateGlobalVector(dmda, &vec);
>>> > >
>>> > > Now I copy the local contents of the application array to the petsc
>>> array using the following:
>>> > >
>>> > > Let i, j be the application indices and iPetsc and jPetsc be petsc's
>>> indices, then:
>>> > >
>>> > > DMDAGetCorners(dmda, &iStart, &jStart, &kStart,
>>> > >                                          &iSize, &jSize, &kSize
>>> > >                               );
>>> > >
>>> > >
>>> > > double **arrayPetsc;
>>> > > DMDAVecGetArray(dmda, vec, &arrayPetsc);
>>> > >
>>> > > for (int j=0, jPetsc=jStart; j<NX2, jPetsc<jStart+jSize; j++,
>>> jPetsc++)
>>> > > {
>>> > >   for (int i=0, iPetsc=iStart; i<NX1, iPetsc<iStart+iSize; i++,
>>> iPetsc++)
>>> > >   {
>>> > >      arrayPetsc[jPetsc][iPetsc] = arrayApplication[j][i];
>>> > >   }
>>> > > }
>>> > >
>>> > > DMDAVecRestoreArray(dmda, vec, &arrayPetsc);
>>> > >
>>> > > Now if I VecView(vec, viewer) and look at the data that petsc has,
>>> it looks right when run with 1 proc, but if I use 4 procs it's all messed
>>> up (see attached plots).
>>> > >
>>> > > I should probably be using the AO object but its not clear how.
>>> Could you help me out?
>>> > >
>>> > > It looks like you have the global order of processes reversed,
>>> meaning you have
>>> > >
>>> > >   1   3
>>> > >
>>> > >   0   2
>>> > >
>>> > > and it should be
>>> > >
>>> > >   2  3
>>> > >
>>> > >   0  1
>>> > >
>>> > >   Thanks,
>>> > >
>>> > >       Matt
>>> > >
>>> > > Thanks,
>>> > > Mani
>>> > > --
>>> > > What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> > > -- Norbert Wiener
>>> > >
>>> >
>>> >
>>>
>>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150809/1cda3839/attachment-0001.html>


More information about the petsc-users mailing list