[petsc-users] Mapping between application ordering and Petsc ordering
Mani Chandra
mc0710 at gmail.com
Sat Aug 8 16:56:02 CDT 2015
So basically one needs to correctly map
iPetsc, jPetsc -> iApplication, jApplication ?
Is there is any standard way to do this? Can I get petsc to automatically
follow the same parallel topology as the host application?
Thanks,
Mani
On Sat, Aug 8, 2015 at 3:12 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> > On Aug 8, 2015, at 3:08 PM, Mani Chandra <mc0710 at gmail.com> wrote:
> >
> > Tried flipping the indices, I get a seg fault.
>
> You would have to be careful in exactly what you flip. Note that the
> meaning of N1 and N2 etc would also be reversed between your code and the
> PETSc DMDA code.
>
> I would create a tiny DMDA and put entires like 1 2 3 4 ... into the
> array so you can track where the values go
>
> Barry
>
> >
> > On Sat, Aug 8, 2015 at 3:03 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> > > On Aug 8, 2015, at 2:45 PM, Mani Chandra <mc0710 at gmail.com> wrote:
> > >
> > > Thanks. Any suggestions for a fix?
> >
> > Just flip the meaning of the x indices and the y indices in the PETSc
> parts of the code?
> >
> > Also run with a very different N1 and N2 (instead of equal size) to
> better test the code coupling.
> >
> > Barry
> >
> >
> > >
> > > Reorder the indices in arrayApplication?
> > >
> > > On Sat, Aug 8, 2015 at 2:19 PM, Matthew Knepley <knepley at gmail.com>
> wrote:
> > > On Sat, Aug 8, 2015 at 1:52 PM, Mani Chandra <mc0710 at gmail.com> wrote:
> > > Hi,
> > >
> > > I'm having trouble interfacing petsc to an application which I think
> is related to the ordering of the nodes. Here's what I'm trying to do:
> > >
> > > The application uses a structured grid with a global array having
> dimensions N1 x N2, which is then decomposed into a local array with
> dimensions NX1 x NX2.
> > >
> > > I create a Petsc DMDA using
> > >
> > > DMDACreate2d(MPI_COMM_WORLD,
> > > DM_BOUNDARY_PERIODIC, DM_BOUNDARY_PERIODIC,
> > > DMDA_STENCIL_BOX,
> > > N1, N2,
> > > N1/NX1, N2/NX2,
> > > 1, nghost, PETSC_NULL, PETSC_NULL,
> > > &dmda);
> > >
> > > and then use this to create a vec:
> > >
> > > DMCreateGlobalVector(dmda, &vec);
> > >
> > > Now I copy the local contents of the application array to the petsc
> array using the following:
> > >
> > > Let i, j be the application indices and iPetsc and jPetsc be petsc's
> indices, then:
> > >
> > > DMDAGetCorners(dmda, &iStart, &jStart, &kStart,
> > > &iSize, &jSize, &kSize
> > > );
> > >
> > >
> > > double **arrayPetsc;
> > > DMDAVecGetArray(dmda, vec, &arrayPetsc);
> > >
> > > for (int j=0, jPetsc=jStart; j<NX2, jPetsc<jStart+jSize; j++, jPetsc++)
> > > {
> > > for (int i=0, iPetsc=iStart; i<NX1, iPetsc<iStart+iSize; i++,
> iPetsc++)
> > > {
> > > arrayPetsc[jPetsc][iPetsc] = arrayApplication[j][i];
> > > }
> > > }
> > >
> > > DMDAVecRestoreArray(dmda, vec, &arrayPetsc);
> > >
> > > Now if I VecView(vec, viewer) and look at the data that petsc has, it
> looks right when run with 1 proc, but if I use 4 procs it's all messed up
> (see attached plots).
> > >
> > > I should probably be using the AO object but its not clear how. Could
> you help me out?
> > >
> > > It looks like you have the global order of processes reversed, meaning
> you have
> > >
> > > 1 3
> > >
> > > 0 2
> > >
> > > and it should be
> > >
> > > 2 3
> > >
> > > 0 1
> > >
> > > Thanks,
> > >
> > > Matt
> > >
> > > Thanks,
> > > Mani
> > > --
> > > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > > -- Norbert Wiener
> > >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150808/49bdbbc9/attachment.html>
More information about the petsc-users
mailing list