<html>
<head>
<meta content="text/html; charset=utf-8" http-equiv="Content-Type">
</head>
<body text="#000000" bgcolor="#FFFFFF">
If the problem is due to the rank-ordering, the following excerpt
from the PETSc FAQ section may help:<br>
<br>
<a class="moz-txt-link-rfc2396E" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#da_mpi_cart"><http://www.mcs.anl.gov/petsc/documentation/faq.html#da_mpi_cart></a><a
name="da_mpi_cart"><br>
<br>
The PETSc DA object decomposes the domain differently than the
MPI_Cart_create() command. How can one use them together?</a>
<p> The MPI_Cart_create() first divides the mesh along the z
direction, then the y, then the x. DMDA divides along the x, then
y, then z. Thus, for example, rank 1 of the processes will be in a
different part of the mesh for the two schemes. To resolve this
you can create a new MPI communicator that you pass to
DMDACreate() that renumbers the process ranks so that each
physical process shares the same part of the mesh with both the
DMDA and the MPI_Cart_create(). The code to determine the new
numbering was provided by Rolf Kuiper. </p>
<pre>// the numbers of processors per direction are (int) x_procs, y_procs, z_procs respectively
// (no parallelization in direction 'dir' means dir_procs = 1)
MPI_Comm NewComm;
int MPI_Rank, NewRank, x,y,z;
// get rank from MPI ordering:
MPI_Comm_rank(MPI_COMM_WORLD, &MPI_Rank);
// calculate coordinates of cpus in MPI ordering:
x = MPI_rank / (z_procs*y_procs);
y = (MPI_rank % (z_procs*y_procs)) / z_procs;
z = (MPI_rank % (z_procs*y_procs)) % z_procs;
// set new rank according to PETSc ordering:
NewRank = z*y_procs*x_procs + y*x_procs + x;
// create communicator with new ranks according to
PETSc ordering:
MPI_Comm_split(PETSC_COMM_WORLD, 1, NewRank, &NewComm);
// override the default communicator (was
MPI_COMM_WORLD as default)
PETSC_COMM_WORLD = NewComm;
</pre>
<br>
<br>
<div class="moz-cite-prefix">On 08.08.2015 23:58, Matthew Knepley
wrote:<br>
</div>
<blockquote
cite="mid:CAMYG4G=miJSw6Gb2xDQ+9QEDbC3kATZAfhT56LwdT71oPrquYw@mail.gmail.com"
type="cite">
<div dir="ltr">
<div class="gmail_extra">
<div class="gmail_quote">On Sat, Aug 8, 2015 at 4:56 PM, Mani
Chandra <span dir="ltr"><<a moz-do-not-send="true"
href="mailto:mc0710@gmail.com" target="_blank">mc0710@gmail.com</a>></span>
wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
<div dir="ltr">So basically one needs to correctly map
<div><br>
</div>
<div>iPetsc, jPetsc -> iApplication, jApplication ?</div>
<div><br>
</div>
<div>Is there is any standard way to do this? Can I get
petsc to automatically follow the same parallel
topology as the host application?</div>
</div>
</blockquote>
<div><br>
</div>
<div>If you want to use DMDA, there is only one mapping of
ranks, namely lexicographic. However, every structured
grid code I have</div>
<div>ever seen uses that mapping, perhaps with a permutation
of the directions {x, y, z}. Thus, the user needs to map
the directions</div>
<div>in PETSc in the right order for the application. I am
not sure how you would automate this seeing as it depends
on the application.</div>
<div><br>
</div>
<div> Thanks,</div>
<div><br>
</div>
<div> Matt</div>
<div> </div>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
<div dir="ltr">
<div>Thanks,</div>
<div>Mani</div>
</div>
<div class="gmail_extra"><br>
<div class="gmail_quote">On Sat, Aug 8, 2015 at 3:12 PM,
Barry Smith <span dir="ltr"><<a
moz-do-not-send="true"
href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span>
wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex"><span
class=""><span><br>
> On Aug 8, 2015, at 3:08 PM, Mani Chandra
<<a moz-do-not-send="true"
href="mailto:mc0710@gmail.com" target="_blank">mc0710@gmail.com</a>>
wrote:<br>
><br>
> Tried flipping the indices, I get a seg
fault.<br>
<br>
</span></span> You would have to be careful in
exactly what you flip. Note that the meaning of N1
and N2 etc would also be reversed between your code
and the PETSc DMDA code.<br>
<br>
I would create a tiny DMDA and put entires like 1
2 3 4 ... into the array so you can track where the
values go<br>
<span><font color="#888888"><br>
Barry<br>
</font></span>
<div>
<div><br>
><br>
> On Sat, Aug 8, 2015 at 3:03 PM, Barry Smith
<<a moz-do-not-send="true"
href="mailto:bsmith@mcs.anl.gov"
target="_blank">bsmith@mcs.anl.gov</a>>
wrote:<br>
><br>
> > On Aug 8, 2015, at 2:45 PM, Mani
Chandra <<a moz-do-not-send="true"
href="mailto:mc0710@gmail.com" target="_blank">mc0710@gmail.com</a>>
wrote:<br>
> ><br>
> > Thanks. Any suggestions for a fix?<br>
><br>
> Just flip the meaning of the x indices
and the y indices in the PETSc parts of the
code?<br>
><br>
> Also run with a very different N1 and N2
(instead of equal size) to better test the code
coupling.<br>
><br>
> Barry<br>
><br>
><br>
> ><br>
> > Reorder the indices in
arrayApplication?<br>
> ><br>
> > On Sat, Aug 8, 2015 at 2:19 PM,
Matthew Knepley <<a moz-do-not-send="true"
href="mailto:knepley@gmail.com"
target="_blank">knepley@gmail.com</a>>
wrote:<br>
> > On Sat, Aug 8, 2015 at 1:52 PM, Mani
Chandra <<a moz-do-not-send="true"
href="mailto:mc0710@gmail.com" target="_blank">mc0710@gmail.com</a>>
wrote:<br>
> > Hi,<br>
> ><br>
> > I'm having trouble interfacing petsc
to an application which I think is related to
the ordering of the nodes. Here's what I'm
trying to do:<br>
> ><br>
> > The application uses a structured grid
with a global array having dimensions N1 x N2,
which is then decomposed into a local array with
dimensions NX1 x NX2.<br>
> ><br>
> > I create a Petsc DMDA using<br>
> ><br>
> > DMDACreate2d(MPI_COMM_WORLD,<br>
> > DM_BOUNDARY_PERIODIC,
DM_BOUNDARY_PERIODIC,<br>
> > DMDA_STENCIL_BOX,<br>
> > N1, N2,<br>
> > N1/NX1, N2/NX2,<br>
> > 1, nghost,
PETSC_NULL, PETSC_NULL,<br>
> > &dmda);<br>
> ><br>
> > and then use this to create a vec:<br>
> ><br>
> > DMCreateGlobalVector(dmda,
&vec);<br>
> ><br>
> > Now I copy the local contents of the
application array to the petsc array using the
following:<br>
> ><br>
> > Let i, j be the application indices
and iPetsc and jPetsc be petsc's indices, then:<br>
> ><br>
> > DMDAGetCorners(dmda, &iStart,
&jStart, &kStart,<br>
> >
&iSize, &jSize, &kSize<br>
> > );<br>
> ><br>
> ><br>
> > double **arrayPetsc;<br>
> > DMDAVecGetArray(dmda, vec,
&arrayPetsc);<br>
> ><br>
> > for (int j=0, jPetsc=jStart; j<NX2,
jPetsc<jStart+jSize; j++, jPetsc++)<br>
> > {<br>
> > for (int i=0, iPetsc=iStart;
i<NX1, iPetsc<iStart+iSize; i++, iPetsc++)<br>
> > {<br>
> > arrayPetsc[jPetsc][iPetsc] =
arrayApplication[j][i];<br>
> > }<br>
> > }<br>
> ><br>
> > DMDAVecRestoreArray(dmda, vec,
&arrayPetsc);<br>
> ><br>
> > Now if I VecView(vec, viewer) and look
at the data that petsc has, it looks right when
run with 1 proc, but if I use 4 procs it's all
messed up (see attached plots).<br>
> ><br>
> > I should probably be using the AO
object but its not clear how. Could you help me
out?<br>
> ><br>
> > It looks like you have the global
order of processes reversed, meaning you have<br>
> ><br>
> > 1 3<br>
> ><br>
> > 0 2<br>
> ><br>
> > and it should be<br>
> ><br>
> > 2 3<br>
> ><br>
> > 0 1<br>
> ><br>
> > Thanks,<br>
> ><br>
> > Matt<br>
> ><br>
> > Thanks,<br>
> > Mani<br>
> > --<br>
> > What most experimenters take for
granted before they begin their experiments is
infinitely more interesting than any results to
which their experiments lead.<br>
> > -- Norbert Wiener<br>
> ><br>
><br>
><br>
<br>
</div>
</div>
</blockquote>
</div>
<br>
</div>
</blockquote>
</div>
<br>
<br clear="all">
<div><br>
</div>
-- <br>
<div class="gmail_signature">What most experimenters take for
granted before they begin their experiments is infinitely
more interesting than any results to which their experiments
lead.<br>
-- Norbert Wiener</div>
</div>
</div>
</blockquote>
<br>
</body>
</html>