[petsc-dev] Generality of VecScatter
Jed Brown
jedbrown at mcs.anl.gov
Sat Nov 26 15:03:31 CST 2011
On Sat, Nov 26, 2011 at 14:25, Mark F. Adams <mark.adams at columbia.edu>wrote:
> So each MPI_Get initiates a message and you pack up a message with an
> array of remote pointers or something?
>
MPI_Get() has an MPI_Datatype for the remote process. That describes where
in the remote buffer the values should be gotten from. This is much like
how VecScatter can operate on just part of a Vec.
>
> It sounds like you are saying that you have an auxiliary arrays of remote
> global indices for each processor that you communicate with and your
> broadcast code looks like:
>
> for all 'proc' that I talk to
>
for all proc that I _need_ an update from, I don't know or care who needs
an update from me
> i = 0
> for all 'v' on proc list
> if v.state == not-done
>
I didn't think we would bother with communicating only those values that
were actually updated. Since this should converge in just a few rounds, I
figured that we would just update all ghost points. Note that communicating
the metadata to send only those values that have actually changed may make
the algorithm less latency-tolerant.
> data[i++] = [ &v.state, v.id, STATE ] // vague code here...
> endif
> endfor
> MPI_Get(proc,i,data)
> endfor
>
I thought we would skip any loop over vertices and just
one-time setup:
remotes = {}
for each ghost vertex v:
remotes[v.owner_rank] += 1 # add/increment entry for this rank
nremotes = remotes.size
build MPI_Datatype their_datatype[nremotes] (using the remote index stored
for each ghosted vertex) and my_datatype[nremotes] (using the sequential
locations of the ghost vertices in my array) for each rank in remotes,
implement as one loop over ghost vertices
communication in each round:
MPI_Win_fence(0, window)
for (r,rank) in remotes:
MPI_Get(my_ghosted_status_array, 1, my_datatype[r], rank, 0, 1,
their_datatype[r], window)
MPI_Win_fence(0, window)
All the packing and unpacking is done by the implementation and I never
traverse vertices in user code.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111126/f287c06b/attachment.html>
More information about the petsc-dev
mailing list