[petsc-dev] programming model for PETSc

Jed Brown jedbrown at mcs.anl.gov
Thu Nov 24 17:54:47 CST 2011


On Thu, Nov 24, 2011 at 17:45, Matthew Knepley <knepley at gmail.com> wrote:

> This not model my current model, but it is a possible one. I do not
> "ghost" the layout, only
> the points.
>

I wasn't asking that. How do you represent a ghosted point? I thought you
do it by (rank, index). I have everything I need if the ghoster has (owner
rank, index), or equivalently (modulo MPI_Scan), the global index of the
point it is ghosting.


> In that case, the fact that remote processes have (owner rank, offset)
>> means that I can broadcast (local_offset, size) with purely local setup
>> (this is the first primitive which can be implemented using MPI_Get()).
>>
>
> Okay, I would really like this coded up. We can do a 1-D mesh  of
> Lagrangian elements just to show me what is going on.
>

Sure, it's simple.


> I am still not sure how much this buys you since you had to communicate
> that offset info somehow.
>

It buys you that the next operation is non-synchronizing, has no
user-visible packing, and does not have memory scalability issues if every
process needs data from one point.


>
>
>> Alternatively, given (owner rank, offset, size), we can literally call
>> VecScatterCreate() after just an MPI_Scan(), which is logarithmic, and
>> local setup. but VecScatterCreate() does lots of unnecessary setup to build
>> the two-way representation.
>>
>>
>> Redistributing a mesh after partitioning is slightly more demanding.
>> First, senders are enumerated using a fetch-and-add initiated by the
>> sending process which has the side-effect of counting the number of nodes
>> that will be in the new partition and informing the senders of the offsets
>> at which to deposit those nodes. Then we broadcast the (rank, offset) of
>> each node on the sender to the receiver. Then we send connectivity using a
>> non-uniform broadcast. Now, before moving data, we can reorder locally,
>> inform the sender of the new ordering, and then move all the data.
>>
>
> Great. This is exactly what i hate about this crap. It always seems
> specially coded for the problem. I think we can use exactly the
> primitive above to do your non-uniform broadcast step above. Mesh topolgoy
> is jsut another function over the mesh points.
>

Yeah, there is still a question of what to store (e.g. FV and DG can throw
away vertices and edges, lowest-order FE can throw away faces and edges),
but that is orthogonal to communication which I agree is just a (usually
variable-sized) function on the mesh points.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111124/e450df42/attachment.html>


More information about the petsc-dev mailing list