[petsc-dev] proposed minor PetscPartitioner changes

Jed Brown jed at jedbrown.org
Wed Nov 8 19:20:32 CST 2017


Matthew Knepley <knepley at gmail.com> writes:

> On Wed, Nov 8, 2017 at 2:27 PM, Jed Brown <jed at jedbrown.org> wrote:
>
>> Matthew Knepley <knepley at gmail.com> writes:
>>
>> > On Wed, Nov 8, 2017 at 1:49 PM, Jed Brown <jed at jedbrown.org> wrote:
>> >
>> >> Matthew Knepley <knepley at gmail.com> writes:
>> >>
>> >> >> > No, this is the right structure.
>> >> >>
>> >> >> Oh come on.  You're defending a quadratic algorithm.
>> >> >>
>> >> >>       ierr = ParMETIS_V3_PartKway(vtxdist, xadj, adjncy, vwgt,
>> adjwgt,
>> >> >> &wgtflag, &numflag, &ncon, &nparts, tpwgts, ubvec, options, &edgeCut,
>> >> >> assignment, &comm);
>> >> >>       // ...
>> >> >>   for (p = 0, i = 0; p < nparts; ++p) {
>> >> >>     for (v = 0; v < nvtxs; ++v) {
>> >> >>       if (assignment[v] == p) points[i++] = v;
>> >> >>     }
>> >> >>   }
>> >> >>
>> >> >> MatPartitioningApply creates an IS with "assignment" and can be
>> >> >> converted to a global numbering with ISPartitioningToNumbering.  You
>> >> >> could as well have an ISPartitioningToSectionAndIS() that produces
>> your
>> >> >> representation, preferably without this silly quadratic algorithm.
>> >> >>
>> >> >
>> >> > Time it. Tell me if it matters. Telling me it matters in the long run
>> is
>> >> > metaphysics.
>> >>
>> >> I realize ParMETIS isn't scalable and that if you have a modest number
>> >> of parts and only a million or so elements per rank, the cost of what
>> >> you do here will be acceptable for most uses.
>> >>
>> >> But you didn't refute my point that ISPartitioningToSectionAndIS can
>> >> produce the representation you want.
>> >
>> >
>> > I do not think its an either or thing. Many equivalent interfaces are
>> > possible,
>> > so I should have told Vaclav "I think this is the right one", but I
>> thought
>> > that
>> > was implicit in me being the responder, and none of us thinking that
>> there
>> > is
>> > a true "right" in interface design.
>> >
>> >
>> >> The IS you're creating is similar
>> >> to the inverse of the Numbering (which is a permutation).  You are the
>> >> one that replaced a scalable algorithm that has existed for a long time
>> >> and uses types correctly with PetscPartitioner which has some ugly
>> >> warts, duplicates a lot of code, and isn't a viable replacement for
>> >> MatPartitioning.
>> >
>> >
>> > 1) MatPartitioning is not a replacement for what I do. According to you,
>> > that
>> > should end the argument right there.
>> >
>> > 2) Deep inside MatPartitioning, it must split things up as I do in order
>> to
>> > pack
>> > stuff to be sent. I think it should be exposed as I have done.
>>
>> Pack what stuff to be sent?  PetscPartitionerPartition() takes the
>> arguments to ParMETIS directly as arrays.  To use MatPartitioning, you
>> pass those same arrays to MatCreateMPIAdj which literally just holds the
>> pointers.  Then MatPartitioningApply passes those on to ParMETIS or
>> whichever partitioning package.  PetscPartitionerPartition_*
>> implementations depends on DM strictly as a way to define the weights.
>> That isn't more general or better; it's more restrictive.
>>
>
> This sounds like deliberate misunderstanding. We are not talking about
> the input to ParMetis, but the output. The biggest shortcoming of
> ParMetis (and indeed all mesh partitioners) is that they tell you what
> must move, but do not move it.  This is the value add of Plex, it
> moves the mesh (including all connectivity) and the field data
> according to the partition. In order to do this communication, you
> always want the partition segmented by rank, as it is in the
> Section/IS output.

Then have your mesh partitioner+distribution call MatPartitioning rather
than copying MatPartitioning code to support multiple partitioning
packages.  It's the duplication of code that we are complaining about,
not the fact that partitioning and distributing a mesh is more than what
MatPartitioning does.


More information about the petsc-dev mailing list