[petsc-dev] proposed minor PetscPartitioner changes

Jed Brown jed at jedbrown.org
Wed Nov 8 13:27:16 CST 2017


Matthew Knepley <knepley at gmail.com> writes:

> On Wed, Nov 8, 2017 at 1:49 PM, Jed Brown <jed at jedbrown.org> wrote:
>
>> Matthew Knepley <knepley at gmail.com> writes:
>>
>> >> > No, this is the right structure.
>> >>
>> >> Oh come on.  You're defending a quadratic algorithm.
>> >>
>> >>       ierr = ParMETIS_V3_PartKway(vtxdist, xadj, adjncy, vwgt, adjwgt,
>> >> &wgtflag, &numflag, &ncon, &nparts, tpwgts, ubvec, options, &edgeCut,
>> >> assignment, &comm);
>> >>       // ...
>> >>   for (p = 0, i = 0; p < nparts; ++p) {
>> >>     for (v = 0; v < nvtxs; ++v) {
>> >>       if (assignment[v] == p) points[i++] = v;
>> >>     }
>> >>   }
>> >>
>> >> MatPartitioningApply creates an IS with "assignment" and can be
>> >> converted to a global numbering with ISPartitioningToNumbering.  You
>> >> could as well have an ISPartitioningToSectionAndIS() that produces your
>> >> representation, preferably without this silly quadratic algorithm.
>> >>
>> >
>> > Time it. Tell me if it matters. Telling me it matters in the long run is
>> > metaphysics.
>>
>> I realize ParMETIS isn't scalable and that if you have a modest number
>> of parts and only a million or so elements per rank, the cost of what
>> you do here will be acceptable for most uses.
>>
>> But you didn't refute my point that ISPartitioningToSectionAndIS can
>> produce the representation you want.
>
>
> I do not think its an either or thing. Many equivalent interfaces are
> possible,
> so I should have told Vaclav "I think this is the right one", but I thought
> that
> was implicit in me being the responder, and none of us thinking that there
> is
> a true "right" in interface design.
>
>
>> The IS you're creating is similar
>> to the inverse of the Numbering (which is a permutation).  You are the
>> one that replaced a scalable algorithm that has existed for a long time
>> and uses types correctly with PetscPartitioner which has some ugly
>> warts, duplicates a lot of code, and isn't a viable replacement for
>> MatPartitioning.
>
>
> 1) MatPartitioning is not a replacement for what I do. According to you,
> that
> should end the argument right there.
>
> 2) Deep inside MatPartitioning, it must split things up as I do in order to
> pack
> stuff to be sent. I think it should be exposed as I have done.

Pack what stuff to be sent?  PetscPartitionerPartition() takes the
arguments to ParMETIS directly as arrays.  To use MatPartitioning, you
pass those same arrays to MatCreateMPIAdj which literally just holds the
pointers.  Then MatPartitioningApply passes those on to ParMETIS or
whichever partitioning package.  PetscPartitionerPartition_*
implementations depends on DM strictly as a way to define the weights.
That isn't more general or better; it's more restrictive.

I see no reason why PetscPartitioner cannot be implemented as a very
small amount of code around MatPartitioning (little more than the
interface function).  The opposite would necessarily bring in concepts
like DM that are unnecessary when a user is partitioning matrices and
graphs.

> 3) I think you exaggerate the faults of the code for effect. Do what you
> must.
>
>
>> And now you seem to be arguing against Vaclav unifying
>> it, claiming technical rationale that don't appear to hold up.
>
>
> Of course, I disagree with you, and think you make no attempt to understand
> why
> someone would write this code, because you never solve the problems its
> necessary
> for.

I have written a parallel unstructured FEM that did this sort of
partitioning and redistribution on the fly.  I wrote it in terms of
MatPartitioning and it worked fine.  I abandoned it because it had
screwy dependencies and you had abandoned Sieve to work on
DMComplex/DMPlex.  I'm happy to have made that change, but the concepts
and requirements are hardly foreign to me.

>> I don't
>> understand why, but it certainly looks like PetscPartitioner can be
>> implemented as a thin layer around MatPartitioning, then that layer can
>> be refactored into helper functions at the IS/PetscSection level.
>
>
> It might be true. To me, it looked like Mat stuff was draped on so it would
> be hard
> to pull the raw result out of the partitioner, and that things would get
> created (like
> the MatMult Scatter) which I did not want.

Barry created the MPIADJ representations in 1997 precisely for this purpose.


More information about the petsc-dev mailing list