[petsc-dev] testing code which use partitioners

Fande Kong fdkong.jd at gmail.com
Sun Jan 6 23:46:00 CST 2019


It is definitely valuable to have partitioners produce machine independent
results.  We right now have to generate different outputs on different
machines in the testing system.

So far we found that just a few partitioners (literally Metis, ParMETIS and
PTScotch ) are interesting. Is it possible to have these packages to take a
user-provided random generator ?

BTW, a while ago, Jed recommended to take a look at KaHIP. I actually
invested a little bit on that a few months ago, and I found that KaHIP does
not even provide a parallel partitioning interface and then I gave up to
spend more time on it. We are definitely interested in the parallel
partitioning since it can potentially partition a large mesh.


Thanks,

Fande Kong



On Sun, Jan 6, 2019 at 9:49 PM Jed Brown via petsc-dev <
petsc-dev at mcs.anl.gov> wrote:

> METIS will use its internal implementation if USE_GKRAND is defined.
> Maybe other partitioners have similar functionality.
>
> I'm not wild about the complexity of providing our own definitions.
> Note that libmetis.so does not link to libpetsc.so so it would be
> nontrivial to make that work with both shared and static libraries.
>
> "Smith, Barry F. via petsc-dev" <petsc-dev at mcs.anl.gov> writes:
>
> >    It really irks me that we can't use the various partitioners in our
> nightly tests because they use rand() and produce different results on
> > different machines.
> >
> >    Here is what I propose
> >
> > 1) add ./configure flag --with-petsc-rand
> >
> > 2) if the flag is set then provide our own rand() and srand() (by
> crudely wrapping PetscRandom()) in pinit.c which is compiled if the flag is
> set
> >
> > 3) add the --with-petsc-rand to all the nightlybuilds
> >
> >   Problems with this plan?
> >
> >     Barry
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190106/ae676c4f/attachment.html>


More information about the petsc-dev mailing list