[petsc-dev] proposed minor PetscPartitioner changes

Smith, Barry F. bsmith at mcs.anl.gov
Tue Nov 20 11:26:09 CST 2018


   MatPartitioning is the future. It is just a question of stripping out the PetscPartitioner and replacing it with the MatPartitioner.   Yes, once this was determined we lost interest in actually doing the work of stripping it out.


     Barry


> On Nov 20, 2018, at 11:12 AM, Fande Kong via petsc-dev <petsc-dev at mcs.anl.gov> wrote:
> 
> I was wondering what is the conclusion on this?
> 
> PetscPartitioner and MatPartitioning, which one will be kept? There is the code duplication. I am asking because I an working on a hierarchical  partitioning based MatPartitioning Interface and we also use MatPartitioning Interface in MOOSE to access all external partitioners.
> 
> Just want to make sure I am using the right partitioning interface.
> 
> Thanks,
> 
> Fande,
> 
> 
> 
> On Thu, Dec 7, 2017 at 8:33 AM Matthew Knepley <knepley at gmail.com> wrote:
> On Thu, Dec 7, 2017 at 10:26 AM, Vaclav Hapla <vaclav.hapla at erdw.ethz.ch> wrote:
> 
> 
> > 9. 11. 2017 v 12:53, Matthew Knepley <knepley at gmail.com>:
> >
> > I think I need to create a proof-of-concept. I would start by employing MatPartitioning in PetscPartitionerPartition with anything outside of this function untouched for now (as already suggested in #192), if you agree.
> >
> > Or what about implementing a special temporary PetscPartitioner implementation wrapping MatPartitioning?
> > PETSCPARTITIONERMATPARTITIONING sounds crazy, though :) But could be a good starting point.
> >
> > This is probably easier, and allows nice regression testing, namely run all tests using EXTRA_OPTIONS="-petscparitioner_type matpartitioning". I think
> > that should be alright for correctness. There are some parallel redistribution tests in Plex.
> >
> > We will need at least one performance regression. Look at how I do it here:
> >
> >   https://bitbucket.org/petsc/petsc/src/312beb00c9b3e1e8ec8fac64a948a1af779da02f/src/dm/impls/plex/examples/tests/ex9.c?at=master&fileviewer=file-view-default
> >
> > You can make custom events, directly access the times, and compare. You could run the two versions
> > and check for degradation for a sequence of meshes in parallel.
> >
> >   Thanks,
> >
> >      Matt
> 
> I have made some progress in this, see https://bitbucket.org/haplav/petsc/branch/haplav/feature-petscpartitionermatpartitioning
> There's a new test ex23 which shows basically that
>   -petscparitioner_type parmetis
> and
>   -petscparitioner_type matpartitioning -mat_partitioning_type parmetis
> give exactly the same results.
> 
> I have not created a PR yet since the regression testing you propose fails in some cases. When running
>   make -f gmakefile.test test EXTRA_OPTIONS="-petscpartitioner_type matpartitioning -options_left 0" search='dm_impls_plex_tests%'
> it seems majority of tests pass but e.g. dm_impls_plex_tests-ex7_7 fails since its reference output has been saved for -petscpartitioner_type simple. How to deal with that, please?
> 
> Ignore the ones with simple. I think all the SNES tests are now simple, so for ex12, ex62, and ex77 can you just do a few runs
> and confirm that both ways give the same results?
> 
>   Thanks,
> 
>      Matt
>  
> As can be seen in my PetscPartitionerPartition_MatPartitioning, the needed manipulations with IS are slightly more complicated than what I inferred from the Jed's comments. But still just the existing IS methods suffice. See bitbucket.org/haplav/petsc/src/ab7d5f43fd87d1d57b51b6e7ff8de0ef3e904673/src/dm/impls/plex/petscpartmatpart.c?at=haplav%2Ffeature-petscpartitionermatpartitioning#petscpartmatpart.c-92
> 
> Vaclav
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/



More information about the petsc-dev mailing list