[petsc-users] Load balancing / redistributing a 1D DM
Åsmund Ervik
Asmund.Ervik at sintef.no
Mon Mar 5 08:25:02 CST 2018
My only argument "against" using Plex is that I don't understand how to use it. Is there a simple example anywhere that shows how to set up a 1D simplical (?) mesh, and then just get/return data between vectors associated with the Plex and (local) Fortran arrays on each proc? I don't have any KSP, SNES etc.
I assume such a code would go something like
<Plex mesh init stuff>
<Plex initial (uniform) parallel distribution>
<My code's init stuff>
<loop over time>
<Plex equivalent of DMDAGetCorners>
<Plex equivalent of DMDAVecGetArrayF90>
<My code that computes things with the array, and finds weights for redistribution>
<Plex equivalent of DMDAVecRestoreArrayF90>
<Plex redistribution>
<end loop>
Regards,
Åsmund
> -----Original Message-----
> From: Matthew Knepley [mailto:knepley at gmail.com]
> Sent: Monday, March 5, 2018 3:08 PM
> To: Tobin Isaac <tisaac at cc.gatech.edu>
> Cc: Jed Brown <jed at jedbrown.org>; Dave May
> <dave.mayhem23 at gmail.com>; Åsmund Ervik <Asmund.Ervik at sintef.no>;
> petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] Load balancing / redistributing a 1D DM
>
> On Mon, Mar 5, 2018 at 9:01 AM, Tobin Isaac <tisaac at cc.gatech.edu
> <mailto:tisaac at cc.gatech.edu> > wrote:
>
>
> This is a somewhat incomplete description of the steps in linear
> partitioning. The rest can be accomplished with PetscSF calls, but I should
> wrap it up in a PetscPartitioner because it's a mistake-prone operation.
>
>
>
> Jed likes to do everything by hand because it is transparent, but then you
> become the maintainer.
> I think this is easy to do in Plex, and we maintain the code. It is less
> transparent, which is the tradeoff.
>
> Matt
>
>
> On March 5, 2018 8:31:42 AM EST, Jed Brown <jed at jedbrown.org
> <mailto:jed at jedbrown.org> > wrote:
> >Dave May <dave.mayhem23 at gmail.com
> <mailto:dave.mayhem23 at gmail.com> > writes:
> >
> >> For a 1D problem such as yours, I would use your favourite graph
> >> partitioner (Metis,Parmetis, Scotch) together with your cell based
> >> weighting and repartition the data yourself.
> >
> >That's overkill in 1D. You can MPI_Allreduce(SUM) and
> MPI_Scan(SUM)
> >the
> >weights, then find the transition indices in each subdomain. It'll be
> >cheaper, more intuitive/deterministic, and avoid the extra library
> >dependency. Of course if you think you may want to move to
> multiple
> >dimensions, it would make sense to consider DMPlex or DMForest.
>
>
>
>
>
> --
>
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.caam.rice.edu/~mk51/>
More information about the petsc-users
mailing list