[petsc-users] Automatic load balancing for PETSc using third party library (e.g. Zoltan)
Matthew Knepley
knepley at gmail.com
Fri Oct 11 07:15:32 CDT 2019
On Fri, Oct 11, 2019 at 4:09 AM von Ramm, Alexander via petsc-users <
petsc-users at mcs.anl.gov> wrote:
> Hi together,
> I'm currently evaluating to use PETSc as framework for our next software
> project.
> My question concerns dynamic repartitioning: I have already found in the
> Documentation (Chapt. 3.5) that PETSC does not currently support dynamic
> repartitioning, load balancing by migrating matrix entries between
> processes, etc. I know that Zoltan e.g., which PETSC can be configured with
> offers this functionality. Is there an tutorial demonstrating how the
> automatic load-balancing features of Zoltan can be used in combination with
> linear/non-linear solvers offered by PETSC? Or should one, for problems
> requiring dynamic load-balancing strategies move to Trilinos all together?
>
I am not sure Zoltan offers anything beyond what we have now. Here is a
sketch of dynamic load balancing for a given matrix:
1) Develop a partition that is good for you using MatPartitioning
This is the problem specific part since you have to use weights to
tell the partitioner about your computation. No package can do this
automatically.
2) Move the matrix using MatPermute
However, very few problems actually work this way. We have the same
workflow for meshes
DMPlexDistribute()
and based upon the partition you get out you can permute Vec/Mat and also
arbitrary data
DMPlexDistributeData()
Thanks,
Matt
> Thank you for any input,
> Best Regards,
> Alex von Ramm
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20191011/0184cb36/attachment.html>
More information about the petsc-users
mailing list