[petsc-users] repartition for dynamic load balancing

Xiangdong epscodes at gmail.com
Thu Jan 28 15:02:45 CST 2016


I am thinking to use parmetis to repartition the mesh (based on new updated
weights for vertices), and use some functions (maybe DMPlexMigrate) to
redistribute the data. I will look into Matt's paper to see whether it is
possible.

Thanks.
Xiangdong



On Thu, Jan 28, 2016 at 2:41 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Thu, Jan 28, 2016 at 1:37 PM, Dave May <dave.mayhem23 at gmail.com> wrote:
>
>>
>>
>> On Thursday, 28 January 2016, Matthew Knepley <knepley at gmail.com> wrote:
>>
>>> On Thu, Jan 28, 2016 at 11:36 AM, Xiangdong <epscodes at gmail.com> wrote:
>>>
>>>> What functions/tools can I use for dynamic migration in DMPlex
>>>> framework?
>>>>
>>>
>>> In this paper, http://arxiv.org/abs/1506.06194, we explain how to use
>>> the DMPlexMigrate() function to redistribute data.
>>> In the future, its likely we will add a function that wraps it up with
>>> determination of the new partition at the same time.
>>>
>>>
>>>> Can you also name some external mesh management systems? Thanks.
>>>>
>>>
>>> I will note that if load balance in the solve is your only concern,
>>> PCTelescope can redistribute the DMDA solve.
>>>
>>
>> Currently Telescope will only repartition 2d and 3d DMDA's. It
>> does perform data migration and allows users to specify the number of ranks
>> to be used in each I,j,k direction via -xxx_grid_x etc. I wouldn't say it
>> supports "load balancing", as there is no mechanism to define number of
>> points in each sub-domain
>>
>
> Let me be more precise. All I have suggested for any of this are
> redistribution tools. You will have to determine
> the right weights for "load balance", which I think is always true. Using
> the default weights is crazy.
>
>   Matt
>
>
>> Cheers
>>   Dave
>>
>>
>>
>>>
>>>   Thanks,
>>>
>>>     Matt
>>>
>>>
>>>>
>>>> Xiangdong
>>>>
>>>> On Thu, Jan 28, 2016 at 12:21 PM, Barry Smith <bsmith at mcs.anl.gov>
>>>> wrote:
>>>>
>>>>>
>>>>> > On Jan 28, 2016, at 11:11 AM, Xiangdong <epscodes at gmail.com> wrote:
>>>>> >
>>>>> > Yes, it can be either DMDA or DMPlex. For example, I have 1D DMDA
>>>>> with Nx=10 and np=2. At the beginning each processor owns 5 cells. After
>>>>> some simulation time, I found that repartition the 10 cells into 3 and 7 is
>>>>> better for load balancing. Is there an easy/efficient way to migrate data
>>>>> from one partition to another partition? I am wondering whether there are
>>>>> some functions or libraries help me manage this redistribution.
>>>>>
>>>>>   For DMDA we don't provide tools for doing this, nor do we expect to.
>>>>> For this type of need for dynamic migration we recommend using DMPlex or
>>>>> some external mesh management system.
>>>>>
>>>>>   Barry
>>>>>
>>>>> >
>>>>> > Thanks.
>>>>> > Xiangdong
>>>>> >
>>>>> > On Thu, Jan 28, 2016 at 12:20 AM, Jed Brown <jed at jedbrown.org>
>>>>> wrote:
>>>>> > Xiangdong <epscodes at gmail.com> writes:
>>>>> >
>>>>> > > I have a question on dynamic load balance in petsc. I started
>>>>> running a
>>>>> > > simulation with one partition. As the simulation goes on, that
>>>>> partition
>>>>> > > may lead to load imbalance since it is a non-steady problem. If it
>>>>> is worth
>>>>> > > to perform the load balance, is there an easy way to re-partition
>>>>> the mesh
>>>>> > > and continue the simulation?
>>>>> >
>>>>> > Are you using a PETSc DM?  What "mesh"?  If you own it, then
>>>>> > repartitioning it is entirely your business.
>>>>> >
>>>>> > In general, after adapting the mesh, you rebuild all algebraic data
>>>>> > structures.  Solvers can be reset (SNESReset, etc.).
>>>>> >
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160128/0fd30bf9/attachment-0001.html>


More information about the petsc-users mailing list