[petsc-users] Load imbalance in DMSwarm & DMDA simulations

Matthew Knepley knepley at gmail.com
Tue Mar 17 09:11:54 CDT 2026


On Tue, Mar 17, 2026 at 8:53 AM Mark Adams <mfadams at lbl.gov> wrote:

> This tool works at the Mat level so it does not matter, but you need to
> work at the DM level anyway.
>
> I don't know of DM tools for this. Let Matt chime in.
>

Hi Miguel,

DMPlex can definitely do point location and load balancing, which is an
advantage.

A disadvantage is that you have to work harder to get structured variable
access, which DMDA has
automatically. Depending on your discretization on this grid, this can be
restored, but it is some work.

What does the algorithm look like?

  Thanks,

      Matt


> On Tue, Mar 17, 2026 at 7:54 AM Miguel Molinos <m.molinos at upm.es> wrote:
>
>> Thank you Mark. Indeed my question was oriented towards the tools PETSc
>> can offer to address this problem. Currently I’m using DMDA, perhaps DMPlex
>> is more suited?
>>
>> Thanks,
>> Miguel
>>
>> On 17 Mar 2026, at 12:17, Mark Adams <mfadams at lbl.gov> wrote:
>>
>> Hi Miguel,
>>
>> This is a common problem. To my knowledge you need to deal with it on
>> your side, but PETSc does have tools to help.
>> I would look in the literature to get ideas and acclimate yourself to the
>> problem.
>>
>> The tool that PETSc has is you can repartition your mesh. Create a dual
>> graph (any petsc examples?) put weights on the "vertice" (now elements, use
>> particle count) and repartition.
>> I am not sure of the best way to proceed from here ... you could create a
>> new DMSwarm with this new "cell" DM and add particles from the old to the
>> new, delete the old, call DMMigrate on the new DMSwarm and that will do the
>> moving that we want.
>>
>> Others will probably be able to add to this.
>>
>> Thanks,
>> Mark
>>
>>
>> On Tue, Mar 17, 2026 at 4:49 AM Miguel Molinos <m.molinos at upm.es> wrote:
>>
>>>
>>> Dear all,
>>>
>>> I am currently running a PIC-like implementation based on DMSwarm and
>>> DMDA. In my setup, a particle (DMSwarm) discretization represents, for
>>> example, a sphere placed at the center of a box-shaped domain discretized
>>> with a regular mesh (DMDA).
>>>
>>> The implementation uses both MPI and OpenMP to accelerate the
>>> computations. The domain decomposition of the sphere follows a PIC-like
>>> approach: each particle is assigned to a rank based on the mesh element it
>>> belongs to, and thus inherits the rank that “owns” that element. This is
>>> essentially the same strategy used in one of the PETSc examples.
>>>
>>> However, I have observed that, for certain configurations, some ranks
>>> end up with very few or even no particles, which leads to load imbalance.
>>>
>>> Has anyone experienced a similar issue?
>>>
>>> Thanks,
>>> Miguel
>>>
>>
>>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YV8vir7FXIBN646nU7u5CIdRqTMT34FBX2a4QOZzsJsiHeLmYLan_Dj6YlA1fVpdf-obQxW90K9Yo6wesXZK$  <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YV8vir7FXIBN646nU7u5CIdRqTMT34FBX2a4QOZzsJsiHeLmYLan_Dj6YlA1fVpdf-obQxW90K9Yo46QTTAM$ >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20260317/322bbef5/attachment.html>


More information about the petsc-users mailing list