[MOAB-dev] Help for migration of cells between MPI subdomains with moab

Olivier Jamond olivier.jamond at cea.fr
Fri Jun 11 10:57:49 CDT 2021


Hi Vijay and Iulian,

Thanks for your answers. I attach my simple toy code to this mail. 
Please tell me if you get it this time!

I just looked at the iMesh interface. Do you carry out migration using 
the hdds with the write/load_mesh functions?

In some of our cases, we plan to do quite 'high frequency' load 
balancing, mainly for fast transient explicit dynamic simulations. In 
this plan, we balance the mesh quite often (using Zoltan also), but at 
each rebalancing, the set of cells which have to be migrated is a small 
part of the cells involved in the calculation.

We do not use the iMesh/ITAPS interface, so no problem with that! In 
fact we wrap in our 'Mesh' class a moab::Core instance and a 
moab::ParallelComm instance.

Thanks a lot for your help,
Best regards,
Olivier

On 11/06/2021 17:19, Vijay S. Mahadevan wrote:
> Hi Olivier,
>
> As Iulian mentioned, there is a lot of infrastructure written for our
> Climate workflows to migrate a mesh from a process set A to a process
> set B, where A and B can have overlaps or can be completely disjoint.
> And during the process of migration, we can do repartitioning as well
> since typically n(A) != n(B). Currently, this functionality is
> available and heavily tested with iMOAB interface as it is consumed by
> a Fortran90 based code. However, if you are interested in this for
> arbitrary meshes, we can work with you to update that interface more
> generally to support all dimensions. Note that this iMOAB migration
> implementation currently works for all unstructured meshes on a sphere
> (so the dimension is hard-coded in places).
>
> I suggest you not use the iMesh/ITAPS framework as this is a legacy
> interface and we are not too keen on supporting it going forward.
>
> Thanks,
> Vijay
>
> On Fri, Jun 11, 2021 at 10:52 AM Grindeanu, Iulian R.
> <iulian at mcs.anl.gov> wrote:
>> Hello Olivier,
>> Migration of meshes with ParallelComm methods is not a well tested piece of code, it is rather old;
>> Your attachment did not go through, can you send it again to me directly?
>>
>> I would be happy to assist; In principle, the data needs to be packed, then sent/received using MPI , then unpacked;
>> We do use the packing and unpacking quite extensively in currently tested code; we prefer now the migration of full meshes from a set of processes to another set, while repartitioning on the fly using Zoltan based methods (more exactly using iMOAB newer interface)
>>
>> I am pretty sure ParallelComm methods will work with minor fixes.
>>
>> We also support an iMesh / ITAPS based framework, which could assist in migration of elements, but that is just another layer / API ; the heavy duty work is still performed by ParallelComm
>>
>> Thanks,
>> Iulian
>>
>> ________________________________________
>> From: moab-dev <moab-dev-bounces at mcs.anl.gov> on behalf of Olivier Jamond <olivier.jamond at cea.fr>
>> Sent: Thursday, June 10, 2021 7:32 AM
>> To: moab-dev at mcs.anl.gov
>> Subject: [MOAB-dev] Help for migration of cells between MPI subdomains with moab
>>
>> Dear Moab developers,
>>
>> I am working for the french CEA
>> (https://en.wikipedia.org/wiki/French_Alternative_Energies_and_Atomic_Energy_Commission)
>> on the development of a new software dedicated to HPC simulations of
>> various mechanical scenarii (structures, fluid, FSI, other couplings,
>> ...) using mesh-based methods (finite-elements, finite-volumes,
>> hybrid-discontinuous-galerkin, ...).
>>
>> This new code uses moab to manage the topological connections between
>> geometrical entities in a distributed memory context.
>>
>> For several reasons, especially for dynamic load balancing, we will need
>> to be able to migrate cells between MPI processes during calculations.
>> But at this time I am quite stuck with this with moab... So I wonder if
>> maybe you could help me with that...
>>
>> I struggled for some time with the moab::ParallelComm class to try to
>> migrate a cell from a process to another one, but without success. I
>> attached to this email a very simple toy piece of code which illustrates
>> that. In this simple program, I construct a basic mesh with 2 quads on
>> the proc0, and I would like to migrate one of these quads to proc1, and
>> then migrate it back to proc0. As I wrote in some comments in this code,
>> I tried using different functions ('high-level' send/recv_entities,
>> send_recv_entities, 'low-level' pack/unpack_entities), but without
>> success yet...
>>
>> I would really appreciate if you could help me with that and maybe take
>> a look at the attached simple code.
>>
>> Best regards,
>> Olivier Jamond
>>
>>


More information about the moab-dev mailing list