[petsc-users] Mixing PETSc Parallelism with Serial MMG3D Workflow
neil liu
liufield at gmail.com
Wed Apr 23 11:56:38 CDT 2025
Thanks a lot. Pierre.
Do you have any suggestions to build a real serial DM from this gatherDM?
I tried several ways, which don't work.
DMClone?
Thanks,
On Wed, Apr 23, 2025 at 11:39 AM Pierre Jolivet <pierre at joliv.et> wrote:
>
>
> On 23 Apr 2025, at 5:31 PM, neil liu <liufield at gmail.com> wrote:
>
> Thanks a lot, Stefano.
> I tried DMPlexGetGatherDM and DMPlexDistributeField. It can give what we
> expected.
> The final gatherDM is listed as follows, rank 0 has all information (which
> is right) while rank 1 has nothing.
> Then I tried to feed this gatherDM into adaptMMG on rank 0 only (it seems
> MMG works better than ParMMG, that is why I want MMG to be tried first).
> But it was stuck at collective petsc functions in DMAdaptMetric_Mmg_Plex().
> By the way, the present work can work well with 1 rank.
>
> Do you have any suggestions ? Build a real serial DM?
>
>
> Yes, you need to change the underlying MPI_Comm as well, but I’m not sure
> if there is any user-facing API for doing this with a one-liner.
>
> Thanks,
> Pierre
>
> Thanks a lot.
> Xiaodong
>
> DM Object: Parallel Mesh 2 MPI processes
> type: plex
> Parallel Mesh in 3 dimensions:
> Number of 0-cells per rank: 56 0
> Number of 1-cells per rank: 289 0
> Number of 2-cells per rank: 452 0
> Number of 3-cells per rank: 216 0
> Labels:
> depth: 4 strata with value/size (0 (56), 1 (289), 2 (452), 3 (216))
> celltype: 4 strata with value/size (0 (56), 1 (289), 3 (452), 6 (216))
> Cell Sets: 2 strata with value/size (29 (152), 30 (64))
> Face Sets: 3 strata with value/size (27 (8), 28 (40), 101 (20))
> Edge Sets: 1 strata with value/size (10 (10))
> Vertex Sets: 5 strata with value/size (27 (2), 28 (6), 29 (2), 101 (4),
> 106 (4))
> Field Field_0:
> adjacency FEM
>
>
>
> On Fri, Apr 18, 2025 at 10:09 AM Stefano Zampini <
> stefano.zampini at gmail.com> wrote:
>
>> If you have a vector distributed on the original mesh, then you can use
>> the SF returned by DMPlexGetGatherDM and use that in a call to
>> DMPlexDistributeField
>>
>> Il giorno ven 18 apr 2025 alle ore 17:02 neil liu <liufield at gmail.com>
>> ha scritto:
>>
>>> Dear PETSc developers and users,
>>>
>>> I am currently exploring the integration of MMG3D with PETSc. Since
>>> MMG3D supports only serial execution, I am planning to combine parallel and
>>> serial computing in my workflow. Specifically, after solving the linear
>>> systems in parallel using PETSc:
>>>
>>> 1.
>>>
>>> I intend to use DMPlexGetGatherDM to collect the entire mesh on the
>>> root process for input to MMG3D.
>>> 2.
>>>
>>> Additionally, I plan to gather the error field onto the root process
>>> using VecScatter.
>>>
>>> However, I am concerned that the nth value in the gathered error vector
>>> (step 2) may not correspond to the nth element in the gathered mesh (step
>>> 1). Is this a valid concern?
>>>
>>> Do you have any suggestions or recommended practices for ensuring
>>> correct correspondence between the solution fields and the mesh when
>>> switching from parallel to serial mode?
>>>
>>> Thanks,
>>>
>>> Xiaodong
>>>
>>
>>
>> --
>> Stefano
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250423/900e2887/attachment-0001.html>
More information about the petsc-users
mailing list