<div dir="ltr">If you have a vector distributed on the original mesh, then you can use the SF returned by DMPlexGetGatherDM and use that in a call to DMPlexDistributeField</div><br><div class="gmail_quote gmail_quote_container"><div dir="ltr" class="gmail_attr">Il giorno ven 18 apr 2025 alle ore 17:02 neil liu <<a href="mailto:liufield@gmail.com">liufield@gmail.com</a>> ha scritto:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><p>Dear PETSc developers and users,</p>
<p>I am currently exploring the integration of MMG3D with PETSc. Since MMG3D supports only serial execution, I am planning to combine parallel and serial computing in my workflow. Specifically, after solving the linear systems in parallel using PETSc:</p>
<ol>
<li>
<p>I intend to use <code>DMPlexGetGatherDM</code> to collect the entire mesh on the root process for input to MMG3D.</p>
</li>
<li>
<p>Additionally, I plan to gather the error field onto the root process using <code>VecScatter</code>.</p>
</li>
</ol>
<p>However, I am concerned that the nth value in the gathered error vector (step 2) may not correspond to the nth element in the gathered mesh (step 1). Is this a valid concern?</p>
<p>Do you have any suggestions or recommended practices for ensuring correct correspondence between the solution fields and the mesh when switching from parallel to serial mode?</p><div><br></div><div>Thanks, </div><div><br></div><div>Xiaodong </div></div></div>
</blockquote></div><div><br clear="all"></div><div><br></div><span class="gmail_signature_prefix">-- </span><br><div dir="ltr" class="gmail_signature">Stefano</div>