<!DOCTYPE html>
<html style="scroll-behavior: smooth;">
<head>
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
</head>
<body>
<p>Dear PETSc users,</p>
<p>My name is Matthieu Noel, and I am a research software engineer
at INRIA.<br>
<br>
During my PhD thesis, I developed <strong>EasyFEA</strong>, an
open-source finite element analysis tool documented here: <a class="moz-txt-link-freetext" href="https://urldefense.us/v3/__https://easyfea.readthedocs.io/en/stable/__;!!G_uCfscf7eWS!flSI4RbIKcgCgsNyOejSxJzr9vPWftTgzTVo2hSxHxa1AjYCAMppGknY5M5EGs9muBjuX4s8jfd0PCpQUkoOjKZWz9tvihE$">https://easyfea.readthedocs.io/en/stable/</a>.</p>
<p>I am currently working on <strong>Issue #26</strong> (<a class="moz-txt-link-freetext" href="https://urldefense.us/v3/__https://github.com/matnoel/EasyFEA/issues/26__;!!G_uCfscf7eWS!flSI4RbIKcgCgsNyOejSxJzr9vPWftTgzTVo2hSxHxa1AjYCAMppGknY5M5EGs9muBjuX4s8jfd0PCpQUkoOjKZWmhF-Su0$">https://github.com/matnoel/EasyFEA/issues/26</a>),
where I aim to use <strong>petsc4py</strong> to solve a simple
stationary thermal problem in parallel.<br>
<br>
The main challenge is <strong>constructing and solving the
parallel matrix system</strong> (<code
class="font-weight-400 bg-state-ghost-hover rounded-md p-1 text-sm whitespace-normal"
data-testid="code-block">Ax = b</code>).</p>
<h3>Current Status:</h3>
<ul>
<li><strong>Mesh partitioning</strong> has been successfully
performed using <strong>GMSH</strong>.</li>
<li>EasyFEA provides the <strong>assembled matrix system</strong>
(<code
class="font-weight-400 bg-state-ghost-hover rounded-md p-1 text-sm whitespace-normal"
data-testid="code-block">Ax = b</code>) with applied boundary
conditions.</li>
<li>My current implementation using <strong>petsc4py</strong>
runs without errors, but the results obtained are incorrect.</li>
</ul>
<p><img src="cid:part1.I6q4VcHx.fv30lC2y@inria.fr" alt=""><br>
I suspect the problem lies in how the <strong>parallel matrix and
vectors are assembled or solved</strong>, particularly in
handling <strong>ghost DOFs</strong> and ensuring proper
communication between MPI ranks.</p>
<p>I would be extremely grateful for any guidance, resources, or
examples you could share to help me address this issue.<br>
My current development branch is available here:
<a class="moz-txt-link-freetext" href="https://urldefense.us/v3/__https://github.com/matnoel/EasyFEA/tree/26_mpi__;!!G_uCfscf7eWS!flSI4RbIKcgCgsNyOejSxJzr9vPWftTgzTVo2hSxHxa1AjYCAMppGknY5M5EGs9muBjuX4s8jfd0PCpQUkoOjKZWW8q1-8I$">https://github.com/matnoel/EasyFEA/tree/26_mpi</a>.</p>
<p>The PETSc function can be found at this link:
<a class="moz-txt-link-freetext" href="https://urldefense.us/v3/__https://github.com/matnoel/EasyFEA/blob/26_mpi/EasyFEA/Simulations/Solvers.py*L649-L767__;Iw!!G_uCfscf7eWS!flSI4RbIKcgCgsNyOejSxJzr9vPWftTgzTVo2hSxHxa1AjYCAMppGknY5M5EGs9muBjuX4s8jfd0PCpQUkoOjKZWlmD8DH8$">https://github.com/matnoel/EasyFEA/blob/26_mpi/EasyFEA/Simulations/Solvers.py#L649-L767</a></p>
<p>I am currently running the script in the attached files.</p>
<p><br>
Thank you in advance for your time and expertise.<br>
<br>
</p>
<p>Best regards,<br>
Matthieu Noel<br>
<br>
<br>
</p>
<p><br>
</p>
</body>
</html>