[petsc-users] Request for Assistance with Parallel Matrix Solving Using petsc4py in EasyFEA
Matthieu Noel
matthieu.noel at inria.fr
Sat Mar 21 05:01:40 CDT 2026
Sorry for my late response.
I found out how to solve the issue. I had to reorder the DOF ordering to
ensure the matrix system is continuous:
https://urldefense.us/v3/__https://github.com/matnoel/EasyFEA/blob/dev/EasyFEA/Simulations/Solvers.py*L523-L618__;Iw!!G_uCfscf7eWS!ZRIBid8MQtxlDGuOHlzwyX677gChgXqdcePniPOa8Samn9v3Sv_ru-0neC8JQXK74zKh9uO9IqyyE9LzQGl-haOxRB1bLFY$
Best regards,
Matthieu Noel
Le 11/03/2026 à 21:56, Barry Smith a écrit :
>
> Does it compute the “correct” result with one MPI process?
>
>> On Mar 10, 2026, at 4:29 AM, Matthieu Noel <matthieu.noel at inria.fr>
>> wrote:
>>
>> Dear PETSc users,
>>
>> My name is Matthieu Noel, and I am a research software engineer at INRIA.
>>
>> During my PhD thesis, I developed *EasyFEA*, an open-source finite
>> element analysis tool documented here:
>> https://urldefense.us/v3/__https://easyfea.readthedocs.io/en/stable/__;!!G_uCfscf7eWS!ZRIBid8MQtxlDGuOHlzwyX677gChgXqdcePniPOa8Samn9v3Sv_ru-0neC8JQXK74zKh9uO9IqyyE9LzQGl-haOxE6AgvRk$ .
>>
>> I am currently working on *Issue #26*
>> (https://urldefense.us/v3/__https://github.com/matnoel/EasyFEA/issues/26__;!!G_uCfscf7eWS!ZRIBid8MQtxlDGuOHlzwyX677gChgXqdcePniPOa8Samn9v3Sv_ru-0neC8JQXK74zKh9uO9IqyyE9LzQGl-haOxEX807cA$ ), where I aim to use
>> *petsc4py* to solve a simple stationary thermal problem in parallel.
>>
>> The main challenge is *constructing and solving the parallel matrix
>> system* (|Ax = b|).
>>
>>
>> Current Status:
>>
>> * *Mesh partitioning* has been successfully performed using *GMSH*.
>> * EasyFEA provides the *assembled matrix system* (|Ax = b|) with
>> applied boundary conditions.
>> * My current implementation using *petsc4py* runs without errors,
>> but the results obtained are incorrect.
>>
>> <DbhO7xbQSISvapJk.png>
>> I suspect the problem lies in how the *parallel matrix and vectors
>> are assembled or solved*, particularly in handling *ghost DOFs* and
>> ensuring proper communication between MPI ranks.
>>
>> I would be extremely grateful for any guidance, resources, or
>> examples you could share to help me address this issue.
>> My current development branch is available here:
>> https://urldefense.us/v3/__https://github.com/matnoel/EasyFEA/tree/26_mpi__;!!G_uCfscf7eWS!ZRIBid8MQtxlDGuOHlzwyX677gChgXqdcePniPOa8Samn9v3Sv_ru-0neC8JQXK74zKh9uO9IqyyE9LzQGl-haOxHLPPL0E$ .
>>
>> The PETSc function can be found at this link:
>> https://urldefense.us/v3/__https://github.com/matnoel/EasyFEA/blob/26_mpi/EasyFEA/Simulations/Solvers.py*L649-L767__;Iw!!G_uCfscf7eWS!ZRIBid8MQtxlDGuOHlzwyX677gChgXqdcePniPOa8Samn9v3Sv_ru-0neC8JQXK74zKh9uO9IqyyE9LzQGl-haOxwl_Zt5w$
>>
>> I am currently running the script in the attached files.
>>
>>
>> Thank you in advance for your time and expertise.
>>
>> Best regards,
>> Matthieu Noel
>>
>>
>>
>> <Thermal1.py>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20260321/60bf1f4b/attachment.html>
More information about the petsc-users
mailing list