<html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head><body style="overflow-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;"><div><br></div> If the program is not completely successful, that is it terminates early due to an error condition, we do not attempt to recover all the memory and resources.<br><div><br><blockquote type="cite"><div>On Jun 9, 2023, at 4:33 PM, neil liu <liufield@gmail.com> wrote:</div><br class="Apple-interchange-newline"><div><div dir="ltr">Thanks a lot, Matt and Barry. <div><br><div>Indeed, I found the original leak that will lead to something related to DMGetWorkArray. </div><div>==15547== 50,000 bytes in 1 blocks are definitely lost in loss record 2,786 of 2,791<br>==15547== at 0x4C37135: malloc (vg_replace_malloc.c:381)<br>==15547== by 0x9BE4E43: MPL_malloc (mpl_trmem.h:373)<br>==15547== by 0x9BE6B3B: PMIU_cmd_add_int (pmi_wire.c:538)<br>==15547== by 0x9BEB7C6: PMIU_msg_set_query_abort (pmi_msg.c:322)<br>==15547== by 0x9BE16F0: PMI_Abort (pmi_v1.c:327)<br>==15547== by 0x9A8E3E7: MPIR_pmi_abort (mpir_pmi.c:243)<br>==15547== by 0x9B20BC7: MPID_Abort (mpid_abort.c:67)<br>==15547== by 0x9A22823: MPIR_Abort_impl (init_impl.c:270)<br>==15547== by 0x97FFF02: internal_Abort (abort.c:65)<br>==15547== by 0x98000C3: PMPI_Abort (abort.c:112)<br>==15547== by 0x58FE116: PetscError (err.c:403)<br>==15547== by 0x410CA3: main (ex1.c:764) //Call DMDestroy();<br></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Jun 9, 2023 at 12:38 PM Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>
This are MPI objects PETSc creates in PetscInitialize(), for successful runs they should all be removed in PETSc finalize, hence they should not appear as valgrind links.<br>
<br>
Are you sure PetscFinalize() is called and completes?<br>
<br>
We'll need the exact PETSc version you are using to know exactly which MPI object is not being destroyed.<br>
<br>
<br>
Barry<br>
<br>
<br>
> On Jun 9, 2023, at 12:01 PM, neil liu <<a href="mailto:liufield@gmail.com" target="_blank">liufield@gmail.com</a>> wrote:<br>
> <br>
> Dear Petsc developers, <br>
> <br>
> I am using valgrind to check the memory leak. It shows, <br>
> <image.png><br>
> Finally, I found that DMPlexrestoretrasitiveclosure can resolve this memory leak. <br>
> <br>
> My question is from the above screen shot, it seems the leak is related to MPI. How can I relate that reminder to DMPlexrestoretrasitiveclosure ?<br>
> <br>
> Thanks, <br>
> <br>
> Xiaodong <br>
<br>
</blockquote></div>
</div></blockquote></div><br></body></html>