[petsc-users] Inquiry about the definitely lost memory

Barry Smith bsmith at petsc.dev
Fri Jun 9 16:36:39 CDT 2023


   If the program is not completely successful, that is it terminates early due to an error condition, we do not attempt to recover all the memory and resources.

> On Jun 9, 2023, at 4:33 PM, neil liu <liufield at gmail.com> wrote:
> 
> Thanks a lot,  Matt and Barry.  
> 
> Indeed, I found the original leak that will lead to something related to DMGetWorkArray. 
> ==15547== 50,000 bytes in 1 blocks are definitely lost in loss record 2,786 of 2,791
> ==15547==    at 0x4C37135: malloc (vg_replace_malloc.c:381)
> ==15547==    by 0x9BE4E43: MPL_malloc (mpl_trmem.h:373)
> ==15547==    by 0x9BE6B3B: PMIU_cmd_add_int (pmi_wire.c:538)
> ==15547==    by 0x9BEB7C6: PMIU_msg_set_query_abort (pmi_msg.c:322)
> ==15547==    by 0x9BE16F0: PMI_Abort (pmi_v1.c:327)
> ==15547==    by 0x9A8E3E7: MPIR_pmi_abort (mpir_pmi.c:243)
> ==15547==    by 0x9B20BC7: MPID_Abort (mpid_abort.c:67)
> ==15547==    by 0x9A22823: MPIR_Abort_impl (init_impl.c:270)
> ==15547==    by 0x97FFF02: internal_Abort (abort.c:65)
> ==15547==    by 0x98000C3: PMPI_Abort (abort.c:112)
> ==15547==    by 0x58FE116: PetscError (err.c:403)
> ==15547==    by 0x410CA3: main (ex1.c:764) //Call DMDestroy();
> 
> On Fri, Jun 9, 2023 at 12:38 PM Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> wrote:
>> 
>>   This are MPI objects PETSc creates in  PetscInitialize(), for successful runs they should all be removed in PETSc finalize, hence they should not appear as valgrind links.
>> 
>>   Are you sure PetscFinalize() is called and completes?
>> 
>>   We'll need the exact PETSc version you are using to know exactly which MPI object is not being destroyed.
>> 
>> 
>>   Barry
>> 
>> 
>> > On Jun 9, 2023, at 12:01 PM, neil liu <liufield at gmail.com <mailto:liufield at gmail.com>> wrote:
>> > 
>> > Dear Petsc developers, 
>> > 
>> > I am using valgrind to check the memory leak. It shows, 
>> > <image.png>
>> > Finally, I found that DMPlexrestoretrasitiveclosure can resolve this memory leak. 
>> > 
>> > My question is from the above screen shot, it seems the leak is related to MPI. How can I relate that reminder to DMPlexrestoretrasitiveclosure ?
>> > 
>> > Thanks, 
>> > 
>> > Xiaodong 
>> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230609/39387597/attachment-0001.html>


More information about the petsc-users mailing list