[petsc-dev] Valgrind defect: memory leak with PetscCommDuplicate?
Boyce Griffith
griffith at cims.nyu.edu
Fri Feb 6 13:13:00 CST 2015
> On Feb 6, 2015, at 2:04 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>
> If you don't see any memory leaks directly from PETSc routines then it is a problem in OpenMPI.
OpenMPI intentionally does stuff that is not Valgrind clean:
http://www.open-mpi.org/faq/?category=debugging#valgrind_clean
I've personally had mixed success with the OpenMPI-provided suppression file in the past, but I haven't tried it in a while.
-- Boyce
>
> We've found that MPICH seems to have less memory leaks than OpenMPI.
>
> Barry
>
>> On Feb 6, 2015, at 8:52 AM, Brendan Kochunas <bkochuna at umich.edu> wrote:
>>
>> Hi we are trying to clear valgrind defects from our code and presently we are valgrind is reporting memory leaks like the following:
>>
>> ==2884== at 0x4A07EB7: malloc (vg_replace_malloc.c:296)
>> ==2884== by 0x8519874: set_value.isra.0.part.1 (in /gcc-4.6.1/toolset/openmpi-1.4.3/lib/libmpi.so.0.0.2)
>> ==2884== by 0x8547E4D: PMPI_Attr_put (in /gcc-4.6.1/toolset/openmpi-1.4.3/lib/libmpi.so.0.0.2)
>> ==2884== by 0x113A153: PetscCommDuplicate
>> ==2884== by 0x113BFA3: PetscHeaderCreate_Private
>> ==2884== by 0x129ADC6: MatCreate
>> ==2884== by 0x123C355: MatMPIAIJSetPreallocation_MPIAIJ
>> ==2884== by 0x124DAB4: MatMPIAIJSetPreallocation
>> ==2884== by 0x12549B0: MatSetUp_MPIAIJ
>> ==2884== by 0x1191186: MatSetUp
>> ==2884== by 0x10D9333: matsetup_
>>
>> and...
>>
>> ==2884== at 0x4A07EB7: malloc (vg_replace_malloc.c:296)
>> ==2884== by 0x8519874: set_value.isra.0.part.1 (in /gcc-4.6.1/toolset/openmpi-1.4.3/lib/libmpi.so.0.0.2)
>> ==2884== by 0x8547E4D: PMPI_Attr_put (in /gcc-4.6.1/toolset/openmpi-1.4.3/lib/libmpi.so.0.0.2)
>> ==2884== by 0x113A22D: PetscCommDuplicate
>> ==2884== by 0x113BFA3: PetscHeaderCreate_Private
>> ==2884== by 0x10ED42B: KSPCreate
>> ==2884== by 0x10DA55C: kspcreate_
>>
>> Is the development team aware of any memory leaks that may be coming from PetscCommDuplicate as it may be used in the call stack shown above?
>>
>> The version of PETSc we are linking with is 3.3 patch 4. And this is built on OpenMPI v. 1.4.3.
>>
>> We are trying to determine if the leak is due to
>> 1. Our codes usage of PETSc
>> 2. The actual PETSc library
>> 3. PETSc's usage of MPI
>> 3. The OpenMPI library that PETSc was built against
>>
>> Any assistance in being able to point to the culprit or suggestions for particular tests (e.g. a PETSc example) worth trying to identify the root issue would be appreciated.
>>
>> Thanks in advance!
>> -Brendan
>>
More information about the petsc-dev
mailing list