[petsc-dev] MPICH from --download-mpich reports inconsistent allocs/frees with valgrind

Patrick Sanan patrick.sanan at gmail.com
Fri Jul 26 04:10:59 CDT 2019


https://github.com/pmodels/mpich/issues/3945

Am Mo., 22. Juli 2019 um 20:24 Uhr schrieb Smith, Barry F. <
bsmith at mcs.anl.gov>:

>
>   Bug report to MPICH.
>
> > On Jul 22, 2019, at 1:22 PM, Balay, Satish via petsc-dev <
> petsc-dev at mcs.anl.gov> wrote:
> >
> > Hm - I don't think we were monitoring the leaks via valgrind that
> closely.
> >
> > Looking at my old mpich install - I don't see a problem - so likely
> > its an issue with newer versions of mpich.
> >
> > Satish
> >
> > -------
> > balay at sb /home/balay/tmp
> > $ mpichversion
> > MPICH Version:        3.3
> > MPICH Release date:   Wed Nov 21 11:32:40 CST 2018
> > MPICH Device:         ch3:sock
> > MPICH configure:      --prefix=/home/balay/soft/mpich-3.3
> MAKE=/usr/bin/gmake --libdir=/home/balay/soft/mpich-3.3/lib CC=gcc
> CFLAGS=-fPIC -g -O AR=/usr/bin/ar ARFLAGS=cr CXX=g++ CXXFLAGS=-g -O -fPIC
> F77=gfortran FFLAGS=-fPIC -g -O FC=gfortran FCFLAGS=-fPIC -g -O
> --enable-shared --with-device=ch3:sock --with-pm=hydra --enable-fast=no
> --enable-error-messages=all --enable-g=meminit
> > MPICH CC:     gcc -fPIC -g -O   -O0
> > MPICH CXX:    g++ -g -O -fPIC  -O0
> > MPICH F77:    gfortran -fPIC -g -O  -O0
> > MPICH FC:     gfortran -fPIC -g -O  -O0
> > MPICH Custom Information:
> > balay at sb /home/balay/tmp
> > $ printf "#include<mpi.h>\nint main(int
> a,char**b){MPI_Init(&a,&b);MPI_Finalize();}" > t.c && mpicc t.c && valgrind
> ./a.out
> > ==9024== Memcheck, a memory error detector
> > ==9024== Copyright (C) 2002-2017, and GNU GPL'd, by Julian Seward et al.
> > ==9024== Using Valgrind-3.15.0 and LibVEX; rerun with -h for copyright
> info
> > ==9024== Command: ./a.out
> > ==9024==
> > ==9024==
> > ==9024== HEAP SUMMARY:
> > ==9024==     in use at exit: 0 bytes in 0 blocks
> > ==9024==   total heap usage: 1,886 allocs, 1,886 frees, 4,884,751 bytes
> allocated
> > ==9024==
> > ==9024== All heap blocks were freed -- no leaks are possible
> > ==9024==
> > ==9024== For lists of detected and suppressed errors, rerun with: -s
> > ==9024== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 0 from 0)
> > balay at sb /home/balay/tmp
> >
> >
> >
> > On Mon, 22 Jul 2019, Patrick Sanan via petsc-dev wrote:
> >
> >> It was pointed out to me that valgrind memcheck reports inconsistent
> heap
> >> usage information when running PETSc examples. All blocks are reported
> >> freed, yet the number of allocations and frees are different. My guess
> as
> >> to what's going on is that this is an MPICH issue, as I can reproduce
> the
> >> behavior with a minimal MPI program.
> >>
> >>> From the PETSc perspective, is this a known issue?  I'm wondering if
> this
> >> inconsistency was always there, whether it's worth looking into more,
> etc.
> >>
> >> Here's a 1-liner to reproduce, using a PETSc master build with
> >> --download-mpich (though note that this doesn't use anything from PETSc
> >> except the MPICH it builds for you).
> >>
> >>     printf "#include<mpi.h>\nint main(int a,char
> >> **b){MPI_Init(&a,&b);MPI_Finalize();}" > t.c &&
> >> $PETSC_DIR/$PETSC_ARCH/bin/mpicc t.c && valgrind ./a.out
> >>
> >> ==14242== Memcheck, a memory error detector
> >> ==14242== Copyright (C) 2002-2017, and GNU GPL'd, by Julian Seward et
> al.
> >> ==14242== Using Valgrind-3.13.0 and LibVEX; rerun with -h for copyright
> info
> >> ==14242== Command: ./a.out
> >> ==14242==
> >> ==14242==
> >> ==14242== HEAP SUMMARY:
> >> ==14242==     in use at exit: 0 bytes in 0 blocks
> >> ==14242==   total heap usage: *1,979 allocs, 1,974 frees*, 4,720,483
> bytes
> >> allocated
> >> ==14242==
> >> ==14242== All heap blocks were freed -- no leaks are possible
> >> ==14242==
> >> ==14242== For counts of detected and suppressed errors, rerun with: -v
> >> ==14242== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 0 from 0)
> >>
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190726/2f7e150f/attachment.html>


More information about the petsc-dev mailing list