[petsc-users] Error in VecAssemeblyBegin after VecView

Satish Balay balay at mcs.anl.gov
Fri Jan 6 08:35:00 CST 2012


These messages are from within mpi. If you wish to have a valgrind
clean mpi - build one with --download-mpich=1. [use a different
PETSC_ARCH for this build]

Satish

On Fri, 6 Jan 2012, Johannes.Huber at unibas.ch wrote:

> Quoting Jed Brown <jedbrown at mcs.anl.gov>:
> 
> > On Thu, Jan 5, 2012 at 02:39, <Johannes.Huber at unibas.ch> wrote:
> > 
> > > When I take a look to the valgrind output, I see a few lines about
> > > unaddressable bytes. Is this a reason to concern?
> > > 
> > 
> > Please paste lines for questions like this.
> > 
> Here are the lines:
> 
> ==22507== Unaddressable byte(s) found during client check request
> ==22507==    at 0x4E34BFD: check_mem_is_defined_untyped (libmpiwrap.c:953)
> ==22507==    by 0x4E499AA: walk_type (libmpiwrap.c:691)
> ==22507==    by 0x4E4D7A3: PMPI_Allreduce (libmpiwrap.c:924)
> ==22507==    by 0x74846D0: MPIR_Get_contextid (in /usr/lib/libmpich.so.1.2)
> ==22507==    by 0x7484CF1: MPIR_Comm_copy (in /usr/lib/libmpich.so.1.2)
> ==22507==    by 0x747D280: PMPI_Comm_dup (in /usr/lib/libmpich.so.1.2)
> ==22507==    by 0x4E48C4A: PMPI_Comm_dup (libmpiwrap.c:2110)
> ==22507==    by 0x586878D: PetscCommDuplicate (tagm.c:149)
> ==22507==    by 0x54718AD: PetscHeaderCreate_Private (inherit.c:51)
> ==22507==    by 0x5891130: VecCreate (veccreate.c:39)
> ==22507==    by 0x400E3A: main (Test.C:10)
> ==22507==  Address 0xffffffffffffffff is not stack'd, malloc'd or (recently)
> free'd
> 
> and
> 
> ==22508== Unaddressable byte(s) found during client check request
> ==22508==    at 0x4E34BFD: check_mem_is_defined_untyped (libmpiwrap.c:953)
> ==22508==    by 0x4E499AA: walk_type (libmpiwrap.c:691)
> ==22508==    by 0x4E4D7A3: PMPI_Allreduce (libmpiwrap.c:924)
> ==22508==    by 0x74846D0: MPIR_Get_contextid (in /usr/lib/libmpich.so.1.2)
> ==22508==    by 0x7484CF1: MPIR_Comm_copy (in /usr/lib/libmpich.so.1.2)
> ==22508==    by 0x747D280: PMPI_Comm_dup (in /usr/lib/libmpich.so.1.2)
> ==22508==    by 0x4E48C4A: PMPI_Comm_dup (libmpiwrap.c:2110)
> ==22508==    by 0x586878D: PetscCommDuplicate (tagm.c:149)
> ==22508==    by 0x54718AD: PetscHeaderCreate_Private (inherit.c:51)
> ==22508==    by 0x5891130: VecCreate (veccreate.c:39)
> ==22508==    by 0x400E3A: main (Test.C:10)
> ==22508==  Address 0xffffffffffffffff is not stack'd, malloc'd or (recently)
> free'd
> ==22508==
> --22508-- REDIR: 0x744f8c0 (PMPI_Attr_put) redirected to 0x4e46b46
> (PMPI_Attr_put)
> 
> ...
> 
> --22508-- REDIR: 0x74df090 (PMPI_Recv) redirected to 0x4e4e4b4 (PMPI_Recv)
> ==22508== Uninitialised byte(s) found during client check request
> ==22508==    at 0x4E49738: PMPI_Get_count (libmpiwrap.c:953)
> ==22508==    by 0x4E4E704: PMPI_Recv (libmpiwrap.c:419)
> ==22508==    by 0x56BE525: VecView_MPI_ASCII (pdvec.c:78)
> ==22508==    by 0x56C1BDA: VecView_MPI (pdvec.c:837)
> ==22508==    by 0x58A9C9D: VecView (vector.c:746)
> ==22508==    by 0x400EEA: main (Test.C:21)
> ==22508==  Address 0x7fefffe30 is on thread 1's stack
> ==22508==  Uninitialised value was created by a stack allocation
> ==22508==    at 0x56BD490: VecView_MPI_ASCII (pdvec.c:36)
> ==22508==
> ==22508== Uninitialised byte(s) found during client check request
> ==22508==    at 0x4E49738: PMPI_Get_count (libmpiwrap.c:953)
> ==22508==    by 0x56BE57F: VecView_MPI_ASCII (pdvec.c:79)
> ==22508==    by 0x56C1BDA: VecView_MPI (pdvec.c:837)
> ==22508==    by 0x58A9C9D: VecView (vector.c:746)
> ==22508==    by 0x400EEA: main (Test.C:21)
> ==22508==  Address 0x7fefffe30 is on thread 1's stack
> ==22508==  Uninitialised value was created by a stack allocation
> ==22508==    at 0x56BD490: VecView_MPI_ASCII (pdvec.c:36)
> 
> > 
> > > BTW: I also see memory leaks from getpwuid. Does anybody know about a
> > > patch for this?
> > > 
> > 
> > I think this is a libc issue.
> > 
> 
> 
> 
> ----------------------------------------------------------------
> This message was sent using IMP, the Internet Messaging Program.
> 
> 



More information about the petsc-users mailing list