[mpich2-dev] nesting level warning with MPI_File_preallocate(fh, 0)

Rajeev Thakur thakur at mcs.anl.gov
Tue Dec 9 15:02:48 CST 2008


I guess that's because you have built a version with memory tracing turned
on.

Rajeev 

> -----Original Message-----
> From: mpich2-dev-bounces at mcs.anl.gov 
> [mailto:mpich2-dev-bounces at mcs.anl.gov] On Behalf Of Lisandro Dalcin
> Sent: Tuesday, December 09, 2008 3:01 PM
> To: mpich2-dev at mcs.anl.gov
> Subject: Re: [mpich2-dev] nesting level warning with 
> MPI_File_preallocate(fh,0)
> 
> Thanks, Rajeev. I'm not using 1.1 yet for production, just wanted to
> point the issue.
> 
> BTW, here you have another problem I'm experiencing. I get the
> following warning (see last line). Sorry again for not diving into the
> sources and providing the patch. I'm really busy with the next
> PETSc/SLEPc releases, and my personal mpi&python-related projects.
> 
> 
> $ cat testfile.c
> #include <mpi.h>
> int main( int argc, char ** argv ) {
>   MPI_File fh;
>   int amode = MPI_MODE_RDWR | MPI_MODE_CREATE | 
> MPI_MODE_DELETE_ON_CLOSE;
>   MPI_Init(&argc, &argv);
>   MPI_File_open(MPI_COMM_WORLD, "/tmp/datafile",
> 		amode, MPI_INFO_NULL, &fh);
>   MPI_File_close(&fh);
>   MPI_Finalize();
>   return 0;
> }
> 
> $ mpicc testfile.c
> 
> $ ./a.out
> In direct memory block for handle type ATTRIBUTE KEY, 2 handles are
> still allocated
> 
> 
> 
> 
> 
> On Tue, Dec 9, 2008 at 6:43 PM, Rajeev Thakur 
> <thakur at mcs.anl.gov> wrote:
> > In src/mpi/romio/mpi-io/prealloc.c, change line 77
> >      if (size == 0) return MPI_SUCCESS;
> > to
> >      if (size == 0) goto fn_exit;
> >
> > That should fix it.
> >
> > Rajeev
> >
> >
> >> -----Original Message-----
> >> From: mpich2-dev-bounces at mcs.anl.gov
> >> [mailto:mpich2-dev-bounces at mcs.anl.gov] On Behalf Of 
> Lisandro Dalcin
> >> Sent: Tuesday, December 09, 2008 11:28 AM
> >> To: mpich2-dev at mcs.anl.gov
> >> Subject: [mpich2-dev] nesting level warning with
> >> MPI_File_preallocate(fh, 0)
> >>
> >> Consider the following Python snipet (sorry, really busy to write
> >> C/C++, I used mpi4py)
> >>
> >> from mpi4py import MPI
> >>
> >> amode = MPI.MODE_RDWR | MPI.MODE_CREATE | MPI.MODE_DELETE_ON_CLOSE
> >>
> >> fh = MPI.File.Open(MPI.COMM_WORLD,
> >>                    '/tmp/datafile', amode,
> >>                    MPI.INFO_NULL)
> >>
> >> N = 0
> >> fh.Preallocate(N)
> >> size = fh.Get_size()
> >> assert size == N
> >>
> >> fh.Close()
> >> print 'Bye!!!'
> >>
> >>
> >> Then, when I run the code (1.0.8 and --enable-g=all), I get
> >> the following output
> >>
> >> Bye!!!
> >> Unexpected value for nesting level = 1
> >> Nest stack is:
> >>       [0] :0
> >>
> >> So the warning is likely emitted at MPI_Finalize().
> >>
> >> If I try to preallocate sizes larger than 0, then the warning
> >> does not appear.
> >>
> >>
> >>
> >>
> >> --
> >> Lisandro Dalcín
> >> ---------------
> >> Centro Internacional de Métodos Computacionales en 
> Ingeniería (CIMEC)
> >> Instituto de Desarrollo Tecnológico para la Industria 
> Química (INTEC)
> >> Consejo Nacional de Investigaciones Científicas y Técnicas 
> (CONICET)
> >> PTLC - Güemes 3450, (3000) Santa Fe, Argentina
> >> Tel/Fax: +54-(0)342-451.1594
> >>
> >
> >
> 
> 
> 
> -- 
> Lisandro Dalcín
> ---------------
> Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
> Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
> Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
> PTLC - Güemes 3450, (3000) Santa Fe, Argentina
> Tel/Fax: +54-(0)342-451.1594
> 




More information about the mpich2-dev mailing list