[mpich-discuss] Fatal error in MPI_Test: Invalid MPI_Request

Gus Correa gus at ldeo.columbia.edu
Mon Feb 23 12:20:45 CST 2009


Hi Samir, list

You must launch your Ring program with mpiexec,
not just "[skhanal at comet ~]$ ./Ring" as you did.

Please, use full path for mpiexec also, to make sure the same
build of MPICH2 is being used all the way through.
This may not solve the problem, but will avoid a lot of confusion
to diagnose the real source of error.

Gus Correa

Samir Khanal wrote:
> Hi
> 
> [skhanal at comet ~]$ g++ -v
> Using built-in specs.
> Target: x86_64-redhat-linux
> Configured with: ../configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --enable-shared --enable-threads=posix --enable-checking=release --with-system-zlib --enable-__cxa_atexit --disable-libunwind-exceptions --enable-libgcj-multifile --enable-languages=c,c++,objc,obj-c++,java,fortran,ada --enable-java-awt=gtk --disable-dssi --enable-plugin --with-java-home=/usr/lib/jvm/java-1.4.2-gcj-1.4.2.0/jre --with-cpu=generic --host=x86_64-redhat-linux
> Thread model: posix
> gcc version 4.1.2 20071124 (Red Hat 4.1.2-42)
> [skhanal at comet ~]$ which mpicxx
> ~/mpich2/bin/mpicxx
> [skhanal at comet ~]$ which mpicc
> ~/mpich2/bin/mpicc
> [skhanal at comet ~]$ which mpiexec
> ~/mpich2/bin/mpiexec
> 
> i have installed all on my home directory
> 
> when i compile i do 
> [skhanal at comet ~]$ /home/skhanal/mpich2/bin/mpicxx -L /home/skhanal/bgtw/lib -lbgtw bgtwRingTest.cpp -o Ring
> 
> [skhanal at comet ~]$ ./Ring
> Fatal error in MPI_Test: Invalid MPI_Request, error stack:
> MPI_Test(152): MPI_Test(request=0x16ae9388, flag=0x7fff7a7599c4, status=0x7fff7a759930) failed
> MPI_Test(75).: Invalid MPI_Request
> 
> the library needs a mpi.h file to include, i gave /home/skhanal/mpich2/include/mpi.h as an absolute path.
> 
> any clues?
> 
> Thanks
> Samir
> 
> ________________________________________
> From: mpich-discuss-bounces at mcs.anl.gov [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Gus Correa [gus at ldeo.columbia.edu]
> Sent: Monday, February 23, 2009 12:32 PM
> To: Mpich Discuss
> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid MPI_Request
> 
> Hi Samir, list
> 
> I am wondering if the mpicxx and mpiexec you are using
> belong to the same MPICH2 build (considering the problems you
> reported before).
> 
> What is the output of "which mpicxx" and "which mpiexec"?
> 
> You may want to use full path names to mpicxx and mpiexec,
> as Anthony Chan recommended in another email.
> Problems with PATH and multiple versions and builds of MPI
> that hang around all Linux computers
> has been an endless source of frustration for many.
> I myself prefer to use full path names when I am testing
> MPI programs, to avoid any confusion and distress.
> 
> I hope this helps,
> Gus Correa
> ---------------------------------------------------------------------
> Gustavo Correa
> Lamont-Doherty Earth Observatory - Columbia University
> Palisades, NY, 10964-8000 - USA
> ---------------------------------------------------------------------
> 
> Samir Khanal wrote:
>> Hi All
>> I tried and did the following.
>>
>> [skhanal at comet ~]$ mpicxx -L /home/skhanal/bgtw/lib -lbgtw bgtwRingTest.cpp -o Ring
>> [skhanal at comet ~]$ mpiexec -n 4 ./Ring
>> Fatal error in MPI_Test: Invalid MPI_Request, error stack:
>> MPI_Test(152): MPI_Test(request=0x1f820388, flag=0x7fffb8236134, status=0x7fffb82360a0) failed
>> MPI_Test(75).: Invalid MPI_Requestrank 0 in job 35  comet.cs.bgsu.edu_35155   caused collective abort of all ranks
>>   exit status of rank 0: killed by signal 9
>>
>> What does this mean?
>> Samir
>>
>> Ps: I am using mpich2 1.0.8



More information about the mpich-discuss mailing list