[mpich-discuss] Fatal error in MPI_Test: Invalid MPI_Request
Samir Khanal
skhanal at bgsu.edu
Thu Feb 26 17:10:46 CST 2009
I tried that too...
in the place where it says
INCLUDES = -I../include
I tried
INCLUDES = -I ../include /home/skhanal/mpich2/include
but not success...
________________________________________
From: mpich-discuss-bounces at mcs.anl.gov [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Anthony Chan [chan at mcs.anl.gov]
Sent: Thursday, February 26, 2009 6:08 PM
To: mpich-discuss at mcs.anl.gov
Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid MPI_Request
Besides INCLUDE and INC that Gus mentioned, look for CPPFLAGS
which some people like to use to -I directives.
----- "Gus Correa" <gus at ldeo.columbia.edu> wrote:
> Hi Samir
>
> I am guessing the makefile that you are using to compile your
> libbgtw library is messing up the paths to the include files.
> The makefile may have some include paths hardwired that break
> what mpicxx needs to do.
> Otherwise mpicxx would find the right includes without a problem.
>
> The standard and easy thing to do is to use the MPICH2
> wrappers (mpicxx) to compile **both** the library and
> your test program.
> It is probably easier to straighten up the makefile to use
> CXX=mpicxx as the C++ compiler,
> than to tweak the makefile to work with CXX=g++ and do everything
> exactly as mpicxx would do.
>
> Search the makefile for "-I" directives, and also for INCLUDE or INC
> strings, and things the like. See if anything is pointing directly
> to a path to mpi.h.
>
> You may also post the makefile here.
> Maybe somebody here will be able to find what is causing the problem.
>
> I hope this helps,
>
> Gus Correa
> ---------------------------------------------------------------------
> Gustavo Correa
> Lamont-Doherty Earth Observatory - Columbia University
> Palisades, NY, 10964-8000 - USA
> ---------------------------------------------------------------------
>
> Samir Khanal wrote:
> > When i changed
> > g++ to mpicxx and
> > gcc to mpicc
> > the compilation did complete without problem
> >
> > i can run the program with out any problem when it is standalone but
> when i do
> >
> > [skhanal at comet ~]$ /home/skhanal/mpich2/bin/mpicxx -L
> /home/skhanal/bgtw/lib -lbgtw bgtwTorusTest.cpp -o Ring
> > [skhanal at comet ~]$ ./Ring
> > //this works
> >
> > But when n>1 , i have problems.
> >
> > [skhanal at comet ~]$ which mpiexec
> > ~/mpich2/bin/mpiexec
> > [skhanal at comet ~]$ mpdringtest
> > time for 1 loops = 0.00102400779724 seconds
> > [skhanal at comet ~]$ mpdtrace
> > comet
> > compute-0-3
> > compute-0-2
> > compute-0-1
> > compute-0-0
> > compute-0-5
> > compute-0-4
> > [skhanal at comet ~]$ ldd ./Ring
> > libbgtw.so.0 => /home/skhanal/bgtw/lib/libbgtw.so.0
> (0x00002b7b11ae4000)
> > libpthread.so.0 => /lib64/libpthread.so.0
> (0x0000003614600000)
> > librt.so.1 => /lib64/librt.so.1 (0x0000003615600000)
> > libstdc++.so.6 => /usr/lib64/libstdc++.so.6
> (0x0000003626000000)
> > libm.so.6 => /lib64/libm.so.6 (0x0000003613e00000)
> > libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x0000003624c00000)
> > libc.so.6 => /lib64/libc.so.6 (0x0000003613a00000)
> > /lib64/ld-linux-x86-64.so.2 (0x0000003613600000)
> >
> > [skhanal at comet ~]$ ~/mpich2/bin/mpiexec -n 2 ./Ring
> > Fatal error in MPI_Test: Invalid MPI_Request, error stack:
> > MPI_Test(152): MPI_Test(request=0x1c95a388, flag=0x7fff096c34d4,
> status=0x7fff096c3440) failed
> > MPI_Test(75).: Invalid MPI_Requestrank 0 in job 8
> comet.cs.bgsu.edu_60252 caused collective abort of all ranks
> > exit status of rank 0: killed by signal 9
> >
> > What am i doing wrong here?
> > Samir
> >
> >
> >
> >
> >
> > ________________________________________
> > From: mpich-discuss-bounces at mcs.anl.gov
> [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Rajeev Thakur
> [thakur at mcs.anl.gov]
> > Sent: Thursday, February 26, 2009 5:00 PM
> > To: mpich-discuss at mcs.anl.gov
> > Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
> MPI_Request
> >
> > Make sure that particular file is being compiled with mpicc, not
> just gcc.
> > Set the C compiler in the Makefile to mpicc.
> >
> > Rajeev
> >
> >
> >> -----Original Message-----
> >> From: mpich-discuss-bounces at mcs.anl.gov
> >> [mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Samir
> Khanal
> >> Sent: Thursday, February 26, 2009 3:51 PM
> >> To: mpich-discuss at mcs.anl.gov
> >> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
> >> MPI_Request
> >>
> >> In fact i am trying to compile a library with a Make file.
> >>
> >> i did as you suggested
> >> #include "mpi.h"
> >>
> >> It says
> >> ../include/message_handler.h:17:17: error: mpi.h: No such
> >> file or directory
> >>
> >> BTW i have installed mpich2 in my home directory, do i need
> >> to give the include directory while compiling then?
> >>
> >> I think it is not finding the file in place "the current
> directory"
> >> I am using the full path for the compiler (mpicxx)
> >> Samir
> >>
> >>
> >> ________________________________________
> >> From: mpich-discuss-bounces at mcs.anl.gov
> >> [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Rajeev
> >> Thakur [thakur at mcs.anl.gov]
> >> Sent: Thursday, February 26, 2009 4:40 PM
> >> To: mpich-discuss at mcs.anl.gov
> >> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
> >> MPI_Request
> >>
> >> Don't even give the full path to mpi.h. Just say #include
> >> "mpi.h". Give the
> >> full path to mpicc when you invoke it.
> >>
> >> Rajeev
> >>
> >>
> >>> -----Original Message-----
> >>> From: mpich-discuss-bounces at mcs.anl.gov
> >>> [mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Samir
> Khanal
> >>> Sent: Thursday, February 26, 2009 3:33 PM
> >>> To: mpich-discuss at mcs.anl.gov
> >>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
> >>> MPI_Request
> >>>
> >>> Hii Rajeev
> >>>
> >>> There are no mpi.h in the application directory, i am
> >>> pointing to the correct directory of mpi.h
> >>> (/home/skhanal/mpich2/include/mpi.h)
> >>> i even renamed the mpi.h for the default installation of
> >>> openmpi to mpi.h.bak
> >>> But still the same problem
> >>> Samir
> >>> ________________________________________
> >>> From: mpich-discuss-bounces at mcs.anl.gov
> >>> [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Rajeev
> >>> Thakur [thakur at mcs.anl.gov]
> >>> Sent: Thursday, February 26, 2009 4:31 PM
> >>> To: mpich-discuss at mcs.anl.gov
> >>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
> >>> MPI_Request
> >>>
> >>> In general, remove all your copies of mpi.h from the
> >>> application directory.
> >>> Let mpicc/mpif90 pick up the right mpi.h from its installation.
> >>>
> >>> Rajeev
> >>>
> >>>> -----Original Message-----
> >>>> From: mpich-discuss-bounces at mcs.anl.gov
> >>>> [mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of
> >> Samir Khanal
> >>>> Sent: Thursday, February 26, 2009 2:32 PM
> >>>> To: mpich-discuss at mcs.anl.gov
> >>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
> >>>> MPI_Request
> >>>>
> >>>> Hi Rajeev, list
> >>>>
> >>>> I am using the mpi.h from mpich2,
> >>>> the program compiled very well (i am building a library) in a
> >>>> X86 box and ran without error with mpiexec 0.82 and mpich 1.2.7
> >>>> I am trying to replicate the same at a x86_64 system with
> >>>> mpich 1.2.7 and mpiexec 0.83
> >>>> and mpich2 and its mpiexec.
> >>>> The library compiles, but i get the same error as my
> >> previous email.
> >>>> with mpich 1.2.7 / mpiexec 0.82 i get P4 error sigx 15 error
> >>>> (which Gus suggested was due to old library and i compiled
> >>>> mpich2 with nemesis channel..)
> >>>> with mpich2 / and its mpiexec shows the following as a
> >> job output.
> >>>> This job is running on following Processors
> >>>> compute-0-3 compute-0-3 compute-0-3 compute-0-3 compute-0-2
> >>>> compute-0-2 compute-0-2 compute-0-2
> >>>>
> >>>> Fatal error in MPI_Test: Invalid MPI_Request, error stack:
> >>>> MPI_Test(152): MPI_Test(request=0x7098388,
> >>>> flag=0x7fffdda3ea34, status=0x7fffdda3e9a0) failed
> >>>> MPI_Test(75).: Invalid MPI_RequestFatal error in MPI_Test:
> >>>> Invalid MPI_Request, error stack:
> >>>> MPI_Test(152): MPI_Test(request=0x3b95388,
> >>>> flag=0x7fffb21504b4, status=0x7fffb2150420) failed
> >>>> MPI_Test(75).: Invalid MPI_Requestrank 3 in job 1
> >>>> compute-0-3.local_43455 caused collective abort of all ranks
> >>>> exit status of rank 3: killed by signal 9
> >>>>
> >>>> FYI this application was written to run on a Gentoo Box with
> >>>> mpich 1.2.5/7 and mpiexec (from OSC) v 0.75
> >>>> I am trying to port this to a new 64bit cluster, with all
> >>>> sorts of problems.
> >>>>
> >>>> :-(
> >>>> Samir
> >>>>
> >>>> ________________________________________
> >>>> From: mpich-discuss-bounces at mcs.anl.gov
> >>>> [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Rajeev
> >>>> Thakur [thakur at mcs.anl.gov]
> >>>> Sent: Monday, February 23, 2009 2:07 PM
> >>>> To: mpich-discuss at mcs.anl.gov
> >>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
> >>>> MPI_Request
> >>>>
> >>>> This can happen if you use an mpif.h or mpi.h from some other
> >>>> implementation. Remove any mpi*.h in the application
> >>>> directory and don't
> >>>> provide any paths to mpi*.h. mpic* will pick up the right file.
> >>>>
> >>>> Rajeev
> >>>>
> >>>>
> >>>>> -----Original Message-----
> >>>>> From: mpich-discuss-bounces at mcs.anl.gov
> >>>>> [mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of
> >>> Samir Khanal
> >>>>> Sent: Monday, February 23, 2009 11:35 AM
> >>>>> To: mpich-discuss at mcs.anl.gov
> >>>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
> >>>>> MPI_Request
> >>>>>
> >>>>> Hi
> >>>>>
> >>>>> [skhanal at comet ~]$ g++ -v
> >>>>> Using built-in specs.
> >>>>> Target: x86_64-redhat-linux
> >>>>> Configured with: ../configure --prefix=/usr
> >>>>> --mandir=/usr/share/man --infodir=/usr/share/info
> >>>>> --enable-shared --enable-threads=posix
> >>>>> --enable-checking=release --with-system-zlib
> >>>>> --enable-__cxa_atexit --disable-libunwind-exceptions
> >>>>> --enable-libgcj-multifile
> >>>>> --enable-languages=c,c++,objc,obj-c++,java,fortran,ada
> >>>>> --enable-java-awt=gtk --disable-dssi --enable-plugin
> >>>>> --with-java-home=/usr/lib/jvm/java-1.4.2-gcj-1.4.2.0/jre
> >>>>> --with-cpu=generic --host=x86_64-redhat-linux
> >>>>> Thread model: posix
> >>>>> gcc version 4.1.2 20071124 (Red Hat 4.1.2-42)
> >>>>> [skhanal at comet ~]$ which mpicxx
> >>>>> ~/mpich2/bin/mpicxx
> >>>>> [skhanal at comet ~]$ which mpicc
> >>>>> ~/mpich2/bin/mpicc
> >>>>> [skhanal at comet ~]$ which mpiexec
> >>>>> ~/mpich2/bin/mpiexec
> >>>>>
> >>>>> i have installed all on my home directory
> >>>>>
> >>>>> when i compile i do
> >>>>> [skhanal at comet ~]$ /home/skhanal/mpich2/bin/mpicxx -L
> >>>>> /home/skhanal/bgtw/lib -lbgtw bgtwRingTest.cpp -o Ring
> >>>>>
> >>>>> [skhanal at comet ~]$ ./Ring
> >>>>> Fatal error in MPI_Test: Invalid MPI_Request, error stack:
> >>>>> MPI_Test(152): MPI_Test(request=0x16ae9388,
> >>>>> flag=0x7fff7a7599c4, status=0x7fff7a759930) failed
> >>>>> MPI_Test(75).: Invalid MPI_Request
> >>>>>
> >>>>> the library needs a mpi.h file to include, i gave
> >>>>> /home/skhanal/mpich2/include/mpi.h as an absolute path.
> >>>>>
> >>>>> any clues?
> >>>>>
> >>>>> Thanks
> >>>>> Samir
> >>>>>
> >>>>> ________________________________________
> >>>>> From: mpich-discuss-bounces at mcs.anl.gov
> >>>>> [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Gus Correa
> >>>>> [gus at ldeo.columbia.edu]
> >>>>> Sent: Monday, February 23, 2009 12:32 PM
> >>>>> To: Mpich Discuss
> >>>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
> >>>>> MPI_Request
> >>>>>
> >>>>> Hi Samir, list
> >>>>>
> >>>>> I am wondering if the mpicxx and mpiexec you are using
> >>>>> belong to the same MPICH2 build (considering the problems you
> >>>>> reported before).
> >>>>>
> >>>>> What is the output of "which mpicxx" and "which mpiexec"?
> >>>>>
> >>>>> You may want to use full path names to mpicxx and mpiexec,
> >>>>> as Anthony Chan recommended in another email.
> >>>>> Problems with PATH and multiple versions and builds of MPI
> >>>>> that hang around all Linux computers
> >>>>> has been an endless source of frustration for many.
> >>>>> I myself prefer to use full path names when I am testing
> >>>>> MPI programs, to avoid any confusion and distress.
> >>>>>
> >>>>> I hope this helps,
> >>>>> Gus Correa
> >>>>>
> >>
> ---------------------------------------------------------------------
> >>>>> Gustavo Correa
> >>>>> Lamont-Doherty Earth Observatory - Columbia University
> >>>>> Palisades, NY, 10964-8000 - USA
> >>>>>
> >>
> ---------------------------------------------------------------------
> >>>>> Samir Khanal wrote:
> >>>>>> Hi All
> >>>>>> I tried and did the following.
> >>>>>>
> >>>>>> [skhanal at comet ~]$ mpicxx -L /home/skhanal/bgtw/lib -lbgtw
> >>>>> bgtwRingTest.cpp -o Ring
> >>>>>> [skhanal at comet ~]$ mpiexec -n 4 ./Ring
> >>>>>> Fatal error in MPI_Test: Invalid MPI_Request, error stack:
> >>>>>> MPI_Test(152): MPI_Test(request=0x1f820388,
> >>>>> flag=0x7fffb8236134, status=0x7fffb82360a0) failed
> >>>>>> MPI_Test(75).: Invalid MPI_Requestrank 0 in job 35
> >>>>> comet.cs.bgsu.edu_35155 caused collective abort of all ranks
> >>>>>> exit status of rank 0: killed by signal 9
> >>>>>>
> >>>>>> What does this mean?
> >>>>>> Samir
> >>>>>>
> >>>>>> Ps: I am using mpich2 1.0.8
> >>>>>
> >>>>
> >>>
> >>
More information about the mpich-discuss
mailing list