[mpich-discuss] Fatal error in MPI_Test: Invalid MPI_Request

Gus Correa gus at ldeo.columbia.edu
Thu Feb 26 19:02:49 CST 2009


Hi Samir, list

Looking at the src makefile and this
error that you reported before:

 >> In fact i am trying to compile a library with a Make file.
 >>
 >> i did as you suggested
 >> #include "mpi.h"
 >>
 >> It says
 >> ../include/message_handler.h:17:17: error: mpi.h: No such
 >> file or directory
 >>
 >>

I am wondering if the problem stems from the nesting
of include files, which sometimes confuses the pre-processor.

Would be any logical difference in your pre-processed code
if you include mpi.h directly in
message_handler.cpp, rather than in message_handler.h?
Something like this:

Inside message_handler.cpp

...
#include "mpi.h"
#include "message_handler.h"
...

(and no #include "mpi.h" inside message_handler.h)

Maybe mpicxx can find mpi.h this way.

Anyway, just a guess.

Gus Correa
---------------------------------------------------------------------
Gustavo Correa
Lamont-Doherty Earth Observatory - Columbia University
Palisades, NY, 10964-8000 - USA
---------------------------------------------------------------------

Samir Khanal wrote:
> There are three make files
> 
> One in
> src
> include and
> parentdirectory
> 
> 
> parent makefile
> -----------------------------------------
> SHELL = /bin/sh
> 
> srcdir = .
> top_srcdir = .
> 
> prefix = /home/skhanal/bgtw
> exec_prefix = ${prefix}
> 
> bindir = ${exec_prefix}/bin
> sbindir = ${exec_prefix}/sbin
> libexecdir = ${exec_prefix}/libexec
> datadir = ${prefix}/share
> sysconfdir = ${prefix}/etc
> sharedstatedir = ${prefix}/com
> localstatedir = ${prefix}/var
> libdir = ${exec_prefix}/lib
> infodir = ${prefix}/info
> mandir = ${prefix}/man
> includedir = ${prefix}/include
> oldincludedir = /usr/include
> 
> DESTDIR =
> 
> pkgdatadir = $(datadir)/BGTW
> pkglibdir = $(libdir)/BGTW
> pkgincludedir = $(includedir)/BGTW
> 
> top_builddir = .
> 
> ACLOCAL = /home/skhanal/Desktop/bgtwNew/missing aclocal-1.4
> AUTOCONF = autoconf
> AUTOMAKE = /home/skhanal/Desktop/bgtwNew/missing automake-1.4
> AUTOHEADER = autoheader
> 
> INSTALL = /usr/bin/install -c
> INSTALL_PROGRAM = ${INSTALL} $(AM_INSTALL_PROGRAM_FLAGS)
> INSTALL_DATA = ${INSTALL} -m 644
> INSTALL_SCRIPT = ${INSTALL}
> transform = s,x,x,
> 
> NORMAL_INSTALL = :
> PRE_INSTALL = :
> POST_INSTALL = :
> NORMAL_UNINSTALL = :
> PRE_UNINSTALL = :
> POST_UNINSTALL = :
> host_alias =
> host_triplet = x86_64-unknown-linux-gnu
> AS = @AS@
> CXXFLAGS =
> DLLTOOL = @DLLTOOL@
> ECHO = echo
> EXEEXT =
> LIBTOOL = $(SHELL) $(top_builddir)/libtool
> LN_S = ln -s
> MAKEINFO = /home/skhanal/Desktop/bgtwNew/missing makeinfo
> NO_PREFIX_PACKAGE_DATA_DIR = share
> NO_PREFIX_PACKAGE_DOC_DIR = doc/BGTW
> NO_PREFIX_PACKAGE_HELP_DIR = share/help
> NO_PREFIX_PACKAGE_MENU_DIR = share
> NO_PREFIX_PACKAGE_PIXMAPS_DIR = share/pixmaps
> OBJDUMP = @OBJDUMP@
> OBJEXT = o
> PACKAGE = BGTW
> PACKAGE_DATA_DIR = /home/skhanal/bgtw/share
> PACKAGE_DOC_DIR = /home/skhanal/bgtw/doc/BGTW
> PACKAGE_HELP_DIR = /home/skhanal/bgtw/share/help
> PACKAGE_MENU_DIR = /home/skhanal/bgtw/share
> PACKAGE_PIXMAPS_DIR = /home/skhanal/bgtw/share/pixmaps
> RANLIB = ranlib
> STRIP = strip
> VERSION = 0.1
> 
> SUBDIRS = include src doc
> 
> libbgtwdocdir = ${prefix}/doc/BGTW
> libbgtwdoc_DATA =       README  COPYING         AUTHORS         ChangeLog       INSTALL         NEWS    TODO
> 
> 
> EXTRA_DIST = $(libbgtwdoc_DATA)
> 
> # this is an mpi program
> CC = /home/skhanal/mpich2/bin/mpicc
> CXX = /home/skhanal/mpich2/bin/mpicxx
> ACLOCAL_M4 = $(top_srcdir)/aclocal.m4
> mkinstalldirs = $(SHELL) $(top_srcdir)/mkinstalldirs
> CONFIG_HEADER = config.h
> CONFIG_CLEAN_FILES =
> DATA =  $(libbgtwdoc_DATA)
> 
> DIST_COMMON =  README ./stamp-h.in AUTHORS COPYING ChangeLog INSTALL \
> Makefile.am Makefile.in NEWS TODO acconfig.h acinclude.m4 aclocal.m4 \
> config.guess config.h.in config.sub configure configure.in install-sh \
> ltmain.sh missing mkinstalldirs
> 
> .....................................................
> there are no -I flags.
> 
> The one in the src dir might be of interest
> 
> 
> host_triplet = x86_64-unknown-linux-gnu
> AS = @AS@
> CC = /home/skhanal/mpich2/bin/mpicc
> CXX = /home/skhanal/mpich2/bin/mpicxx
> CXXFLAGS =
> DLLTOOL = @DLLTOOL@
> ECHO = echo
> EXEEXT =
> LIBTOOL = $(SHELL) $(top_builddir)/libtool
> LN_S = ln -s
> MAKEINFO = /home/skhanal/Desktop/bgtwNew/missing makeinfo
> NO_PREFIX_PACKAGE_DATA_DIR = share
> NO_PREFIX_PACKAGE_DOC_DIR = doc/BGTW
> NO_PREFIX_PACKAGE_HELP_DIR = share/help
> NO_PREFIX_PACKAGE_MENU_DIR = share
> NO_PREFIX_PACKAGE_PIXMAPS_DIR = share/pixmaps
> OBJDUMP = @OBJDUMP@
> OBJEXT = o
> PACKAGE = BGTW
> PACKAGE_DATA_DIR = /home/skhanal/bgtw/share
> PACKAGE_DOC_DIR = /home/skhanal/bgtw/doc/BGTW
> PACKAGE_HELP_DIR = /home/skhanal/bgtw/share/help
> PACKAGE_MENU_DIR = /home/skhanal/bgtw/share
> PACKAGE_PIXMAPS_DIR = /home/skhanal/bgtw/share/pixmaps
> RANLIB = ranlib
> STRIP = strip
> VERSION = 0.1
> 
> INCLUDES =       -I../include
> #these are the header files for the program
> 
> 
> AM_CXXFLAGS =    -DDEBUG -DNDEBUG        -Wall   -O3
> 
> 
> lib_LTLIBRARIES = libbgtw.la
> 
> libbgtw_la_SOURCES =    lp.cpp  message_handler.cpp     process.cpp     thread.cpp      twevent.cpp     twlp.cpp        random.cpp      defines.cpp     threadedgvt.cpp         state.cpp       memorymgr.cpp
> 
> 
> libbgtw_la_LDFLAGS =
> 
> libbgtw_la_LIBADD =      -lpthread
> 
> mkinstalldirs = $(SHELL) $(top_srcdir)/mkinstalldirs
> CONFIG_HEADER = ../config.h
> CONFIG_CLEAN_FILES =
> LTLIBRARIES =  $(lib_LTLIBRARIES)
> 
> 
> DEFS = -DHAVE_CONFIG_H -I. -I$(srcdir) -I..
> CPPFLAGS =
> LDFLAGS =
> LIBS =
> libbgtw_la_DEPENDENCIES =
> libbgtw_la_OBJECTS =  lp.lo message_handler.lo process.lo thread.lo \
> twevent.lo twlp.lo random.lo defines.lo threadedgvt.lo state.lo \
> memorymgr.lo
> 
> CXXCOMPILE = $(CXX) $(DEFS) $(INCLUDES) $(AM_CPPFLAGS) $(CPPFLAGS) $(AM_CXXFLAGS) $(CXXFLAGS)
> LTCXXCOMPILE = $(LIBTOOL) --mode=compile $(CXX) $(DEFS) $(INCLUDES) $(AM_CPPFLAGS) $(CPPFLAGS) $(AM_CXXFLAGS) $(CXXFLAGS)
> CXXLD = $(CXX)
> CXXLINK = $(LIBTOOL) --mode=link $(CXXLD) $(AM_CXXFLAGS) $(CXXFLAGS) $(LDFLAGS) -o $@
> DIST_COMMON =  Makefile.am Makefile.in
> 
> 
> DISTFILES = $(DIST_COMMON) $(SOURCES) $(HEADERS) $(TEXINFOS) $(EXTRA_DIST)
> 
> 
> --------------------------
> 
> Just want to remind that the program runs very well for n=1, producing two output files as desired.
> (this is a simulation software developed in house some 5-6 years ago...)
> 
> Samir
> 
> 
> 
> 
> 
> 
> ________________________________________
> From: mpich-discuss-bounces at mcs.anl.gov [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Gus Correa [gus at ldeo.columbia.edu]
> Sent: Thursday, February 26, 2009 5:57 PM
> To: Mpich Discuss
> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid MPI_Request
> 
> Hi Samir
> 
> I am guessing the makefile that you are using to compile your
> libbgtw library is messing up the paths to the include files.
> The makefile may have some include paths hardwired that break
> what mpicxx needs to do.
> Otherwise mpicxx would find the right includes without a problem.
> 
> The standard and easy thing to do is to use the MPICH2
> wrappers (mpicxx) to compile **both** the library and
> your test program.
> It is probably easier to straighten up the makefile to use
> CXX=mpicxx as the C++ compiler,
> than to tweak the makefile to work with CXX=g++ and do everything
> exactly as mpicxx would do.
> 
> Search the makefile for "-I" directives, and also for INCLUDE or INC
> strings, and things the like.  See if anything is pointing directly
> to a path to mpi.h.
> 
> You may also post the makefile here.
> Maybe somebody here will be able to find what is causing the problem.
> 
> I hope this helps,
> 
> Gus Correa
> ---------------------------------------------------------------------
> Gustavo Correa
> Lamont-Doherty Earth Observatory - Columbia University
> Palisades, NY, 10964-8000 - USA
> ---------------------------------------------------------------------
> 
> Samir Khanal wrote:
>> When i changed
>> g++ to mpicxx and
>> gcc to mpicc
>> the compilation did complete without problem
>>
>> i can run the program with out any problem when it is standalone but when i do
>>
>> [skhanal at comet ~]$ /home/skhanal/mpich2/bin/mpicxx -L /home/skhanal/bgtw/lib -lbgtw bgtwTorusTest.cpp -o Ring
>> [skhanal at comet ~]$ ./Ring
>> //this works
>>
>> But when n>1 , i have problems.
>>
>> [skhanal at comet ~]$ which mpiexec
>> ~/mpich2/bin/mpiexec
>> [skhanal at comet ~]$ mpdringtest
>> time for 1 loops = 0.00102400779724 seconds
>> [skhanal at comet ~]$ mpdtrace
>> comet
>> compute-0-3
>> compute-0-2
>> compute-0-1
>> compute-0-0
>> compute-0-5
>> compute-0-4
>> [skhanal at comet ~]$ ldd ./Ring
>>         libbgtw.so.0 => /home/skhanal/bgtw/lib/libbgtw.so.0 (0x00002b7b11ae4000)
>>         libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003614600000)
>>         librt.so.1 => /lib64/librt.so.1 (0x0000003615600000)
>>         libstdc++.so.6 => /usr/lib64/libstdc++.so.6 (0x0000003626000000)
>>         libm.so.6 => /lib64/libm.so.6 (0x0000003613e00000)
>>         libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x0000003624c00000)
>>         libc.so.6 => /lib64/libc.so.6 (0x0000003613a00000)
>>         /lib64/ld-linux-x86-64.so.2 (0x0000003613600000)
>>
>> [skhanal at comet ~]$ ~/mpich2/bin/mpiexec -n 2 ./Ring
>> Fatal error in MPI_Test: Invalid MPI_Request, error stack:
>> MPI_Test(152): MPI_Test(request=0x1c95a388, flag=0x7fff096c34d4, status=0x7fff096c3440) failed
>> MPI_Test(75).: Invalid MPI_Requestrank 0 in job 8  comet.cs.bgsu.edu_60252   caused collective abort of all ranks
>>   exit status of rank 0: killed by signal 9
>>
>> What am i doing wrong here?
>> Samir
>>
>>
>>
>>
>>
>> ________________________________________
>> From: mpich-discuss-bounces at mcs.anl.gov [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Rajeev Thakur [thakur at mcs.anl.gov]
>> Sent: Thursday, February 26, 2009 5:00 PM
>> To: mpich-discuss at mcs.anl.gov
>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid MPI_Request
>>
>> Make sure that particular file is being compiled with mpicc, not just gcc.
>> Set the C compiler in the Makefile to mpicc.
>>
>> Rajeev
>>
>>
>>> -----Original Message-----
>>> From: mpich-discuss-bounces at mcs.anl.gov
>>> [mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Samir Khanal
>>> Sent: Thursday, February 26, 2009 3:51 PM
>>> To: mpich-discuss at mcs.anl.gov
>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
>>> MPI_Request
>>>
>>> In fact i am trying to compile a library with a Make file.
>>>
>>> i did as you suggested
>>> #include "mpi.h"
>>>
>>> It says
>>> ../include/message_handler.h:17:17: error: mpi.h: No such
>>> file or directory
>>>
>>> BTW i have installed mpich2 in my home directory, do i need
>>> to give the include directory while compiling then?
>>>
>>> I think it is not finding the file in place "the current directory"
>>> I am using the full path for the compiler (mpicxx)
>>> Samir
>>>
>>>
>>> ________________________________________
>>> From: mpich-discuss-bounces at mcs.anl.gov
>>> [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Rajeev
>>> Thakur [thakur at mcs.anl.gov]
>>> Sent: Thursday, February 26, 2009 4:40 PM
>>> To: mpich-discuss at mcs.anl.gov
>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
>>> MPI_Request
>>>
>>> Don't even give the full path to mpi.h. Just say #include
>>> "mpi.h". Give the
>>> full path to mpicc   when you invoke it.
>>>
>>> Rajeev
>>>
>>>
>>>> -----Original Message-----
>>>> From: mpich-discuss-bounces at mcs.anl.gov
>>>> [mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Samir Khanal
>>>> Sent: Thursday, February 26, 2009 3:33 PM
>>>> To: mpich-discuss at mcs.anl.gov
>>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
>>>> MPI_Request
>>>>
>>>> Hii Rajeev
>>>>
>>>> There are no mpi.h in the application directory, i am
>>>> pointing to the correct directory of mpi.h
>>>> (/home/skhanal/mpich2/include/mpi.h)
>>>> i even renamed the mpi.h for the default installation of
>>>> openmpi to mpi.h.bak
>>>> But still the same problem
>>>> Samir
>>>> ________________________________________
>>>> From: mpich-discuss-bounces at mcs.anl.gov
>>>> [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Rajeev
>>>> Thakur [thakur at mcs.anl.gov]
>>>> Sent: Thursday, February 26, 2009 4:31 PM
>>>> To: mpich-discuss at mcs.anl.gov
>>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
>>>> MPI_Request
>>>>
>>>> In general, remove all your copies of mpi.h from the
>>>> application directory.
>>>> Let mpicc/mpif90 pick up the right mpi.h from its installation.
>>>>
>>>> Rajeev
>>>>
>>>>> -----Original Message-----
>>>>> From: mpich-discuss-bounces at mcs.anl.gov
>>>>> [mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of
>>> Samir Khanal
>>>>> Sent: Thursday, February 26, 2009 2:32 PM
>>>>> To: mpich-discuss at mcs.anl.gov
>>>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
>>>>> MPI_Request
>>>>>
>>>>> Hi Rajeev, list
>>>>>
>>>>> I am using the mpi.h from mpich2,
>>>>> the program compiled very well (i am building a library) in a
>>>>> X86 box and ran without error with mpiexec 0.82 and mpich 1.2.7
>>>>> I am trying to replicate the same at a x86_64 system with
>>>>> mpich 1.2.7 and mpiexec 0.83
>>>>> and mpich2 and its mpiexec.
>>>>> The library compiles, but i get the same error as my
>>> previous email.
>>>>> with mpich 1.2.7 / mpiexec 0.82 i get P4 error sigx 15 error
>>>>> (which Gus suggested was due to old library and i compiled
>>>>> mpich2 with nemesis channel..)
>>>>> with mpich2  / and its mpiexec shows  the following as a
>>> job output.
>>>>> This job is running on following Processors
>>>>> compute-0-3 compute-0-3 compute-0-3 compute-0-3 compute-0-2
>>>>> compute-0-2 compute-0-2 compute-0-2
>>>>>
>>>>> Fatal error in MPI_Test: Invalid MPI_Request, error stack:
>>>>> MPI_Test(152): MPI_Test(request=0x7098388,
>>>>> flag=0x7fffdda3ea34, status=0x7fffdda3e9a0) failed
>>>>> MPI_Test(75).: Invalid MPI_RequestFatal error in MPI_Test:
>>>>> Invalid MPI_Request, error stack:
>>>>> MPI_Test(152): MPI_Test(request=0x3b95388,
>>>>> flag=0x7fffb21504b4, status=0x7fffb2150420) failed
>>>>> MPI_Test(75).: Invalid MPI_Requestrank 3 in job 1
>>>>> compute-0-3.local_43455   caused collective abort of all ranks
>>>>>   exit status of rank 3: killed by signal 9
>>>>>
>>>>> FYI this application was written to run on a Gentoo Box with
>>>>> mpich 1.2.5/7  and mpiexec (from OSC) v 0.75
>>>>> I am trying to port this to a new 64bit cluster, with all
>>>>> sorts of problems.
>>>>>
>>>>> :-(
>>>>> Samir
>>>>>
>>>>> ________________________________________
>>>>> From: mpich-discuss-bounces at mcs.anl.gov
>>>>> [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Rajeev
>>>>> Thakur [thakur at mcs.anl.gov]
>>>>> Sent: Monday, February 23, 2009 2:07 PM
>>>>> To: mpich-discuss at mcs.anl.gov
>>>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
>>>>> MPI_Request
>>>>>
>>>>> This can happen if you use an mpif.h or mpi.h from some other
>>>>> implementation. Remove any mpi*.h in the application
>>>>> directory and don't
>>>>> provide any paths to mpi*.h. mpic* will pick up the right file.
>>>>>
>>>>> Rajeev
>>>>>
>>>>>
>>>>>> -----Original Message-----
>>>>>> From: mpich-discuss-bounces at mcs.anl.gov
>>>>>> [mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of
>>>> Samir Khanal
>>>>>> Sent: Monday, February 23, 2009 11:35 AM
>>>>>> To: mpich-discuss at mcs.anl.gov
>>>>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
>>>>>> MPI_Request
>>>>>>
>>>>>> Hi
>>>>>>
>>>>>> [skhanal at comet ~]$ g++ -v
>>>>>> Using built-in specs.
>>>>>> Target: x86_64-redhat-linux
>>>>>> Configured with: ../configure --prefix=/usr
>>>>>> --mandir=/usr/share/man --infodir=/usr/share/info
>>>>>> --enable-shared --enable-threads=posix
>>>>>> --enable-checking=release --with-system-zlib
>>>>>> --enable-__cxa_atexit --disable-libunwind-exceptions
>>>>>> --enable-libgcj-multifile
>>>>>> --enable-languages=c,c++,objc,obj-c++,java,fortran,ada
>>>>>> --enable-java-awt=gtk --disable-dssi --enable-plugin
>>>>>> --with-java-home=/usr/lib/jvm/java-1.4.2-gcj-1.4.2.0/jre
>>>>>> --with-cpu=generic --host=x86_64-redhat-linux
>>>>>> Thread model: posix
>>>>>> gcc version 4.1.2 20071124 (Red Hat 4.1.2-42)
>>>>>> [skhanal at comet ~]$ which mpicxx
>>>>>> ~/mpich2/bin/mpicxx
>>>>>> [skhanal at comet ~]$ which mpicc
>>>>>> ~/mpich2/bin/mpicc
>>>>>> [skhanal at comet ~]$ which mpiexec
>>>>>> ~/mpich2/bin/mpiexec
>>>>>>
>>>>>> i have installed all on my home directory
>>>>>>
>>>>>> when i compile i do
>>>>>> [skhanal at comet ~]$ /home/skhanal/mpich2/bin/mpicxx -L
>>>>>> /home/skhanal/bgtw/lib -lbgtw bgtwRingTest.cpp -o Ring
>>>>>>
>>>>>> [skhanal at comet ~]$ ./Ring
>>>>>> Fatal error in MPI_Test: Invalid MPI_Request, error stack:
>>>>>> MPI_Test(152): MPI_Test(request=0x16ae9388,
>>>>>> flag=0x7fff7a7599c4, status=0x7fff7a759930) failed
>>>>>> MPI_Test(75).: Invalid MPI_Request
>>>>>>
>>>>>> the library needs a mpi.h file to include, i gave
>>>>>> /home/skhanal/mpich2/include/mpi.h as an absolute path.
>>>>>>
>>>>>> any clues?
>>>>>>
>>>>>> Thanks
>>>>>> Samir
>>>>>>
>>>>>> ________________________________________
>>>>>> From: mpich-discuss-bounces at mcs.anl.gov
>>>>>> [mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Gus Correa
>>>>>> [gus at ldeo.columbia.edu]
>>>>>> Sent: Monday, February 23, 2009 12:32 PM
>>>>>> To: Mpich Discuss
>>>>>> Subject: Re: [mpich-discuss] Fatal error in MPI_Test: Invalid
>>>>>> MPI_Request
>>>>>>
>>>>>> Hi Samir, list
>>>>>>
>>>>>> I am wondering if the mpicxx and mpiexec you are using
>>>>>> belong to the same MPICH2 build (considering the problems you
>>>>>> reported before).
>>>>>>
>>>>>> What is the output of "which mpicxx" and "which mpiexec"?
>>>>>>
>>>>>> You may want to use full path names to mpicxx and mpiexec,
>>>>>> as Anthony Chan recommended in another email.
>>>>>> Problems with PATH and multiple versions and builds of MPI
>>>>>> that hang around all Linux computers
>>>>>> has been an endless source of frustration for many.
>>>>>> I myself prefer to use full path names when I am testing
>>>>>> MPI programs, to avoid any confusion and distress.
>>>>>>
>>>>>> I hope this helps,
>>>>>> Gus Correa
>>>>>>
>>> ---------------------------------------------------------------------
>>>>>> Gustavo Correa
>>>>>> Lamont-Doherty Earth Observatory - Columbia University
>>>>>> Palisades, NY, 10964-8000 - USA
>>>>>>
>>> ---------------------------------------------------------------------
>>>>>> Samir Khanal wrote:
>>>>>>> Hi All
>>>>>>> I tried and did the following.
>>>>>>>
>>>>>>> [skhanal at comet ~]$ mpicxx -L /home/skhanal/bgtw/lib -lbgtw
>>>>>> bgtwRingTest.cpp -o Ring
>>>>>>> [skhanal at comet ~]$ mpiexec -n 4 ./Ring
>>>>>>> Fatal error in MPI_Test: Invalid MPI_Request, error stack:
>>>>>>> MPI_Test(152): MPI_Test(request=0x1f820388,
>>>>>> flag=0x7fffb8236134, status=0x7fffb82360a0) failed
>>>>>>> MPI_Test(75).: Invalid MPI_Requestrank 0 in job 35
>>>>>> comet.cs.bgsu.edu_35155   caused collective abort of all ranks
>>>>>>>   exit status of rank 0: killed by signal 9
>>>>>>>
>>>>>>> What does this mean?
>>>>>>> Samir
>>>>>>>
>>>>>>> Ps: I am using mpich2 1.0.8



More information about the mpich-discuss mailing list