[mpich-discuss] MPI_Comm_accept() / connect() errors

Blesson Varghese b.varghese at student.reading.ac.uk
Thu Oct 1 05:56:52 CDT 2009

The following is the information you have requested. 

- The output of the "ompi_info --all" is attached with the email
- PATH Variable:
- LD_LIBRARY_PATH variable was empty
- The following is the output of ifconfig on hpcc00 from where the error has
been generated:
eth0      Link encap:Ethernet  HWaddr 00:12:3f:4c:2d:78
          inet addr:  Bcast:
          inet6 addr: fe80::212:3fff:fe4c:2d78/64 Scope:Link
          RX packets:15912728 errors:0 dropped:0 overruns:0 frame:0
          TX packets:15312376 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000
          RX bytes:2951880321 (2.7 GB)  TX bytes:2788249498 (2.5 GB)

lo        Link encap:Local Loopback
          inet addr:  Mask:
          inet6 addr: ::1/128 Scope:Host
          UP LOOPBACK RUNNING  MTU:16436  Metric:1
          RX packets:3507489 errors:0 dropped:0 overruns:0 frame:0
          TX packets:3507489 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:0
          RX bytes:1794266658 (1.6 GB)  TX bytes:1794266658 (1.6 GB)


-----Original Message-----
From: mpich-discuss-bounces at mcs.anl.gov
[mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Jeff Squyres
Sent: 30 September 2009 13:23
To: mpich-discuss at mcs.anl.gov
Subject: Re: [mpich-discuss] MPI_Comm_accept() / connect() errors

Can you send all the information listed here:



On Sep 30, 2009, at 8:16 AM, Blesson Varghese wrote:

> Hi,
> I am running MPI 2.0 on Ubuntu 4.2.4, kernel version 2.6.24.
> I have been trying to execute the server.c and client.c program  
> provided in http://www.mpi-forum.org/docs/mpi21-report/node213.htm#Node213

> , using accept() and connect() function in MPI. However, the  
> following errors are generated.
> [hpcc00:16522] *** An error occurred in MPI_Comm_connect
> [hpcc00:16522] *** on communicator MPI_COMM_WORLD
> [hpcc00:16522] *** MPI_ERR_INTERN: internal error
> [hpcc00:16522] *** MPI_ERRORS_ARE_FATAL (goodbye)
> I ran the server program as mpirun -np 1 server. This program gave  
> me the output port as 0.1.0:2000. I used this port name as the  
> command line argument to execute the client program as mpirun -np 1  
> client 0.1.1:2000
> Could you please advice?
> Many thanks,
> Blesson.

Jeff Squyres
jsquyres at cisco.com

-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: outputompi.txt
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20091001/011908e6/attachment-0001.txt>

More information about the mpich-discuss mailing list