[mpich-discuss] MPICH2 Between Linux/Windows

Jayesh Krishna jayesh at mcs.anl.gov
Mon Aug 23 16:26:21 CDT 2010


Hi,
 From the results it looks like the linux machine is unable to communicate using the sock channel (connect via tcp) with the windows machine.

# Did you compile the same test case on both the machines (There is icpi.c ==> interactive version and cpi.c ==> non-interactive version) ?
# Did you specify the ipaddresses of the machines in the machinefile (If you are specifying the hostnames in the machinefile try specifying ipaddresses instead)?

Regards,
Jayesh

----- Original Message -----
From: "Stephan Hackstedt" <stephan.hackstedt at googlemail.com>
To: jayesh at mcs.anl.gov
Sent: Monday, August 23, 2010 4:11:17 PM GMT -06:00 US/Canada Central
Subject: Re: [mpich-discuss] MPICH2 Between Linux/Windows

Hi, 

thanks for your advises. 

# Turn off any firewalls (windows firewall, linux firewall) running on the machines 
->done 
# Configure MPICH2 on linux with the sock channel (./configure ... --with-pm=smpd --with-device=ch3:sock ...) & SMPD process manager & install it. 
->done -> ./configure --prefix=.../project/mpich2_121 --with-pm=smpd --with-device=ch3:sock 
# Install the same version of MPICH2 (Same architecture & MPICH2 version) on the windows system. 
->done i used mpich2-1.2.1p1-win-ia32.msi installer 
# Make sure that you have the same user on both the machines (Either the same username/password or Use a domain user). Register that user on the windows machine (mpiexec -register). 
->done 
# Try running MPI jobs locally on the windows machine and the linux machine 
->works! 
# Try pinging each machine from the other machine 
->works! 
# Now try running a non-MPI job from the windows machine (mpiexec -n 2 -machinefile mf.txt -channel sock hostname; where mf.txt contains the ipaddresses of both the machines in separate lines) 
->works, output both processes is plotted in windows terminal, programms successful end. 
# If the above step is successful, try running cpi ( https://svn.mcs.anl.gov/repos/mpi/mpich2/trunk/examples/icpi.c ) on both the machines. eg: If you have icpi.exe in c:\test on the windows machine and icpi.exe (unix executable) on /home/stephan/test , you should specify the paths as below, 

->the job hangs directly after the input of the number interval, even if I choose 0 for quit, I also did not get the error message. 
>Enter the number of intervals: (0 quits) 1 
>_ 
>Enter the number of intervals: (0 quits) 0 
>_ 

I also tried to use the cpi example from the source, cpi.c. I debugged, and it as far as I can say, the error occurs exactly when calling MPI_Bcast, the programm blocks and waits until I cancel it manually. Maybe the Data transmission does not work correctly in my case? i don't know what can cause this. 

The output is: 

>Process 0 of 2 is on stephanxp <- WinXP Machine 
>Process 1 of 2 is on stephan-desktop <- Linux Machine 
>_ 


regards, 
stephan 


2010/8/23 < jayesh at mcs.anl.gov > 


Hi, 
Did you try explicitly specifying the channel on windows (mpiexec -n 2 -channel sock -machinefile mf.txt hostname) ? 
Please try the following steps (Even if you have tried it before, pls try the steps as specified below - this helps us in debugging your problem) and let us know the results, 

# Turn off any firewalls (windows firewall, linux firewall) running on the machines 
# Configure MPICH2 on linux with the sock channel (./configure ... --with-pm=smpd --with-device=ch3:sock ...) & SMPD process manager & install it. 
# Install the same version of MPICH2 (Same architecture & MPICH2 version) on the windows system. 
# Make sure that you have the same user on both the machines (Either the same username/password or Use a domain user). Register that user on the windows machine (mpiexec -register). 
# Try running MPI jobs locally on the windows machine and the linux machine 
# Try pinging each machine from the other machine 
# Now try running a non-MPI job from the windows machine (mpiexec -n 2 -machinefile mf.txt -channel sock hostname; where mf.txt contains the ipaddresses of both the machines in separate lines) 
# If the above step is successful, try running cpi ( https://svn.mcs.anl.gov/repos/mpi/mpich2/trunk/examples/icpi.c ) on both the machines. eg: If you have icpi.exe in c:\test on the windows machine and icpi.exe (unix executable) on /home/stephan/test , you should specify the paths as below, 

mpiexec -n 2 -machinefile mf.txt -channel sock -path "c:\test;/home/stephan/test" icpi.exe 

Let us know the results (Pls provide as much details as possible - including commands typed and the stdout/err outputs). 

Regards, 
Jayesh 




----- Original Message ----- 
From: "Stephan Hackstedt" < stephan.hackstedt at googlemail.com > 
To: mpich-discuss at mcs.anl.gov 
Sent: Monday, August 23, 2010 1:19:24 PM GMT -06:00 US/Canada Central 
Subject: Re: [mpich-discuss] MPICH2 Between Linux/Windows 


i also observed, that when i use the ch3:sock build, the MPI_comm_open always repeats the internal ip adress like: 

tag=0 port=46264 description=stephan-desktop ifname=127.0.1.1 


regards, 
stephan 


2010/8/23 Stephan Hackstedt < stephan.hackstedt at googlemail.com > 


Hi, 

I'm trying to establish a connection between a process on linux (ububtu 10 04) and a process on a WinXp machine. MPICH 1.2.1 on both machines. The WinXP with installer, ubuntu from sources. 
I configured the linux side with 

./configure --with-pm=smpd --with-device=ch3:nemesis 

and another build with 

./configure --with-pm=smpd --with-device=ch3:sock , 

none of them work. I also disabled all firewalls, wonder because when i start a virtual machine on the winxo machine i can esatblish a connection to the linux machine. Had anybody dealt with such problems before? 
One other question is, do i have to rebuild a MPI Application when using another MPICH2 build with another processmanager like above? 

regards, 

stephan 




_______________________________________________ 
mpich-discuss mailing list 
mpich-discuss at mcs.anl.gov 
https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss 



More information about the mpich-discuss mailing list