Hi Will,<br> You have to create the machinefile on your own and you can give it any name as you desire and the format in which you write process to node mappings in the machinefile should be in the manner as mentioned in the before given link. Also first check out whether you are done with ssh passwordless login between nodes on which you want to run your MPI program. Also mpiexec syntax given by u seems to be wrong it should "mpiexec -f machinefile -np "no-of-processes" "./name-of-your-mpi-executable".<br>
<br><br><br>Regards,<br>-- Mahesh Doijade <br>
<br><br><br><div class="gmail_quote">2011/8/24 张巍 <span dir="ltr"><<a href="mailto:zhweizi@126.com">zhweizi@126.com</a>></span><br><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
<div style="line-height: 1.7; color: rgb(0, 0, 0); font-size: 14px; font-family: arial;">Hi Mahesh, thanks for your reply, I re-install mpich2-1.4, and here are "c.txt", "m.txt", "mi.txt",. <div>
<br></div><div>After "make", "make install", "Add the bin subdirectory", "check my 'mpicc' and 'mpiexec' ", "ssh", when I type:<div> "mpiexec -f machinefile -n hostname" , it's said the "machinefile" can't be found.</div>
<div><br></div><div>I know machinefile must be overwrite with hostname and number of processors, however, I just can't find machiefile in my computer...</div><div><br></div><div>I hate myself....</div><div><br></div><div>
Sorry to disturb you again, and with my regards.</div><div><br></div><div>Will ZHANG</div><div><div></div><div class="h5"><div><br><div></div><br>At 2011-08-15 23:44:51,"Mahesh Doijade" <<a href="mailto:maheshdoijade@gmail.com" target="_blank">maheshdoijade@gmail.com</a>> wrote:<br>
<blockquote style="padding-left: 1ex; margin: 0px 0px 0px 0.8ex; border-left: 1px solid rgb(204, 204, 204);"><br>Hi Will,<br> mpd is been replaced by hydra which is now the default process manager for MPICH2. and if you want to verify whether mpich2 has been installed then check whether 'mpiexec' , 'mpicc' are present, for this you have to execute this command 'which mpicc' 'which mpiexec'.<br>
And regarding the machinefile and executing MPI programs on a cluster follow steps given in this link <a href="http://wiki.mcs.anl.gov/mpich2/index.php/Using_the_Hydra_Process_Manager" target="_blank">http://wiki.mcs.anl.gov/mpich2/index.php/Using_the_Hydra_Process_Manager</a>.<br>
<br>Regards,<br>-- Mahesh Doijade. <br><span style="font-size: large;"> </span><br><br><div class="gmail_quote">2011/8/15 张巍 <span dir="ltr"><<a href="mailto:zhweizi@126.com" target="_blank">zhweizi@126.com</a>></span><br>
<blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
<div><div><span style="font-size: large;">Hi, I'm an amateur of Linux, and studying in Cluster with MPI recently. The version which I downloaded is the newest "mpich2-1.4", and OS is Red Hat Enterprise Linux 4 ES. I have two questions to be consulted, and appreciate for your attention. </span></div>
<div><span style="font-size: large;"><br></span></div><div><span style="font-size: large;">First, in the manual "Installer's Guide", 2.2-8, I should use command "which mpicc" and "which mpiexec" to check my installation. But according to informations I've found in internet, they all said there should be four command: "which mpd", "which mpicc", "which mpiexec" and "which mpirun" . And some guy tells me command "mpd" is replaced by other commands like "hydra" or etc. I don't know if I can't use "mpd" command, means that the MPICH program has not been installed successfully?
</span></div><div><span style="font-size: large;"><br></span></div><div><span style="font-size: large;">Second, in the manual "Installer's Guide", 2.2-10, I should modify the "machinefile" to ensure every computer is listed in the cluster, but, I can't find a way to modify or open the "machinefile".</span></div>
<div><span style="font-size: large;"><br></span></div><div><span style="font-size: large;">Sorry to take your time, and waiting for your reply. <br><br></span></div></div><div><span style="font-size: large;">Best regards.</span></div>
<div><span style="font-size: large;"><br></span></div><div><span style="font-size: large;"> &nb
sp; Will ZHANG </span></div><div><span style="font-size: large;"> Tongji University, Shanghai, China</span></div>
<br><br><span title="neteasefooter"><span></span></span><br>_______________________________________________<br>
mpich-discuss mailing list<br>
<a href="mailto:mpich-discuss@mcs.anl.gov" target="_blank">mpich-discuss@mcs.anl.gov</a><br>
<a href="https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss</a><br>
<br></blockquote></div><br><br clear="all"><br><br>
<br><br>
</blockquote></div></div></div></div></div><br><br><span title="neteasefooter"><span></span></span></blockquote></div> <br><br>