[mpich-discuss] how to make the hosts files

Reuti reuti at staff.uni-marburg.de
Sun May 15 11:50:21 CDT 2011


Hi,

Am 12.05.2011 um 12:48 schrieb Leon Yuhanov:

> is there any way to do this with mpd instead???

you mean something like it was available in LAM/MPI to simply use all  
known machines in the ring?

Wouldn't this relocate the problem only, as you would then need some  
kind of known machines file to start the mpd ring? Same stands for  
PVM, when  you add at one point each machine.

-- Reuti


> Leon Yuhanov
> IT Manager - Philip Chun Advanced Technology
> Cell: +614 16 029 852
> Email: leon at philipchun.com
>
>
> Mandar Gurav <mandarwce at gmail.com> wrote:
>
> I would suggest you to consult this page....
>
> http://wiki.mcs.anl.gov/mpich2/index.php/Using_the_Hydra_Process_Manager
>
> --Mandar Gurav
>
> On Thu, May 12, 2011 at 4:02 PM, Leon Yuhanov <leon at philipchun.com>  
> wrote:
>> is there a way to run this process on multiple machines without a  
>> file??
>> Leon Yuhanov
>> IT Manager - Philip Chun Advanced Technology
>> Cell: +614 16 029 852
>> Email: leon at philipchun.com
>>
>>
>> Mandar Gurav <mandarwce at gmail.com> wrote:
>>
>> Ohhh!
>>
>> You are running your job on a single machine! Just forget about the
>> machinefile and all. Just execute
>>
>> "mpiexec -np 2 ./a.out"
>>
>> that's it.
>>
>> The machinefile or hosts file is required when you have 2(or more)
>> machines with mpi installed on both machines and you want to run a
>> single program the spawns across the two machines.
>>
>> -- Mandar Gurav
>>
>> On Thu, May 12, 2011 at 2:03 PM, Pavan Balaji <balaji at mcs.anl.gov>  
>> wrote:
>>>
>>> Is there an actual machine with the name "host1" or "host2" in  
>>> your setup?
>>>
>>> If you are just running it on the local node, you should not give  
>>> the
>>> -machinefile or -f option.
>>>
>>> -- Pavan
>>>
>>> On 05/12/2011 03:28 AM, hyunduk kim wrote:
>>>>
>>>> Thanks for your response
>>>> However, my setup is not working.
>>>>
>>>> In my check progress.
>>>> 1) I installed mpich2 on intel muti-core 2 cpu machine
>>>> 2) check : /etc/hosts file
>>>>   127.0.0.1               localhost.localdomain localhost
>>>>   ::1                        localhost6.localdomain6 localhost6
>>>>
>>>> 3) made the machinefile for mpiexec :
>>>> /usr/local/mpich2/machine/machinefile
>>>>
>>>> host1:6
>>>> host2:6
>>>>
>>>> 4) run : [root at francium machine]# mpiexec -n 10 -machinefile
>>>> ./machinefile /usr/local/mpich2-1.3.2p1/examples/cpi
>>>>  ==> I received messages as below
>>>>        ssh: connect to host host1 port 22: Connection timed out
>>>>        ssh: connect to host host2 port 22: Connection timed out
>>>>
>>>> Question is :
>>>> 1) why do I setup passwordless login among the two hosts?
>>>> 2) Mpich2 was installed on the just multi-core 2 cpu machine. Why  
>>>> dose
>>>> the mpiexec try to connect host1 and host2 using port 22 ?
>>>> 3) Is there other method for defining the machinefile on the  
>>>> multi-core
>>>> 2 cpu machine ?
>>>>
>>>> I will attach my log files.
>>>>
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> mpich-discuss mailing list
>>>> mpich-discuss at mcs.anl.gov
>>>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>>>
>>> --
>>> Pavan Balaji
>>> http://www.mcs.anl.gov/~balaji
>>>
>>
>>
>>
>> --
>> Mandar Gurav
>> http://www.mandargurav.org
>> _______________________________________________
>> mpich-discuss mailing list
>> mpich-discuss at mcs.anl.gov
>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>> _______________________________________________
>> mpich-discuss mailing list
>> mpich-discuss at mcs.anl.gov
>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>>
>
>
>
> -- 
> Mandar Gurav
> http://www.mandargurav.org
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss



More information about the mpich-discuss mailing list