[mpich-discuss] MPI_Publish_name/Open_port on windows

Jerome Soumagne soumagne at cscs.ch
Wed Sep 23 11:10:50 CDT 2009


Jayesh,

thanks for your response. We will use the "port in a file" method...
I think it would not be so hard to implement it ourselves but it might 
be not really worth doing it since we only have one process doing the 
Open_port, followed by a collective accept.
Thus only one process in our case would write to the file.

Anyway, we will let you know if we make some patch to implement this 
info key.

Best regards,

Jerome

Le 23/09/2009 17:35, Jayesh Krishna a écrit :
> Hi,
>  From the code it looks like the info key, "ip_port", is not yet 
> implemented for MPI_Open_port().
> Regards,
> Jayesh
>
> ------------------------------------------------------------------------
> *From:* mpich-discuss-bounces at mcs.anl.gov 
> [mailto:mpich-discuss-bounces at mcs.anl.gov] *On Behalf Of *John Biddiscombe
> *Sent:* Wednesday, September 23, 2009 9:46 AM
> *To:* mpich-discuss at mcs.anl.gov
> *Subject:* Re: [mpich-discuss] MPI_Publish_name/Open_port on windows
>
> Jayesh
>
>     MPI_Info_create (&info);
>     MPI_Info_set( info, "ip_port", "1957");
>
> doesn't work either - by which we mean it doesn't give us the port we 
> asked for. Is this simply a case of "nobody implemented it" and if we 
> modify the source to check for the info key we can make it work - in 
> which case we'll patch the source and recompile.
>
> Thanks
>
> JB
>
>
>>   You can try opening an MPI port and writing the port name returned 
>> to a file and have the other processes read it (Now you have to make 
>> sure that only one process writes to the file OR have some locking 
>> mechanism to modify the file).
>>
>> Regards,
>> Jayesh
>>
>> -----Original Message-----
>> From: mpich-discuss-bounces at mcs.anl.gov 
>> [mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Jerome Soumagne
>> Sent: Wednesday, September 23, 2009 7:20 AM
>> To: mpich-discuss at mcs.anl.gov
>> Subject: [mpich-discuss] MPI_Publish_name/Open_port on windows
>>
>> Hi,
>>
>> I'm using MPI_Comm_connect and accept to create the communication 
>> between 2 different applications (simulation and visualization apps) 
>> and I have 2 questions:
>>
>> - on Linux, I use Open_port/Publish_name/Lookup_name to get the 
>> ip/port from an app server registered in the same mpd ring - this 
>> works fine.
>> The problem comes from Windows with the smpd daemon, whenever I try 
>> to connect 2 simple tests (client/server), the client never finds the 
>> registered name. However the smpd service is up and running. Is there 
>> a way to tell the smpd service to use the same features of the mpd 
>> daemon so that I could still use Publish_name and Lookup_name?
>>
>> - Instead of using Publish_name/Lookup_name, I tried to simply define 
>> a port with MPI_Open_port specifying an Info argument but it does not 
>> work, how could I specify an ip/port to use? I want to always use the 
>> same specific port and the MPI_PORT_RANGE environment variable only 
>> allows me to specify a range of ports. Depending on some other MPI 
>> processes running on the same machine for different tasks, I then get 
>> different ports and if I try to have an MPI_PORT_RANGE like this :
>> 2646:2646, the other MPI processes crash because they don't have any 
>> available socket. How can I tell the MPI_Open_port function to use a 
>> defined port?
>>
>> I thank you very much in advance.
>>
>> Jerome Soumagne
>>
>> --
>> Jérôme Soumagne
>> CSCS, Swiss National Supercomputing Centre Galleria 2, Via Cantonale  
>> | Tel: +41 (0)91 610 8258
>> CH-6928 Manno, Switzerland | Fax: +41 (0)91 610 8282
>>
>>
>
>
> -- 
> John Biddiscombe,                            email:biddisco @ cscs.ch
> http://www.cscs.ch/
> CSCS, Swiss National Supercomputing Centre  | Tel:  +41 (91) 610.82.07
> Via Cantonale, 6928 Manno, Switzerland      | Fax:  +41 (91) 610.82.82


-- 
Jérôme Soumagne
CSCS, Swiss National Supercomputing Centre
Galleria 2, Via Cantonale  | Tel: +41 (0)91 610 8258
CH-6928 Manno, Switzerland | Fax: +41 (0)91 610 8282


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20090923/9e0386c4/attachment-0001.htm>


More information about the mpich-discuss mailing list