[mpich-discuss] mpich-g2 + firewalls + mpich-g2 over MPICH-basedvendor-MPI

Rajeev Thakur thakur at mcs.anl.gov
Fri Mar 26 16:02:04 CDT 2010


The mailing list for MPICH-G2 is mpi at globus.org, which I have CCed here.
 
Rajeev


  _____  

From: mpich-discuss-bounces at mcs.anl.gov
[mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of Vladimir Janjic
Sent: Friday, March 26, 2010 9:48 AM
To: mpich-discuss at mcs.anl.gov
Subject: [mpich-discuss] mpich-g2 + firewalls + mpich-g2 over
MPICH-basedvendor-MPI


hi all, 

i have the following situation : i have four 8-core servers (say,
servers s1a,s1b,s1c,s1d) on one site, and six 4-core servers
(s2a,s2b,s2c,s2d,s2e,s3f) on the other site.
i want to run MPI applications on all of them, using the globus
middleware. i have globus installed on s1a, and on all machines on site
2 (s2a-s2f).
firewall between s1a and s2a-s2f machines has all necessary ports open,
and i can use mpich-g2 to run MPI applications on grid consisting of
{s1a,s2a,s2b,s2c,s2d,s2e,s2f} grid.
however, i want to include s1b-s1d machines in this system. the problem
is, there is a firewall blocking the traffic from s1b-s1d to s2a-s2f,
and arranging institutions to open new holes in
firewall is very long and tiresome process.
i have read that it is possible to set-up mpich-g2 to use vendor MPI for
running programs. i have mpich2 set up on s1a-s1d machines, so i can run
MPI programs on them. is it possible to set-up
globus and mpich-g2 so that all communication between s1a-s1d uses
mpich2 local implementation, and all communication from s1b-s1d to
s2a-s2f machines (and vice versa) to go through the
head node s1a (since firewall will not block this traffic)? so, for
example, if s2a needs to send the message to s1b, it sends it via s1a?
do i need to have globus and mpich-g2 installed on s2b-s2d servers in
that case?

i hope my question is at least remotely clear :)

thanks a lot,
vladimir

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20100326/c120e79e/attachment.htm>


More information about the mpich-discuss mailing list