tma@freims:~$ mpiexec -l -n 2 -binding cpu -f ~/host_mpich env<br>[0] SHELL=/bin/bash<br>[0] SSH_CLIENT=192.168.159.239 59246 22<br>[0] LC_ALL=en_US.UTF-8<br>[0] USER=tma<br>[0] MAIL=/var/mail/tma<br>[0] PATH=/home/tma/opt/bin:/home/tma/opt/mpi/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/grid5000/code/bin<br>
[0] PWD=/home/tma<br>[0] LANG=en_US.UTF-8<br>[0] SHLVL=1<br>[0] HOME=/home/tma<br>[0] LOGNAME=tma<br>[0] SSH_CONNECTION=192.168.159.239 59246 172.16.175.100 22<br>[0] _=/home/tma/opt/mpi/bin/mpiexec<br>[0] TERM=xterm<br>[0] OLDPWD=/home/tma/opt/mpi<br>
[0] SSH_TTY=/dev/pts/26<br>[0] GFORTRAN_UNBUFFERED_PRECONNECTED=y<br>[0] MPICH_INTERFACE_HOSTNAME=<a href="http://stremi-4.reims.grid5000.fr">stremi-4.reims.grid5000.fr</a><br>[0] PMI_RANK=0<br>[0] PMI_FD=6<br>[0] PMI_SIZE=2<br>
[1] SHELL=/bin/bash<br>[1] SSH_CLIENT=192.168.159.239 59246 22<br>[1] LC_ALL=en_US.UTF-8<br>[1] USER=tma<br>[1] MAIL=/var/mail/tma<br>[1] PATH=/home/tma/opt/bin:/home/tma/opt/mpi/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/grid5000/code/bin<br>
[1] PWD=/home/tma<br>[1] LANG=en_US.UTF-8<br>[1] SHLVL=1<br>[1] HOME=/home/tma<br>[1] LOGNAME=tma<br>[1] SSH_CONNECTION=192.168.159.239 59246 172.16.175.100 22<br>[1] _=/home/tma/opt/mpi/bin/mpiexec<br>[1] TERM=xterm<br>[1] OLDPWD=/home/tma/opt/mpi<br>
[1] SSH_TTY=/dev/pts/26<br>[1] GFORTRAN_UNBUFFERED_PRECONNECTED=y<br>[1] MPICH_INTERFACE_HOSTNAME=<a href="http://stremi-4.reims.grid5000.fr">stremi-4.reims.grid5000.fr</a><br>[1] PMI_RANK=1<br>[1] PMI_FD=7<br>[1] PMI_SIZE=2<br>
<br><br>and <br><br><br>tma@freims:~$ mpiexec -l -n 2 -f ~/host_mpich env<br>[0] SHELL=/bin/bash<br>[0] SSH_CLIENT=192.168.159.239 59246 22<br>[0] LC_ALL=en_US.UTF-8<br>[0] USER=tma<br>[0] MAIL=/var/mail/tma<br>[0] PATH=/home/tma/opt/bin:/home/tma/opt/mpi/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/grid5000/code/bin<br>
[0] PWD=/home/tma<br>[0] LANG=en_US.UTF-8<br>[0] SHLVL=1<br>[0] HOME=/home/tma<br>[0] LOGNAME=tma<br>[0] SSH_CONNECTION=192.168.159.239 59246 172.16.175.100 22<br>[0] _=/home/tma/opt/mpi/bin/mpiexec<br>[0] TERM=xterm<br>[0] OLDPWD=/home/tma/opt/mpi<br>
[0] SSH_TTY=/dev/pts/26<br>[0] GFORTRAN_UNBUFFERED_PRECONNECTED=y<br>[0] MPICH_INTERFACE_HOSTNAME=<a href="http://stremi-4.reims.grid5000.fr">stremi-4.reims.grid5000.fr</a><br>[0] PMI_RANK=0<br>[0] PMI_FD=5<br>[0] PMI_SIZE=2<br>
[1] SHELL=/bin/bash<br>[1] SSH_CLIENT=192.168.159.239 59246 22<br>[1] LC_ALL=en_US.UTF-8<br>[1] USER=tma<br>[1] MAIL=/var/mail/tma<br>[1] PATH=/home/tma/opt/bin:/home/tma/opt/mpi/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/grid5000/code/bin<br>
[1] PWD=/home/tma<br>[1] LANG=en_US.UTF-8<br>[1] SHLVL=1<br>[1] HOME=/home/tma<br>[1] LOGNAME=tma<br>[1] SSH_CONNECTION=192.168.159.239 59246 172.16.175.100 22<br>[1] _=/home/tma/opt/mpi/bin/mpiexec<br>[1] TERM=xterm<br>[1] OLDPWD=/home/tma/opt/mpi<br>
[1] SSH_TTY=/dev/pts/26<br>[1] GFORTRAN_UNBUFFERED_PRECONNECTED=y<br>[1] MPICH_INTERFACE_HOSTNAME=<a href="http://stremi-4.reims.grid5000.fr">stremi-4.reims.grid5000.fr</a><br>[1] PMI_RANK=1<br>[1] PMI_FD=6<br>[1] PMI_SIZE=2<br>
<br><br><br><div class="gmail_quote">On Tue, Aug 2, 2011 at 1:49 PM, Darius Buntinas <span dir="ltr"><<a href="mailto:buntinas@mcs.anl.gov">buntinas@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
<br>
Can you send us the output of the following?<br>
<br>
mpiexec -l -n 2 -binding cpu -f ~/host_mpich env<br>
and<br>
mpiexec -l -n 2 -f ~/host_mpich env<br>
<br>
Thanks,<br>
<font color="#888888">-d<br>
</font><div><div></div><div class="h5"><br>
On Aug 2, 2011, at 12:18 PM, teng ma wrote:<br>
<br>
> If -binding is removed, it's no problem to scale to 768 processes. (32 nodes, 24 core /node). if without binding parameter, what kind of binding strategy mpich2 will use? ( fill out all slots of one nodes, and then another node, or round robin along nodes?)<br>
><br>
> Thanks<br>
> Teng<br>
><br>
> On Tue, Aug 2, 2011 at 1:14 PM, Pavan Balaji <<a href="mailto:balaji@mcs.anl.gov">balaji@mcs.anl.gov</a>> wrote:<br>
><br>
> Please keep mpich-discuss cc'ed. The below error doesn't seem to be a binding issue. Did you try removing the -binding option to see if it works without that?<br>
><br>
><br>
> On 08/02/2011 12:12 PM, teng ma wrote:<br>
> thanks for the answer. I met another issue with hydra binding. When<br>
> processes launched exceed 408, it throws error like following:<br>
><br>
><br>
> I run it like<br>
> mpiexec -n 408 -binding cpu -f ~/host_mpich ./IMB-MPI1 Bcast -npmin 408<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
> Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
> MPIR_Init_thread(388)..............:<br>
> MPID_Init(139).....................: channel initialization failed<br>
> MPIDI_CH3_Init(38).................:<br>
> MPID_nem_init(234).................:<br>
> MPID_nem_tcp_init(99)..............:<br>
> MPID_nem_tcp_get_business_card(325):<br>
> MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
><br>
><br>
> When processes is less than 407, -binding cpu/rr looks good. If I<br>
> remove -binding cpu/rr, just with -f ~/host_mpich, it's still ok no<br>
> matter how many processes. My host_mpich is like:<br>
><br>
> <a href="http://stremi-7.reims.grid5000.fr:24" target="_blank">stremi-7.reims.grid5000.fr:24</a> <<a href="http://stremi-7.reims.grid5000.fr:24" target="_blank">http://stremi-7.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-35.reims.grid5000.fr:24" target="_blank">stremi-35.reims.grid5000.fr:24</a> <<a href="http://stremi-35.reims.grid5000.fr:24" target="_blank">http://stremi-35.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-28.reims.grid5000.fr:24" target="_blank">stremi-28.reims.grid5000.fr:24</a> <<a href="http://stremi-28.reims.grid5000.fr:24" target="_blank">http://stremi-28.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-38.reims.grid5000.fr:24" target="_blank">stremi-38.reims.grid5000.fr:24</a> <<a href="http://stremi-38.reims.grid5000.fr:24" target="_blank">http://stremi-38.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-32.reims.grid5000.fr:24" target="_blank">stremi-32.reims.grid5000.fr:24</a> <<a href="http://stremi-32.reims.grid5000.fr:24" target="_blank">http://stremi-32.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-26.reims.grid5000.fr:24" target="_blank">stremi-26.reims.grid5000.fr:24</a> <<a href="http://stremi-26.reims.grid5000.fr:24" target="_blank">http://stremi-26.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-22.reims.grid5000.fr:24" target="_blank">stremi-22.reims.grid5000.fr:24</a> <<a href="http://stremi-22.reims.grid5000.fr:24" target="_blank">http://stremi-22.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-43.reims.grid5000.fr:24" target="_blank">stremi-43.reims.grid5000.fr:24</a> <<a href="http://stremi-43.reims.grid5000.fr:24" target="_blank">http://stremi-43.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-30.reims.grid5000.fr:24" target="_blank">stremi-30.reims.grid5000.fr:24</a> <<a href="http://stremi-30.reims.grid5000.fr:24" target="_blank">http://stremi-30.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-41.reims.grid5000.fr:24" target="_blank">stremi-41.reims.grid5000.fr:24</a> <<a href="http://stremi-41.reims.grid5000.fr:24" target="_blank">http://stremi-41.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-4.reims.grid5000.fr:24" target="_blank">stremi-4.reims.grid5000.fr:24</a> <<a href="http://stremi-4.reims.grid5000.fr:24" target="_blank">http://stremi-4.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-34.reims.grid5000.fr:24" target="_blank">stremi-34.reims.grid5000.fr:24</a> <<a href="http://stremi-34.reims.grid5000.fr:24" target="_blank">http://stremi-34.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-24.reims.grid5000.fr:24" target="_blank">stremi-24.reims.grid5000.fr:24</a> <<a href="http://stremi-24.reims.grid5000.fr:24" target="_blank">http://stremi-24.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-23.reims.grid5000.fr:24" target="_blank">stremi-23.reims.grid5000.fr:24</a> <<a href="http://stremi-23.reims.grid5000.fr:24" target="_blank">http://stremi-23.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-20.reims.grid5000.fr:24" target="_blank">stremi-20.reims.grid5000.fr:24</a> <<a href="http://stremi-20.reims.grid5000.fr:24" target="_blank">http://stremi-20.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-36.reims.grid5000.fr:24" target="_blank">stremi-36.reims.grid5000.fr:24</a> <<a href="http://stremi-36.reims.grid5000.fr:24" target="_blank">http://stremi-36.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-29.reims.grid5000.fr:24" target="_blank">stremi-29.reims.grid5000.fr:24</a> <<a href="http://stremi-29.reims.grid5000.fr:24" target="_blank">http://stremi-29.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-19.reims.grid5000.fr:24" target="_blank">stremi-19.reims.grid5000.fr:24</a> <<a href="http://stremi-19.reims.grid5000.fr:24" target="_blank">http://stremi-19.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-42.reims.grid5000.fr:24" target="_blank">stremi-42.reims.grid5000.fr:24</a> <<a href="http://stremi-42.reims.grid5000.fr:24" target="_blank">http://stremi-42.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-39.reims.grid5000.fr:24" target="_blank">stremi-39.reims.grid5000.fr:24</a> <<a href="http://stremi-39.reims.grid5000.fr:24" target="_blank">http://stremi-39.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-27.reims.grid5000.fr:24" target="_blank">stremi-27.reims.grid5000.fr:24</a> <<a href="http://stremi-27.reims.grid5000.fr:24" target="_blank">http://stremi-27.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-44.reims.grid5000.fr:24" target="_blank">stremi-44.reims.grid5000.fr:24</a> <<a href="http://stremi-44.reims.grid5000.fr:24" target="_blank">http://stremi-44.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-37.reims.grid5000.fr:24" target="_blank">stremi-37.reims.grid5000.fr:24</a> <<a href="http://stremi-37.reims.grid5000.fr:24" target="_blank">http://stremi-37.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-31.reims.grid5000.fr:24" target="_blank">stremi-31.reims.grid5000.fr:24</a> <<a href="http://stremi-31.reims.grid5000.fr:24" target="_blank">http://stremi-31.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-6.reims.grid5000.fr:24" target="_blank">stremi-6.reims.grid5000.fr:24</a> <<a href="http://stremi-6.reims.grid5000.fr:24" target="_blank">http://stremi-6.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-33.reims.grid5000.fr:24" target="_blank">stremi-33.reims.grid5000.fr:24</a> <<a href="http://stremi-33.reims.grid5000.fr:24" target="_blank">http://stremi-33.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-3.reims.grid5000.fr:24" target="_blank">stremi-3.reims.grid5000.fr:24</a> <<a href="http://stremi-3.reims.grid5000.fr:24" target="_blank">http://stremi-3.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-2.reims.grid5000.fr:24" target="_blank">stremi-2.reims.grid5000.fr:24</a> <<a href="http://stremi-2.reims.grid5000.fr:24" target="_blank">http://stremi-2.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-40.reims.grid5000.fr:24" target="_blank">stremi-40.reims.grid5000.fr:24</a> <<a href="http://stremi-40.reims.grid5000.fr:24" target="_blank">http://stremi-40.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-21.reims.grid5000.fr:24" target="_blank">stremi-21.reims.grid5000.fr:24</a> <<a href="http://stremi-21.reims.grid5000.fr:24" target="_blank">http://stremi-21.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-5.reims.grid5000.fr:24" target="_blank">stremi-5.reims.grid5000.fr:24</a> <<a href="http://stremi-5.reims.grid5000.fr:24" target="_blank">http://stremi-5.reims.grid5000.fr:24</a>><br>
> <a href="http://stremi-25.reims.grid5000.fr:24" target="_blank">stremi-25.reims.grid5000.fr:24</a> <<a href="http://stremi-25.reims.grid5000.fr:24" target="_blank">http://stremi-25.reims.grid5000.fr:24</a>><br>
><br>
><br>
> The configure of mpich2 is just default configure.<br>
><br>
> Thanks<br>
> Teng<br>
><br>
> On Tue, Aug 2, 2011 at 12:43 PM, Pavan Balaji <<a href="mailto:balaji@mcs.anl.gov">balaji@mcs.anl.gov</a><br>
> <mailto:<a href="mailto:balaji@mcs.anl.gov">balaji@mcs.anl.gov</a>>> wrote:<br>
><br>
><br>
> mpiexec -binding rr<br>
><br>
> -- Pavan<br>
><br>
><br>
> On 08/02/2011 11:35 AM, teng ma wrote:<br>
><br>
> If I want to do a process-core binding like MVAPICH2's scatter way:<br>
> assign MPI ranks by nodes in host file, e.g.<br>
> host1<br>
> host2<br>
> host3<br>
><br>
> rank 0 host 1's core 0<br>
> rank 1 host 2's core 0<br>
> rank 2 host 3's core 0<br>
> rank 3 host 1's core 1<br>
> rank 4 host 2's core 1<br>
> rank 5 host 3's core 1<br>
><br>
> Is there any easy method in mpich2-1.4 to achieve this binding?<br>
><br>
> Teng Ma<br>
><br>
><br>
><br>
> _________________________________________________<br>
> mpich-discuss mailing list<br>
> <a href="mailto:mpich-discuss@mcs.anl.gov">mpich-discuss@mcs.anl.gov</a> <mailto:<a href="mailto:mpich-discuss@mcs.anl.gov">mpich-discuss@mcs.anl.gov</a>><br>
><br>
> <a href="https://lists.mcs.anl.gov/__mailman/listinfo/mpich-discuss" target="_blank">https://lists.mcs.anl.gov/__mailman/listinfo/mpich-discuss</a><br>
> <<a href="https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss</a>><br>
><br>
><br>
> --<br>
> Pavan Balaji<br>
> <a href="http://www.mcs.anl.gov/%7Ebalaji" target="_blank">http://www.mcs.anl.gov/~balaji</a> <<a href="http://www.mcs.anl.gov/%7Ebalaji" target="_blank">http://www.mcs.anl.gov/%7Ebalaji</a>><br>
><br>
><br>
><br>
> --<br>
> Pavan Balaji<br>
> <a href="http://www.mcs.anl.gov/%7Ebalaji" target="_blank">http://www.mcs.anl.gov/~balaji</a><br>
><br>
> _______________________________________________<br>
> mpich-discuss mailing list<br>
> <a href="mailto:mpich-discuss@mcs.anl.gov">mpich-discuss@mcs.anl.gov</a><br>
> <a href="https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss</a><br>
<br>
_______________________________________________<br>
mpich-discuss mailing list<br>
<a href="mailto:mpich-discuss@mcs.anl.gov">mpich-discuss@mcs.anl.gov</a><br>
<a href="https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss</a><br>
</div></div></blockquote></div><br>