tma@freims:~$ mpiexec -l -n 2 -binding cpu -f ~/host_mpich env<br>[0] SHELL=/bin/bash<br>[0] SSH_CLIENT=192.168.159.239 59246 22<br>[0] LC_ALL=en_US.UTF-8<br>[0] USER=tma<br>[0] MAIL=/var/mail/tma<br>[0] PATH=/home/tma/opt/bin:/home/tma/opt/mpi/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/grid5000/code/bin<br>
[0] PWD=/home/tma<br>[0] LANG=en_US.UTF-8<br>[0] SHLVL=1<br>[0] HOME=/home/tma<br>[0] LOGNAME=tma<br>[0] SSH_CONNECTION=192.168.159.239 59246 172.16.175.100 22<br>[0] _=/home/tma/opt/mpi/bin/mpiexec<br>[0] TERM=xterm<br>[0] OLDPWD=/home/tma/opt/mpi<br>
[0] SSH_TTY=/dev/pts/26<br>[0] GFORTRAN_UNBUFFERED_PRECONNECTED=y<br>[0] MPICH_INTERFACE_HOSTNAME=<a href="http://stremi-4.reims.grid5000.fr">stremi-4.reims.grid5000.fr</a><br>[0] PMI_RANK=0<br>[0] PMI_FD=6<br>[0] PMI_SIZE=2<br>
[1] SHELL=/bin/bash<br>[1] SSH_CLIENT=192.168.159.239 59246 22<br>[1] LC_ALL=en_US.UTF-8<br>[1] USER=tma<br>[1] MAIL=/var/mail/tma<br>[1] PATH=/home/tma/opt/bin:/home/tma/opt/mpi/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/grid5000/code/bin<br>
[1] PWD=/home/tma<br>[1] LANG=en_US.UTF-8<br>[1] SHLVL=1<br>[1] HOME=/home/tma<br>[1] LOGNAME=tma<br>[1] SSH_CONNECTION=192.168.159.239 59246 172.16.175.100 22<br>[1] _=/home/tma/opt/mpi/bin/mpiexec<br>[1] TERM=xterm<br>[1] OLDPWD=/home/tma/opt/mpi<br>
[1] SSH_TTY=/dev/pts/26<br>[1] GFORTRAN_UNBUFFERED_PRECONNECTED=y<br>[1] MPICH_INTERFACE_HOSTNAME=<a href="http://stremi-4.reims.grid5000.fr">stremi-4.reims.grid5000.fr</a><br>[1] PMI_RANK=1<br>[1] PMI_FD=7<br>[1] PMI_SIZE=2<br>
<br><br>and <br><br><br>tma@freims:~$ mpiexec -l -n 2 -f ~/host_mpich env<br>[0] SHELL=/bin/bash<br>[0] SSH_CLIENT=192.168.159.239 59246 22<br>[0] LC_ALL=en_US.UTF-8<br>[0] USER=tma<br>[0] MAIL=/var/mail/tma<br>[0] PATH=/home/tma/opt/bin:/home/tma/opt/mpi/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/grid5000/code/bin<br>
[0] PWD=/home/tma<br>[0] LANG=en_US.UTF-8<br>[0] SHLVL=1<br>[0] HOME=/home/tma<br>[0] LOGNAME=tma<br>[0] SSH_CONNECTION=192.168.159.239 59246 172.16.175.100 22<br>[0] _=/home/tma/opt/mpi/bin/mpiexec<br>[0] TERM=xterm<br>[0] OLDPWD=/home/tma/opt/mpi<br>
[0] SSH_TTY=/dev/pts/26<br>[0] GFORTRAN_UNBUFFERED_PRECONNECTED=y<br>[0] MPICH_INTERFACE_HOSTNAME=<a href="http://stremi-4.reims.grid5000.fr">stremi-4.reims.grid5000.fr</a><br>[0] PMI_RANK=0<br>[0] PMI_FD=5<br>[0] PMI_SIZE=2<br>
[1] SHELL=/bin/bash<br>[1] SSH_CLIENT=192.168.159.239 59246 22<br>[1] LC_ALL=en_US.UTF-8<br>[1] USER=tma<br>[1] MAIL=/var/mail/tma<br>[1] PATH=/home/tma/opt/bin:/home/tma/opt/mpi/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/grid5000/code/bin<br>
[1] PWD=/home/tma<br>[1] LANG=en_US.UTF-8<br>[1] SHLVL=1<br>[1] HOME=/home/tma<br>[1] LOGNAME=tma<br>[1] SSH_CONNECTION=192.168.159.239 59246 172.16.175.100 22<br>[1] _=/home/tma/opt/mpi/bin/mpiexec<br>[1] TERM=xterm<br>[1] OLDPWD=/home/tma/opt/mpi<br>
[1] SSH_TTY=/dev/pts/26<br>[1] GFORTRAN_UNBUFFERED_PRECONNECTED=y<br>[1] MPICH_INTERFACE_HOSTNAME=<a href="http://stremi-4.reims.grid5000.fr">stremi-4.reims.grid5000.fr</a><br>[1] PMI_RANK=1<br>[1] PMI_FD=6<br>[1] PMI_SIZE=2<br>
<br><br><br><div class="gmail_quote">On Tue, Aug 2, 2011 at 1:49 PM, Darius Buntinas <span dir="ltr">&lt;<a href="mailto:buntinas@mcs.anl.gov">buntinas@mcs.anl.gov</a>&gt;</span> wrote:<br><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
<br>
Can you send us the output of the following?<br>
<br>
    mpiexec -l -n 2 -binding cpu -f ~/host_mpich env<br>
and<br>
    mpiexec -l -n 2 -f ~/host_mpich env<br>
<br>
Thanks,<br>
<font color="#888888">-d<br>
</font><div><div></div><div class="h5"><br>
On Aug 2, 2011, at 12:18 PM, teng ma wrote:<br>
<br>
&gt; If -binding is removed, it&#39;s no problem to scale to 768 processes. (32 nodes, 24 core /node). if without binding parameter, what kind of binding strategy mpich2 will use? ( fill out all slots of one nodes, and then another node,   or round robin along nodes?)<br>

&gt;<br>
&gt; Thanks<br>
&gt; Teng<br>
&gt;<br>
&gt; On Tue, Aug 2, 2011 at 1:14 PM, Pavan Balaji &lt;<a href="mailto:balaji@mcs.anl.gov">balaji@mcs.anl.gov</a>&gt; wrote:<br>
&gt;<br>
&gt; Please keep mpich-discuss cc&#39;ed. The below error doesn&#39;t seem to be a binding issue. Did you try removing the -binding option to see if it works without that?<br>
&gt;<br>
&gt;<br>
&gt; On 08/02/2011 12:12 PM, teng ma wrote:<br>
&gt; thanks for the answer. I met another issue with hydra binding. When<br>
&gt; processes launched exceed 408,  it throws error like following:<br>
&gt;<br>
&gt;<br>
&gt; I run it like<br>
&gt; mpiexec -n 408 -binding cpu -f ~/host_mpich ./IMB-MPI1 Bcast -npmin 408<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt; Fatal error in PMPI_Init_thread: Other MPI error, error stack:<br>
&gt; MPIR_Init_thread(388)..............:<br>
&gt; MPID_Init(139).....................: channel initialization failed<br>
&gt; MPIDI_CH3_Init(38).................:<br>
&gt; MPID_nem_init(234).................:<br>
&gt; MPID_nem_tcp_init(99)..............:<br>
&gt; MPID_nem_tcp_get_business_card(325):<br>
&gt; MPIDI_Get_IP_for_iface(276)........: ioctl failed errno=19 - No such device<br>
&gt;<br>
&gt;<br>
&gt; When processes is less than 407, -binding cpu/rr looks good.   If I<br>
&gt; remove -binding cpu/rr, just with -f ~/host_mpich, it&#39;s still ok no<br>
&gt; matter how many processes. My host_mpich is like:<br>
&gt;<br>
&gt; <a href="http://stremi-7.reims.grid5000.fr:24" target="_blank">stremi-7.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-7.reims.grid5000.fr:24" target="_blank">http://stremi-7.reims.grid5000.fr:24</a>&gt;<br>
&gt; <a href="http://stremi-35.reims.grid5000.fr:24" target="_blank">stremi-35.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-35.reims.grid5000.fr:24" target="_blank">http://stremi-35.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-28.reims.grid5000.fr:24" target="_blank">stremi-28.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-28.reims.grid5000.fr:24" target="_blank">http://stremi-28.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-38.reims.grid5000.fr:24" target="_blank">stremi-38.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-38.reims.grid5000.fr:24" target="_blank">http://stremi-38.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-32.reims.grid5000.fr:24" target="_blank">stremi-32.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-32.reims.grid5000.fr:24" target="_blank">http://stremi-32.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-26.reims.grid5000.fr:24" target="_blank">stremi-26.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-26.reims.grid5000.fr:24" target="_blank">http://stremi-26.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-22.reims.grid5000.fr:24" target="_blank">stremi-22.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-22.reims.grid5000.fr:24" target="_blank">http://stremi-22.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-43.reims.grid5000.fr:24" target="_blank">stremi-43.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-43.reims.grid5000.fr:24" target="_blank">http://stremi-43.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-30.reims.grid5000.fr:24" target="_blank">stremi-30.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-30.reims.grid5000.fr:24" target="_blank">http://stremi-30.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-41.reims.grid5000.fr:24" target="_blank">stremi-41.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-41.reims.grid5000.fr:24" target="_blank">http://stremi-41.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-4.reims.grid5000.fr:24" target="_blank">stremi-4.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-4.reims.grid5000.fr:24" target="_blank">http://stremi-4.reims.grid5000.fr:24</a>&gt;<br>
&gt; <a href="http://stremi-34.reims.grid5000.fr:24" target="_blank">stremi-34.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-34.reims.grid5000.fr:24" target="_blank">http://stremi-34.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-24.reims.grid5000.fr:24" target="_blank">stremi-24.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-24.reims.grid5000.fr:24" target="_blank">http://stremi-24.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-23.reims.grid5000.fr:24" target="_blank">stremi-23.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-23.reims.grid5000.fr:24" target="_blank">http://stremi-23.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-20.reims.grid5000.fr:24" target="_blank">stremi-20.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-20.reims.grid5000.fr:24" target="_blank">http://stremi-20.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-36.reims.grid5000.fr:24" target="_blank">stremi-36.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-36.reims.grid5000.fr:24" target="_blank">http://stremi-36.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-29.reims.grid5000.fr:24" target="_blank">stremi-29.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-29.reims.grid5000.fr:24" target="_blank">http://stremi-29.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-19.reims.grid5000.fr:24" target="_blank">stremi-19.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-19.reims.grid5000.fr:24" target="_blank">http://stremi-19.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-42.reims.grid5000.fr:24" target="_blank">stremi-42.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-42.reims.grid5000.fr:24" target="_blank">http://stremi-42.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-39.reims.grid5000.fr:24" target="_blank">stremi-39.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-39.reims.grid5000.fr:24" target="_blank">http://stremi-39.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-27.reims.grid5000.fr:24" target="_blank">stremi-27.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-27.reims.grid5000.fr:24" target="_blank">http://stremi-27.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-44.reims.grid5000.fr:24" target="_blank">stremi-44.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-44.reims.grid5000.fr:24" target="_blank">http://stremi-44.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-37.reims.grid5000.fr:24" target="_blank">stremi-37.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-37.reims.grid5000.fr:24" target="_blank">http://stremi-37.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-31.reims.grid5000.fr:24" target="_blank">stremi-31.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-31.reims.grid5000.fr:24" target="_blank">http://stremi-31.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-6.reims.grid5000.fr:24" target="_blank">stremi-6.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-6.reims.grid5000.fr:24" target="_blank">http://stremi-6.reims.grid5000.fr:24</a>&gt;<br>
&gt; <a href="http://stremi-33.reims.grid5000.fr:24" target="_blank">stremi-33.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-33.reims.grid5000.fr:24" target="_blank">http://stremi-33.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-3.reims.grid5000.fr:24" target="_blank">stremi-3.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-3.reims.grid5000.fr:24" target="_blank">http://stremi-3.reims.grid5000.fr:24</a>&gt;<br>
&gt; <a href="http://stremi-2.reims.grid5000.fr:24" target="_blank">stremi-2.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-2.reims.grid5000.fr:24" target="_blank">http://stremi-2.reims.grid5000.fr:24</a>&gt;<br>
&gt; <a href="http://stremi-40.reims.grid5000.fr:24" target="_blank">stremi-40.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-40.reims.grid5000.fr:24" target="_blank">http://stremi-40.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-21.reims.grid5000.fr:24" target="_blank">stremi-21.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-21.reims.grid5000.fr:24" target="_blank">http://stremi-21.reims.grid5000.fr:24</a>&gt;<br>

&gt; <a href="http://stremi-5.reims.grid5000.fr:24" target="_blank">stremi-5.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-5.reims.grid5000.fr:24" target="_blank">http://stremi-5.reims.grid5000.fr:24</a>&gt;<br>
&gt; <a href="http://stremi-25.reims.grid5000.fr:24" target="_blank">stremi-25.reims.grid5000.fr:24</a> &lt;<a href="http://stremi-25.reims.grid5000.fr:24" target="_blank">http://stremi-25.reims.grid5000.fr:24</a>&gt;<br>

&gt;<br>
&gt;<br>
&gt; The configure of mpich2 is just default configure.<br>
&gt;<br>
&gt; Thanks<br>
&gt; Teng<br>
&gt;<br>
&gt; On Tue, Aug 2, 2011 at 12:43 PM, Pavan Balaji &lt;<a href="mailto:balaji@mcs.anl.gov">balaji@mcs.anl.gov</a><br>
&gt; &lt;mailto:<a href="mailto:balaji@mcs.anl.gov">balaji@mcs.anl.gov</a>&gt;&gt; wrote:<br>
&gt;<br>
&gt;<br>
&gt;    mpiexec -binding rr<br>
&gt;<br>
&gt;      -- Pavan<br>
&gt;<br>
&gt;<br>
&gt;    On 08/02/2011 11:35 AM, teng ma wrote:<br>
&gt;<br>
&gt;        If I want to do a process-core binding like MVAPICH2&#39;s scatter way:<br>
&gt;        assign MPI ranks by nodes in host file, e.g.<br>
&gt;        host1<br>
&gt;        host2<br>
&gt;        host3<br>
&gt;<br>
&gt;        rank 0 host 1&#39;s core 0<br>
&gt;        rank 1 host 2&#39;s core 0<br>
&gt;        rank 2 host 3&#39;s core 0<br>
&gt;        rank 3 host 1&#39;s core 1<br>
&gt;        rank 4 host 2&#39;s core 1<br>
&gt;        rank 5 host 3&#39;s core 1<br>
&gt;<br>
&gt;        Is there any easy method in mpich2-1.4 to achieve this binding?<br>
&gt;<br>
&gt;        Teng Ma<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;        _________________________________________________<br>
&gt;        mpich-discuss mailing list<br>
&gt;        <a href="mailto:mpich-discuss@mcs.anl.gov">mpich-discuss@mcs.anl.gov</a> &lt;mailto:<a href="mailto:mpich-discuss@mcs.anl.gov">mpich-discuss@mcs.anl.gov</a>&gt;<br>
&gt;<br>
&gt;        <a href="https://lists.mcs.anl.gov/__mailman/listinfo/mpich-discuss" target="_blank">https://lists.mcs.anl.gov/__mailman/listinfo/mpich-discuss</a><br>
&gt;        &lt;<a href="https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss</a>&gt;<br>
&gt;<br>
&gt;<br>
&gt;    --<br>
&gt;    Pavan Balaji<br>
&gt;    <a href="http://www.mcs.anl.gov/%7Ebalaji" target="_blank">http://www.mcs.anl.gov/~balaji</a> &lt;<a href="http://www.mcs.anl.gov/%7Ebalaji" target="_blank">http://www.mcs.anl.gov/%7Ebalaji</a>&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; --<br>
&gt; Pavan Balaji<br>
&gt; <a href="http://www.mcs.anl.gov/%7Ebalaji" target="_blank">http://www.mcs.anl.gov/~balaji</a><br>
&gt;<br>
&gt; _______________________________________________<br>
&gt; mpich-discuss mailing list<br>
&gt; <a href="mailto:mpich-discuss@mcs.anl.gov">mpich-discuss@mcs.anl.gov</a><br>
&gt; <a href="https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss</a><br>
<br>
_______________________________________________<br>
mpich-discuss mailing list<br>
<a href="mailto:mpich-discuss@mcs.anl.gov">mpich-discuss@mcs.anl.gov</a><br>
<a href="https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss</a><br>
</div></div></blockquote></div><br>