<div>Is it possible for me to modify the Python script (mpiexec) in such a way that I use the pbind command to bind each instance of the executing program to a different processor? I would like to give that a shot.</div>
<div> </div>
<div>Warm Regards,</div>
<div>Christina.<br><br> </div>
<div><span class="gmail_quote">On 6/20/07, <b class="gmail_sendername">Rajeev Thakur</b> <<a href="mailto:thakur@mcs.anl.gov">thakur@mcs.anl.gov</a>> wrote:</span>
<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid">MPICH2 leaves the scheduling of processes to the OS. If the OS has some way<br>to bind processes to processors, you could try using it.
<br><br>Rajeev<br><br>> -----Original Message-----<br>> From: <a href="mailto:owner-mpich-discuss@mcs.anl.gov">owner-mpich-discuss@mcs.anl.gov</a><br>> [mailto:<a href="mailto:owner-mpich-discuss@mcs.anl.gov">owner-mpich-discuss@mcs.anl.gov
</a>] On Behalf Of<br>> Christina Patrick<br>> Sent: Wednesday, June 20, 2007 4:12 PM<br>> To: <a href="mailto:mpich-discuss-digest@mcs.anl.gov">mpich-discuss-digest@mcs.anl.gov</a><br>> Subject: [MPICH] Binding and instance of the MPI program to a
<br>> particular processor<br>><br>> Hi everybody,<br>><br>> I am having a 8 processor Solaris 9 machine and I want to<br>> execute an MPI program on it. The problem is that the tasks<br>> created by mpiexec keep migrating on the different
<br>> processors. Since it is only one machine, there is only one<br>> instance of the mpdboot daemon running on the machine. Hence<br>> when I execute the below command on the machine with 8<br>> processors, I get an output that says:
<br>><br>> (For example, if the MPI program name is "finalized")<br>> # mpiexec -n 8 ./finalized<br>> 0: No Errors<br>><br>> When I examined the system using the "prstat" command, I<br>
> observed that the tasks are migrating between the different<br>> processors.<br>><br>> Is there any why in which I could bind each instance of the<br>> MPI program to a different processor.<br>><br>> Your suggesstions and help is appreciated,
<br>><br>> Thanks,<br>> Christina.<br>><br><br></blockquote></div><br>