[mpich-discuss] -channel shm
Jayesh Krishna
jayesh at mcs.anl.gov
Thu Oct 1 13:46:05 CDT 2009
Hi,
Make sure that you are using the latest stable version of MPICH2
(1.1.1p1) available at
http://www.mcs.anl.gov/research/projects/mpich2/downloads/index.php?s=down
loads .
Let us know if you still have problems (If you still have problems in
your email include the complete command that you use to launch your MPI
job. If possible, include a test program so that we can recreate your
problem.).
Regards,
Jayesh
_____
From: dave waite [mailto:waitedm at gmail.com]
Sent: Thursday, October 01, 2009 1:26 PM
To: 'Jayesh Krishna'
Subject: RE: [mpich-discuss] -channel shm
Jayesh,
When I run using -n 5 -channel shm, my application has one master rank
and 4 slave ranks. When I run using -n 5 -channel nemesis, all 5
instances of my application have rank = master. Is there some other
mpiexec arguments needed when running nemesis?
Thanks,
Dave Waite.
From: Jayesh Krishna [mailto:jayesh at mcs.anl.gov]
Sent: Thursday, October 01, 2009 11:00 AM
To: 'dave waite'
Cc: mpich-discuss at mcs.anl.gov
Subject: RE: [mpich-discuss] -channel shm
Hi,
The SHM channel is old (It is already deprecated on unix, it will be
deprecated on windows in the upcoming release). Use the newer Nemesis
channel (-channel nemesis) instead.
Regards,
Jayesh
_____
From: mpich-discuss-bounces at mcs.anl.gov
[mailto:mpich-discuss-bounces at mcs.anl.gov] On Behalf Of dave waite
Sent: Thursday, October 01, 2009 12:43 PM
To: mpich-discuss at mcs.anl.gov
Subject: [mpich-discuss] -channel shm
Using mpich2 on windows, when we run mpiexec using the -localonly -channel
shm options, we see that all cores spin at all times(100% CPU Usage
History, Windows Task Manager). Without, the -channel shm option, cores
show cpu activity only when they are actively computing. Is that a way to
use the -channel shm option and not have the infinite spin limit on the
cores?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20091001/7b78361e/attachment.htm>
More information about the mpich-discuss
mailing list