<br><div class="gmail_quote">On Thu, Nov 10, 2011 at 10:56 PM, Jed Brown <span dir="ltr"><<a href="mailto:jedbrown@mcs.anl.gov">jedbrown@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="gmail_quote"><div class="im">On Thu, Nov 10, 2011 at 21:14, Kai Germaschewski <span dir="ltr"><<a href="mailto:kai.germaschewski@unh.edu" target="_blank">kai.germaschewski@unh.edu</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div>Just a (kinda arbitrary) data point: I introduced MPI_IN_PLACE in my code a year or so ago (not realizing that it requires MPI-2.0 at the time) and within a year, I hit two cases of machines that were still running MPI 1.x -- and the code doesn't get to run in all that many places in the first place.<br>
</div></blockquote><div><br></div></div><div>What sort of configurations were these?</div></div></blockquote><div><br>One was NASA's CCMC (community coordinated modeling center) running mvapich, the other, IIRC, was a local cluster with mpich-gm (Myrinet).<br>
<br></div><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;"><div>I agree, this is mostly cosmetic, but there are more MPI-2 features that would also be useful. There are places where having MPI 1-sided would greatly simplify, for example.</div>
</blockquote></div><br>If there are other useful features of MPI2 you want to use, that seems like a much better reason.<br><br>--Kai<br><br><br clear="all"><br>-- <br>Kai Germaschewski<br>Assistant Professor, Dept of Physics / Space Science Center<br>
University of New Hampshire, Durham, NH 03824<br>office: Morse Hall 245E<br>phone: +1-603-862-2912<br>fax: +1-603-862-2771<br><br>