[petsc-dev] ugly shit

Kai Germaschewski kai.germaschewski at unh.edu
Fri Nov 11 00:09:11 CST 2011


On Thu, Nov 10, 2011 at 10:56 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:

> On Thu, Nov 10, 2011 at 21:14, Kai Germaschewski <
> kai.germaschewski at unh.edu> wrote:
>
>> Just a (kinda arbitrary) data point: I introduced MPI_IN_PLACE in my code
>> a year or so ago (not realizing that it requires MPI-2.0 at the time) and
>> within a year, I hit two cases of machines that were still running MPI 1.x
>> -- and the code doesn't get to run in all that many places in the first
>> place.
>>
>
> What sort of configurations were these?
>

One was NASA's CCMC (community coordinated modeling center) running
mvapich, the other, IIRC, was a local cluster with mpich-gm (Myrinet).

I agree, this is mostly cosmetic, but there are more MPI-2 features that
> would also be useful. There are places where having MPI 1-sided would
> greatly simplify, for example.
>

If there are other useful features of MPI2 you want to use, that seems like
a much better reason.

--Kai



-- 
Kai Germaschewski
Assistant Professor, Dept of Physics / Space Science Center
University of New Hampshire, Durham, NH 03824
office: Morse Hall 245E
phone:  +1-603-862-2912
fax: +1-603-862-2771
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111111/1dfdd7af/attachment.html>


More information about the petsc-dev mailing list