[petsc-dev] Bad use of defined(MPI_XXX)

Jeff Hammond jeff.science at gmail.com
Fri May 24 17:23:26 CDT 2019


No, it's really not better to keep it.  MPI 2.2 support is ubiquitous.  It
has been 10 years, which is 1-2 lifetimes of an HPC system or PC.  Anybody
who insists on using an MPI library that doesn't support 2.2 should accept
that they must use a version of PETSc from 2018 or earlier.

In the HPC space, MPI 3.0 has been available on most machines for 5+
years.  The last platform that I used that didn't have MPI 2.2 support was
IBM Blue Gene/P and all of those machines were taken offline long ago.  As
of SC18, the MPI 3.1 support matrix (see below) is essentially complete and
the only feature that PETSc would need to test for is MS-MPI's lack of
neighborhood collectives.

I am aware that people are using Open-MPI 1.10 in production today.  These
people are bad.  Don't allow their poor life choices to force the pollution
of PETSc source code with unnecessary macros.

https://lists.mpi-forum.org/pipermail/mpi-forum/2014-June/006086.html <-
MPI 3.0
https://lists.mpi-forum.org/pipermail/mpi-forum/2016-November/006532.html <-
MPI 3.1
https://lists.mpi-forum.org/pipermail/mpi-forum/2018-November/006783.html <-
MPI 3.1

Jeff

On Fri, May 24, 2019 at 2:15 PM Zhang, Junchao via petsc-dev <
petsc-dev at mcs.anl.gov> wrote:

> PetscSF has many PETSC_HAVE_MPI_REDUCE_LOCAL. It is disturbing. But
> consider the time gap between MPI-2.0 (1998) and MPI-2.2 (2009), it is
> better to keep it.
>
>
> On Fri, May 24, 2019 at 3:53 PM Jed Brown <jed at jedbrown.org> wrote:
>
>> "Zhang, Junchao" <jczhang at mcs.anl.gov> writes:
>>
>> > How about stuff in MPI-2.2 (approved in 2009), the last of MPI-2.x,
>> e.g., PETSC_HAVE_MPI_REDUCE_LOCAL?
>>
>> Currently we only require MPI-2.0, but I would not object to increasing
>> to MPI-2.1 or 2.2 if such systems are sufficiently rare (almost
>> nonexistent) in the wild.  I'm not sure how great the benefits are.
>>
>> > On Fri, May 24, 2019 at 2:51 PM Jed Brown via petsc-dev <
>> petsc-dev at mcs.anl.gov<mailto:petsc-dev at mcs.anl.gov>> wrote:
>> > Lisandro Dalcin via petsc-dev <petsc-dev at mcs.anl.gov<mailto:
>> petsc-dev at mcs.anl.gov>> writes:
>> >
>> >> These two are definitely wrong, we need PETSC_HAVE_MPI_XXX instead.
>> >
>> > Thanks, we can delete both of these cpp guards.
>> >
>> >> include/petscsf.h:#if defined(MPI_REPLACE)
>> >
>> > MPI-2.0
>> >
>> >> src/sys/objects/init.c:#if defined(PETSC_USE_64BIT_INDICES) ||
>> >> !defined(MPI_2INT)
>> >
>> > MPI-1.0
>>
>

-- 
Jeff Hammond
jeff.science at gmail.com
http://jeffhammond.github.io/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190524/48ff908e/attachment-0001.html>


More information about the petsc-dev mailing list