[petsc-users] [petsc-dev] configure option missing for MPI.h / on IBM machine

Aron Ahmadia aron.ahmadia at kaust.edu.sa
Thu Feb 10 13:45:36 CST 2011


I was wondering this myself.  On the BlueGene line the MPI installation is
based on MPICH, so the mpi* compilers behave as you'd expect on any other
MPICH install.  I'm not familiar with the voodoo in the IBM-HPC toolkit or
the intricacies of this particular machine, but since it's obviously being
administered by *somebody* (see the modules in Leo's environment), I'd
expect the administrators to have gotten it right.

A

On Thu, Feb 10, 2011 at 8:58 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>  Aron,
>
>   Shouldn't the mpcc and mpfort manage providing the include directories
> and libraries automatically (like everyone elses mpicc etc does?) Seems very
> cumbersome that users need to know they strange directories and include them
> themselves? A real step backwards in usability?
>
>    Barry
>
> On Feb 10, 2011, at 7:44 AM, Aron Ahmadia wrote:
>
> > add opt/ibmhpc/ppe.poe/include/ibmmpi/ to your ./configure options like
> this:
> >
> > --with-mpi-include=/opt/ibmhpc/ppe.poe/include/ibmmpi/
> >
> > You may have to manually add the MPI libraries and their path as well,
> since BuildSystem tends to like these packaged together.  Ping the list back
> if you can't figure it out from there.
> >
> > -Aron
> >
> > On Thu, Feb 10, 2011 at 3:23 PM, lvankampenhout at gmail.com <
> lvankampenhout at gmail.com> wrote:
> > Hi all, i'm having this error when configuring the latest petsc-dev on an
> IBM PPC system.
> >
> >
> > TESTING: CxxMPICheck from
> config.packages.MPI(/gpfs/h01/vkampenh/install/petsc-dev/config/BuildSystem/config/packages/MPI.py:611)
> >
> *******************************************************************************
> >          UNABLE to CONFIGURE with GIVEN OPTIONS    (see configure.log for
> details):
> >
> -------------------------------------------------------------------------------
> > C++ error! mpi.h could not be located at: []
> >
> *******************************************************************************
> >
> >
> > My configure options: ./configure --with-batch=1 --with-mpi-shared=0
> --with-endian=big --with-memcmp-ok --sizeof-void-p=8 --sizeof-char=1
> --sizeof-short=2 --sizeof-int=4 --sizeof-long=8 --sizeof-size-t=8
> --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8
> --sizeof-MPI-Comm=8 --sizeof-MPI-Fint=4 --have-mpi-long-double=1
> --with-f90-interface=rs6000 --with-cc="mpcc -compiler xlc_r -q64"
> --with-fc="mpfort -compiler xlf_r -q64" --FFLAGS="-O3 -qhot -qstrict
> -qarch=auto -qtune=auto" --CFLAGS="-O3 -qhot -qstrict -qarch=auto
> -qtune=auto" --LIBS=-lmass_64 --with-ar=/usr/bin/ar
> --prefix=/sara/sw/petsc/3.0.0-p8/real --with-scalar-type=real
> PETSC_ARCH=linux-ibm-pwr6-xlf-real-64 --with-shared=0 -with-debugging=0
> --download-ml --download-hypre
> >
> >
> > vkampenh at p6012:~/install/petsc-dev> module list
> > Currently Loaded Modulefiles:
> >  1) compilerwrappers/yes  4) c++/ibm/11.1          7) upc/ibm/11.1
> >  2) java/ibm/1.5          5) fortran/ibm/13.1
> >  3) c/ibm/11.1            6) sara
> >
> >
> > vkampenh at p6012:~/install/petsc-dev> locate mpi.h
> > /opt/ibm/java2-ppc64-50/include/jvmpi.h
> > /opt/ibmhpc/ppe.poe/include/ibmmpi/mpi.h
> > /opt/mpich/include/mpi.h
> > /usr/include/boost/mpi.hpp
> > /usr/lib64/gcc/powerpc64-suse-linux/4.3/include/jvmpi.h
> > /usr/lib64/mpi/gcc/openmpi/include/mpi.h
> > /usr/lib64/mpi/gcc/openmpi/include/openmpi/ompi/mpi/f77/prototypes_mpi.h
> >
> /usr/src/linux-2.6.32.27-0.2-obj/ppc64/default/include/config/usb/serial/siemens/mpi.h
> >
> /usr/src/linux-2.6.32.27-0.2-obj/ppc64/ppc64/include/config/usb/serial/siemens/mpi.h
> >
> /usr/src/linux-2.6.32.27-0.2-obj/ppc64/trace/include/config/usb/serial/siemens/mpi.h
> > /usr/src/linux-2.6.32.27-0.2/drivers/message/fusion/lsi/mpi.h
> >
> >
> > Is there an easy way to add the IBMHPC/PPE.POE directory to the configure
> list, so that it will be recognized? The machine uses LoadLeveler schedule
> system, which handles the MPI settings.
> >
> > Thanks,
> > Leo
> >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110210/42067981/attachment.htm>


More information about the petsc-users mailing list