[petsc-users] 3.11 configure error on pleiades

Matthew Knepley knepley at gmail.com
Sat Mar 30 15:41:46 CDT 2019


On Sat, Mar 30, 2019 at 4:31 PM Kokron, Daniel S. (ARC-606.2)[InuTeq, LLC]
via petsc-users <petsc-users at mcs.anl.gov> wrote:

> Last time I built PETSc on Pleiades it was version 3.8.3.  Using the same
> build procedure with the same compilers and MPI libraries with 3.11 does
> not work.  Is there a way to enable more verbose diagnostics during the
> configure phase so I can figure out what executable was being run and how
> it was compiled?
>

This is not the right option:

  --with-mpi-exec=mpiexec

it is

  --with-mpiexec=mpiexec

  Thanks,

      Matt

PBS r147i6n10 24> ./configure --prefix=/nobackupp8/XXX
> /Projects/CHEM/BoA_Case/Codes-2018.3.222/binaries/petsc-3.11+
> --with-debugging=0 --with-shared-libraries=1 --with-cc=mpicc
> --with-fc=mpif90 --with-cxx=mpicxx
> --with-blas-lapack-dir=$MKLROOT/lib/intel64
> --with-scalapack-include=$MKLROOT/include
> --with-scalapack-lib="$MKLROOT/lib/intel64/libmkl_scalapack_lp64.so
> $MKLROOT/lib/intel64/libmkl_blacs_sgimpt_lp64.so" --with-cpp=/usr/bin/cpp
> --with-gnu-compilers=0 --with-vendor-compilers=intel -COPTFLAGS="-g -O3
> -xCORE-AVX2 -diag-disable=cpu-dispatch" -CXXOPTFLAGS="-g -O3 -xCORE-AVX2
> -diag-disable=cpu-dispatch" -FOPTFLAGS="-g -O3 -xCORE-AVX2
> -diag-disable=cpu-dispatch" --with-mpi=true --with-mpi-exec=mpiexec
> --with-mpi-compilers=1 --with-precision=double --with-scalar-type=real
> --with-x=0 --with-x11=0 --with-memalign=32
>
>
>
> I get this which usually means that an executable was linked with libmpi,
> but was not launched with mpiexec.
>
>
>
> TESTING: configureMPITypes from
> config.packages.MPI(/nobackupp8/dkokron/Projects/CHEM/BoA_Case/Codes-2018.3.222/petsc/config/BuildSystem/config/packages/MPI.py:283)
>
> ????????CMPT ERROR: mpiexec_mpt must be used to launch all MPI applications
>
> ????????CMPT ERROR: mpiexec_mpt must be used to launch all MPI applications
>
> ????????CMPT ERROR: mpiexec_mpt must be used to launch all MPI applications
>
>
>
> If I let it continue, configure reports that MPI is empty.
>
>
>
> make:
>
> BLAS/LAPACK:
> -Wl,-rpath,/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64
> -L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64
> -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread
>
> MPI:
>
> cmake:
>
> pthread:
>
> scalapack:
>
>
>
> Daniel Kokron
> Redline Performance Solutions
> SciCon/APP group
>
> --
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190330/07f86d10/attachment.html>


More information about the petsc-users mailing list