[petsc-users] 3.11 configure error on pleiades
Balay, Satish
balay at mcs.anl.gov
Sat Mar 30 15:35:33 CDT 2019
configure creates configure.log with all the debugging details.
Its best to compare configure.log from the successful 3.8.3 with the
current one - and see what changed between these 2 builds
[you can send us both logs at petsc-maint]
Satish
On Sat, 30 Mar 2019, Kokron, Daniel S. (ARC-606.2)[InuTeq, LLC] via petsc-users wrote:
> Last time I built PETSc on Pleiades it was version 3.8.3. Using the same build procedure with the same compilers and MPI libraries with 3.11 does not work. Is there a way to enable more verbose diagnostics during the configure phase so I can figure out what executable was being run and how it was compiled?
>
> PBS r147i6n10 24> ./configure --prefix=/nobackupp8/XXX /Projects/CHEM/BoA_Case/Codes-2018.3.222/binaries/petsc-3.11+ --with-debugging=0 --with-shared-libraries=1 --with-cc=mpicc --with-fc=mpif90 --with-cxx=mpicxx --with-blas-lapack-dir=$MKLROOT/lib/intel64 --with-scalapack-include=$MKLROOT/include --with-scalapack-lib="$MKLROOT/lib/intel64/libmkl_scalapack_lp64.so $MKLROOT/lib/intel64/libmkl_blacs_sgimpt_lp64.so" --with-cpp=/usr/bin/cpp --with-gnu-compilers=0 --with-vendor-compilers=intel -COPTFLAGS="-g -O3 -xCORE-AVX2 -diag-disable=cpu-dispatch" -CXXOPTFLAGS="-g -O3 -xCORE-AVX2 -diag-disable=cpu-dispatch" -FOPTFLAGS="-g -O3 -xCORE-AVX2 -diag-disable=cpu-dispatch" --with-mpi=true --with-mpi-exec=mpiexec --with-mpi-compilers=1 --with-precision=double --with-scalar-type=real --with-x=0 --with-x11=0 --with-memalign=32
>
> I get this which usually means that an executable was linked with libmpi, but was not launched with mpiexec.
>
> TESTING: configureMPITypes from config.packages.MPI(/nobackupp8/dkokron/Projects/CHEM/BoA_Case/Codes-2018.3.222/petsc/config/BuildSystem/config/packages/MPI.py:283)
> ????????CMPT ERROR: mpiexec_mpt must be used to launch all MPI applications
> ????????CMPT ERROR: mpiexec_mpt must be used to launch all MPI applications
> ????????CMPT ERROR: mpiexec_mpt must be used to launch all MPI applications
>
> If I let it continue, configure reports that MPI is empty.
>
> make:
> BLAS/LAPACK: -Wl,-rpath,/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64 -L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread
> MPI:
> cmake:
> pthread:
> scalapack:
>
> Daniel Kokron
> Redline Performance Solutions
> SciCon/APP group
> --
>
>
More information about the petsc-users
mailing list