[petsc-dev] Building PETSc on LLNL Lassen

Jed Brown jed at jedbrown.org
Sun Dec 13 15:36:30 CST 2020


I use Clang because XL compilers are buggy and not faster. This worked as of a few months ago:

#!/usr/tce/packages/python/python-3.7.2/bin/python3
if __name__ == '__main__':
  import sys
  import os
  sys.path.insert(0, os.path.abspath('config'))
  import configure
  configure_options = [
    '--with-blaslapack-lib=/usr/tcetmp/packages/essl/essl-6.2.1/lib64/liblapackforessl.so /usr/tcetmp/packages/essl/essl-6.2.1/lib64/libessl.so',
    '--with-cuda=1',
    '--with-debugging=0',
    '--with-fc=0',
    '--with-mpi-dir=/usr/tce/packages/spectrum-mpi/spectrum-mpi-rolling-release-clang-ibm-2019.10.03',
    'COPTFLAGS=-O3 -mcpu=native -ffp-contract=fast',
    'CUDAFLAGS=--gpu-architecture=sm_70 -ccbin clang++',
    'CXXOPTFLAGS=-O3 -mcpu=native -ffp-contract=fast',
    'PETSC_ARCH=lassen-clang-essl-opt',
  ]
  configure.petsc_configure(configure_options)

Jacob Faibussowitsch <jacob.fai at gmail.com> writes:

> Hello All,
>
> Does anyone have any experience building petsc with cuda support on Lassen? I’ve been having trouble building with ibm xl compilers + spectrum-mpi + nvcc. NVCC seems to not like -std=c++14 argument, complaining that its configured host compiler doesn’t support it, but compiling the following “test.cc <http://test.cc/>":
>
> #include <stdlib.h>
>
> int main(int argc, char **argv)
> {                                                                                                   
>   int i = 1;
>   i += argc;
>   return(i);
> }
>
> With mpicc -std=c++14 test.cc <http://test.cc/> produces zero errors. 
> ------------------------------------------------------------------------
>
> Modules loaded:
>
> module load xl/2020.11.12-cuda-11.1.1                                                               
> module load spectrum-mpi
> module load cuda/11.1.1
> module load python/3.8.2
> module load cmake
> module load valgrind
> module load lapack
>
> My configure commands:
>
> ./configure  --with-cc=mpicc --with-cxx=mpiCC --with-fc=mpifort --with-cuda --with-debugging=1 PETSC_ARCH=arch-linux-c-debug
>
> The error:
>
> TESTING: findMPIInc from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:636)         *******************************************************************************
>          UNABLE to CONFIGURE with GIVEN OPTIONS    (see configure.log for details):
> -------------------------------------------------------------------------------
> Bad compiler flag: -I/usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/include
> *******************************************************************************
>
> The actual configure.log error:
>
> Executing: nvcc -c -o /var/tmp/petsc-2v0k4k61/config.setCompilers/conftest.o -I/var/tmp/petsc-2v0k4\
> k61/config.setCompilers -I/var/tmp/petsc-2v0k4k61/config.types  -g -std=c++14 -I/usr/tce/packages/s\
> pectrum-mpi/ibm/spectrum-mpi-rolling-release/include  -Wno-deprecated-gpu-targets /var/tmp/petsc-2v\
> 0k4k61/config.setCompilers/conftest.cu 
> Possible ERROR while running compiler:
> stderr:
> nvcc warning : The -std=c++14 flag is not supported with the configured host compiler. Flag will be\
>  ignored.
> Source:
> #include "confdefs.h"
> #include "conffix.h"
>
> int main() {
> ;
>   return 0;
> }
>                   Rejecting compiler flag -I/usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/include  due to 
> nvcc warning : The -std=c++14 flag is not supported with the configured host compiler. Flag will be ignored.
>
>
> Best regards,
>
> Jacob Faibussowitsch
> (Jacob Fai - booss - oh - vitch)
> Cell: (312) 694-3391


More information about the petsc-dev mailing list