[petsc-users] Compiling PETSc in for Grace-Hopper nodes

Vanella, Marcos (Fed) marcos.vanella at nist.gov
Fri Nov 8 14:49:33 CST 2024


Thank you Junchao, we'll work on this compatibility issue.
Best,
Marcos
________________________________
From: Junchao Zhang <junchao.zhang at gmail.com>
Sent: Friday, November 8, 2024 3:47 PM
To: Vanella, Marcos (Fed) <marcos.vanella at nist.gov>
Cc: Satish Balay <balay at mcs.anl.gov>; petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>; Victor Eijkhout <eijkhout at tacc.utexas.edu>
Subject: Re: [petsc-users] Compiling PETSc in for Grace-Hopper nodes

Yes, the error message indicates gcc-14 is not supported by cuda-12.5.
  According to https://urldefense.us/v3/__https://gcc02.safelinks.protection.outlook.com/?url=https*3A*2F*2Fdocs.nvidia.com*2Fcuda*2Farchive*2F12.5.0*2Fcuda-installation-guide-linux*2Findex.html*23host-compiler-support-policy&data=05*7C02*7Cmarcos.vanella*40nist.gov*7C7136c6301354458874f708dd00369e37*7C2ab5d82fd8fa4797a93e054655c61dec*7C0*7C0*7C638666956782708272*7CUnknown*7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ*3D*3D*7C0*7C*7C*7C&sdata=ZCZOH7YEqdGU2NSHnn4Shuly*2BG2*2BPci6Gcm5yoVcJKY*3D&reserved=0__;JSUlJSUlJSUlJSUlJSUlJSUlJSUlJSUlJSUlJQ!!G_uCfscf7eWS!dE1UqBggdaLlQuHbpbc4p7cYAz9EvrpKKIDn3RKUBr-E2D7rcj5OO0krQHrW75551ETLudAIb2qjBr_qhCg0XKvqOmcTHhJu$ <https://urldefense.us/v3/__https://docs.nvidia.com/cuda/archive/12.5.0/cuda-installation-guide-linux/index.html*host-compiler-support-policy__;Iw!!G_uCfscf7eWS!dE1UqBggdaLlQuHbpbc4p7cYAz9EvrpKKIDn3RKUBr-E2D7rcj5OO0krQHrW75551ETLudAIb2qjBr_qhCg0XKvqOk0iGmQL$ >,
 it supports up to gcc-13.2.

Perhaps the best approach is to ask your sys admin to install
compatible gcc and cuda.

--Junchao Zhang

On Fri, Nov 8, 2024 at 2:34 PM Vanella, Marcos (Fed)
<marcos.vanella at nist.gov> wrote:
>
> Hi Satish and Junchao, this is what I'm getting at the end of the configure.log. I guess there is an incompatibility among nvcc and the gcc version I'm using.
>
>
> ...
> =============================================================================================
> TESTING: checkCUDACompiler from config.setCompilers(/home1/09805/mnv/Software/petsc/config/BuildSystem/config/setCompilers.py:1541)
>   Locate a functional CUDA compiler
>     Checking for program /opt/apps/xalt/xalt/bin/nvcc...not found
>     Checking for program /opt/apps/gcc14/cuda12/openmpi/5.0.5/libexec/osu-micro-benchmarks/mpi/one-sided/nvcc...not found
>     Checking for program /opt/apps/gcc14/cuda12/openmpi/5.0.5/libexec/osu-micro-benchmarks/mpi/collective/nvcc...not found
>     Checking for program /opt/apps/gcc14/cuda12/openmpi/5.0.5/libexec/osu-micro-benchmarks/mpi/pt2pt/nvcc...not found
>     Checking for program /opt/apps/gcc14/cuda12/openmpi/5.0.5/libexec/osu-micro-benchmarks/mpi/startup/nvcc...not found
>     Checking for program /opt/apps/gcc14/cuda12/openmpi/5.0.5/bin/nvcc...not found
>     Checking for program /home1/apps/nvidia/Linux_aarch64/24.7/cuda/12.5/bin/nvcc...found
>               Defined make macro "CUDAC" to "nvcc"
> Executing: nvcc -c -o /tmp/petsc-qlfa8fb8/config.setCompilers/conftest.o -I/tmp/petsc-qlfa8fb8/config.setCompilers   /tmp/petsc-qlfa8fb8/config.setCompilers/conftest.cu
> stdout:
> In file included from /home1/apps/nvidia/Linux_aarch64/24.7/cuda/12.5/bin/../targets/sbsa-linux/include/cuda_runtime.h:82,
>                  from <command-line>:
> /home1/apps/nvidia/Linux_aarch64/24.7/cuda/12.5/bin/../targets/sbsa-linux/include/crt/host_config.h:143:2: error: #error -- unsupported GNU version! gcc versions later than 13 are not supported! The nvcc flag '-allow-unsupported-compiler' can be used to override this version check; however, using an unsupported host compiler may cause compilation failure or incorrect run time execution. Use at your own risk.
>   143 | #error -- unsupported GNU version! gcc versions later than 13 are not supported! The nvcc flag '-allow-unsupported-compiler' can be used to override this version check; however, using an unsupported host compiler may cause compilation failure or incorrect run time execution. Use at your own risk.
>       |  ^~~~~
> Possible ERROR while running compiler: exit code 1
> stderr:
> In file included from /home1/apps/nvidia/Linux_aarch64/24.7/cuda/12.5/bin/../targets/sbsa-linux/include/cuda_runtime.h:82,
>                  from <command-line>:
> /home1/apps/nvidia/Linux_aarch64/24.7/cuda/12.5/bin/../targets/sbsa-linux/include/crt/host_config.h:143:2: error: #error -- unsupported GNU version! gcc versions later than 13 are not supported! The nvcc flag '-allow-unsupported-compiler' can be used to override this version check; however, using an unsupported host compiler may cause compilation failure or incorrect run time execution. Use at your own risk.
>   143 | #error -- unsupported GNU version! gcc versions later than 13 are not supported! The nvcc flag '-allow-unsupported-compiler' can be used to override this version check; however, using an unsupported host compiler may cause compilation failure or incorrect run time execution. Use at your own risk.
>       |  ^~~~~
> Source:
> #include "confdefs.h"
> #include "conffix.h"
>
> int main(void) {
>   return 0;
> }
>
>           Error testing CUDA compiler: Cannot compile CUDA with nvcc.
>             Deleting "CUDAC"
> *********************************************************************************************
>            UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details):
> ---------------------------------------------------------------------------------------------
>   CUDA compiler you provided with -with-cudac=nvcc cannot be found or does not work.
>   Cannot compile CUDA with nvcc.
> *********************************************************************************************
>   File "/home1/09805/mnv/Software/petsc/config/configure.py", line 461, in petsc_configure
>     framework.configure(out = sys.stdout)
>   File "/home1/09805/mnv/Software/petsc/config/BuildSystem/config/framework.py", line 1460, in configure
>     self.processChildren()
>   File "/home1/09805/mnv/Software/petsc/config/BuildSystem/config/framework.py", line 1448, in processChildren
>     self.serialEvaluation(self.childGraph)
>   File "/home1/09805/mnv/Software/petsc/config/BuildSystem/config/framework.py", line 1423, in serialEvaluation
>     child.configure()
>   File "/home1/09805/mnv/Software/petsc/config/BuildSystem/config/setCompilers.py", line 2846, in configure
>     self.executeTest(getattr(self,LANG.join(('check','Compiler'))))
>   File "/home1/09805/mnv/Software/petsc/config/BuildSystem/config/base.py", line 138, in executeTest
>     ret = test(*args,**kargs)
>   File "/home1/09805/mnv/Software/petsc/config/BuildSystem/config/setCompilers.py", line 1544, in checkCUDACompiler
>     for compiler in self.generateCUDACompilerGuesses():
>   File "/home1/09805/mnv/Software/petsc/config/BuildSystem/config/setCompilers.py", line 1527, in generateCUDACompilerGuesses
>     raise RuntimeError('CUDA compiler you provided with -with-cudac='+self.argDB['with-cudac']+' cannot be found or does not work.'+'\n'+self.mesg)
> ================================================================================
> Finishing configure run at Fri, 08 Nov 2024 14:28:04 -0600
> ================================================================================
> ________________________________
> From: Junchao Zhang <junchao.zhang at gmail.com>
> Sent: Friday, November 8, 2024 3:25 PM
> To: Vanella, Marcos (Fed) <marcos.vanella at nist.gov>
> Cc: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> Subject: Re: [petsc-users] Compiling PETSc in for Grace-Hopper nodes
>
> Hi, Marcos
>   Could you attach the configure.log?
> --Junchao Zhang
>
>
> On Fri, Nov 8, 2024 at 2:19 PM Vanella, Marcos (Fed) via petsc-users <petsc-users at mcs.anl.gov> wrote:
>
> Hi all, does anyone have experience compiling PETSc with gnu openmpi and cross compiling with cuda nvcc on these systems?
> we have access to Vista, a machine in TACC and was trying to build PETSc with these libraries. I would need gnu openmpi to compile my code (fortran std 2018), and would like to keep the same cpu compiler/openmpi for PETSc.I have the following modules loaded:
>
> Currently Loaded Modules:
>   1) ucc/1.3.0   2) ucx/1.17.0   3) cmake/3.29.5   4) xalt/3.1   5) TACC   6) gcc/14.2.0   7) cuda/12.5 (g)   8) openmpi/5.0.5
>
>   Where:
>    g:  built for GPU
>
> Here mpicc points to the gcc compiler, etc. When configuring PETSc in the following form I get nvcc not working:
>
> $ ./configure COPTFLAGS="-O2 -g" CXXOPTFLAGS="-O2 -g" FOPTFLAGS="-O2 -g" FCOPTFLAGS="-O2 -g" CUDAOPTFLAGS="-O2 -g" --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpifort --with-cuda --with-cudac=nvcc --with-cuda-arch=90 --download-fblaslapack=1 --with-make-np=8
>
> =============================================================================================
>                          Configuring PETSc to compile on your system
> =============================================================================================
> TESTING: checkCUDACompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1541)
> *********************************************************************************************
>            UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details):
> ---------------------------------------------------------------------------------------------
>   CUDA compiler you provided with -with-cudac=nvcc cannot be found or does not work.
>   Cannot compile CUDA with nvcc.
> *********************************************************************************************
>
> I have nvcc in my path:
>
> $ nvcc --version
> nvcc: NVIDIA (R) Cuda compiler driver
> Copyright (c) 2005-2024 NVIDIA Corporation
> Built on Thu_Jun__6_02:26:10_PDT_2024
> Cuda compilation tools, release 12.5, V12.5.82
> Build cuda_12.5.r12.5/compiler.34385749_0
>
> I remember being able to do this cross compilation in polaris. Any help is most appreciated,
> Marcos
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241108/d24f4833/attachment-0001.html>


More information about the petsc-users mailing list