[petsc-users] Compiling PETSc in Polaris with gnu
Vanella, Marcos (Fed)
marcos.vanella at nist.gov
Thu May 2 16:02:41 CDT 2024
Thank you Satish and Junchao! I was able to compile PETSc with your configure options + suitesparse and hypre, and then compile my fortran code linking to PETSc.
But when I try to run my test run I'm picking up an error at the very beginning:
MPICH ERROR [Rank 0] [job id 01eb3c4a-28a7-4178-aced-512b4fb704c6] [Thu May 2 20:44:26 2024] [x3006c0s19b1n0] - Abort(-1) (rank 0 in comm 0): MPIDI_CRAY_init: GPU_SUPPORT_ENABLED is requested, but GTL library is not linked
(Other MPI error)
aborting job:
MPIDI_CRAY_init: GPU_SUPPORT_ENABLED is requested, but GTL library is not linked
It says in the Polaris user guide that:
The environment variable MPICH_GPU_SUPPORT_ENABLED=1 needs to be set if your application requires MPI-GPU support whereby the MPI library sends and receives data directly from GPU buffers. In this case, it will be important to have the craype-accel-nvidia80 module loaded both when compiling your application and during runtime to correctly link against a GPU Transport Layer (GTL) MPI library. Otherwise, you'll likely see GPU_SUPPORT_ENABLED is requested, but GTL library is not linked errors during runtime.
I tried adding loading this module (also needed to add nvhpc-mixed) in my submission script but I get the same result.
I'll get in touch with alcf help on this.
________________________________
From: Satish Balay <balay at mcs.anl.gov>
Sent: Thursday, May 2, 2024 11:58 AM
To: Junchao Zhang <junchao.zhang at gmail.com>
Cc: petsc-users <petsc-users at mcs.anl.gov>; Vanella, Marcos (Fed) <marcos.vanella at nist.gov>; Mueller, Eric V. (Fed) <eric.mueller at nist.gov>
Subject: Re: [petsc-users] Compiling PETSc in Polaris with gnu
I just tried a build (used default versions) - and the following builds for me [on the login node].
module use /soft/modulefiles
module load PrgEnv-gnu
module load cudatoolkit-standalone
module load cray-libsci
./configure --with-cc=cc --with-fc=ftn --with-cxx=CC --with-make-np=4 --with-cuda=1 --with-cudac=nvcc --with-cuda-arch=80 \
--with-debugging=0 COPTFLAGS=-O2 CXXOPTFLAGS=-O2 FOPTFLAGS=-O2 CUDAOPTFLAGS=-O2 --download-kokkos --download-kokkos-kernels
make
Satish
---
balay at polaris-login-01:~> module list
Currently Loaded Modules:
1) libfabric/1.15.2.0 4) darshan/3.4.4 7) cray-dsmml/0.2.2 10) cray-pals/1.3.4 13) PrgEnv-gnu/8.5.0
2) craype-network-ofi 5) gcc-native/12.3 8) cray-mpich/8.1.28 11) cray-libpals/1.3.4 14) cudatoolkit-standalone/12.2.2
3) perftools-base/23.12.0 6) craype/2.7.30 9) cray-pmi/6.1.13 12) craype-x86-milan 15) cray-libsci/23.12.5
On Thu, 2 May 2024, Junchao Zhang wrote:
> I used cudatoolkit-standalone/12.4.1 and gcc-12.3.
>
> Be sure to use the latest petsc/main or petsc/release, which contains fixes
> for Polaris.
>
> --Junchao Zhang
>
>
> On Thu, May 2, 2024 at 10:23 AM Satish Balay via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
> > Try:
> >
> > module use /soft/modulefiles
> >
> > Satish
> >
> > On Thu, 2 May 2024, Vanella, Marcos (Fed) via petsc-users wrote:
> >
> > > Hi all, it seems the modules in Polaris have changed (can't find
> > cudatoolkit-standalone anymore).
> > > Does anyone have recent experience compiling the library with gnu and
> > cuda in the machine?
> > > Thank you!
> > > Marcos
> > >
> >
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240502/9af0673a/attachment-0001.html>
More information about the petsc-users
mailing list