[petsc-users] Problem building petsc +rocm variant

Roskop, Luke B luke.roskop at hpe.com
Fri Mar 11 14:34:00 CST 2022


Hi, I’m hoping you can help me figure out how to build PETSc targeting AMDGPUs (gfx90a GPU).

I’m attempting to build on the CrayEx ORNL system called crusher, using ROCmCC (AMD’s compiler), and cray-mpich. In case it helps, I’m using spack to build petsc with the “petsc at main%rocmcc+batch+rocm amdgpu_target=gfx90a” spec.

Spack ends up invoking the following configure for PETSc:

'/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/python-3.9.10-7y7mxajn5rywz5xdnba4azphcdodxiub/bin/python3.9' 'configure' '--prefix=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/petsc-main-mccbycx66la7rlx6jv44f6zd63cmdzm7' '--with-ssl=0' '--download-c2html=0' '--download-sowing=0' '--download-hwloc=0' 'CFLAGS=' 'FFLAGS=-fPIC' 'CXXFLAGS=' 'LDFLAGS=-Wl,-z,notext' '--with-cc=/opt/cray/pe/mpich/8.1.12/ofi/amd/4.4/bin/mpicc' '--with-cxx=/opt/cray/pe/mpich/8.1.12/ofi/amd/4.4/bin/mpicxx' '--with-fc=/opt/cray/pe/mpich/8.1.12/ofi/amd/4.4/bin/mpif90' '--with-precision=double' '--with-scalar-type=real' '--with-shared-libraries=1' '--with-debugging=0' '--with-openmp=0' '--with-64-bit-indices=0' 'COPTFLAGS=' 'FOPTFLAGS=' 'CXXOPTFLAGS=' '--with-blas-lapack-lib=/opt/cray/pe/libsci/21.08.1.2/AMD/4.0/x86_64/lib/libsci_amd.so' '--with-batch=1' '--with-x=0' '--with-clanguage=C' '--with-cuda=0' '--with-hip=1' '--with-hip-dir=/opt/rocm-4.5.0/hip' '--with-metis=1' '--with-metis-include=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/metis-5.1.0-zn5rn5srr7qzxyo5tq36d46adcsyc5a7/include' '--with-metis-lib=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/metis-5.1.0-zn5rn5srr7qzxyo5tq36d46adcsyc5a7/lib/libmetis.so' '--with-hypre=1' '--with-hypre-include=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/hypre-develop-vgfx3lhhloq4cnethsrrpz7iez7x6wad/include' '--with-hypre-lib=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/hypre-develop-vgfx3lhhloq4cnethsrrpz7iez7x6wad/lib/libHYPRE.so' '--with-parmetis=1' '--with-parmetis-include=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/parmetis-4.0.3-6jqxqmt7qqq73rxmx3beu5ba4vj3r253/include' '--with-parmetis-lib=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/parmetis-4.0.3-6jqxqmt7qqq73rxmx3beu5ba4vj3r253/lib/libparmetis.so' '--with-superlu_dist=1' '--with-superlu_dist-include=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/superlu-dist-develop-mpiyhomp4k72bilqn6xk7uol36ulsdve/include' '--with-superlu_dist-lib=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/superlu-dist-develop-mpiyhomp4k72bilqn6xk7uol36ulsdve/lib/libsuperlu_dist.so' '--with-ptscotch=0' '--with-suitesparse=0' '--with-hdf5=1' '--with-hdf5-include=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/hdf5-1.12.1-dp5vqo4tjh6oi7szpcsqkdlifgjxknzf/include' '--with-hdf5-lib=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/hdf5-1.12.1-dp5vqo4tjh6oi7szpcsqkdlifgjxknzf/lib/libhdf5.so' '--with-zlib=1' '--with-zlib-include=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/zlib-1.2.11-2ciasfxwyxanyohroisdpvidg4gs2fdy/include' '--with-zlib-lib=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/zlib-1.2.11-2ciasfxwyxanyohroisdpvidg4gs2fdy/lib/libz.so' '--with-mumps=0' '--with-trilinos=0' '--with-fftw=0' '--with-valgrind=0' '--with-gmp=0' '--with-libpng=0' '--with-giflib=0' '--with-mpfr=0' '--with-netcdf=0' '--with-pnetcdf=0' '--with-moab=0' '--with-random123=0' '--with-exodusii=0' '--with-cgns=0' '--with-memkind=0' '--with-p4est=0' '--with-saws=0' '--with-yaml=0' '--with-hwloc=0' '--with-libjpeg=0' '--with-scalapack=1' '--with-scalapack-lib=/opt/cray/pe/libsci/21.08.1.2/AMD/4.0/x86_64/lib/libsci_amd.so' '--with-strumpack=0' '--with-mmg=0' '--with-parmmg=0' '--with-tetgen=0' '--with-cxx-dialect=C++11'

Using spack, I see this error at compile time:


/tmp/lukebr/spack-stage/spack-stage-petsc-main-5jlv6jcfdaa37iy5zm77umvb6uvgwdo7/spack-src/src/vec/is/sf/impls/basic/sfpack.c:463:19: error: static declaration of 'MPI_Type_dup' follows non-static declaration

static inline int MPI_Type_dup(MPI_Datatype datatype,MPI_Datatype *newtype)

                  ^

/opt/cray/pe/mpich/8.1.12/ofi/cray/10.0/include/mpi.h:1291:5: note: previous declaration is here

int MPI_Type_dup(MPI_Datatype oldtype, MPI_Datatype *newtype) MPICH_API_PUBLIC;

    ^

1 error generated.

To get around this error, I pass “-DPETSC_HAVE_MPI_TYPE_DUP” but then I see the following lining error:


CLINKER arch-linux-c-opt/lib/libpetsc.so.3.016.5

ld.lld: error: undefined hidden symbol: PetscSFCreate_Window

>>> referenced by sfregi.c

>>>               arch-linux-c-opt/obj/vec/is/sf/interface/sfregi.o:(PetscSFRegisterAll)

clang-13: error: linker command failed with exit code 1 (use -v to see invocation)

gmake[3]: *** [gmakefile:113: arch-linux-c-opt/lib/libpetsc.so.3.016.5] Error 1

gmake[2]: *** [/tmp/lukebr/spack-stage/spack-stage-petsc-main-5jlv6jcfdaa37iy5zm77umvb6uvgwdo7/spack-src/lib/petsc/conf/rules:56: libs] Error 2


Before I continue, is there a preferred way to build PETSc on an AMDGPU system? Could you share this?

Thanks,
Luke

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220311/8aa23ece/attachment-0001.html>


More information about the petsc-users mailing list