<div dir="ltr"><div>Hi, Luke,</div><div><div>  I removed check of MPI_Type_dup in <a href="https://gitlab.com/petsc/petsc/-/merge_requests/4965">https://gitlab.com/petsc/petsc/-/merge_requests/4965</a><br>   I hope with that you could build petsc +batch or ~batch</div><div><br><div><div><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr">--Junchao Zhang</div></div></div><br></div></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Mar 11, 2022 at 4:53 PM Satish Balay via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Looks like an issue with --with-batch=1 [where the MPI checks are skipped]<br>
<br>
Can you try the build without: '+batch'?<br>
<br>
Satish<br>
<br>
<br>
On Fri, 11 Mar 2022, Roskop, Luke B wrote:<br>
<br>
> As requested, I attached the configure.log and make.log files<br>
> Thanks!<br>
> Luke<br>
> <br>
> <br>
> From: Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>><br>
> Date: Friday, March 11, 2022 at 2:47 PM<br>
> To: "Roskop, Luke B" <<a href="mailto:luke.roskop@hpe.com" target="_blank">luke.roskop@hpe.com</a>><br>
> Cc: "<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>" <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br>
> Subject: Re: [petsc-users] Problem building petsc +rocm variant<br>
> <br>
> <br>
>   Please send configure.log and make.log to <a href="mailto:petsc-maint@mcs.anl.gov" target="_blank">petsc-maint@mcs.anl.gov</a><mailto:<a href="mailto:petsc-maint@mcs.anl.gov" target="_blank">petsc-maint@mcs.anl.gov</a>>  For some reason PETSc's configure does not detect that the MPI does provide MPI_Type_dup() even though it is prototyped in the MPI include file.<br>
> <br>
> <br>
> <br>
> On Mar 11, 2022, at 3:34 PM, Roskop, Luke B <<a href="mailto:luke.roskop@hpe.com" target="_blank">luke.roskop@hpe.com</a><mailto:<a href="mailto:luke.roskop@hpe.com" target="_blank">luke.roskop@hpe.com</a>>> wrote:<br>
> <br>
> Hi, I’m hoping you can help me figure out how to build PETSc targeting AMDGPUs (gfx90a GPU).<br>
> <br>
> I’m attempting to build on the CrayEx ORNL system called crusher, using ROCmCC (AMD’s compiler), and cray-mpich. In case it helps, I’m using spack to build petsc with the “petsc@main%rocmcc+batch+rocm amdgpu_target=gfx90a” spec.<br>
> <br>
> Spack ends up invoking the following configure for PETSc:<br>
> '/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/python-3.9.10-7y7mxajn5rywz5xdnba4azphcdodxiub/bin/python3.9' 'configure' '--prefix=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/petsc-main-mccbycx66la7rlx6jv44f6zd63cmdzm7' '--with-ssl=0' '--download-c2html=0' '--download-sowing=0' '--download-hwloc=0' 'CFLAGS=' 'FFLAGS=-fPIC' 'CXXFLAGS=' 'LDFLAGS=-Wl,-z,notext' '--with-cc=/opt/cray/pe/mpich/8.1.12/ofi/amd/4.4/bin/mpicc' '--with-cxx=/opt/cray/pe/mpich/8.1.12/ofi/amd/4.4/bin/mpicxx' '--with-fc=/opt/cray/pe/mpich/8.1.12/ofi/amd/4.4/bin/mpif90' '--with-precision=double' '--with-scalar-type=real' '--with-shared-libraries=1' '--with-debugging=0' '--with-openmp=0' '--with-64-bit-indices=0' 'COPTFLAGS=' 'FOPTFLAGS=' 'CXXOPTFLAGS=' '--with-blas-lapack-lib=/opt/cray/pe/libsci/<a href="http://21.08.1.2/AMD/4.0/x86_64/lib/libsci_amd.so" rel="noreferrer" target="_blank">21.08.1.2/AMD/4.0/x86_64/lib/libsci_amd.so</a>' '--with-batch=1' '--with-x=0' '--with-clanguage=C' '--with-cuda=0' '--<br>
 with-hip<br>
 =1' '--with-hip-dir=/opt/rocm-4.5.0/hip' '--with-metis=1' '--with-metis-include=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/metis-5.1.0-zn5rn5srr7qzxyo5tq36d46adcsyc5a7/include' '--with-metis-lib=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/metis-5.1.0-zn5rn5srr7qzxyo5tq36d46adcsyc5a7/lib/libmetis.so' '--with-hypre=1' '--with-hypre-include=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/hypre-develop-vgfx3lhhloq4cnethsrrpz7iez7x6wad/include' '--with-hypre-lib=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/hypre-develop-vgfx3lhhloq4cnethsrrpz7iez7x6wad/lib/libHYPRE.so' '--with-parmetis=1' '--with-parmetis-include=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/parmetis-4.0.3-6jqxqmt7qqq73rxmx3beu5ba4vj3r253/include' '--with-parmetis-lib=<br>
 /gpfs/al<br>
 pine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/parmetis-4.0.3-6jqxqmt7qqq73rxmx3beu5ba4vj3r253/lib/libparmetis.so' '--with-superlu_dist=1' '--with-superlu_dist-include=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/superlu-dist-develop-mpiyhomp4k72bilqn6xk7uol36ulsdve/include' '--with-superlu_dist-lib=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/superlu-dist-develop-mpiyhomp4k72bilqn6xk7uol36ulsdve/lib/libsuperlu_dist.so' '--with-ptscotch=0' '--with-suitesparse=0' '--with-hdf5=1' '--with-hdf5-include=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/hdf5-1.12.1-dp5vqo4tjh6oi7szpcsqkdlifgjxknzf/include' '--with-hdf5-lib=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/hdf5-1.12.1-dp5vqo4tjh6oi7szpcsqkdlifgjxknzf/lib/libhdf5.so' '--with-zlib=1' '--with<br>
 -zlib-in<br>
 clude=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/zlib-1.2.11-2ciasfxwyxanyohroisdpvidg4gs2fdy/include' '--with-zlib-lib=/gpfs/alpine/ven114/scratch/lukebr/confidence/spack/install_tree/cray-sles15-zen3/rocmcc-4.5.0/zlib-1.2.11-2ciasfxwyxanyohroisdpvidg4gs2fdy/lib/libz.so' '--with-mumps=0' '--with-trilinos=0' '--with-fftw=0' '--with-valgrind=0' '--with-gmp=0' '--with-libpng=0' '--with-giflib=0' '--with-mpfr=0' '--with-netcdf=0' '--with-pnetcdf=0' '--with-moab=0' '--with-random123=0' '--with-exodusii=0' '--with-cgns=0' '--with-memkind=0' '--with-p4est=0' '--with-saws=0' '--with-yaml=0' '--with-hwloc=0' '--with-libjpeg=0' '--with-scalapack=1' '--with-scalapack-lib=/opt/cray/pe/libsci/<a href="http://21.08.1.2/AMD/4.0/x86_64/lib/libsci_amd.so" rel="noreferrer" target="_blank">21.08.1.2/AMD/4.0/x86_64/lib/libsci_amd.so</a>' '--with-strumpack=0' '--with-mmg=0' '--with-parmmg=0' '--with-tetgen=0' '--with-cxx-dialect=C++11'<br>
> <br>
> Using spack, I see this error at compile time:<br>
> <br>
> /tmp/lukebr/spack-stage/spack-stage-petsc-main-5jlv6jcfdaa37iy5zm77umvb6uvgwdo7/spack-src/src/vec/is/sf/impls/basic/sfpack.c:463:19: error: static declaration of 'MPI_Type_dup' follows non-static declaration<br>
> static inline int MPI_Type_dup(MPI_Datatype datatype,MPI_Datatype *newtype)<br>
>                   ^<br>
> /opt/cray/pe/mpich/8.1.12/ofi/cray/10.0/include/mpi.h:1291:5: note: previous declaration is here<br>
> int MPI_Type_dup(MPI_Datatype oldtype, MPI_Datatype *newtype) MPICH_API_PUBLIC;<br>
>     ^<br>
> 1 error generated.<br>
> <br>
> To get around this error, I pass “-DPETSC_HAVE_MPI_TYPE_DUP” but then I see the following lining error:<br>
> <br>
> CLINKER arch-linux-c-opt/lib/libpetsc.so.3.016.5<br>
> ld.lld: error: undefined hidden symbol: PetscSFCreate_Window<br>
> >>> referenced by sfregi.c<br>
> >>>               arch-linux-c-opt/obj/vec/is/sf/interface/sfregi.o:(PetscSFRegisterAll)<br>
> clang-13: error: linker command failed with exit code 1 (use -v to see invocation)<br>
> gmake[3]: *** [gmakefile:113: arch-linux-c-opt/lib/libpetsc.so.3.016.5] Error 1<br>
> gmake[2]: *** [/tmp/lukebr/spack-stage/spack-stage-petsc-main-5jlv6jcfdaa37iy5zm77umvb6uvgwdo7/spack-src/lib/petsc/conf/rules:56: libs] Error 2<br>
> <br>
> <br>
> Before I continue, is there a preferred way to build PETSc on an AMDGPU system? Could you share this?<br>
> <br>
> Thanks,<br>
> Luke<br>
> <br>
> <br>
</blockquote></div>