[petsc-users] Compilation issues on cluster - PETSC ERROR: Unknown Mat type given: mpiaijmkl

Matthew Knepley knepley at gmail.com
Thu Feb 10 07:48:17 CST 2022


On Thu, Feb 10, 2022 at 8:30 AM Juan Salazar via petsc-users <
petsc-users at mcs.anl.gov> wrote:

> Hello,
>
> I am having issues compiling PETsc on a cluster using the following
> configure command.
>
> ./configure  --force \
>     --with-64-bit-indices=1 \
>     --with-precision=double \
>     --with-debugging=0 \
>     --COPTFLAGS=-O3 \
>     --CXXOPTFLAGS=-O3 \
>     --FOPTFLAGS=-O3 \
>     PETSC_ARCH=$WM_OPTIONS \
>     --with-blaslapack-dir=$MKLROOT  \
>     --with-mkl_sparse-dir=$MKLROOT \
>     --with-mkl_sparse_optimize-dir=$MKLROOT \
>     --with-mpi-dir=$MPI_ARCH_PATH  \
>     --download-hypre
>
> Where
>
> MKLROOT=/scratch/app_sequana/intel-oneapi/2021.1.0-2659/mkl/2021.1.1
> WM_OPTIONS=linux64GccDPInt64Opt
> MPI_ARCH_PATH=/scratch/app_sequana/openmpi/2.1.1
>
> -----
> $ make --version
> GNU Make 3.82
> Built for x86_64-redhat-linux-gnu
> ------
>
>
> ------
> $ls $MKLROOT/lib/intel64
>
> libmkl_avx2.so.1                  libmkl_blacs_sgimpt_ilp64.so
>   libmkl_gf_lp64.a          libmkl_mc3.so.1              libmkl_sycl.so
> libmkl_avx512_mic.so.1
>   libmkl_blacs_sgimpt_ilp64.so.1  libmkl_gf_lp64.so
>   libmkl_mc.so.1               libmkl_sycl.so.1
> libmkl_avx512.so.1                libmkl_blacs_sgimpt_lp64.a
>   libmkl_gf_lp64.so.1       libmkl_pgi_thread.a          libmkl_tbb_thread.a
> libmkl_avx.so.1                   libmkl_blacs_sgimpt_lp64.so
>   libmkl_gnu_thread.a       libmkl_pgi_thread.so
>   libmkl_tbb_thread.so
> libmkl_blacs_intelmpi_ilp64.a
>   libmkl_blacs_sgimpt_lp64.so.1   libmkl_gnu_thread.so
>   libmkl_pgi_thread.so.1       libmkl_tbb_thread.so.1
> libmkl_blacs_intelmpi_ilp64.so    libmkl_blas95_ilp64.a
>   libmkl_gnu_thread.so.1    libmkl_rt.so
>   libmkl_vml_avx2.so.1
> libmkl_blacs_intelmpi_ilp64.so.1  libmkl_blas95_lp64.a
>   libmkl_intel_ilp64.a      libmkl_rt.so.1
>   libmkl_vml_avx512_mic.so.1
> libmkl_blacs_intelmpi_lp64.a      libmkl_cdft_core.a
>   libmkl_intel_ilp64.so     libmkl_scalapack_ilp64.a
>   libmkl_vml_avx512.so.1
> libmkl_blacs_intelmpi_lp64.so     libmkl_cdft_core.so
>   libmkl_intel_ilp64.so.1   libmkl_scalapack_ilp64.so    libmkl_vml_avx.so.1
> libmkl_blacs_intelmpi_lp64.so.1   libmkl_cdft_core.so.1
>   libmkl_intel_lp64.a
>   libmkl_scalapack_ilp64.so.1  libmkl_vml_cmpt.so.1
> libmkl_blacs_openmpi_ilp64.a      libmkl_core.a
>   libmkl_intel_lp64.so      libmkl_scalapack_lp64.a      libmkl_vml_def.so.1
> libmkl_blacs_openmpi_ilp64.so     libmkl_core.so
>   libmkl_intel_lp64.so.1    libmkl_scalapack_lp64.so     libmkl_vml_mc2.so.1
> libmkl_blacs_openmpi_ilp64.so.1   libmkl_core.so.1
>   libmkl_intel_thread.a     libmkl_scalapack_lp64.so.1   libmkl_vml_mc3.so.1
> libmkl_blacs_openmpi_lp64.a       libmkl_def.so.1
>   libmkl_intel_thread.so    libmkl_sequential.a          libmkl_vml_mc.so.1
> libmkl_blacs_openmpi_lp64.so      libmkl_gf_ilp64.a
>   libmkl_intel_thread.so.1  libmkl_sequential.so         locale
> libmkl_blacs_openmpi_lp64.so.1    libmkl_gf_ilp64.so
>   libmkl_lapack95_ilp64.a   libmkl_sequential.so.1
> libmkl_blacs_sgimpt_ilp64.a       libmkl_gf_ilp64.so.1
>   libmkl_lapack95_lp64.a    libmkl_sycl.a
> ------
>
> I am running code  that requires  mat_type mpiaijmkl, but unfortunately it
> seems that mpiaijmkl.c is not compiled and I get the error:  PETSC ERROR:
> Unknown Mat type given: mpiaijmkl
>
> ------
> $ls linux64GccDPInt64Opt/obj/mat/impls/aij/mpi/
>
> aijperm  fdmpiaij.d  ftn-custom  mpb_aij.d  mpiaij.o
>   mpimatmatmatmult.d  mpimatmatmult.o           mpiov.d    mpiptap.o
> aijsell  fdmpiaij.o  mmaij.d
>   mpb_aij.o  mpiaijpc.d  mpimatmatmatmult.o  mpimattransposematmult.d  mpiov.o
> crl      ftn-auto    mmaij.o     mpiaij.d   mpiaijpc.o  mpimatmatmult.d
>   mpimattransposematmult.o  mpiptap.d
> ------
>
> In the make.log I see:
>
> PETSC_HAVE_MKL 1
>
> But the variable PETSC_HAVE_MKL_SPARSE is not set, and according
> to src/mat/impls/aij/mpi/aijmkl/makefile it should be set to 1 for the file
> to be included in the compilation.
>
> I have searched in the user list and  tried different configure options,
> but so far without success. Any guidance is highly appreciated. Attached
> are the configure and make logs.
>

Hi Juan,

I believe the problem is that you specify --with-mkl_sparse-dir, but that
is not used because the BLAS/LAPACK logic checks for that, and you just
need --with-mkl_sparse. Normally the "dir" option would do this
automatically, but since it is not used, that logic does not kick in.
Please tell me if
this works.

  Thanks,

     Matt


> Cheers,
> Juan S.
>
-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220210/eb631b5f/attachment.html>


More information about the petsc-users mailing list