[petsc-users] MKL_PARDISO question

Smith, Barry F. bsmith at mcs.anl.gov
Wed Sep 18 22:46:33 CDT 2019


   This is easy thanks to the additional debugging I added recently. Your install of MKL does not have CPardiso support. When you install MKL you have to make sure you select the "extra" cluster option, otherwise it doesn't install some of the library.  I only learned this myself recently from another PETSc user. 

   So please try again after you install the full MKL and send configure.log if it fails (by the way,. just use --with-mkl_cpardiso not --with-mkl_cpardiso-dir since it always has to find the CPardiso in the MKL BLAS/Lapack directory). After you install the full MKL you will see that directory also has files with *blacs* in them.

   Barry




Executing: ls /home/epscodes/MyLocal/intel/mkl/lib/intel64
stdout:
libmkl_avx2.so
libmkl_avx512_mic.so
libmkl_avx512.so
libmkl_avx.so
libmkl_blas95_ilp64.a
libmkl_blas95_lp64.a
libmkl_core.a
libmkl_core.so
libmkl_def.so
libmkl_gf_ilp64.a
libmkl_gf_ilp64.so
libmkl_gf_lp64.a
libmkl_gf_lp64.so
libmkl_gnu_thread.a
libmkl_gnu_thread.so
libmkl_intel_ilp64.a
libmkl_intel_ilp64.so
libmkl_intel_lp64.a
libmkl_intel_lp64.so
libmkl_intel_thread.a
libmkl_intel_thread.so
libmkl_lapack95_ilp64.a
libmkl_lapack95_lp64.a
libmkl_mc3.so
libmkl_mc.so
libmkl_rt.so
libmkl_sequential.a
libmkl_sequential.so
libmkl_tbb_thread.a
libmkl_tbb_thread.so
libmkl_vml_avx2.so
libmkl_vml_avx512_mic.so
libmkl_vml_avx512.so
libmkl_vml_avx.so
libmkl_vml_cmpt.so
libmkl_vml_def.so
libmkl_vml_mc2.so
libmkl_vml_mc3.so
libmkl_vml_mc.so



> On Sep 18, 2019, at 9:40 PM, Xiangdong <epscodes at gmail.com> wrote:
> 
> Thank you very much for your information. I pulled the master branch but got the error when configuring it.
> 
> When I run configure without mkl_cpardiso (configure.log_nocpardiso):  ./configure PETSC_ARCH=arch-debug  --with-debugging=1 --with-mpi-dir=$MPI_ROOT --with-blaslapack-dir=${MKL_ROOT} , it works fine.
> 
> However, when I add mkl_cpardiso (configure.log_withcpardiso): ./configure PETSC_ARCH=arch-debug  --with-debugging=1 --with-mpi-dir=$MPI_ROOT  -with-blaslapack-dir=${MKL_ROOT} --with-mkl_cpardiso-dir=${MKL_ROOT} , it complains about "Could not find a functional BLAS.", but the blas was provided through mkl as same as previous configuration. 
> 
> Can you help me on the configuration? Thank you.
> 
> Xiangdong
> 
> On Wed, Sep 18, 2019 at 2:39 PM Smith, Barry F. <bsmith at mcs.anl.gov> wrote:
> 
> 
> > On Sep 18, 2019, at 9:15 AM, Xiangdong via petsc-users <petsc-users at mcs.anl.gov> wrote:
> > 
> > Hello everyone,
> > 
> > From here,
> > https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATSOLVERMKL_PARDISO.html
> > 
> > It seems thatMKL_PARDISO only works for seqaij. I am curious that whether one can use mkl_pardiso in petsc with multi-thread.
> 
>    You can use  mkl_pardiso for multi-threaded and  mkl_cpardiso for MPI parallelism.
> 
>    In both cases you must use the master branch of PETSc (or the next release of PETSc) to do this this easily.
> 
>    Note that when you use mkl_pardiso with multiple threads the matrix is coming from a single MPI process (or the single program if not running with MPI). So it is not MPI parallel that matches the rest of the parallelism with PETSc. So one much be a little careful: for example if one has 4 cores and uses them all with mpiexec -n 4 and then uses mkl_pardiso with 4 threads (each) then you have 16 threads fighting over 4 cores. So you need to select the number of MPI processes and number of threads wisely.
> 
> > 
> > Is there any reason that MKL_PARDISO is not listed in the linear solver table?
> > https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html
> > 
> 
>    Just an oversight, thanks for letting us know, I have added it.
> 
> 
> > Thank you.
> > 
> > Best,
> > Xiangdong
> 
> <configure.log_nocpardiso><configure.log_withcpardiso>



More information about the petsc-users mailing list