[petsc-users] configuration option for PETSc on cluster

Samuel Lanthaler s.lanthaler at gmail.com
Tue Apr 24 08:55:09 CDT 2018


Dear Satish,

I get the following when doing the ls:

[lanthale at daint103 mkl]$ ls 
/opt/intel/compilers_and_libraries_2017.4.196/linux/mkl/lib/intel64/
libmkl_ao_worker.so             libmkl_blacs_sgimpt_lp64.so 
libmkl_intel_lp64.a        libmkl_sequential.so
libmkl_avx2.so                  libmkl_blas95_ilp64.a 
libmkl_intel_lp64.so       libmkl_tbb_thread.a
libmkl_avx512_mic.so            libmkl_blas95_lp64.a 
libmkl_intel_thread.a      libmkl_tbb_thread.so
libmkl_avx512.so                libmkl_cdft_core.a 
libmkl_intel_thread.so     libmkl_vml_avx2.so
libmkl_avx.so                   libmkl_cdft_core.so 
libmkl_lapack95_ilp64.a    libmkl_vml_avx512_mic.so
libmkl_blacs_intelmpi_ilp64.a   libmkl_core.a libmkl_lapack95_lp64.a     
libmkl_vml_avx512.so
libmkl_blacs_intelmpi_ilp64.so  libmkl_core.so 
libmkl_mc3.so              libmkl_vml_avx.so
libmkl_blacs_intelmpi_lp64.a    libmkl_def.so libmkl_mc.so               
libmkl_vml_cmpt.so
libmkl_blacs_intelmpi_lp64.so   libmkl_gf_ilp64.a 
libmkl_pgi_thread.a        libmkl_vml_def.so
libmkl_blacs_openmpi_ilp64.a    libmkl_gf_ilp64.so 
libmkl_pgi_thread.so       libmkl_vml_mc2.so
libmkl_blacs_openmpi_ilp64.so   libmkl_gf_lp64.a 
libmkl_rt.so               libmkl_vml_mc3.so
libmkl_blacs_openmpi_lp64.a     libmkl_gf_lp64.so 
libmkl_scalapack_ilp64.a   libmkl_vml_mc.so
libmkl_blacs_openmpi_lp64.so    libmkl_gnu_thread.a 
libmkl_scalapack_ilp64.so  locale
libmkl_blacs_sgimpt_ilp64.a     libmkl_gnu_thread.so libmkl_scalapack_lp64.a
libmkl_blacs_sgimpt_ilp64.so    libmkl_intel_ilp64.a 
libmkl_scalapack_lp64.so
libmkl_blacs_sgimpt_lp64.a      libmkl_intel_ilp64.so libmkl_sequential.a


Cheers,
Samuel

On 04/24/2018 03:50 PM, Satish Balay wrote:
> Executing: cc  -o /tmp/petsc-tyd_Hm/config.libraries/conftest     -O /tmp/petsc-tyd_Hm/config.libraries/conftest.o  -Wl,-rpath,/opt/intel/compilers_and_libraries_2017.4.196/linux/mkl/lib/intel64 -L/opt/intel/compilers_and_libraries_2017.4.196/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lstdc++ -ldl
> Possible ERROR while running linker: exit code 256
> stderr:
> /opt/intel/compilers_and_libraries_2017.4.196/linux/mkl/lib/intel64/libmkl_core.a(mkl_semaphore.o): In function `mkl_serv_load_inspector':
> mkl_semaphore.c:(.text+0x123): warning: Using 'dlopen' in statically linked applications requires at runtime the shared libraries from the glibc version used for linking
> /opt/intel/compilers_and_libraries_2017.4.196/linux/mkl/lib/intel64/libmkl_core.a(dgetrs.o): In function `mkl_lapack_dgetrs':
> dgetrs_gen.f:(.text+0x224): undefined reference to `mkl_blas_dtrsm'
> dgetrs_gen.f:(.text+0x2b1): undefined reference to `mkl_blas_dtrsm'
> dgetrs_gen.f:(.text+0x2e7): undefined reference to `mkl_lapack_dlaswp'
> <<<<<<<<
>
> For some reason the compiler is picking up static version of MKL - instead of shared libraries.
>
> What do you have for:
>
> ls /opt/intel/compilers_and_libraries_2017.4.196/linux/mkl/lib/intel64/
>
> Satish
>
> On Tue, 24 Apr 2018, Samuel Lanthaler wrote:
>
>> Dear Satish and Stefano,
>>
>> Thank you for your answers. I believe I had initially tried to use the option
>> --with-blaslapack-dir=[...], instead of specifying lib and include directly.
>> But that gives me an error message:
>>
>> *******************************************************************************
>>           UNABLE to CONFIGURE with GIVEN OPTIONS    (see configure.log for
>> details):
>> -------------------------------------------------------------------------------
>> You set a value for --with-blaslapack-dir=<dir>, but
>> /opt/intel/compilers_and_libraries_2017.4.196/linux/mkl cannot be used
>> *******************************************************************************
>>
>> I attach the new configure.log. Do you think there is something wrong with the
>> mkl version that I'm trying to use? Or is it only related to petsc?
>>
>> Cheers,
>> Samuel
>>
>>
>> On 04/24/2018 02:55 PM, Satish Balay wrote:
>>> On Tue, 24 Apr 2018, Samuel Lanthaler wrote:
>>>
>>>> Hi there,
>>>>
>>>> I was wondering if you could help me with the correct configuration of
>>>> PETSc-dev version on a cluster (https://www.cscs.ch/)? I'm not sure which
>>>> information would be useful to you, but basically the problem seems to be
>>>> in
>>>> correctly compiling it with intel compiler and the existing mkl library.
>>>>
>>>> The pre-installed mkl version is
>>>>
>>>> intel/17.0.4.196
>>>>
>>>> I have tried various things and finally, I got it at least to compile with
>>>> the
>>>> following options (options chosen in reference to the mkl link advisor...):
>>>>
>>>> ./configure --with-fc=ftn --with-cc=cc --with-cxx=CC --with-debugging=0
>>>> --with-scalar-type=complex --download-scalapack --download-mumps=yes
>>>> --download-superlu --with-batch --known-mpi-shared-libraries=1
>>>> *--with-blaslapack-lib=*" ${MKLROOT}/lib/intel64/libmkl_blas95_lp64.a
>>>> ${MKLROOT}/lib/intel64/libmkl_lapack95_lp64.a -Wl,--start-group
>>>> ${MKLROOT}/lib/intel64/libmkl_intel_lp64.a
>>>> ${MKLROOT}/lib/intel64/libmkl_sequential.a
>>>> ${MKLROOT}/lib/intel64/libmkl_core.a -Wl,--end-group -lpthread -lm -ldl"
>>>> *--with-blaslapack-include*="[/opt/intel/compilers_and_libraries_2017.4.196/linux/mkl/include,/opt/intel/compilers_and_libraries_2017.4.196/linux/mkl/include/intel64/lp64]"
>>> Can you remove the above 2 options [--with-blaslapack-lib,
>>> --with-blaslapack-include] and use:
>>>
>>> --with-blas-lapack-dir=${MKLROOT}
>>>
>>> And see if you still have this problem?
>>>
>>> Satish
>>>
>>>> --known-64-bit-blas-indices=0
>>>>
>>>> After compilation, when trying to compile
>>>> /users/lanthale/petsc-git/src/snes/examples/tutorials/ex5f I get linking
>>>> errors (attached). Would anyone be able to help me out here? I really don't
>>>> have a good understanding of this.
>>>>
>>>> I'm attaching the configuration.log file, as well as the linking errors
>>>> when
>>>> trying to compile ex5f.
>>>>
>>>> Thank you very much in advance!
>>>>
>>>> Best regards,
>>>> Samuel
>>>>
>>



More information about the petsc-users mailing list