[petsc-dev] Error compiling PETSc on Windows
Smith, Barry F.
bsmith at mcs.anl.gov
Sat Jul 7 20:08:51 CDT 2018
> On Jul 7, 2018, at 4:40 PM, Hector E Barrios Molano <hectorb at utexas.edu> wrote:
>
> Thanks Barry and Satish for your answers.
>
> I installed the correct version of hypre. Also, I changed the paths to short dos paths as Satish suggested. Now PETSc compiles without problems and the tests are ok.
>
> Regarding Satish notes:
>
> I had to include -LIBS because otherwise the configure script stops while testing blas-lapack with the following error:
>
> ===============================================================================
> TESTING: checkLib from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:114)*******************************************************************************
> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details):
> -------------------------------------------------------------------------------
> You set a value for --with-blaslapack-lib=<lib>, but ['-L/cygdrive/c/PROGRA~2/INTELS~1/COMPIL~1/windows/mkl/lib/intel64', 'mkl_intel_lp64.lib', 'mkl_core.lib', 'mkl_intel_thread.lib'] cannot be used
> *******************************************************************************
>
> Do I need to turn on the MKL sparse functionality? what is the difference between MKL_SPARSE, MKL_SPARSE_OPTIMIZE and MKL_SPARSE_SP2M in the configure options?
>
> Why is better to use sequential MKL? Is it possible to use a hybrid MPI - OpenMP approach? for example MPI for internode communication and OpenMP for intranode computation?
As of 2018, the PETSc team still highly recommends using the pure MPI model. We have not seen any convincing evidence that MPI + OpenMP provides performance even near that of pure MPI. For codes that are not written to be memory scalable, for example, because the mesh or some other data structure is stored in full on each MPI process, we recommend storing this data in shared memory on each node (using MPI 3 shared memory constructs for portable code) rather than using a hybrid MPI + OpenMP model.
If you could provide more information as to why you are considering using a MPI + OpenMP programming model we might be able to provide more specific advise on how to proceed.
Barry
> In that case would it be good to use threaded MKL + MPI?
>
> Thanks,
>
> Hector
>
>
>
> On 07/05/2018 08:54 PM, Satish Balay wrote:
>> A few additional notes:
>>
>> On Thu, 5 Jul 2018, Satish Balay wrote:
>>
>>
>>> Using configure Options: --prefix=/cygdrive/c/installed/petsc_git-intel-debug/ --PETSC_DIR=/cygdrive/c/sources/petsc --PETSC_ARCH=windows-intel-debug --with-cc="win32fe cl" --with-fc="win32fe ifort" --with-mpi-include="[/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mpi/intel64/include]" --with-mpi-lib="[/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mpi/intel64/lib/release_mt/impi.lib]"
>>>
>>> --with-openmp=1
>>>
>> I don't think this option does anything [on windows]
>>
>>
>>> --with-parmetis-lib=/cygdrive/c/installed/parmetis/lib/parmetis.lib --with-parmetis-include=/cygdrive/c/installed/parmetis/include --with-metis-lib=/cygdrive/c/installed/parmetis/lib/metis.lib --with-metis-include=/cygdrive/c/installed/parmetis/include --with-zoltan-include=/cygdrive/c/installed/zoltan/include --with-zoltan-lib=/cygdrive/c/installed/zoltan/lib/zoltan.lib --with-hypre-include=/cygdrive/c/installed/hypre/include --with-hypre-lib=/cygdrive/c/installed/hypre/lib/HYPRE.lib --with-blaslapack-lib="[/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mkl/lib/intel64/mkl_intel_lp64.lib,/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mkl/lib/intel64/mkl_core.lib,/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mkl/lib/intel64/mkl_intel_thread.lib]"
>>>
>>> --with-scalapack-include="/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mkl/include" --with-scalapack-lib="[/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mkl/lib/intel64/mkl_scalapack_lp64.lib,/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mkl/lib/intel64/mkl_blacs_intelmpi_lp64.lib]"
>>>
>> PETSc does not use scalapack or blacs. They are dependencies for MUMPS [which petsc has an interface to]
>>
>>
>>> --with-shared-libraries=0
>>>
>>> -LIBS=""/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mkl/lib/intel64/mkl_intel_lp64.lib" "/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mkl/lib/intel64/mkl_intel_thread.lib" "/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mkl/lib/intel64//mkl_core.lib" "/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mkl/lib/intel64/mkl_scalapack_lp64.lib" "/cygdrive/c/Program Files (x86)/IntelSWTools/compilers_and_libraries_2018/windows/mkl/lib/intel64/mkl_blacs_intelmpi_lp64.lib" libiomp5md.lib"
>>>
>> I'm not sure why you are having to respecify MKL via LIBS option.
>>
>> Also with PETSc MPI usage - its best to use sequential MKL - and not threaded MKL - for eg:
>>
>>
>>> --with-blaslapack-lib='-L/cygdrive/c/PROGRA~2/INTELS~1/COMPIL~2/windows/mkl/lib/intel64 mkl_intel_lp64_dll.lib mkl_sequential_dll.lib mkl_core_dll.lib'
>>>
>>>> mhypre.c
>>>> C:\sources\petsc\src\mat\impls\hypre\mhypre.c(1453): warning C4002: too many actual parameters for macro 'hypre_TFree'
>>>>
>> You can switch to the compatible version of hypre - or fix revert the code change..
>>
>>
>> https://bitbucket.org/petsc/petsc/commits/e6de09342ce9c4562cc062ff2c1bac4bd956bda0
>>
>>
>> Satish
>>
>
More information about the petsc-dev
mailing list