[petsc-users] Is there a way of having PETSc compile hypre with provided intel/mkl?

Bastian Löhrer bastian.loehrer at tu-dresden.de
Mon Jul 30 04:48:19 CDT 2018

Dear PETSc users,

im configuring PETSc to use an intel stack including intel/mkl:

./configure PETSC_ARCH=$PETSC_ARCH \
--with-vendor-compilers=intel                                    \
--with-large-file-io=1 \
--CFLAGS="-L${I_MPI_ROOT}/intel64/lib -I${I_MPI_ROOT}/intel64/include 
-lmpi"            \
--CXXFLAGS="-L${I_MPI_ROOT}/intel64/lib -I${I_MPI_ROOT}/intel64/include 
-lmpi -lmpicxx" \
--FFLAGS="-L${I_MPI_ROOT}/intel64/lib -I${I_MPI_ROOT}/intel64/include 
-lmpi"            \
--LDFLAGS="-L${I_MPI_ROOT}/intel64/lib -I${I_MPI_ROOT}/intel64/include 
-lmpi"           \
--with-blas-lapack-dir="${MKLROOT}/lib/intel64" \
--download-hypre \

two questions:*

1)* the blas-lapack-dir option is not passed down to the compilation of 
hypre according to $PETSC_DIR/$PETSC_ARCH/conf/hypre
*Is there a way of having PETSc compile hypre with my intel/mkl?*

2)* *Should I omit or include any option?* I have come across a few 
options in previous configuration calls used at my department which I 
have removed from my configuration call because

    a) I had the impression that they were of no additional use:

      * --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-mpi=1
      * --with-pic, because --with-pic=1 is the default anyway since
        2.3.1, is it not?
      * --with-mpiexec=mpirun
      * --with-mpi-compilers=1
      * --known-mpi-shared=1
      * --with-mpi-dir=...
      * COPTFLAGS="-g" CXXOPTFLAGS="-g" FOPTFLAGS="-g" because
        --with-debugging=yes adds them anyway

    b) because I couldn't figure out what they were actually for:

      * --configModules=PETSc.Configure
      * --optionsModule=PETSc.compilerOptions

    c) others:

      * --known-64-bit-blas-indices I guess it wouldn't hurt anyway so I
        guess I'll include this option the next time I configure petsc

Looking forward to your advice

Kind regards,

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180730/eb7c85a4/attachment.html>

More information about the petsc-users mailing list