<html>
  <head>

    <meta http-equiv="content-type" content="text/html; charset=utf-8">
  </head>
  <body text="#000000" bgcolor="#FFFFFF">
    <p>Dear PETSc users,</p>
    <p>im configuring PETSc to use an intel stack including intel/mkl:</p>
    <tt>./configure
PETSC_ARCH=$PETSC_ARCH                                                     
      \</tt><tt><br>
    </tt><tt>--with-gnu-compilers=0
      --with-vendor-compilers=intel                                    \</tt><tt><br>
    </tt><tt>--with-large-file-io=1                                                                 
      \</tt><tt><br>
    </tt><tt>--CFLAGS="-L${I_MPI_ROOT}/intel64/lib
      -I${I_MPI_ROOT}/intel64/include -lmpi"            \</tt><tt><br>
    </tt><tt>--CXXFLAGS="-L${I_MPI_ROOT}/intel64/lib
      -I${I_MPI_ROOT}/intel64/include -lmpi -lmpicxx" \</tt><tt><br>
    </tt><tt>--FFLAGS="-L${I_MPI_ROOT}/intel64/lib
      -I${I_MPI_ROOT}/intel64/include -lmpi"            \</tt><tt><br>
    </tt><tt>--LDFLAGS="-L${I_MPI_ROOT}/intel64/lib
      -I${I_MPI_ROOT}/intel64/include -lmpi"           \</tt><tt><br>
    </tt><tt>--with-blas-lapack-dir="${MKLROOT}/lib/intel64"                                        
      \</tt><tt><br>
    </tt><tt>--download-hypre                                                                       
      \</tt><tt><br>
    </tt><tt>--with-debugging=yes</tt><br>
    <br>
    <b><br>
      two questions:</b><br>
    <br>
    <b><br>
      1)</b> the blas-lapack-dir option is not passed down to the
    compilation of hypre according to $PETSC_DIR/$PETSC_ARCH/conf/hypre<br>
    <b>Is there a way of having PETSc compile hypre with my intel/mkl?</b><br>
    <br>
    <b><br>
      2)</b> <b>Should I omit or include any option?</b> I have come
    across a few options in previous configuration calls used at my
    department which I have removed from my configuration call because <br>
    <blockquote>a) I had the impression that they were of no additional
      use:<br>
      <ul>
        <li><tt>--with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort
            --with-mpi=1</tt></li>
        <li><tt>--with-pic</tt>, because <tt>--with-pic=1</tt> is the
          default anyway since 2.3.1, is it not? <br>
          (<a class="moz-txt-link-freetext" href="https://www.mcs.anl.gov/petsc/documentation/changes/231.html">https://www.mcs.anl.gov/petsc/documentation/changes/231.html</a>)</li>
        <li><tt>--with-mpiexec=mpirun</tt></li>
        <li><tt>--with-mpi-compilers=1</tt></li>
        <li><tt>--known-mpi-shared=1</tt></li>
        <li><tt>--with-mpi-dir=...</tt></li>
        <li><tt>COPTFLAGS="-g"
            CXXOPTFLAGS="-g" FOPTFLAGS="-g"</tt> because <tt>--with-debugging=yes</tt>
          adds them anyway</li>
      </ul>
      b) because I couldn't figure out what they were actually for:<br>
      <ul>
        <li><tt><tt>--configModules=PE</tt><tt>TSc.Configure</tt></tt></li>
        <li><tt><tt>--optionsModule=PETSc.compilerOptions</tt></tt><br>
        </li>
      </ul>
    </blockquote>
    <blockquote>
      <p>c) others:</p>
      <ul>
        <li><tt>--known-64-bit-blas-indices</tt> I guess it wouldn't
          hurt anyway so I guess I'll include this option the next time
          I configure petsc<br>
(<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscBLASInt.html">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscBLASInt.html</a>)<br>
        </li>
      </ul>
    </blockquote>
    <p>Looking forward to your advice<br>
    </p>
    <p>Kind regards,<br>
      Bastian<br>
    </p>
  </body>
</html>