[petsc-dev] MKL issue

Jeff Hammond jeff.science at gmail.com
Thu Jan 28 23:54:57 CST 2016


On Thu, Jan 28, 2016 at 9:11 PM, Jed Brown <jed at jedbrown.org> wrote:

> Stefano Zampini <stefano.zampini at gmail.com> writes:
>
> > Just for the records:
> >
> > I have an installation using the MKL distribution of scalapack but it
> >  doesn't use IntelMPI
>
>
I don't understand your comment.  MKL ScaLAPACK is build against Intel MPI,
Open-MPI and (maybe) SGI MPT.

Because of MPICH ABI compatibility (https://www.mpich.org/abi/), you should
be able to use the MKL ScaLAPACK libraries compiled against Intel MPI with
MPICH, MVAPICH2, Cray MPI and any other late-model MPICH derivative, so
long as the binary representation of MPICH objects does not change (e.g.
MPI_Status object).


> > I've just got a very strange error using MUMPS from PETSc
> >
> > Intel MKL FATAL ERROR: Cannot load symbol MKLMPI_Get_wrappers.
>
>
Can you provide MCVE (http://stackoverflow.com/help/mcve)?  If not, I'll
take a non-minimal CVE.


> > The error goes away with -mat_mumps_icntl_13 1 (which disables ScaLAPACK)
>
> Do you want that to happen automatically?  Could we try and catch the
> error instead of relying on configure to determine this run-time bug?
>
> > This issue is related with
> >
> >
> https://software.intel.com/en-us/articles/using-intel-mkl-mpi-wrapper-with-the-intel-mkl-cluster-functions
>
>
I was aware this was coming down the line.  Since it is new, it is
potentially imperfect.


> > Is it something we can check for at configure time or we just wait that
> > Intel ships MKL with a shared version of libmkl_blacs_(i)lp64?
>
> You can write a test for it or wait.  Since the test needs to execute
> code, it would need to run as part of conftest on a batch system.
>

Jeff, who works for Intel but is not responsible for any of our software
products

-- 
Jeff Hammond
jeff.science at gmail.com
http://jeffhammond.github.io/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20160128/7fbc205d/attachment.html>


More information about the petsc-dev mailing list