[petsc-users] Problems with MKL?

Satish Balay balay at mcs.anl.gov
Thu Mar 5 11:29:27 CST 2020


On Thu, 5 Mar 2020, Adolfo Rodriguez wrote:

> Satish,
> 
> Thanks you for super-fast reply. Unfortunately, I cannot follow your
> suggestion because the application I am linking PETSc is being compiled
> with INTEL and gcc will not work, same goes with mpich. However, I am
> positively sure that the problem is that the original application links to
> mkl STATICALLY, while petsc links to the dynamic libraries. Also, I would
> like to compile PETSC statically.

--with-shared-libraries=0

> 
> My question is: can I link to the statical mkl libraries? I tried
> --with-blas-lapcak-lib=mkl_xx.a and did not work (I am doing exactly that
> in windows and works).

With --with-shared-libraries=0 - this won't matter [as long as MKL is used during configure].

Satish

> 
> Thanks for your help,
> 
> Adolfo
> 
> On Thu, Mar 5, 2020 at 10:25 AM Satish Balay <balay at mcs.anl.gov> wrote:
> 
> > You can try running the code in a debugger and check on the values before
> > the blas call.
> >
> > Or make a debug build with gcc/gfortran/--download-fblaslapack
> > --download-mpich and try valgrind again.
> >
> > BTW: I don't see MKL in configure options here - so likely system
> > blas/lapack is used..
> >
> > Satish
> >
> > On Thu, 5 Mar 2020, Adolfo Rodriguez wrote:
> >
> > > I am experiencing a very stubborn issue, apparently related to an MKL
> > > issue. I am solving a linear system using ksp which works well on
> > windows.
> > > I am trying to port this program to Linux now but I have been getting an
> > > error coming from the solver (the matrix, right-hand side, and initial
> > > solution vectors have been constructed without any issues). However, when
> > > trying to compute the norm of any vector I get and error. Running the
> > same
> > > program with the debug option on, I get the message shown below. I tried
> > > valgrind but did not help. Any suggestions?
> > >
> > > Regards,
> > >
> > > Adolfo
> > >
> > > [0]PETSC ERROR:
> > > ------------------------------------------------------------------------
> > > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> > > probably memory access out of range
> > > [0]PETSC ERROR: Try option -start_in_debugger or
> > -on_error_attach_debugger
> > > [0]PETSC ERROR: or see
> > > https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> > > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
> > OS X
> > > to find memory corruption errors
> > > [0]PETSC ERROR: likely location of problem given in stack below
> > > [0]PETSC ERROR: ---------------------  Stack Frames
> > > ------------------------------------
> > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> > available,
> > > [0]PETSC ERROR:       INSTEAD the line number of the start of the
> > function
> > > [0]PETSC ERROR:       is given.
> > > [0]PETSC ERROR: [0] BLASasum line 259
> > > /home/rodriad/CODE/petsc-3.12.3/src/vec/vec/impls/seq/bvec2.c
> > > [0]PETSC ERROR: [0] VecNorm_Seq line 221
> > > /home/rodriad/CODE/petsc-3.12.3/src/vec/vec/impls/seq/bvec2.c
> > > [0]PETSC ERROR: [0] VecNorm line 213
> > > /home/rodriad/CODE/petsc-3.12.3/src/vec/vec/interface/rvector.c
> > > [0]PETSC ERROR: --------------------- Error Message
> > > --------------------------------------------------------------
> > > [0]PETSC ERROR: Signal received
> > > [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
> > for
> > > trouble shooting.
> > > [0]PETSC ERROR: Petsc Release Version 3.12.3, Jan, 03, 2020
> > > [0]PETSC ERROR: Unknown Name on a arch-linux-oxy-dbg named ohylrss0 by
> > > rodriad Thu Mar  5 10:08:03 2020
> > > [0]PETSC ERROR: Configure options --with-debugging=yes
> > >
> > --with-mpi-dir=/apps/Intel/XEcluster/compilers_and_libraries_2018.0.128/linux/mpi/intel64
> > > COPTFLAGS=-debug CXXOPTFLAGS=-debug FOPTFLAGS=-debug
> > > PETSC_ARCH=arch-linux-oxy-dbg
> > > [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> > > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> > >
> >
> >
> 



More information about the petsc-users mailing list