[petsc-users] problems after glibc upgrade to 2.17-157
Matthew Knepley
knepley at gmail.com
Wed Jan 4 06:43:10 CST 2017
On Wed, Jan 4, 2017 at 4:32 AM, Klaij, Christiaan <C.Klaij at marin.nl> wrote:
> Satish,
>
> I tried your suggestion:
>
> --with-clib-autodetect=0 --with-fortranlib-autodetect=0
> --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a
>
> I guess I don't really need "LIBS= " twice (?) so I've used this line:
>
> LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.
> 3.210/linux/compiler/lib/intel64_lin/libifcore.a
>
> Unfortunately, this approach also fails (attached log):
>
Ah, this error is much easier:
Executing: mpif90 -o /tmp/petsc-3GfeyZ/config.compilers/conftest -fPIC
-g -O3 /tmp/petsc-3GfeyZ/config.compilers/conftest.o
/tmp/petsc-3GfeyZ/config.compilers/cxxobj.o
/tmp/petsc-3GfeyZ/config.compilers/confc.o -ldl
/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a
Possible ERROR while running linker: exit code 256
stderr:
/tmp/petsc-3GfeyZ/config.compilers/cxxobj.o:(.gnu.linkonce.d.DW.ref.__gxx_personality_v0+0x0):
undefined reference to `__gxx_personality_v0'
Intel as lazy writing its C++ compiler, so it uses some of g++. If you want
to use C++, you will need to add -lstdc++ to your LIBS variable (I think).
Otherwise, please turn it off using --with-cxx=0.
Thanks,
Matt
> ************************************************************
> *******************
> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for
> details):
> ------------------------------------------------------------
> -------------------
> Fortran could not successfully link C++ objects
> ************************************************************
> *******************
>
> There are multiple libifcore.a in the intel compiler lib: one in
> intel64_lin and one in intel64_lin_mic. Tried both, got same error.
>
> Chris
>
>
>
> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development
> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl
>
> MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-
> uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm
>
> ________________________________________
> From: Satish Balay <balay at mcs.anl.gov>
> Sent: Tuesday, January 03, 2017 4:37 PM
> To: Klaij, Christiaan
> Cc: petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157
>
> Do you have similar issues with gnu compilers?
>
> It must be some incompatibility with intel compilers with this glibc
> change.
>
> >>>>>>>>>
> compilers: Check that C libraries can be used from Fortran
> Pushing language FC
> Popping language FC
> Pushing language FC
> Popping language FC
> Pushing language FC
> Popping language FC
> **** Configure header /tmp/petsc-rOjdnN/confdefs.h ****
> <<<<<<<<<<
>
> Thre is a bug in configure [Matt?] that eats away some of the log - so
> I don't see the exact error you are getting.
>
> If standalone micc/mpif90 etc work - then you can try the following
> additional options:
>
> --with-clib-autodetect=0 --with-fortranlib-autodetect=0
> --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a
>
> [replace "path_to" with the correct path to the ifort lubifcore.a library]
>
> Note: I have a RHEL7 box with this glibc - and I don't see this issue.
>
> >>>>
> -bash-4.2$ cat /etc/redhat-release
> Red Hat Enterprise Linux Server release 7.3 (Maipo)
> -bash-4.2$ rpm -q glibc
> glibc-2.17-157.el7_3.1.x86_64
> glibc-2.17-157.el7_3.1.i686
> -bash-4.2$ mpiicc --version
> icc (ICC) 17.0.0 20160721
> Copyright (C) 1985-2016 Intel Corporation. All rights reserved.
>
> -bash-4.2$
> <<<<
>
> Satish
>
> On Tue, 3 Jan 2017, Klaij, Christiaan wrote:
>
> >
> > I've been using petsc-3.7.4 with intel mpi and compilers,
> > superlu_dist, metis and parmetis on a cluster running
> > SL7. Everything was working fine until SL7 got an update where
> > glibc was upgraded from 2.17-106 to 2.17-157.
> >
> > This update seemed to have broken (at least) parmetis: the
> > standalone binary gpmetis started to give a segmentation
> > fault. The core dump shows this:
> >
> > Core was generated by `gpmetis'.
> > Program terminated with signal 11, Segmentation fault.
> > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6
> >
> > That's when I decided to recompile, but to my surprise I cannot
> > even get past the configure stage (log attached)!
> >
> > ************************************************************
> *******************
> > UNABLE to EXECUTE BINARIES for ./configure
> > ------------------------------------------------------------
> -------------------
> > Cannot run executables created with FC. If this machine uses a batch
> system
> > to submit jobs you will need to configure using ./configure with the
> additional option --with-batch.
> > Otherwise there is problem with the compilers. Can you compile and run
> code with your compiler 'mpif90'?
> > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf
> > ************************************************************
> *******************
> >
> > Note the following:
> >
> > 1) Configure was done with the exact same options that worked
> > fine before the update of SL7.
> >
> > 2) The intel mpi and compilers are exactly the same as before the
> > update of SL7.
> >
> > 3) The cluster does not require a batch system to run code.
> >
> > 4) I can compile and run code with mpif90 on this cluster.
> >
> > 5) The problem also occurs on a workstation running SL7.
> >
> > Any clues on how to proceed?
> > Chris
> >
> >
> > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development
> > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl |
> http://www.marin.nl
> >
> > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-
> uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm
> >
> >
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/b0933c09/attachment.html>
More information about the petsc-users
mailing list