[petsc-users] problems after glibc upgrade to 2.17-157

Matthew Knepley knepley at gmail.com
Wed Jan 4 09:37:46 CST 2017


On Wed, Jan 4, 2017 at 9:19 AM, Klaij, Christiaan <C.Klaij at marin.nl> wrote:

> Attached is the log for
>
>
> LIBS="-L/cm/shared/apps/intel/compilers_and_libraries_2016.
> 3.210/linux/compiler/lib/intel64_lin -lifcore"
>
Something is strange with the quotes in this shell. Can you use this
instead

LIBS=[-L/cm/shared/apps/intel/compilers_and_libraries_2016.
3.210/linux/compiler/lib/intel64_lin,-lifcore]

  Thanks,

     Matt

> No luck there.
>
>
> Chris
>
>
>
> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development
> MARIN | T +31 317 49 33 44 <+31%20317%20493%20344> | C.Klaij at marin.nl |
> www.marin.nl
>
> [image: LinkedIn] <https://www.linkedin.com/company/marin> [image:
> YouTube] <http://www.youtube.com/marinmultimedia> [image: Twitter]
> <https://twitter.com/MARIN_nieuws> [image: Facebook]
> <https://www.facebook.com/marin.wageningen>
> MARIN news: MARIN Report 119: highlighting naval research projects
> <http://www.marin.nl/web/News/News-items/MARIN-Report-119-highlighting-naval-research-projects.htm>
>
> ------------------------------
> *From:* Klaij, Christiaan
> *Sent:* Wednesday, January 04, 2017 4:05 PM
>
> *To:* Matthew Knepley
> *Cc:* petsc-users; Satish Balay
> *Subject:* Re: [petsc-users] problems after glibc upgrade to 2.17-157
>
>
> By the way, petsc did compile and install metis and parmetis succesfully
> before the make error. However, running the newly compiled gpmetis program
> gives the same segmentation fault! So the original problem was not solved
> by recompiling, unfortunately.
>
>
> Chris
>
>
> ------------------------------
> *From:* Klaij, Christiaan
> *Sent:* Wednesday, January 04, 2017 3:53 PM
> *To:* Matthew Knepley
> *Cc:* petsc-users; Satish Balay
> *Subject:* Re: [petsc-users] problems after glibc upgrade to 2.17-157
>
>
> So how would I do that? Does LIBS=<string> accept spaces in the
> string? Something like this perhaps:
>
>
> LIBS="-L/cm/shared/apps/intel/compilers_and_libraries_2016.
> 3.210/linux/compiler/lib/intel64_lin -lifcore"
>
>
> But I'm starting to believe that my intel install is somehow broken. I'm
> getting these intel compilers from rpm's provided by our cluster vendor. On
> a workstation I can try yum remove and install of the intel packages. Not
> so easy on a production cluster. Is this worth a try? Or will it just
> copy/paste the same broken (?) stuff in the same place?
>
>
> Chris
>
>
> ------------------------------
> *From:* Matthew Knepley <knepley at gmail.com>
> *Sent:* Wednesday, January 04, 2017 3:13 PM
> *To:* Klaij, Christiaan
> *Cc:* petsc-users; Satish Balay
> *Subject:* Re: [petsc-users] problems after glibc upgrade to 2.17-157
>
> On Wed, Jan 4, 2017 at 7:37 AM, Klaij, Christiaan <C.Klaij at marin.nl>
> wrote:
>
>> I've tried with:
>>
>>
>>  --LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3
>> .210/linux/compiler/lib/intel64_lin/libifcore.a -lstdc++\\
>>
> This is likely connected to the problem below, but I would have to see the
> log.
>
>> but that doesn't seem to make a difference.
>>
>>
>> With the option --with-cxx=0 the configure part does work(!), but then I
>> get
>>
>>
>> **************************ERROR*************************************
>>   Error during compile, check Linux-x86_64-Intel/lib/petsc/conf/make.log
>>   Send it and Linux-x86_64-Intel/lib/petsc/conf/configure.log to
>> petsc-maint at mcs.anl.gov
>> *******************************************************************
>>
> Here is the problem:
>
>      CLINKER /projects/developers/cklaij/ReFRESCO/Dev/trunk/Libs/build/
> petsc-3.7.4/Linux-x86_64-Intel/lib/libpetsc.so.3.7.4
> ld: /cm/shared/apps/intel/compilers_and_libraries_2016.
> 3.210/linux/compiler/lib/intel64_lin/libifcore.a(for_init.o): relocation
> R_X86_64_32 against `.rodata.str1.4' can not be used when making a shared
> object; recompile with -fPIC
> /cm/shared/apps/intel/compilers_and_libraries_2016.
> 3.210/linux/compiler/lib/intel64_lin/libifcore.a: could not read symbols:
> Bad value
>
> Clearly there is something wrong with the compiler install.
>
> However, can you put a libifcore.so in LIBS instead?
>
>    Matt
>
>> See the attached log files.
>>
>>
>> Chris
>>
>>
>>
>> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development
>> MARIN | T +31 317 49 33 44 <+31%20317%20493%20344> | C.Klaij at marin.nl |
>> www.marin.nl
>>
>> [image: LinkedIn] <https://www.linkedin.com/company/marin> [image:
>> YouTube] <http://www.youtube.com/marinmultimedia> [image: Twitter]
>> <https://twitter.com/MARIN_nieuws> [image: Facebook]
>> <https://www.facebook.com/marin.wageningen>
>> MARIN news: MARIN Report 119: highlighting naval research projects
>> <http://www.marin.nl/web/News/News-items/MARIN-Report-119-highlighting-naval-research-projects.htm>
>>
>> ------------------------------
>> *From:* Matthew Knepley <knepley at gmail.com>
>> *Sent:* Wednesday, January 04, 2017 1:43 PM
>> *To:* Klaij, Christiaan
>> *Cc:* petsc-users; Satish Balay
>> *Subject:* Re: [petsc-users] problems after glibc upgrade to 2.17-157
>>
>> On Wed, Jan 4, 2017 at 4:32 AM, Klaij, Christiaan <C.Klaij at marin.nl>
>> wrote:
>>
>>> Satish,
>>>
>>> I tried your suggestion:
>>>
>>> --with-clib-autodetect=0 --with-fortranlib-autodetect=0
>>> --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a
>>>
>>> I guess I don't really need "LIBS= " twice (?) so I've used this line:
>>>
>>> LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3.21
>>> 0/linux/compiler/lib/intel64_lin/libifcore.a
>>>
>>> Unfortunately, this approach also fails (attached log):
>>>
>>
>> Ah, this error is much easier:
>>
>> Executing: mpif90  -o /tmp/petsc-3GfeyZ/config.compilers/conftest
>>  -fPIC -g -O3  /tmp/petsc-3GfeyZ/config.compilers/conftest.o
>> /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o
>>  /tmp/petsc-3GfeyZ/config.compilers/confc.o   -ldl
>> /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/
>> linux/compiler/lib/intel64_lin/libifcore.a
>> Possible ERROR while running linker: exit code 256
>> stderr:
>> /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o:(.gnu.linkonce.
>> d.DW.ref.__gxx_personality_v0+0x0): undefined reference to
>> `__gxx_personality_v0'
>>
>> Intel as lazy writing its C++ compiler, so it uses some of g++. If you
>> want to use C++, you will need to add -lstdc++ to your LIBS variable (I
>> think).
>> Otherwise, please turn it off using --with-cxx=0.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> ************************************************************
>>> *******************
>>>          UNABLE to CONFIGURE with GIVEN OPTIONS    (see configure.log
>>> for details):
>>> ------------------------------------------------------------
>>> -------------------
>>> Fortran could not successfully link C++ objects
>>> ************************************************************
>>> *******************
>>>
>>> There are multiple libifcore.a in the intel compiler lib: one in
>>> intel64_lin and one in intel64_lin_mic. Tried both, got same error.
>>>
>>> Chris
>>>
>>>
>>>
>>> dr. ir. Christiaan Klaij  | CFD Researcher | Research & Development
>>> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl |
>>> http://www.marin.nl
>>>
>>> MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-
>>> and-BEMBEM-for-propeller-pressure-pulse-prediction.htm
>>>
>>> ________________________________________
>>> From: Satish Balay <balay at mcs.anl.gov>
>>> Sent: Tuesday, January 03, 2017 4:37 PM
>>> To: Klaij, Christiaan
>>> Cc: petsc-users at mcs.anl.gov
>>> Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157
>>>
>>> Do you have similar issues with gnu compilers?
>>>
>>> It must be some incompatibility with intel compilers with this glibc
>>> change.
>>>
>>> >>>>>>>>>
>>>           compilers: Check that C libraries can be used from Fortran
>>>                       Pushing language FC
>>>                       Popping language FC
>>>                       Pushing language FC
>>>                       Popping language FC
>>>                           Pushing language FC
>>>                           Popping language FC
>>> **** Configure header /tmp/petsc-rOjdnN/confdefs.h ****
>>> <<<<<<<<<<
>>>
>>> Thre is a bug in configure [Matt?] that eats away some of the log - so
>>> I don't see the exact error you are getting.
>>>
>>> If standalone micc/mpif90 etc work - then you can try the following
>>> additional options:
>>>
>>> --with-clib-autodetect=0 --with-fortranlib-autodetect=0
>>> --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a
>>>
>>> [replace "path_to" with the correct path to the ifort lubifcore.a
>>> library]
>>>
>>> Note: I have a RHEL7 box with this glibc - and I don't see this issue.
>>>
>>> >>>>
>>> -bash-4.2$ cat /etc/redhat-release
>>> Red Hat Enterprise Linux Server release 7.3 (Maipo)
>>> -bash-4.2$ rpm -q glibc
>>> glibc-2.17-157.el7_3.1.x86_64
>>> glibc-2.17-157.el7_3.1.i686
>>> -bash-4.2$ mpiicc --version
>>> icc (ICC) 17.0.0 20160721
>>> Copyright (C) 1985-2016 Intel Corporation.  All rights reserved.
>>>
>>> -bash-4.2$
>>> <<<<
>>>
>>> Satish
>>>
>>> On Tue, 3 Jan 2017, Klaij, Christiaan wrote:
>>>
>>> >
>>> > I've been using petsc-3.7.4 with intel mpi and compilers,
>>> > superlu_dist, metis and parmetis on a cluster running
>>> > SL7. Everything was working fine until SL7 got an update where
>>> > glibc was upgraded from 2.17-106 to 2.17-157.
>>> >
>>> > This update seemed to have broken (at least) parmetis: the
>>> > standalone binary gpmetis started to give a segmentation
>>> > fault. The core dump shows this:
>>> >
>>> > Core was generated by `gpmetis'.
>>> > Program terminated with signal 11, Segmentation fault.
>>> > #0  0x00002aaaac6b865e in memmove () from /lib64/libc.so.6
>>> >
>>> > That's when I decided to recompile, but to my surprise I cannot
>>> > even get past the configure stage (log attached)!
>>> >
>>> > ************************************************************
>>> *******************
>>> >                     UNABLE to EXECUTE BINARIES for ./configure
>>> > ------------------------------------------------------------
>>> -------------------
>>> > Cannot run executables created with FC. If this machine uses a batch
>>> system
>>> > to submit jobs you will need to configure using ./configure with the
>>> additional option  --with-batch.
>>> >  Otherwise there is problem with the compilers. Can you compile and
>>> run code with your compiler 'mpif90'?
>>> > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf
>>> > ************************************************************
>>> *******************
>>> >
>>> > Note the following:
>>> >
>>> > 1) Configure was done with the exact same options that worked
>>> > fine before the update of SL7.
>>> >
>>> > 2) The intel mpi and compilers are exactly the same as before the
>>> > update of SL7.
>>> >
>>> > 3) The cluster does not require a batch system to run code.
>>> >
>>> > 4) I can compile and run code with mpif90 on this cluster.
>>> >
>>> > 5) The problem also occurs on a workstation running SL7.
>>> >
>>> > Any clues on how to proceed?
>>> > Chris
>>> >
>>> >
>>> > dr. ir. Christiaan Klaij  | CFD Researcher | Research & Development
>>> > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl |
>>> http://www.marin.nl
>>> >
>>> > MARIN news: http://www.marin.nl/web/News/N
>>> ews-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-press
>>> ure-pulse-prediction.htm
>>> >
>>> >
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/93ee9306/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: imaged8e995.PNG
Type: image/png
Size: 333 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/93ee9306/attachment-0008.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image215e33.PNG
Type: image/png
Size: 293 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/93ee9306/attachment-0009.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image3ccbd0.PNG
Type: image/png
Size: 331 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/93ee9306/attachment-0010.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image1b081f.PNG
Type: image/png
Size: 333 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/93ee9306/attachment-0011.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: imaged7de88.PNG
Type: image/png
Size: 253 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/93ee9306/attachment-0012.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image3607b7.PNG
Type: image/png
Size: 331 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/93ee9306/attachment-0013.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image1f0435.PNG
Type: image/png
Size: 293 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/93ee9306/attachment-0014.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image17dbf5.PNG
Type: image/png
Size: 253 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/93ee9306/attachment-0015.png>


More information about the petsc-users mailing list