[petsc-users] problems after glibc upgrade to 2.17-157

Matthew Knepley knepley at gmail.com
Tue Jan 3 09:50:53 CST 2017


On Tue, Jan 3, 2017 at 9:46 AM, Klaij, Christiaan <C.Klaij at marin.nl> wrote:

> I've downloaded the tarball on October 24th:
>
> $ ls -lh petsc-lite-3.7.4.tar.gz
> -rw-r--r-- 1 cklaij domain users 8.4M Oct 24 11:07 petsc-lite-3.7.4.tar.gz
>
> (no direct internet access on cluster)
>
That is missing the fix. You can insert those few lines:

    https://bitbucket.org/petsc/petsc/commits/32cc76960ddbb48660f8e7c667e293
c0ccd0e7d7

Or get the new tarball when it spins tonight, since Satish has just added
the fix to maint.

  Thanks,

     Matt


> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development
> MARIN | T +31 317 49 33 44 <+31%20317%20493%20344> | C.Klaij at marin.nl |
> www.marin.nl
>
> [image: LinkedIn] <https://www.linkedin.com/company/marin> [image:
> YouTube] <http://www.youtube.com/marinmultimedia> [image: Twitter]
> <https://twitter.com/MARIN_nieuws> [image: Facebook]
> <https://www.facebook.com/marin.wageningen>
> MARIN news: Modelling natural transition on hydrofoils for application in
> underwater gliders
> <http://www.marin.nl/web/News/News-items/Modelling-natural-transition-on-hydrofoils-for-application-in-underwater-gliders-1.htm>
>
> ------------------------------
> *From:* Matthew Knepley <knepley at gmail.com>
> *Sent:* Tuesday, January 03, 2017 4:36 PM
> *To:* Klaij, Christiaan
> *Cc:* petsc-users at mcs.anl.gov
> *Subject:* Re: [petsc-users] problems after glibc upgrade to 2.17-157
>
> On Tue, Jan 3, 2017 at 9:04 AM, Klaij, Christiaan <C.Klaij at marin.nl>
> wrote:
>
>>
>> I've been using petsc-3.7.4 with intel mpi and compilers,
>> superlu_dist, metis and parmetis on a cluster running
>> SL7. Everything was working fine until SL7 got an update where
>> glibc was upgraded from 2.17-106 to 2.17-157.
>>
>
> I cannot see the error in your log. We previously fixed a bug with this
> error reporting:
>
>   https://bitbucket.org/petsc/petsc/commits/32cc76960ddbb48660f8e7c667e293
> c0ccd0e7d7
>
> in August. Is it possible that your PETSc is older than this? Could you
> apply that patch, or
> run the configure with 'master'?
>
> My guess is this is a dynamic library path problem, as it always is after
> upgrades.
>
>   Thanks,
>
>     Matt
>
>
>> This update seemed to have broken (at least) parmetis: the
>> standalone binary gpmetis started to give a segmentation
>> fault. The core dump shows this:
>>
>> Core was generated by `gpmetis'.
>> Program terminated with signal 11, Segmentation fault.
>> #0  0x00002aaaac6b865e in memmove () from /lib64/libc.so.6
>>
>> That's when I decided to recompile, but to my surprise I cannot
>> even get past the configure stage (log attached)!
>>
>> ************************************************************
>> *******************
>>                     UNABLE to EXECUTE BINARIES for ./configure
>> ------------------------------------------------------------
>> -------------------
>> Cannot run executables created with FC. If this machine uses a batch
>> system
>> to submit jobs you will need to configure using ./configure with the
>> additional option  --with-batch.
>>  Otherwise there is problem with the compilers. Can you compile and run
>> code with your compiler 'mpif90'?
>> See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf
>> ************************************************************
>> *******************
>>
>> Note the following:
>>
>> 1) Configure was done with the exact same options that worked
>> fine before the update of SL7.
>>
>> 2) The intel mpi and compilers are exactly the same as before the
>> update of SL7.
>>
>> 3) The cluster does not require a batch system to run code.
>>
>> 4) I can compile and run code with mpif90 on this cluster.
>>
>> 5) The problem also occurs on a workstation running SL7.
>>
>> Any clues on how to proceed?
>> Chris
>>
>>
>> dr. ir. Christiaan Klaij  | CFD Researcher | Research & Development
>> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl |
>> http://www.marin.nl
>>
>> MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-
>> and-BEMBEM-for-propeller-pressure-pulse-prediction.htm
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170103/b49f50ce/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image75d749.PNG
Type: image/png
Size: 333 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170103/b49f50ce/attachment.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image1337f2.PNG
Type: image/png
Size: 331 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170103/b49f50ce/attachment-0001.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: imagebc4ffe.PNG
Type: image/png
Size: 253 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170103/b49f50ce/attachment-0002.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image901d0c.PNG
Type: image/png
Size: 293 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170103/b49f50ce/attachment-0003.png>


More information about the petsc-users mailing list