<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Tue, Jan 3, 2017 at 9:04 AM, Klaij, Christiaan <span dir="ltr"><<a href="mailto:C.Klaij@marin.nl" target="_blank">C.Klaij@marin.nl</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>
I've been using petsc-3.7.4 with intel mpi and compilers,<br>
superlu_dist, metis and parmetis on a cluster running<br>
SL7. Everything was working fine until SL7 got an update where<br>
glibc was upgraded from 2.17-106 to 2.17-157.<br></blockquote><div><br></div><div>I cannot see the error in your log. We previously fixed a bug with this error reporting:</div><div><br></div><div> <a href="https://bitbucket.org/petsc/petsc/commits/32cc76960ddbb48660f8e7c667e293c0ccd0e7d7">https://bitbucket.org/petsc/petsc/commits/32cc76960ddbb48660f8e7c667e293c0ccd0e7d7</a></div><div><br></div><div>in August. Is it possible that your PETSc is older than this? Could you apply that patch, or</div><div>run the configure with 'master'?</div><div><br></div><div>My guess is this is a dynamic library path problem, as it always is after upgrades.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
This update seemed to have broken (at least) parmetis: the<br>
standalone binary gpmetis started to give a segmentation<br>
fault. The core dump shows this:<br>
<br>
Core was generated by `gpmetis'.<br>
Program terminated with signal 11, Segmentation fault.<br>
#0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6<br>
<br>
That's when I decided to recompile, but to my surprise I cannot<br>
even get past the configure stage (log attached)!<br>
<br>
******************************<wbr>******************************<wbr>*******************<br>
UNABLE to EXECUTE BINARIES for ./configure<br>
------------------------------<wbr>------------------------------<wbr>-------------------<br>
Cannot run executables created with FC. If this machine uses a batch system<br>
to submit jobs you will need to configure using ./configure with the additional option --with-batch.<br>
Otherwise there is problem with the compilers. Can you compile and run code with your compiler 'mpif90'?<br>
See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html#libimf</a><br>
******************************<wbr>******************************<wbr>*******************<br>
<br>
Note the following:<br>
<br>
1) Configure was done with the exact same options that worked<br>
fine before the update of SL7.<br>
<br>
2) The intel mpi and compilers are exactly the same as before the<br>
update of SL7.<br>
<br>
3) The cluster does not require a batch system to run code.<br>
<br>
4) I can compile and run code with mpif90 on this cluster.<br>
<br>
5) The problem also occurs on a workstation running SL7.<br>
<br>
Any clues on how to proceed?<br>
Chris<br>
<br>
<br>
dr. ir. Christiaan Klaij | CFD Researcher | Research & Development<br>
MARIN | T <a href="tel:%2B31%20317%2049%2033%2044" value="+31317493344">+31 317 49 33 44</a> | mailto:<a href="mailto:C.Klaij@marin.nl">C.Klaij@marin.nl</a> | <a href="http://www.marin.nl" rel="noreferrer" target="_blank">http://www.marin.nl</a><br>
<br>
MARIN news: <a href="http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm" rel="noreferrer" target="_blank">http://www.marin.nl/web/News/<wbr>News-items/Comparison-of-<wbr>uRANS-and-BEMBEM-for-<wbr>propeller-pressure-pulse-<wbr>prediction.htm</a><br>
<br>
</blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div>
</div></div>