<div dir="ltr">Thank you! I tried the suggested invocation, but with a different result:<div><br></div><div>./configure --with-clanguage=cxx --with-mpi-dir=/usr/lib64/openmpi/bin --with-c2html=0<br></div><div><div>===============================================================================</div><div> Configuring PETSc to compile on your system </div><div>===============================================================================</div><div>TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) *******************************************************************************</div><div> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details):</div><div>-------------------------------------------------------------------------------</div><div>--with-mpi-dir=/usr/lib64/openmpi/bin did not work</div><div>*******************************************************************************</div></div><div><br></div><div>The configure.log ends with a self-referral suggestion and generation of the above error message, while the first error message tracing back from the end of configure.log provides:</div><div><br></div><div><div> Pushing language CXX</div><div> Popping language CXX</div><div>sh: g++ -o /tmp/petsc-HWuXr9/config.libraries/conftest -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g /tmp/petsc-HWuXr9/config.libraries/conftest.o /usr/lib64/openmpi/bin/lib64/i386/msmpi.lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -ldl -lgcc_s -ldl </div><div>Executing: g++ -o /tmp/petsc-HWuXr9/config.libraries/conftest -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g /tmp/petsc-HWuXr9/config.libraries/conftest.o /usr/lib64/openmpi/bin/lib64/i386/msmpi.lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -ldl -lgcc_s -ldl </div><div>sh: </div><div>Possible ERROR while running linker: g++: error: /usr/lib64/openmpi/bin/lib64/i386/msmpi.lib: No such file or directory</div><div> output: ret = 256</div><div>error message = {g++: error: /usr/lib64/openmpi/bin/lib64/i386/msmpi.lib: No such file or directory</div><div>}</div></div><div><br></div><div>Indeed, there is no /usr/lib64/openmpi/bin/lib64, but i'm not clear as to whether this is a 'possible' or the realised error. </div><div><br></div><div>Perhaps this all boils down to an incomplete development environment. </div><div><br></div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Fri, Nov 18, 2016 at 4:03 PM, Satish Balay <span dir="ltr"><<a href="mailto:balay@mcs.anl.gov" target="_blank">balay@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">I just tried a of petsc-3.4 on centos-7.2 - and that went through fine.<br>
<br>
./configure --with-clanguage=cxx --with-mpi-dir=/usr/lib64/<wbr>openmpi/bin --with-c2html=0<br>
<span class="HOEnZb"><font color="#888888"><br>
Satish<br>
</font></span><div class="HOEnZb"><div class="h5"><br>
On Fri, 18 Nov 2016, Satish Balay wrote:<br>
<br>
> The stack says - the crash is OpenMPI's orterun [aka mpiexec].<br>
><br>
> Perhaps its broken?<br>
><br>
> you can run PETSc examples without mpiexec as:<br>
><br>
> cd src/ksp/ksp/examples/tutorials<br>
> make ex2<br>
> ./ex2<br>
><br>
> I don't understand the tweak. Usually 'compat' packages are used by<br>
> precompiled binaries - that were compiled with old compilers. And<br>
> PETSc shouldn't need it.<br>
><br>
> Also PETSc can use system blas/lapack - instead of<br>
> --download-f2cblaslapack=1 [but that shouldn't cause issues]<br>
><br>
> Also gcc, openmpi are in the base CentOS 7.2 repo - so I'm not sure I<br>
> understand the reference to EPEL.<br>
><br>
> Satish<br>
><br>
> On Fri, 18 Nov 2016, Park, Joseph wrote:<br>
><br>
> > I'm having difficulty configuring petsc 3.4.5 on a CentOS 7.2 machine. I'm<br>
> > forced to use petsc 3.4.5 until the application (C++) can be upgraded. We<br>
> > are restricted to EPEL packages for gcc, OpenMPI and all libraries.<br>
> ><br>
> > The only 'tweak' is that EPEL package compat-openmpi16 is used instead<br>
> > of openmpi, as the latter seems incompatible with petsc 3.4.2.<br>
> ><br>
> > Petsc configures and builds fine via:<br>
> ><br>
> > ./configure --download-f2cblaslapack=1<br>
> > --with-mpi-dir=/usr/lib64/<wbr>compat-openmpi16 --with-debugging=1<br>
> > --with-clanguage=cxx --with-fc=0<br>
> ><br>
> > make PETSC_DIR=/opt/sfwmd_rsm/apps/<wbr>petsc-3.4.5<br>
> > PETSC_ARCH=arch-linux2-cxx-<wbr>debug all<br>
> ><br>
> > However, at test time:<br>
> ><br>
> > make PETSC_DIR=/opt/sfwmd_rsm/apps/<wbr>petsc-3.4.5<br>
> > PETSC_ARCH=arch-linux2-cxx-<wbr>debug test<br>
> ><br>
> > Running test examples to verify correct installation<br>
> > Using PETSC_DIR=/opt/sfwmd_rsm/apps/<wbr>petsc-3.4.5 and<br>
> > PETSC_ARCH=arch-linux2-cxx-<wbr>debug<br>
> > /usr/bin/sh: line 20: 24136 Segmentation fault<br>
> > /usr/lib64/compat-openmpi16/<wbr>bin/mpiexec -n 1 ./ex19 -da_refine 3 -pc_type<br>
> > mg -ksp_type fgmres > ex19_1.tmp 2>&1<br>
> > Possible error running C/C++ src/snes/examples/tutorials/<wbr>ex19 with 1 MPI<br>
> > process<br>
> > See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a><br>
> > [snailkite:24136] *** Process received signal ***<br>
> > [snailkite:24136] Signal: Segmentation fault (11)<br>
> > [snailkite:24136] Signal code: Address not mapped (1)<br>
> > [snailkite:24136] Failing at address: (nil)<br>
> > [snailkite:24136] [ 0] /lib64/libpthread.so.0(+<wbr>0xf100) [0x7ff020de8100]<br>
> > [snailkite:24136] [ 1] /lib64/libc.so.6(+0x85346) [0x7ff020a9c346]<br>
> > [snailkite:24136] [ 2] /lib64/libhwloc.so.5(+0x7ccb) [0x7ff021206ccb]<br>
> > [snailkite:24136] [ 3]<br>
> > /lib64/libhwloc.so.5(hwloc__<wbr>insert_object_by_cpuset+0xa7) [0x7ff021206e87]<br>
> > [snailkite:24136] [ 4] /lib64/libhwloc.so.5(+0x2196e) [0x7ff02122096e]<br>
> > [snailkite:24136] [ 5] /lib64/libhwloc.so.5(+0x22828) [0x7ff021221828]<br>
> > [snailkite:24136] [ 6] /lib64/libhwloc.so.5(+0x228a3) [0x7ff0212218a3]<br>
> > [snailkite:24136] [ 7] /lib64/libhwloc.so.5(hwloc_<wbr>topology_load+0x13b)<br>
> > [0x7ff0212098bb]<br>
> > [snailkite:24136] [ 8]<br>
> > /usr/lib64/compat-openmpi16/<wbr>lib/libopen-rte.so.4(orte_<wbr>odls_base_open+0x7ab)<br>
> > [0x7ff021fa6dbb]<br>
> > [snailkite:24136] [ 9]<br>
> > /usr/lib64/compat-openmpi16/<wbr>lib/openmpi/mca_ess_hnp.so(+<wbr>0x2e54)<br>
> > [0x7ff01f233e54]<br>
> > [snailkite:24136] [10]<br>
> > /usr/lib64/compat-openmpi16/<wbr>lib/libopen-rte.so.4(orte_<wbr>init+0x193)<br>
> > [0x7ff021f7dd83]<br>
> > [snailkite:24136] [11] /usr/lib64/compat-openmpi16/<wbr>bin/mpiexec() [0x403dd5]<br>
> > [snailkite:24136] [12] /usr/lib64/compat-openmpi16/<wbr>bin/mpiexec() [0x403430]<br>
> > [snailkite:24136] [13] /lib64/libc.so.6(__libc_start_<wbr>main+0xf5)<br>
> > [0x7ff020a38b15]<br>
> > [snailkite:24136] [14] /usr/lib64/compat-openmpi16/<wbr>bin/mpiexec() [0x403349]<br>
> > [snailkite:24136] *** End of error message ***<br>
> > /usr/bin/sh: line 20: 24139 Segmentation fault<br>
> ><br>
> > Under gdb:<br>
> ><br>
> > Program received signal SIGSEGV, Segmentation fault.<br>
> > 0x00007ffff6647346 in __strcmp_sse2 () from /lib64/libc.so.6<br>
> > (gdb)<br>
> > (gdb) where<br>
> > #0 0x00007ffff6647346 in __strcmp_sse2 () from /lib64/libc.so.6<br>
> > #1 0x00007ffff6db1ccb in hwloc_obj_cmp () from /lib64/libhwloc.so.5<br>
> > #2 0x00007ffff6db1e87 in hwloc__insert_object_by_cpuset () from<br>
> > /lib64/libhwloc.so.5<br>
> > #3 0x00007ffff6dcb96e in summarize () from /lib64/libhwloc.so.5<br>
> > #4 0x00007ffff6dcc828 in hwloc_look_x86 () from /lib64/libhwloc.so.5<br>
> > #5 0x00007ffff6dcc8a3 in hwloc_x86_discover () from /lib64/libhwloc.so.5<br>
> > #6 0x00007ffff6db48bb in hwloc_topology_load () from /lib64/libhwloc.so.5<br>
> > #7 0x00007ffff7b51dbb in orte_odls_base_open ()<br>
> > from /usr/lib64/compat-openmpi16/<wbr>lib/libopen-rte.so.4<br>
> > #8 0x00007ffff4ddee54 in rte_init () from<br>
> > /usr/lib64/compat-openmpi16/<wbr>lib/openmpi/mca_ess_hnp.so<br>
> > #9 0x00007ffff7b28d83 in orte_init () from<br>
> > /usr/lib64/compat-openmpi16/<wbr>lib/libopen-rte.so.4<br>
> > #10 0x0000000000403dd5 in orterun ()<br>
> > #11 0x0000000000403430 in main ()<br>
> ><br>
> > Any suggestions are most welcome!<br>
> ><br>
><br>
><br>
<br>
<br>
</div></div></blockquote></div><br></div>