<html><head><meta http-equiv="Content-Type" content="text/html charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;" class=""><br class="">
Dear support team,<div class=""><br class=""></div><div class="">I am trying to install petsc4py on top of my miniconda3 python environment. </div><div class=""><br class=""></div><div class="">I recently filed an <a href="https://bitbucket.org/petsc/petsc4py/issues/50/cannot-pass-runtestpy#comment-30006281" class="">issue</a> on bitbucket about the trouble that I was having with installing the development version of petsc4py. I have changed the status of this issue as invalid as this no longer corresponds to the trouble that I am having. </div><div class=""><br class=""></div><div class="">Since I think that the problem I’m probably having comes from my—possibly broken—environment, I reckoned the best was to ask for support via e-mail.</div><div class=""><br class=""></div><div class="">I had previously succeeded in installing the development version of both petsc and petsc4py on my office computer running on MacOSX. But, when I try to do so on my laptop (using the same OS) using exactly the same command inputs, I keep on failing. </div><div class=""><br class=""></div><div class="">Now, I seem to compile PETSc without trouble. Configuration step produces the following output:</div><div class=""><br class=""></div><div class=""><br class=""></div><blockquote style="margin: 0 0 0 40px; border: none; padding: 0px;" class=""><div class=""><font color="#00f900" class="">jrek@MacJerem:petsc$</font><font color="#0433ff" class=""> python2.7 ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-scalar-type=complex --download-scalapack --download-mumps</font></div><div class="">===============================================================================<br class=""> Configuring PETSc to compile on your system <br class="">===============================================================================<br class="">Compilers: <br class=""> C Compiler: mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -Qunused-arguments -fvisibility=hidden -g3 <br class=""> C++ Compiler: mpicxx -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g <br class=""> Fortran Compiler: mpif90 -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g <br class="">Linkers:<br class=""> Shared linker: mpicc -dynamiclib -single_module -undefined dynamic_lookup -multiply_defined suppress -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -Qunused-arguments -fvisibility=hidden -g3<br class=""> Dynamic linker: mpicc -dynamiclib -single_module -undefined dynamic_lookup -multiply_defined suppress -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -Qunused-arguments -fvisibility=hidden -g3<br class="">make:<br class="">BLAS/LAPACK: -llapack -lblas<br class="">MPI:<br class=""> Includes: -I/usr/local/Cellar/open-mpi/1.10.3/include<br class="">cmake:<br class=""> Arch: <br class="">hwloc:<br class=""> Includes: -I/usr/local/include<br class=""> Library: -Wl,-rpath,/usr/local/lib -L/usr/local/lib -lhwloc<br class="">scalapack:<br class=""> Library: -Wl,-rpath,/Users/jrek/softs/petsc/arch-darwin-c-opt/lib -L/Users/jrek/softs/petsc/arch-darwin-c-opt/lib -lscalapack<br class="">MUMPS:<br class=""> Includes: -I/Users/jrek/softs/petsc/arch-darwin-c-opt/include<br class=""> Library: -Wl,-rpath,/Users/jrek/softs/petsc/arch-darwin-c-opt/lib -L/Users/jrek/softs/petsc/arch-darwin-c-opt/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord<br class="">X:<br class=""> Includes: -I/opt/X11/include<br class=""> Library: -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib -lX11<br class="">pthread:<br class="">sowing:<br class="">ssl:<br class=""> Includes: -I/usr/local/opt/openssl/include<br class=""> Library: -Wl,-rpath,/usr/local/opt/openssl/lib -L/usr/local/opt/openssl/lib -lssl -lcrypto<br class="">PETSc:<br class=""> PETSC_ARCH: arch-darwin-c-opt<br class=""> PETSC_DIR: /Users/jrek/softs/petsc<br class=""> Scalar type: complex<br class=""> Precision: double<br class=""> Clanguage: C<br class=""> Integer size: 32<br class=""> shared libraries: enabled<br class=""> Memory alignment: 16<br class="">xxx=========================================================================xxx<br class=""> Configure stage complete. Now build PETSc libraries with (gnumake build):<br class=""> make PETSC_DIR=/Users/jrek/softs/petsc PETSC_ARCH=arch-darwin-c-opt all<br class="">xxx=========================================================================xxx</div><div class=""><br class=""></div></blockquote>Then, everything works smoothly with building and testing until I get to the point of actually installing petsc4py. Then, the error I am having is this one:<div class=""><br class=""></div><blockquote style="margin: 0 0 0 40px; border: none; padding: 0px;" class=""><div class=""><font color="#00f900" class="">jrek@MacJerem:petsc4py$</font><font color="#0433ff" class=""> python setup.py install </font><br class="">running install<br class="">running build<br class="">running build_src<br class="">running build_py<br class="">running build_ext<br class="">PETSC_DIR: /Users/jrek/softs/petsc<br class="">PETSC_ARCH: arch-darwin-c-opt<br class="">version: 3.7.3 development<br class="">integer-size: 32-bit<br class="">scalar-type: complex<br class="">precision: double<br class="">language: CONLY<br class="">compiler: mpicc<br class="">linker: mpicc<br class="">building 'PETSc' extension<br class="">mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -Qunused-arguments -g3 -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -I/Users/jrek/miniconda3/include -arch x86_64 -DPETSC_DIR=/Users/jrek/softs/petsc -I/usr/local/Cellar/open-mpi/1.10.3/include -I/usr/local/opt/openssl/include -I/opt/X11/include -I/usr/local/include -I/Users/jrek/softs/petsc/arch-darwin-c-opt/include -I/Users/jrek/softs/petsc/include -Isrc/include -I/Users/jrek/miniconda3/lib/python3.5/site-packages/numpy/core/include -I/Users/jrek/miniconda3/include/python3.5m -c src/PETSc.c -o build/temp.macosx-10.6-x86_64-3.5/arch-darwin-c-opt/src/PETSc.o<br class="">In file included from src/PETSc.c:3:<br class="">In file included from src/petsc4py.PETSc.c:271:<br class="">In file included from /usr/local/include/petsc.h:5:<br class="">In file included from /usr/local/include/petscbag.h:4:<br class=""><font color="#ff2600" class="">/usr/local/include/petscsys.h:152:6: error: "PETSc was configured with one OpenMPI mpi.h version but now appears to be compiling using a different OpenMPI mpi.h version"<br class=""># error "PETSc was configured with one OpenMPI mpi.h version but now appears to be compiling using a different OpenMPI mpi.h version"</font></div></blockquote><div class=""><div class=""><br class=""></div><div class="">Plus other errors which I think have good chances of being due to this one. </div><div class=""><br class=""></div><div class="">I guess I must probably have conflicting OpenMPI installs but I do not understand why PETSc is cannot compile just as fine as it did a moment ago. </div><div class="">How can be sure of the “mpi.h” that I am using and would specify one solve my problem ? </div><div class=""><br class=""></div><div class="">Any help would be greatly appreciated as this is slowly driving me insane :)</div><div class="">Thanks very much in advance :D</div><div class=""><br class=""></div><div class="">Cheers,</div><div class="">Jerem </div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""> </div></div></body></html>