[petsc-users] petsc installs okay, cannot install petsc4py

Matthew Knepley knepley at gmail.com
Sun Aug 21 10:35:02 CDT 2016


On Sun, Aug 21, 2016 at 10:04 AM, Jérémy REKIER <jrekier at gmail.com> wrote:

>
> Dear support team,
>
> I am trying to install petsc4py on top of my miniconda3 python
> environment.
>
> I recently filed an issue
> <https://bitbucket.org/petsc/petsc4py/issues/50/cannot-pass-runtestpy#comment-30006281> on
> bitbucket about the trouble that I was having with installing the
> development version of petsc4py. I have changed the status of this issue as
> invalid as this no longer corresponds to the trouble that I am having.
>
> Since I think that the problem I’m probably having comes from my—possibly
> broken—environment, I reckoned the best was to ask for support via e-mail.
>
> I had previously succeeded in installing the development version of both
> petsc and petsc4py on my office computer running on MacOSX. But, when I try
> to do so on my laptop (using the same OS) using exactly the same command
> inputs, I keep on failing.
>
> Now, I seem to compile PETSc without trouble. Configuration step produces
> the following output:
>
>
> jrek at MacJerem:petsc$ python2.7 ./configure --with-cc=mpicc
> --with-cxx=mpicxx --with-fc=mpif90 --with-scalar-type=complex
> --download-scalapack --download-mumps
> ============================================================
> ===================
>              Configuring PETSc to compile on your system
>
> ============================================================
> ===================
> Compilers:
>
>
>   C Compiler:         mpicc    -Wall -Wwrite-strings -Wno-strict-aliasing
> -Wno-unknown-pragmas -Qunused-arguments -fvisibility=hidden -g3
>   C++ Compiler:       mpicxx  -Wall -Wwrite-strings -Wno-strict-aliasing
> -Wno-unknown-pragmas -fvisibility=hidden -g
>   Fortran Compiler:   mpif90   -Wall -ffree-line-length-0
> -Wno-unused-dummy-argument -g
> Linkers:
>   Shared linker:   mpicc  -dynamiclib -single_module -undefined
> dynamic_lookup -multiply_defined suppress    -Wall -Wwrite-strings
> -Wno-strict-aliasing -Wno-unknown-pragmas -Qunused-arguments
> -fvisibility=hidden -g3
>   Dynamic linker:   mpicc  -dynamiclib -single_module -undefined
> dynamic_lookup -multiply_defined suppress    -Wall -Wwrite-strings
> -Wno-strict-aliasing -Wno-unknown-pragmas -Qunused-arguments
> -fvisibility=hidden -g3
> make:
> BLAS/LAPACK: -llapack -lblas
> MPI:
>   Includes: -I/usr/local/Cellar/open-mpi/1.10.3/include
> cmake:
>   Arch:
> hwloc:
>   Includes: -I/usr/local/include
>   Library:  -Wl,-rpath,/usr/local/lib -L/usr/local/lib -lhwloc
> scalapack:
>   Library:  -Wl,-rpath,/Users/jrek/softs/petsc/arch-darwin-c-opt/lib
> -L/Users/jrek/softs/petsc/arch-darwin-c-opt/lib -lscalapack
> MUMPS:
>   Includes: -I/Users/jrek/softs/petsc/arch-darwin-c-opt/include
>   Library:  -Wl,-rpath,/Users/jrek/softs/petsc/arch-darwin-c-opt/lib
> -L/Users/jrek/softs/petsc/arch-darwin-c-opt/lib -lcmumps -ldmumps
> -lsmumps -lzmumps -lmumps_common -lpord
> X:
>   Includes: -I/opt/X11/include
>   Library:  -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib -lX11
> pthread:
> sowing:
> ssl:
>   Includes: -I/usr/local/opt/openssl/include
>   Library:  -Wl,-rpath,/usr/local/opt/openssl/lib
> -L/usr/local/opt/openssl/lib -lssl -lcrypto
> PETSc:
>   PETSC_ARCH: arch-darwin-c-opt
>   PETSC_DIR: /Users/jrek/softs/petsc
>   Scalar type: complex
>   Precision: double
>   Clanguage: C
>   Integer size: 32
>   shared libraries: enabled
>   Memory alignment: 16
> xxx=========================================================
> ================xxx
>  Configure stage complete. Now build PETSc libraries with (gnumake build):
>    make PETSC_DIR=/Users/jrek/softs/petsc PETSC_ARCH=arch-darwin-c-opt all
> xxx=========================================================
> ================xxx
>
> Then, everything works smoothly with building and testing until I get to
> the point of actually installing petsc4py. Then, the error I am having is
> this one:
>
> jrek at MacJerem:petsc4py$ python setup.py install
> running install
> running build
> running build_src
> running build_py
> running build_ext
> PETSC_DIR:    /Users/jrek/softs/petsc
> PETSC_ARCH:   arch-darwin-c-opt
> version:      3.7.3 development
> integer-size: 32-bit
> scalar-type:  complex
> precision:    double
> language:     CONLY
> compiler:     mpicc
> linker:       mpicc
> building 'PETSc' extension
> mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas
> -Qunused-arguments -g3 -Wno-unused-result -Wsign-compare -Wunreachable-code
> -DNDEBUG -g -fwrapv -O3 -Wall -I/Users/jrek/miniconda3/include -arch
> x86_64 -DPETSC_DIR=/Users/jrek/softs/petsc -I/usr/local/Cellar/open-mpi/1.10.3/include
> -I/usr/local/opt/openssl/include -I/opt/X11/include -I/usr/local/include
> -I/Users/jrek/softs/petsc/arch-darwin-c-opt/include
> -I/Users/jrek/softs/petsc/include -Isrc/include
> -I/Users/jrek/miniconda3/lib/python3.5/site-packages/numpy/core/include
> -I/Users/jrek/miniconda3/include/python3.5m -c src/PETSc.c -o
> build/temp.macosx-10.6-x86_64-3.5/arch-darwin-c-opt/src/PETSc.o
> In file included from src/PETSc.c:3:
> In file included from src/petsc4py.PETSc.c:271:
> In file included from /usr/local/include/petsc.h:5:
> In file included from /usr/local/include/petscbag.h:4:
> /usr/local/include/petscsys.h:152:6: error: "PETSc was configured with
> one OpenMPI mpi.h version but now appears to be compiling using a different
> OpenMPI mpi.h version"
> #    error "PETSc was configured with one OpenMPI mpi.h version but now
> appears to be compiling using a different OpenMPI mpi.h version"
>
>
> Plus other errors which I think have good chances of being due to this
> one.
>
> I guess I must probably have conflicting OpenMPI installs but I do not
> understand why PETSc is cannot compile just as fine as it did a moment ago.
> How can be sure of the “mpi.h” that I am using and would specify one solve
> my problem ?
>

I believe this can happen because Python is not always as careful about
letting default include directories sneak in since they believe
you should always be using them.

So the MPI that PETSc used was

  mpicc in your path
  Includes: -I/usr/local/Cellar/open-mpi/1.10.3/include

but I bet that you have

  /use/include/mpi.h

as well. I guess its also possible that you have a slightly different path
when you installed petsc4py
than when you installed PETSc, and this causes two different 'mpicc' to be
picked up.

I would say that having multiple installs of MPI is always the road to
disaster.

   Matt


> Any help would be greatly appreciated as this is slowly driving me insane
> :)
> Thanks very much in advance :D
>
> Cheers,
> Jerem
>
-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160821/9b5e708b/attachment-0001.html>


More information about the petsc-users mailing list