[petsc-users] How to compile PETSC4py correctly with MPI ?

Satish Balay balay at mcs.anl.gov
Mon Sep 10 09:12:03 CDT 2018


1. you might want to locate and check  configure.log for this build of PETSc to see
exactly how its built [and which MPI got used]

2. And if its not built as you desired - use env variable
PETSC_CONFIGURE_OPTIONS to specify additional necessary options to
petsc configure - to built it as desired.

3. I'm not sure how pip chooses MPI used by MPI4PY - Lisandro might
have to help with this.

There is a non-python way to build this [which Lisandro doesn't recommend]:
use PETSc configure to build mpi4py and petsc4py. i.e:

./configure --download-mpi4py --download-petsc4py

[but then - you would install slepc/slepc4py manually after that]

Satish

On Mon, 10 Sep 2018, Jan Grießer wrote:

> Hey,
> i started again trying to solve an eigenvalue problem with petsc4py and
> slepc4py. I followed the instruction on the website and installed petsc and
> petsc4py via pip3. MPI4PY and the corresponding MPI version is already
> compiled on our cluster.
> Unfortunately when i start the following script by using:
> mpirun -n 2 python3 test3.py
> 
> import time
> import numpy as np
> import sys, petsc4py
> petsc4py.init(sys.argv)
> from petsc4py import PETSc
> 
> from mpi4py import MPI
> print(MPI.COMM_WORLD.Get_size())
> 
> 
> # For print
> Print = PETSc.Sys.Print
> 
> def construct_operator(m, n):
>     """
>     Standard symmetric eigenproblem corresponding to the
>     Laplacian operator in 2 dimensions.
>     """
>     # Create matrix for 2D Laplacian operator
>     A = PETSc.Mat().create()
>     A.setSizes([m*n, m*n])
>     A.setFromOptions( )
>     A.setUp()
>     # Fill matrix
>     hx = 1.0/(m-1) # x grid spacing
>     hy = 1.0/(n-1) # y grid spacing
>     diagv = 2.0*hy/hx + 2.0*hx/hy
>     offdx = -1.0*hy/hx
>     offdy = -1.0*hx/hy
>     Istart, Iend = A.getOwnershipRange()
>     print("Istart: ", Istart)
>     print("Iend: ", Iend)
>     for I in range(Istart, Iend) :
>         A[I,I] = diagv
>         i = I//n    # map row number to
>         j = I - i*n # grid coordinates
>         if i> 0  : J = I-n; A[I,J] = offdx
>         if i< m-1: J = I+n; A[I,J] = offdx
>         if j> 0  : J = I-1; A[I,J] = offdy
>         if j< n-1: J = I+1; A[I,J] = offdy
>     A.assemble()
>     return A
> 
> def main():
>     opts = PETSc.Options()
>     N = opts.getInt('N', 96)
>     m = opts.getInt('m', N)
>     n = opts.getInt('n', m)
>     Print("Symmetric Eigenproblem (sparse matrix), "
>           "N=%d (%dx%d grid)" % (m*n, m, n))
>     A = construct_operator(m,n)
> Symmetric Eigenproblem (sparse matrix), N=9216 (96x96 grid)
> Symmetric Eigenproblem (sparse matrix), N=9216 (96x96 grid)
> 2
> Istart:  0
> Iend:  9216
> 2
> Istart:  0
> Iend:  9216
> 
> if __name__ == '__main__':
>     main()
> 
> it returns me the following output:
> 
> instead of the one i assumed to get from a local execution of my file:
> 2
> 2
> Symmetric Eigenproblem (sparse matrix), N=9216 (96x96 grid)
> Istart:  4608
> Iend:  9216
> Istart:  0
> Iend:  4608
> 
> Therefore i assume somehow my MPI version is not linked correctly to my
> file and i checked with "ldd" my petsc file:
> linux-vdso.so.1 =>  (0x00007ffd8db6a000)
> liblapack.so.3 => /lib64/liblapack.so.3 (0x00007f1bd95e8000)
> libblas.so.3 => /lib64/libblas.so.3 (0x00007f1bd938e000)
> libm.so.6 => /lib64/libm.so.6 (0x00007f1bd908c000)
> libX11.so.6 => /lib64/libX11.so.6 (0x00007f1bd8d4e000)
> libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f1bd8b31000)
> libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00007f1bd8829000)
> libdl.so.2 => /lib64/libdl.so.2 (0x00007f1bd8625000)
> libgfortran.so.3 => /lib64/libgfortran.so.3 (0x00007f1bd8302000)
> libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f1bd80ec000)
> libquadmath.so.0 => /lib64/libquadmath.so.0 (0x00007f1bd7eb0000)
> libc.so.6 => /lib64/libc.so.6 (0x00007f1bd7aec000)
> /lib64/ld-linux-x86-64.so.2 (0x0000557325576000)
> libxcb.so.1 => /lib64/libxcb.so.1 (0x00007f1bd78c4000)
> libXau.so.6 => /lib64/libXau.so.6 (0x00007f1bd76bf000)
> I assume that pip3 was for some reasons not able to link my mpi version to
> it. Can i force pip to link mpi or is there another mistake somewhere?
> 


More information about the petsc-users mailing list