[petsc-users] [petsc4py] dm/examples/tutorials/ex2 in python

Francesco Caimmi francesco.caimmi at gmail.com
Tue Jun 2 04:52:14 CDT 2015


Dear PETSC users,

first of all, many thanks to the developers for making PETSC available.

I am trying to get familiar with the library using petsc4py (my C/C++ 
knowledge is rudimentary to say the least), and I am now trying to reproduce 
teh examples in dm/examples/tutorials/ in python to get accustomed to DMs.

However, while translating ex2.c, I get an error whose cause I cannot 
understand. 
I am doing (I hope it's ok to post this short code snippet):

impost sys
import petsc4py
petsc4py.init(sys.argv)
from petsc4py import PETSc
stype = PETSc.DMDA.StencilType.BOX
bx    = PETSc.DMDA.BoundaryType.NONE
by = PETSc.DMDA.BoundaryType.NONE
comm = PETSc.COMM_WORLD 
rank = comm.rank
OptDB = PETSc.Options()#get PETSc option DB
M = OptDB.getInt('M', 10)
N = OptDB.getInt('N', 8)
m = OptDB.getInt('m', PETSc.DECIDE)
n = OptDB.getInt('n', PETSc.DECIDE)
dm = PETSc.DMDA().create(dim=2, sizes = (M,N), proc_sizes=(m,n),
                             boundary_type=(bx,by), stencil_type=stype,
                             stencil_width = 1, dof = 1, comm = comm
                             )
global_vec = dm.createGlobalVector()
start, end = global_vec.getOwnershipRange()
with global_vec as v:
        for i in xrange(start,end):
            v[i] = 5.0*rank

As far as I understand this should be the closest python translation of ex2.c 
up to line 48, except for the viewer part which is still to be translated.

If run on a single processor everything is ok, but when I run with 
mpiexec -n 2 <file_name>  I get the following error (on the second rank)

#############################################
	v[i] = 5.0*rank
IndexError: index 40 is out of bounds for axis 0 with size 40
#############################################

I am on linux/x64 and I get this behaviour both with petsc3.4.3+petsc4py 3.4.2 
(the packages available from opensuse repositories) and with 
petsc3.5.4+petsc4py3.5.1 (that I built myself).

I would have expected the code to seamlessly handle  the transition from one 
to multiple processors, so it's me who's doing something wrong or some other 
kind of problem? Up to now, I have never seen somethinf like that with 
vectors/matrices created by PETSc.Vec()/PETSc.Mat().

Thank you for your attention, 
-- 
Francesco Caimmi


More information about the petsc-users mailing list