[petsc-users] petsc4py Vec().setSizes()

Jed Brown jedbrown at mcs.anl.gov
Mon Feb 4 17:40:46 CST 2013


petsc4py is "too clever" in the sense that it tries to interpret many
different kids of "sizes" arguments. You can always pass the pair
(localsize, globalsize), but you can also pass only a global size (in which
case the vector will be split apart). If you want to set only the local
size, you should pass (localsize, None).

Your example is invalid, with each process passing different global sizes.
petsc-dev will now error if you do this.

I changed your example to:


X = PETSc.Vec().create(comm=PETSc.COMM_WORLD)
X.setSizes((sizes[mpi_rank],PETSc.DECIDE),bsize=1)
X.setFromOptions()
ilow,ihigh = X.getOwnershipRange()

PETSc.Sys.syncPrint("rank: ",mpi_rank,"low/high: ",ilow,ihigh)
PETSc.Sys.syncFlush()

and now get the output:

rank:  0 low/high:  0 35675
rank:  1 low/high:  35675 401185
rank:  2 low/high:  401185 766927
rank:  3 low/high:  766927 802370



On Mon, Feb 4, 2013 at 4:01 PM, Weston Lowrie <wlowrie at uw.edu> wrote:

> Hi,
> I'm confused what the Vec().setSizes() routine is doing in petsc4py.
>  Consider this example:
>
> #!/usr/bin/env python
> import sys,os
> from petsc4py import PETSc
> from numpy import *
>
> mpi_rank = PETSc.COMM_WORLD.getRank()
> mpi_size = PETSc.COMM_WORLD.getSize()
>
> sizes = zeros(4)
> sizes[0] = 35675
> sizes[1] = 365510
> sizes[2] = 365742
> sizes[3] = 35443
>
> X = PETSc.Vec().create(comm=PETSc.COMM_WORLD)
> X.setSizes(mpi_size*sizes[mpi_rank],bsize=1)
> X.setFromOptions()
> ilow,ihigh = X.getOwnershipRange()
>
> print "rank: ",mpi_rank,"low/high: ",ilow,ihigh
>
>
>
> Why is it that when setting the local sizes explicitly do I need to
> multiply by the mpi_size?  My understanding is that when using this routine
> it is telling PETSc what the local processor core size should be.  It seems
> to divide it by total number of processors cores.
>
> Thanks,
> Wes
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130204/e102d863/attachment.html>


More information about the petsc-users mailing list