[petsc-users] Understanding MatCreate bsize parameter

Matthew Knepley knepley at gmail.com
Fri Mar 27 07:34:56 CDT 2015


On Fri, Mar 27, 2015 at 7:31 AM, Florian Lindner <mailinglists at xgm.de>
wrote:
>
> Am Freitag, 27. März 2015, 07:26:11 schrieb Matthew Knepley:
> > On Fri, Mar 27, 2015 at 4:28 AM, Florian Lindner <mailinglists at xgm.de>
> > wrote:
> >
> > > Am Donnerstag, 26. März 2015, 07:34:27 schrieb Jed Brown:
> > > > Florian Lindner <mailinglists at xgm.de> writes:
> > > >
> > > > > Hello,
> > > > >
> > > > > I'm using petsc with petsc4py.
> > > > >
> > > > > A matrix is created like that
> > > > >
> > > > >     MPIrank = MPI.COMM_WORLD.Get_rank()
> > > > >     MPIsize = MPI.COMM_WORLD.Get_size()
> > > > >     print("MPI Rank = ", MPIrank)
> > > > >     print("MPI Size = ", MPIsize)
> > > > >     parts = partitions()
> > > > >
> > > > >     print("Dimension= ", nSupport + dimension, "bsize = ",
> > > len(parts[MPIrank]))
> > > > >
> > > > >     MPI.COMM_WORLD.Barrier() # Just to keep the output together
> > > > >     A = PETSc.Mat(); A.createDense( (nSupport + dimension,
> nSupport +
> > > dimension), bsize = len(parts[MPIrank]) ) # <-- crash here
> > > >
> > > > bsize is collective (must be the same on all processes).  It is used
> for
> > > > vector-valued problems (like elasticity -- bs=3 in 3 dimensions).
> > >
> > > It seems I'm still misunderstanding the bsize parameter.
> > >
> > > If I distribute a 10x10 matrix on three ranks I need to have a
> > > non-homogenous distribution, and thats what petsc does itself:
> > >
> >
> > blockSize really means the uniform block size of the matrix, thus is HAS
> to
> > divide the global size. If it does not,
> > you do not have a uniform block size, you have a bunch of different sized
> > blocks.
>
> But how can I set a parallel layout when the size of the matrix is not
> divisable by the number of ranks? When I omit bsize Petsc does that for me,
> by using block sizes of 4, 3 and 3 on the three different ranks. How can I
> set such a parallel layout manually?
>

I am going to reply in C because it is my native language:

  MatCreate(comm, &A);
  MatSetSizes(A, m, n, PETSC_DETERMINE, PETSC_DETERMINE);
  MatSetFromOptions(A);
  <Preallocation stuff here>

You have each proc give its local size.

  Thanks,

     Matt


> Thanks,
> Florian
>
>
> > > A.createDense( (n, n) )
> > >
> > > print("Rank = ", rank, "Range    = ", A.owner_range, "Size = ",
> > > A.owner_range[1] - A.owner_range[0])
> > > print("Rank = ", rank, "ColRange = ", A.getOwnershipRangeColumn(),
> "Size =
> > > ", A.getOwnershipRangeColumn()[1] - A.getOwnershipRangeColumn()[0])
> > >
> > > gives:
> > >
> > > Rank =  2 Range    =  (7, 10) Size =  3
> > > Rank =  2 ColRange =  (7, 10) Size =  3
> > > Rank =  0 Range    =  (0, 4)  Size =  4
> > > Rank =  0 ColRange =  (0, 4)  Size =  4
> > > Rank =  1 Range    =  (4, 7)  Size =  3
> > > Rank =  1 ColRange =  (4, 7)  Size =  3
> > >
> > >
> > > How can I manually set a distribution of rows like above? My approach
> was
> > > to call create with bsize = [3,3,4][rank] but that obviously is not the
> > > way...
> > >
> > > Thanks,
> > > Florian
> > >
> >
> >
> >
> >
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150327/36ce7e7e/attachment.html>


More information about the petsc-users mailing list