[petsc-users] Understanding MatCreate bsize parameter

Matthew Knepley knepley at gmail.com
Mon Mar 30 06:41:45 CDT 2015


On Mon, Mar 30, 2015 at 5:59 AM, Florian Lindner <mailinglists at xgm.de>
wrote:

>
>
> Am Freitag, 27. März 2015, 07:34:56 schrieb Matthew Knepley:
> > On Fri, Mar 27, 2015 at 7:31 AM, Florian Lindner <mailinglists at xgm.de>
> > wrote:
> > >
> > > Am Freitag, 27. März 2015, 07:26:11 schrieb Matthew Knepley:
> > > > On Fri, Mar 27, 2015 at 4:28 AM, Florian Lindner <
> mailinglists at xgm.de>
> > > > wrote:
> > > >
> > > > > Am Donnerstag, 26. März 2015, 07:34:27 schrieb Jed Brown:
> > > > > > Florian Lindner <mailinglists at xgm.de> writes:
> > > > > >
> > > > > > > Hello,
> > > > > > >
> > > > > > > I'm using petsc with petsc4py.
> > > > > > >
> > > > > > > A matrix is created like that
> > > > > > >
> > > > > > >     MPIrank = MPI.COMM_WORLD.Get_rank()
> > > > > > >     MPIsize = MPI.COMM_WORLD.Get_size()
> > > > > > >     print("MPI Rank = ", MPIrank)
> > > > > > >     print("MPI Size = ", MPIsize)
> > > > > > >     parts = partitions()
> > > > > > >
> > > > > > >     print("Dimension= ", nSupport + dimension, "bsize = ",
> > > > > len(parts[MPIrank]))
> > > > > > >
> > > > > > >     MPI.COMM_WORLD.Barrier() # Just to keep the output together
> > > > > > >     A = PETSc.Mat(); A.createDense( (nSupport + dimension,
> > > nSupport +
> > > > > dimension), bsize = len(parts[MPIrank]) ) # <-- crash here
> > > > > >
> > > > > > bsize is collective (must be the same on all processes).  It is
> used
> > > for
> > > > > > vector-valued problems (like elasticity -- bs=3 in 3 dimensions).
> > > > >
> > > > > It seems I'm still misunderstanding the bsize parameter.
> > > > >
> > > > > If I distribute a 10x10 matrix on three ranks I need to have a
> > > > > non-homogenous distribution, and thats what petsc does itself:
> > > > >
> > > >
> > > > blockSize really means the uniform block size of the matrix, thus is
> HAS
> > > to
> > > > divide the global size. If it does not,
> > > > you do not have a uniform block size, you have a bunch of different
> sized
> > > > blocks.
> > >
> > > But how can I set a parallel layout when the size of the matrix is not
> > > divisable by the number of ranks? When I omit bsize Petsc does that
> for me,
> > > by using block sizes of 4, 3 and 3 on the three different ranks. How
> can I
> > > set such a parallel layout manually?
> > >
> >
> > I am going to reply in C because it is my native language:
> >
> >   MatCreate(comm, &A);
> >   MatSetSizes(A, m, n, PETSC_DETERMINE, PETSC_DETERMINE);
> >   MatSetFromOptions(A);
> >   <Preallocation stuff here>
> >
> > You have each proc give its local size.
>
> Ok, that seems to be what I'm looking for...
>
> I've experienced some things where my understanding of petsc still seems
> to be far off, regarding the parallel layout:
>
> I have this code:
>
>   ierr = MatSetSizes(matrix, 10, 10, PETSC_DECIDE, PETSC_DECIDE);
> CHKERRQ(ierr);
>
>   MatGetOwnershipRange(matrix, &ownerBegin, &ownerEnd);
>   cout << "Rank = " << MPIrank << " Begin = " << ownerBegin << " End = "
> << ownerEnd << endl;
>
> Complete test code: http://pastebin.com/xFM1fJnQ
>
> If started with mpirun -n 3 I it prints
>
> Rank = 2 Begin = 0 End = 10
> Rank = 1 Begin = 0 End = 10
> Rank = 0 Begin = 0 End = 10
>

You created three serial matrices since you used PETSC_COMM_SELF in
MatCreate().

  Thanks,

     Matt


> The same happens when I manually set the sizes per processor, like you
> suggested.
>
>   int sizes[] = {4, 3, 3};
>   MatSetSizes(matrix, sizes[MPIrank], sizes[MPIrank], PETSC_DECIDE,
> PETSC_DECIDE);
>
> I wonder why the range is always starting from 0, I was rather expecting
> something like
>
> Rank = 2 Begin = 7 End = 10
> Rank = 1 Begin = 4 End = 7
> Rank = 0 Begin = 0 End = 4
>
> Petsc4py prints what I expect:
>
> Rank =  1 Range    =  (4, 7) Size =  3
> Rank =  2 Range    =  (7, 10) Size =  3
> Rank =  0 Range    =  (0, 4) Size =  4
>
>
> Is this the way it should be?
>
>
> petsc4py's Mat::setSizes combines MatSetSizes and MatSetBlockSizes. I had
> some trouble what the correct datatype for size was, but figured it out now:
>
> size = [ (m, M), (n,N) ]
> size =  [ (sizes[rank], PETSc.DETERMINE), (sizes[rank], PETSc.DETERMINE) ]
>
> The documentation on the python bindings is rather sparse...
>
> Thanks,
> Florian
>
>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150330/4dcddecc/attachment.html>


More information about the petsc-users mailing list