[petsc-users] Understanding global and local rows in a distributed dense matrix

Matthew Knepley knepley at gmail.com
Wed Jan 13 07:36:07 CST 2021


On Wed, Jan 13, 2021 at 8:26 AM Stefano Zampini <stefano.zampini at gmail.com>
wrote:

> MATMPIDENSE does not implement any cyclic distribution. In parallel, a
> dense matrix is split by rows. Each process owns localrows*globalcols
> entries. Local sizes are to be intended as the size of the right and left
> vectors used in matvec operations, and are not strictly related with
> storage considerations.
> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetLocalSize.html
>

To expand on Stefano's point. MATDENSE is intended for very simple matrix
handling, and the optimizations are most appropriate
for blocks of vectors. Sophisticated desne matrix operations are better
handled by the MATELEMENTAL type. What you give up here
is the ability to directly manipulate the storage since it is now
distributed in a crazy way. Thus most users actually want MATDENSE.

  Thanks,

     Matt


> Il Mer 13 Gen 2021, 15:40 Roland Richter <roland.richter at ntnu.no> ha
> scritto:
>
>> Hei,
>>
>> I am currently struggling a bit with my understanding of local and global
>> rows/columns in distributed dense matrices in PETSc.
>>
>> The attached short program is creating a local matrix in every thread
>> with [matrix_size_rows, matrix_size_cols] as global size. Afterwards, I
>> flatten the matrix into a vector, and split this vector using the
>> boost-function split() when running using several MPI threads. In addition
>> I create a dense MPI-distributed matrix based on PETSc, using the same
>> input sizes, and set all values to zero (after I'm not interested in the
>> content itself, just the shape). Finally, I retrieve the global and local
>> values for the number of rows and columns using MatGetLocalSize, and the
>> number of the first and last row in each matrix using
>> MatGetOwnershipRange() and print them. According to the documentation I
>> would expect that the PETSc-based matrix is split up
>>
>> Now, for an input size of [8, 8] and a single thread I get
>>
>>
>>
>>
>>
>>
>> *Rank 0 has a total size of 8, 8 Rank 0 has an initial size of 64 Rank 0
>> has a local matrix size of 8, 8 Rank 0 has a global matrix size of 8, 8
>> Rank 0 has a matrix spanning from row 0 to 8 Rank 0 has a vector size of 64*
>>
>> which is what I expect. For two threads, though, I get
>> *Rank 0 has a total size of 8, 8*
>> *Rank 0 has an initial size of 64*
>> *Rank 0 has a local matrix size of 4, 4*
>> *Rank 0 has a global matrix size of 8, 8*
>> *Rank 0 has a matrix spanning from row 0 to 4*
>> *Rank 0 has a vector size of 32*
>> *Rank 1 has a total size of 8, 8*
>> *Rank 1 has an initial size of 64*
>> *Rank 1 has a local matrix size of 4, 4*
>> *Rank 1 has a global matrix size of 8, 8*
>> *Rank 1 has a matrix spanning from row 4 to 8*
>> * Rank 1 has a vector size of 32*
>>
>> Here, most entries make sense, except the size of the local matrices. Why
>> do I get a size of [4, 4], and not a size of [4, 8]? Each row should be
>> contiguous in the local process, and therefore each row should contain all
>> columns, not only a part of it.
>>
>> Is there a misunderstanding about how to use MatGetLocalSize(), or
>> something else?
>>
>> Thanks!
>>
>> Regards,
>>
>> Roland Richter
>>
>>
>>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210113/47e606b2/attachment.html>


More information about the petsc-users mailing list