[petsc-users] petsc4py mpi matrix size

Lukas Razinkovas lukasrazinkovas at gmail.com
Fri Jan 10 07:52:53 CST 2020


Hello,

I am trying to use petsc4py and slepc4py for parallel sparse matrix
diagonalization.
However I am a bit confused about matrix size increase when I switch from
single processor to multiple processors. For example 100 x 100 matrix with
298 nonzero elements consumes
8820 bytes of memory (mat.getInfo()["memory"]), however on two processes it
consumes 20552 bytes of memory  and on four 33528.  My matrix is taken from
the slepc4py/demo/ex1.py,
where nonzero elements are on three diagonals.

Why memory usage increases with MPI processes number?
I thought that each process stores its own rows and it should stay the
same. Or some elements are stored globally?

Lukas
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200110/24470952/attachment-0001.html>


More information about the petsc-users mailing list