[petsc-users] Storage space for symmetric (SBAIJ) matrix

Daniel Langr daniel.langr at gmail.com
Fri Sep 17 06:15:41 CDT 2010


Hi all,

I do not understand much how PETSc works with symmetric matrices. I 
tried some tests with a symmetric matrix, which have nonzeroes only
in the main diagonal and in the last column/row, e.g., the following 
pattern:

* 0 0 *
0 * 0 *
0 0 * *
* * * *

for a 4 x 4 matrix. Since PETSc requires to set values only for the 
upper triangular part of a matrix, I set two values for every row except 
of the last one with only one value. My code looks like:

MatCreate(PETSC_COMM_WORLD, &A);
MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, n, n);
MatSetType(A, MATMPISBAIJ);
MatMPISBAIJSetPreallocation(A, 1, 1, PETSC_NULL, 1, PETSC_NULL);
MatGetOwnershipRange(A, &first, &last);
last--;

for (i = first; i <= last; i++) {
   MatSetValue(A, i, i, a, INSERT_VALUES);
   if (i != (n - 1))
     MatSetValue(A, i, n - 1, a, INSERT_VALUES);
}

MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);
MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);

(The value of the variable "a" changes for every nonzero entry. I 
omitted this code in the above printout together with an error checking 
for better readability.)

Resulting matrix works quite fine but when I check the matrix info (e.g. 
using -mat_view_info_detailed option) for matrix size n=10000 and 2 MPI 
processes I get:

// First and last row for every MPI process
Rank: 0, Fist row: 0, Last row: 4999
Rank: 1, Fist row: 5000, Last row: 9999

Matrix Object:
   type=mpisbaij, rows=10000, cols=10000
   total: nonzeros=19999, allocated nonzeros=69990
     [0] Local rows 10000 nz 10000 nz alloced 10000 bs 1 mem 254496
     [0] on-diagonal part: nz 5000
     [0] off-diagonal part: nz 5000
     [1] Local rows 10000 nz 9999 nz alloced 59990 bs 1 mem 264494
     [1] on-diagonal part: nz 9999
     [1] off-diagonal part: nz 0

1. My problem is with the amount of memory used. For 10000 nonzeroes of 
the first process I would except memory needs for CSR storage format 
something approximately like:

nz * sizeof(PetscScalar) + nz * sizeof(PetscInt) + n_local_rows * 
sizeof(PetscInt)
= 10000 * 8 + 10000 * 4 + 5000 * 4
= 140000 bytes

and matrix info gives 254496 bytes. Similarly for the second process. I 
would understand some additional space needed because of efficiency but 
this is more than 180 precent of a space really needed for storing CSR 
matrix, which is quite unacceptable for large problems.

2. Why there is "Local rows 10000"? Shouldn't be this 5000 for every 
process?

3. Why there is "alloced 59990 bs" for the second process? Why there is 
"total: nonzeros=19999, allocated nonzeros=69990"?

4. Why there is 9999 on-diagonal and 0 off-diagonal nonzeroes for the 
second process? This is not true for my matrix.

There is no information about symmetric matrices in PETSc Users Manual. 
I would really welcome some hints how to works with them. For example, 
how to effectively construct such matrices. When I have to set values 
only for the upper triangular part and expect approximately similar fill 
for every row, then, to give every process the same amount of rows (as 
MatGetOwnershipRange indicates) would lead to terrible load balancing. 
At least for the matrix construction process.

Thanks,

Daniel



More information about the petsc-users mailing list