[petsc-users] Storage space for symmetric (SBAIJ) matrix

Daniel Langr daniel.langr at gmail.com
Fri Sep 17 06:33:15 CDT 2010


>> Hi all,
>>
>> I do not understand much how PETSc works with symmetric matrices. I tried
>> some tests with a symmetric matrix, which have nonzeroes only
>> in the main diagonal and in the last column/row, e.g., the following
>> pattern:
>>
>> * 0 0 *
>> 0 * 0 *
>> 0 0 * *
>> * * * *
>>
>> for a 4 x 4 matrix. Since PETSc requires to set values only for the upper
>> triangular part of a matrix, I set two values for every row except of the
>> last one with only one value. My code looks like:
>>
>> MatCreate(PETSC_COMM_WORLD,&A);
>> MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, n, n);
>> MatSetType(A, MATMPISBAIJ);
>> MatMPISBAIJSetPreallocation(A, 1, 1, PETSC_NULL, 1, PETSC_NULL);
>> MatGetOwnershipRange(A,&first,&last);
>> last--;
>>
>> for (i = first; i<= last; i++) {
>>   MatSetValue(A, i, i, a, INSERT_VALUES);
>>   if (i != (n - 1))
>>     MatSetValue(A, i, n - 1, a, INSERT_VALUES);
>> }
>>
>> MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);
>> MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);
>>
>> (The value of the variable "a" changes for every nonzero entry. I omitted
>> this code in the above printout together with an error checking for better
>> readability.)
>>
>> Resulting matrix works quite fine but when I check the matrix info (e.g.
>> using -mat_view_info_detailed option) for matrix size n=10000 and 2 MPI
>> processes I get:
>>
>> // First and last row for every MPI process
>> Rank: 0, Fist row: 0, Last row: 4999
>> Rank: 1, Fist row: 5000, Last row: 9999
>>
>> Matrix Object:
>>   type=mpisbaij, rows=10000, cols=10000
>>   total: nonzeros=19999, allocated nonzeros=69990
>>     [0] Local rows 10000 nz 10000 nz alloced 10000 bs 1 mem 254496
>>     [0] on-diagonal part: nz 5000
>>     [0] off-diagonal part: nz 5000
>>     [1] Local rows 10000 nz 9999 nz alloced 59990 bs 1 mem 264494
>>     [1] on-diagonal part: nz 9999
>>     [1] off-diagonal part: nz 0
>
> How were you preallocating this matrix?
>
> Jed

There is a preallocation routine call in my printout. Again:
MatMPISBAIJSetPreallocation(A, 1, 1, PETSC_NULL, 1, PETSC_NULL);

Daniel



More information about the petsc-users mailing list