[petsc-users] Matrix preallocation - d_nz and o_nz

Matthew Knepley knepley at gmail.com
Thu Apr 7 05:47:24 CDT 2022


On Thu, Apr 7, 2022 at 6:12 AM Gabriela Nečasová <necaga at gmail.com> wrote:

> Dear PETSc team,
>
> I would like to ask you a question about the matrix preallocation.
> I am using the routine MatMPIAIJSetPreallocation().
>
> Example: The matrix A has the size 18 x 18 with 168 nonzeros:
> A =
>  106.21      -91.667       26.042            0            0            0
>          0            0            0      -2704.2       2862.5      -1354.2
>       260.42            0            0            0            0
>  0
>       -91.667       132.25      -91.667       26.042            0
>    0            0            0            0       2862.5      -4058.3
> 3122.9      -1354.2       260.42            0            0            0
>        0
>        26.042      -91.667       132.25      -91.667       26.042
>    0            0            0            0      -1354.2       3122.9
>  -4058.3       3122.9      -1354.2       260.42            0            0
>          0
>             0       26.042      -91.667       132.25      -91.667
> 26.042            0            0            0       260.42      -1354.2
>   3122.9      -4058.3       3122.9      -1354.2       260.42            0
>          0
>             0            0       26.042      -91.667       132.25
>  -91.667       26.042            0            0            0       260.42
>    -1354.2       3122.9      -4058.3       3122.9      -1354.2       260.42
>            0
>             0            0            0       26.042      -91.667
> 132.25      -91.667       26.042            0            0            0
>   260.42      -1354.2       3122.9      -4058.3       3122.9      -1354.2
>     260.42
>             0            0            0            0       26.042
>  -91.667       132.25      -91.667       26.042            0            0
>          0       260.42      -1354.2       3122.9      -4058.3       3122.9
>      -1354.2
>             0            0            0            0            0
> 26.042      -91.667       132.25      -91.667            0            0
>        0            0       260.42      -1354.2       3122.9      -4058.3
>     2862.5
>             0            0            0            0            0
>    0       26.042      -91.667       106.21            0            0
>      0            0            0       260.42      -1354.2       2862.5
>  -2704.2
>        9.3542      -8.3333       2.6042            0            0
>    0            0            0            0       106.21      -91.667
> 26.042            0            0            0            0            0
>        0
>       -8.3333       11.958      -8.3333       2.6042            0
>    0            0            0            0      -91.667       132.25
>  -91.667       26.042            0            0            0            0
>          0
>        2.6042      -8.3333       11.958      -8.3333       2.6042
>    0            0            0            0       26.042      -91.667
> 132.25      -91.667       26.042            0            0            0
>        0
>             0       2.6042      -8.3333       11.958      -8.3333
> 2.6042            0            0            0            0       26.042
>  -91.667       132.25      -91.667       26.042            0            0
>          0
>             0            0       2.6042      -8.3333       11.958
>  -8.3333       2.6042            0            0            0            0
>     26.042      -91.667       132.25      -91.667       26.042            0
>            0
>             0            0            0       2.6042      -8.3333
> 11.958      -8.3333       2.6042            0            0            0
>        0       26.042      -91.667       132.25      -91.667       26.042
>          0
>             0            0            0            0       2.6042
>  -8.3333       11.958      -8.3333       2.6042            0            0
>          0            0       26.042      -91.667       132.25      -91.667
>       26.042
>             0            0            0            0            0
> 2.6042      -8.3333       11.958      -8.3333            0            0
>        0            0            0       26.042      -91.667       132.25
>    -91.667
>             0            0            0            0            0
>    0       2.6042      -8.3333       9.3542            0            0
>      0            0            0            0       26.042      -91.667
>   106.21
>
> I am wondering how the d_nz and o_nz values should be set.
> I read the docs:
> https://petsc.org/release/docs/manualpages/Mat/MatMPIAIJSetPreallocation.html
>
> *For example, for 2 processes P1, P2: *
> mi - denote the number of rows which belong to process Pi
> P0: m0 = 9
> P1: m1 = 9
>
> The dn_nz and o_nz should be:
> P0: d_nz = 5; o_nz = 7
> P1: d_nz = 5; o_nz = 7
>
> Therefore, we are allocating m*(d_nz+o_nz) storage locations for every
> process:
> P0: 9*(5+7) = 108
> P1: 9*(5+7) = 108
> So, in total, we have 216 storage locations to store 168 nonzeros.
>
> But I am doing more experiments and I use the various number of processes
> - e.g. 2, 4, 8, 19, 32, etc.
>
> *For example, if we take 3 processes - P0, P1, P2:*
> P0: m0 = 6
> P1: m1 = 6
> P2: m2 = 6
>
> The dn_nz and o_nz should be:
> P0: d_nz = 5; o_nz = 7 (I took the maximum number of o_nz)
> P1: d_nz = 3; o_nz = 8  (I took the maximum number of o_nz)
> P2: d_nz = 5; o_nz = 7 (I took the maximum number of o_nz)
>
> Therefore, we are allocating m*(d_nz+o_nz) storage locations for every
> process:
> P0: 6*(5+7) = 72
> P1: 6*(3+8) = 66
> P2: 6*(5+7) = 72
> So, in total, we have 210 storage locations to store 168 nonzeros.
>
> *My questions are:*
> 1) How should I set the d_nz and o_nz values when the number of processes
> changes in every experiment?
> -- I suppose, there has to be some "general" settings, so you do not have
> to change these values for every experiment, right?
> -- For better illustration, I am sending you figures of the matrix A in
> the attachment.
>

I think the correct answer is that d_nz/o_nz should only be used for
testing or example codes. Production codes should
specify the number of nonzeros in every row, d_nnz/o_nnz. We provide
support for doing this easily with the
MatPreallocator class..


> 2) If I run the code sequentially, then the MatMPIAIJSetPreallocation()
> routine will crash?
>

If you run sequentially, meaning with the type MATSEQAIJ, then this
function is ignored, so your matrix
will not be preallocated. This can still work, but will be slow.


> 3) If I need to generate a large identity matrix, is this a correct
> approach, please?
> //create matrix I (diagonal matrix)
> ierr = MatCreate(comm, &Ione);CHKERRQ(ierr);
> ierr = MatSetType(Ione, MATMPIAIJ);CHKERRQ(ierr);
> ierr = MatSetSizes(Ione, PETSC_DECIDE, PETSC_DECIDE, data_size,
> data_size);CHKERRQ(ierr);
>
> // parallel sparse - preallocation for better performance
> ierr = MatMPIAIJSetPreallocation(Ione, 1, PETSC_NULL, 0,
> PETSC_NULL);CHKERRQ(ierr);
>
> // create vector with ones
> ierr = VecCreate(PETSC_COMM_WORLD, &diagVec);CHKERRQ(ierr);
> ierr = VecSetType(diagVec, VECMPI);CHKERRQ(ierr);
> ierr = VecSetSizes(diagVec, PETSC_DECIDE, data_size);CHKERRQ(ierr);
> ierr = VecSet(diagVec, 1);CHKERRQ(ierr);
> // set the diagonal of matrix I using the vector diagVec
> ierr = MatDiagonalSet(Ione, diagVec, INSERT_VALUES);CHKERRQ(ierr);
>
> ierr = MatAssemblyBegin(Ione, MAT_FINAL_ASSEMBLY);
> ierr = MatAssemblyEnd(Ione, MAT_FINAL_ASSEMBLY);
>

Yes, that is a fine way to do it.

  Thanks,

     Matt


> Thank you very much in advance.
> Kind regards
> Gabi
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220407/fe186142/attachment.html>


More information about the petsc-users mailing list