[petsc-users] Problem when solving matrices with identity matrices as diagonal block domains
Adrián Amor
aamor at pa.uc3m.es
Thu Feb 1 02:45:21 CST 2018
Hi,
First, I am a novice in the use of PETSC so apologies for having a newbie
mistake, but maybe you can help me! I am solving a matrix of the kind:
(Identity (50% dense)block
(50% dense)block Identity)
I have found a problem in the performance of the solver when I treat the
diagonal blocks as sparse matrices in FORTRAN. In other words, I use the
routine:
MatCreateSeqAIJ
To preallocate the matrix, and then I have tried:
1. To call MatSetValues for all the values of the identity matrices. I
mean, if the identity matrix has a dimension of 22x22, I call MatSetValues
22*22 times.
2. To call MatSetValues only once per row. If the identity matrix has a
dimension of 22x22, I call MatSetValues only 22 times.
With the case 1, the iterative solver (I have tried with the default one
and KSPBCGS) only takes one iteration to converge and it converges with a
residual of 1E-14. However, with the case 2, the iterative solver takes,
say, 9 iterations and converges with a residual of 1E-04. The matrices that
are loaded into PETSC are exactly the same (I have written them to a file
from the matrix which is solved, getting it with MatGetValues).
What can be happening? I know that the fact that only takes one iteration
is because the iterative solver is "lucky" and its first guess is the right
one, but I don't understand the difference in the performance since the
matrix is the same. I would like to use the case 2 since my matrices are
quite large and it's much more efficient.
Please help me! Thanks!
Adrian.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180201/261f0a7d/attachment.html>
More information about the petsc-users
mailing list