[petsc-users] MatSetValues is expensive
M. Scot Breitenfeld
brtnfld at uiuc.edu
Thu Feb 24 09:33:59 CST 2011
Hi,
I'm working on a particle type method and I'm using MatSetValues, to
insert values (add_values) into my matrix. Currently I:
do i, loop over number of particles
do j, loop over particles in i's family
...
in row i's dof; insert values in columns of j's (x,y,z) dofs
(3 calls to MatSetValues for i's x,y,z dof)
in row j's dof; insert values in columns of i's (x,y,z) dofs
(3 calls to MatSetValues for j's x,y,z dof)
...
enddo
enddo
Running serially, using MatSetValues it takes 294.8 sec. to assemble the
matrix, if I remove the calls to MatSetValues it takes 29.5 sec. to run
through the same loops, so the MatSetValues calls take up 90% of the
assembling time. I'm preallocating the A matrix specifying d_nnz and o_nnz.
I guess I need to add extra storage so I can call the MatSetValues with
more values so that I can call it less, or just do a lot of
recalculating of values so that I can add an entire row at once. I just
want to make sure this is expected behavior and not something that I'm
doing wrong before I start to rewrite my assembling routine. Probably a
hash table would be better but I don't want to store that and then
convert that to a CRS matrix, I'm already running into memory issues as
it is.
Just out of curiosity, wouldn't a finite element code have a similar
situation, in that case you would form the local stiffness matrix and
then insert that into the global stiffness matrix, so you would be
calling MatSetValues "number of elements" times.
More information about the petsc-users
mailing list