[petsc-users] Need advice to do blockwise matrix-vector multiplication

Fangbo Wang fangbowa at buffalo.edu
Thu Jan 19 16:10:49 CST 2017


Hi,

*Background:*

I am using stochastic finite element to solve a solid mechanics problem
with random material properties. At the end of the day, I get a linear
system of equations Ax=b to solve.

The matrix A is very large with size of 1.3million by 1.3 million, and to
save this matrix needs more than 100 G memory. Fortunately, matrix A has
some nice features that it is a block matrix, most of the blocks inside the
matrix are similar, each block is 10,000 by 10,000.


Hence, I only need to save some  blocks (in my case 45). Most of the
computation  in my iterative solver is matrix-vec multiplication, that's
why I want to do it using block matrices.


​

*Current:*
I tried to parallelize all my 45 block matrices in all the processors, and
all the corresponding 45 block vectors in all the processors. However, the
computation seems to be very slow, and no scalability at all.
I am thinking of using small groups of processors to separate the
computation, like using intra-communicators and inter-communicators. Maybe
this will help to reduce the communication.

Any one have some experiences on this? Is there any Petsc function to do
these jobs? I am open to any suggestions.

Thank you very much!



Fangbo Wang

-- 
Fangbo Wang, PhD student
Stochastic Geomechanics Research Group
Department of Civil, Structural and Environmental Engineering
University at Buffalo
Email:
*fangbowa at buffalo.edu <fangbowa at buffalo.edu>*
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170119/ef59fe57/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: FIG-2-Color-online-A-symmetric-block-Toeplitz-matrix-Each-block-is-also-a-symmetric.png
Type: image/png
Size: 26270 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170119/ef59fe57/attachment-0001.png>


More information about the petsc-users mailing list