[petsc-users] KSP with large sparse kronecker product

Donald Planalp donaldrexplanalpjr at outlook.com
Thu Jan 16 10:00:21 CST 2025


Hello,

I am inquiring to see what the best approach for solving a problem I am encountering in my quantum mechanics research.

For some context, the structure of my problem is solving a linear system Ax=b where b=By in parallel. In this case A and B can be written as the sum of 4-5 kronecker products. Specifically, A and B are formed by the same terms but with a few flipped signs in the sum, and some of the scalings are time dependent so currently the sum is performed at each step.

The issue I'm having is balancing GMRES solving speed versus memory usage. The sparse structure of my matrices is such that A and B (after summing) is equivalent in structure to a kronecker product between a matrix of size 1000x1000 with 5 nonzeros per row (concentrated along diagonal), and a matrix of 2000x2000 with 13 nonzeros per row (banded along diagonal).

In this case, the memory usage for explicitly storing the kronecker product can be quite large. This seems inefficient since the kronecker product contains a lot of repeated information. Further, adding the matrices together at each step due to the scaling of time dependence as well as dimensionality makes the solver quickly become much slower.

I've looked into various matrix types. MATMPIKAIJ seems close, but I would require more terms, and all of my matrices are sparse not some of them. I've also looked into MATCOMPOSITE to avoid explicit sums at least, however the solver time actually becomes slower.

I was just curious if there is a better way of handling this using petsc that can achieve the best of both worlds, low memory usage and avoiding explicit kronecker product evaluation while keeping similar speeds of matrix vector products for GMRES.

Thank you for your time.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250116/8cf3d987/attachment.html>


More information about the petsc-users mailing list