[petsc-users] optimizing repeated calls to KSPsolve?

Jed Brown jed at 59A2.org
Fri Dec 10 17:30:39 CST 2010


On Sat, Dec 11, 2010 at 00:03, Luke Bloy <luke.bloy at gmail.com> wrote:

> I'm solving a diffusion problem. essentially I have 2,000,000 possible
> states for my system to be in. The system evolves based on a markov matrix
> M, which describes the probability the system moves from one state to
> another. This matrix is extremely sparse on the < 100,000,000 nonzero
> elements. The problem is to pump mass/energy into the system at certain
> states. What I'm interested in is the steady state behavior of the system.
>
> basically the dynamics can be summarized as
>
> d_{t+1} = M d_{t} + d_i
>
> Where d_t is the state vector at time t and d_i shows the states I am
> pumping energy into. I want to find d_t as t goes to infinity.
>
> My current approach is to solve the following system.
>
> (I-M) d = d_i
>

So you want to do this for some 500,000 d_i?  What problem are you really
trying to solve?  Is it really to just brute-force compute states for all
these inputs?  What are you doing with the resulting 500k states (all 8
terabytes of it)?  Are you, for example, looking for some d_i that would
change the steady state d in a certain way?

2. If you can afford the memory, a direct solve probably makes sense.
>
>
> My understanding is the inverses would generally be dense. I certainly
> don't have any memory to hold a 2 million by 2 million dense matrix, I have
> about 40G to play with. So perhaps a decomposition might work? Which might
> you suggest?
>

While inverses are almost always dense, sparse factorization is far from
dense.  For PDE problems factored in an optimal ordering, the memory
asymptotics are n*log n in 2D and n^{4/3} in 3D.  The time asymptotics are
n^{3/2} and n^2 respectively.  Compare to n^2 memory, n^3 time for dense.

Jed
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20101211/d3207971/attachment-0001.htm>


More information about the petsc-users mailing list