[petsc-users] optimizing repeated calls to KSPsolve?

Luke Bloy luke.bloy at gmail.com
Fri Dec 10 17:03:31 CST 2010


Thanks for the response.

On 12/10/2010 04:18 PM, Jed Brown wrote:
> On Fri, Dec 10, 2010 at 22:15, Luke Bloy <luke.bloy at gmail.com 
> <mailto:luke.bloy at gmail.com>> wrote:
>
>     My problem is that i have a large number (~500,000)  of b vectors
>     that I would like to find solutions for. My plan is to call
>     KSPsolve repeatedly with each b. However I wonder if there are any
>     solvers or approaches that might benefit from the fact that my A
>     matrix does not change. Are there any decompositions that might
>     still be sparse that would offer a speed up?
>
>
> 1. What is the high-level problem you are trying to solve?  There 
> might be a better way.
>
I'm solving a diffusion problem. essentially I have 2,000,000 possible 
states for my system to be in. The system evolves based on a markov 
matrix M, which describes the probability the system moves from one 
state to another. This matrix is extremely sparse on the < 100,000,000 
nonzero elements. The problem is to pump mass/energy into the system at 
certain states. What I'm interested in is the steady state behavior of 
the system.

basically the dynamics can be summarized as

d_{t+1} = M d_{t} + d_i

Where d_t is the state vector at time t and d_i shows the states I am 
pumping energy into. I want to find d_t as t goes to infinity.

My current approach is to solve the following system.

(I-M) d = d_i

I'm certainly open to any suggestions you might have.

> 2. If you can afford the memory, a direct solve probably makes sense.

My understanding is the inverses would generally be dense. I certainly 
don't have any memory to hold a 2 million by 2 million dense matrix, I 
have about 40G to play with. So perhaps a decomposition might work? 
Which might you suggest?

Thanks
Luke

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20101210/8af71718/attachment.htm>


More information about the petsc-users mailing list