[petsc-users] ksp for AX=B system

Barry Smith bsmith at mcs.anl.gov
Tue Apr 16 19:49:58 CDT 2013


  Shuangshuang

     This is what I was expecting, thanks for the confirmation. For these size problems you definitely want to use a direct solver (often parallel but not for smaller matrices) and solve multiple right hand sides. This means you actually will not use the KSP solver that is standard for most PETSc work, instead you will work directly with the MatGetFactor(), MatGetOrdering(), MatLUFactorSymbolic(), MatLUFactorNumeric(), MatMatSolve() paradigm where the A matrix is stored as an MATAIJ matrix and the B (multiple right hand side) as a MATDENSE matrix. 

   An example that displays this paradigm is src/mat/examples/tests/ex125.c 

   Once you have something running of interest to you we would like to work with you to improve the performance, we have some "tricks" we haven't yet implemented to make these solvers much faster than they will be by default. 

   Barry



On Apr 16, 2013, at 7:38 PM, "Jin, Shuangshuang" <Shuangshuang.Jin at pnnl.gov> wrote:

> Hi, Barry, thanks for your prompt reply.
> 
> We have various size problems, from (n=9, m=3), (n=1081, m = 288), to (n=16072, m=2361) or even larger ultimately. 
> 
> Usually the dimension of the square matrix A, n is much larger than the column dimension of B, m.
> 
> As you said, I'm using the loop to deal with the small (n=9, m=3) case. However, for bigger problem, I do hope there's a better approach.
> 
> This is a power flow problem. When we parallelized it in OpenMP previously, we just parallelized the outside loop, and used a direct solver to solve it.
> 
> We are now switching to MPI and like to use PETSc ksp solver to solve it in parallel. However, I don't know what's the best ksp solver I should use here? Direct solver or try a preconditioner?
> 
> I appreciate very much for your recommendations. 
> 
> Thanks,
> Shuangshuang
> 
> 
> -----Original Message-----
> From: petsc-users-bounces at mcs.anl.gov [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Barry Smith
> Sent: Tuesday, April 16, 2013 5:16 PM
> To: PETSc users list
> Subject: Re: [petsc-users] ksp for AX=B system
> 
> 
>   Shuangshuang,
> 
>    How large is n and m?  PETSc does not have any built-in "multiple right hand side" iterative solvers. Generally if m is small, m < 10-20 we recommend just solving each one in a loop as you suggest below.  If m is large and n is not "too large" we recommend using a direct solver and using MatMatSolve() to solve all the right hand sides "together". If n is not "too large" and m is very large it would also be reasonable to solve in parallel different "sets" of right hand sides where each "set" of right hand sides is solve in parallel using a direct solver with MatMatSolve(), we don't have specific "canned" code set up to do this but it is pretty straightforward. 
> 
>   Also where is your matrix coming from? A PDE where there are good known preconditioners to solve it (like multigrid) or some other type of problem without good preconditioners?
> 
>   Once we know the type of problem and some idea of n and m we can make more specific recommendations.
> 
>    Barry
> 
> 
> On Apr 16, 2013, at 6:21 PM, "Jin, Shuangshuang" <Shuangshuang.Jin at pnnl.gov> wrote:
> 
>> Hi, petsc developers, I have another question regarding solving the AX=B linear systems.
>> 
>> I know that we can use PETSc ksp solver to solve an Ax=b linear system in parallel, where A is a square matrix, and b is a column vector for example.
>> 
>> What about solving AX=B in parallel, where A is still n*n, and B is a n*m matrix?
>> 
>> If I solve each column of B one by one, such as:
>> for (i = 0; i < m; i++)
>> Callkspsolver(A, xi, bi); // user defined wrapper function to call PETSc ksp solver
>> 
>> Then for solving each individual Axi = bi, it's parallel. However, if m is big, the sequential outside loop is quite inefficient.
>> 
>> What is the best approach to parallelize the outside loop as well to speed up the overall computation?
>> 
>> Thanks,
>> Shuangshuang
>> 
> 



More information about the petsc-users mailing list