[petsc-users] Code performance for solving multiple RHS

Harshad Ranadive harshadranadive at gmail.com
Wed Aug 24 20:07:37 CDT 2016


Thanks a lot Barry for your help. I have resolved the issue of slow linear
solver for excessively large RHS vectors to some degree.

*To summarize:*

1) In my case, I wanted to solve the *same linear system a large number of
times for different RHS vectors*. Previously I was using the iterative
solver KSPGMRES with preconditioner PCJACOBI

2) I had a huge number of RHS to be solved (~1M/time step)  .... this had a
computational cost *9.6 times* greater than my explicit code (which did not
need matrix inversion - only reqd. RHS eval)

3) Based on Barry's suggestions to use a direct solver I changed the KSP
type to KSPPREONLY and used the preconditioner - PCLU.

4) This approach has a small drawback that the linear system needs to be
solved sequentially. Although different systems are now solved by different
processors in parallel.

5) The current computational cost is only *1.16 times* the explicit code.

Thanks and Regards,
Harshad


On Fri, Aug 12, 2016 at 1:27 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
> > On Aug 11, 2016, at 10:14 PM, Harshad Ranadive <
> harshadranadive at gmail.com> wrote:
> >
> > Hi Barry,
> >
> > Thanks for this recommendation.
> >
> > As you mention, the matrix factorization should be on a single processor.
> > If the factored matrix A is available on all processors can I then use
> MatMatSolve(A,B,X) in parallel? That is could the RHS block matrix 'B' and
> solution matrix 'X' be distributed in different processors as is done while
> using MatCreateDense(...) ?
>
>   Note sure what you mean.
>
>   You can have different processes handle different right hand sides. So
> give the full linear system matrix to each process; each process factors it
> and then each process solves a different set of right hand sides.
> Embarrassingly parallel except for any communication you need to do to get
> the matrix and right hand sides to the right processes.
>
>    If the linear system involves say millions of unknowns this is the way
> to go. If the linear system is over say 1 billion unknowns then it might be
> worth each linear system in parallel.
>
>    Barry
>
> >
> > Thanks,
> > Harshad
> >
> >
> >
> > On Fri, Aug 12, 2016 at 2:09 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> >    If it is sequential, which it probably should be, then you can you
> MatLUFactorSymbolic(), MatLUFactorNumeric() and MatMatSolve() where you put
> a bunch of your right hand side vectors into a dense array; not all million
> of them but maybe 10 to 100 at a time.
> >
> >    Barry
> >
> > > On Aug 10, 2016, at 10:18 PM, Harshad Ranadive <
> harshadranadive at gmail.com> wrote:
> > >
> > > Hi Barry
> > >
> > > The matrix A is mostly tridiagonal
> > >
> > > 1 α 0 ......... 0
> > >
> > > α 1 α 0 .......0
> > >
> > >
> > > 0 α 1 α 0 ....0
> > >
> > >
> > > ....................
> > > 0..............α 1
> > >
> > > In some cases (periodic boundaries) there would be an 'α' in
> right-top-corner and left-bottom corner.
> > >
> > > I am not using multigrid approach. I just implemented an implicit
> filtering approach (instead of an explicit existing one) which requires the
> solution of the above system.
> > >
> > > Thanks
> > > Harshad
> > >
> > > On Thu, Aug 11, 2016 at 1:07 PM, Barry Smith <bsmith at mcs.anl.gov>
> wrote:
> > >
> > >   Effectively utilizing multiple right hand sides with the same system
> can result in roughly 2 or at absolute most  3 times  improvement in solve
> time. A great improvement but when you have a million right hand sides not
> a giant improvement.
> > >
> > >   The first step is to get the best (most efficient) preconditioner
> for you problem. Since you have many right hand sides it obviously pays to
> spend more time building the preconditioner so that each solve is faster.
> If you provide more information on your linear system we might have
> suggestions.  CFD so is your linear system a Poisson problem? Are you using
> geometric or algebraic multigrid with PETSc? It not a Poisson problem how
> can you describe the linear system?
> > >
> > >   Barry
> > >
> > >
> > >
> > > > On Aug 10, 2016, at 9:54 PM, Harshad Ranadive <
> harshadranadive at gmail.com> wrote:
> > > >
> > > > Hi All,
> > > >
> > > > I have currently added the PETSc library with our CFD solver.
> > > >
> > > > In this I need to use KSPSolve(...) multiple time for the same
> matrix A. I have read that PETSc does not support passing multiple RHS
> vectors in the form of a matrix and the only solution to this is calling
> KSPSolve multiple times as in example given here:
> > > > http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/
> examples/tutorials/ex16.c.html
> > > >
> > > > I have followed this technique, but I find that the performance of
> the code is very slow now. I basically have a mesh size of 8-10 Million and
> I need to solve the matrix A very large number of times. I have checked
> that the statement KSPSolve(..) is taking close to 90% of my computation
> time.
> > > >
> > > > I am setting up the matrix A, KSPCreate, KSPSetup etc just once at
> the start. Only the following statements are executed in a repeated loop
> > > >
> > > > Loop begin: (say million times !!)
> > > >
> > > >    loop over vector length
> > > >        VecSetValues( ....)
> > > >    end
> > > >
> > > >    VecAssemblyBegin( ... )
> > > >    VecAssemblyEnd (...)
> > > >
> > > >     KSPSolve (...)
> > > >
> > > >     VecGetValues
> > > >
> > > > Loop end.
> > > >
> > > > Is there an efficient way of doing this rather than using KSPSolve
> multiple times?
> > > >
> > > > Please note my matrix A never changes during the time steps or
> across the mesh ... So essentially if I can get the inverse once would it
> be good enough?  It has been recommended in the FAQ that matrix inverse
> should be avoided but would it be okay to use in my case?
> > > >
> > > > Also could someone please provide an example of how to use
> MatLUFactor and MatCholeskyFactor() to find the matrix inverse... the
> arguments below were not clear to me.
> > > > IS row
> > > > IS col
> > > > const MatFactorInfo *info
> > > >
> > > > Apologies for a long email and thanks to anyone for help.
> > > >
> > > > Regards
> > > > Harshad
> > > >
> > > >
> > > >
> > > >
> > > >
> > >
> > >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160825/5ef30f69/attachment.html>


More information about the petsc-users mailing list