[petsc-dev] How to precondition a direct solve ?

Franck Houssen franck.houssen at inria.fr
Sun Aug 27 11:16:47 CDT 2017


OK, thanks.

Franck

----- Mail original -----
> De: "Barry Smith" <bsmith at mcs.anl.gov>
> À: "Franck Houssen" <franck.houssen at inria.fr>
> Cc: "For users of the development version of PETSc" <petsc-dev at mcs.anl.gov>
> Envoyé: Vendredi 25 Août 2017 23:21:35
> Objet: Re: [petsc-dev] How to precondition a direct solve ?
> 
> 
>   By "precondition" a direct solver I assume you mean "scale the linear
>   system so that the direct solver produces a more accurate solution"?
> 

I meant getting a better condition number.

To make a long story short, I face a problem I have no idea what the root cause could be: it works, but, it's slow. I read the doc more and more to see if I didn't "miss" something. I changed heavily the command line (ksp_type, pc_type, ...) to see if it helps (doesn't really). I profiled (log_view) to locate the down-speed. I stepped into PETSc with gdb without getting no real idea on the root cause (lost in low LOD). I'am still trying to "change/test stuffs/ideas" (quick code modif, options, ...) hoping to "see" if it could help or not (get some clue). For one given kind of matrix, I got lucky after I changed the direct solve into an iterative one: performance came back to something comparable with the expected one. But unfortunately, for "my real" kind of matrix, this does not help. So, as changing from direct to iterative can help at least in some cases, I came to this question : "is it also possible to precondition a direct solve" (hopefully easily at the command line ? seems no).

Anyway, "my" problem may be elsewhere (I'm not even sure testing a preconditioned direct solve would help). This was more a general purpose question (the answer seems to be no).

> 1) For most matrices this is not necessary
> 
> 2) Some of the direct solvers do some of this internally so no reason for you
> to do this
> 
> 3) Instead of doing a scaling you can just use "iterative refinement" where
> you run with, for example GMRES, instead of PREONLY and it does additional
> iterations, if needed to get a more accurate solution
> 

Ah OK, I see, turns out to be a smart feature !
Unfortunately, this does not help in my case (with or without mumps and icntl(10)).

> 4) If you really want to do it then yes you need to do the product yourself,
> but first try 3 since it is much cheaper and tells you if it is really
> needed.
> 
> 
>   Barry
> 
> 
> 
> 
> 
> > On Aug 25, 2017, at 10:51 AM, Franck Houssen <franck.houssen at inria.fr>
> > wrote:
> > 
> > How to precondition a direct solve ?
> > 
> > My understanding is that to solve AX=B with an iterative method (say GMRES
> > + jacobi), I need to follow these steps:
> > KSPSetType(ksp, KSPGMRES);
> > PCSetType(pc, PCJACOBI);
> > KSPSolve(ksp, B);
> > 
> > Also, to solve AX=B with a direct method, my understanding is that I need
> > to follow these steps:
> > KSPSetType(ksp, KSPPREONLY);
> > PCSetType(pc, PCLU);
> > PCFactorSetMatSolverPackage(pc, "mumps"); // To choose the direct solver
> > KSPSolve(ksp, B);
> > 
> > OK, so far so good.
> > 
> > Now what if I want to solve a preconditioned system (say MAX=MB that is
> > AX=B "modified" by M) with a direct method : what is the "PETSc way" to do
> > that ? What would be the "steps to follow" ?
> > The short answer would be : replace A and B by MA and MB. But then, I'll
> > have to compute M by myself which may not be an easy thing (if not a
> > jacobi - ilu, sor, mg, ....). I wondered if there is a more natural way to
> > do that with PETSc.
> > 
> > My understanding is that KSPPREONLY is a "trick" to mimic a direct solve
> > within an iterative solve. Is there such a trick to precondition a direct
> > solve ?
> > 
> > Franck
> 
> 


More information about the petsc-dev mailing list