[petsc-users] KSPsolve performance tuning

Smith, Barry F. bsmith at mcs.anl.gov
Fri Nov 9 13:04:37 CST 2018


    The code below looks ok (you don't need to repeatedly call KSPSetOperators() but, you only have to have called that once).  You can run with -info | grep "Leaving PC with identical preconditioner since reuse preconditioner is set" to see if the flag is being respected. See src/ksp/pc/interface/precon.c and the function PCSetUp() for how it determines if the preconditioner should be rebuilt.

    Barry


> On Nov 9, 2018, at 4:52 AM, Edoardo alinovi <edoardo.alinovi at gmail.com> wrote:
> 
> Hello Barry,
> 
> sorry for  digging up again this post, but I have a doubt on how reuse a preconditioner.
> 
> Acuttaly I am calling  the function in this order  
> 
> - refill the new matrix
> 
> -  MatAssemblyBegin/end
> 
> - KSPSetReusePreconditioner(ksp, PETSC_TRUE,ierr) 
> 
> - KSPSetOperators(ksp,A,A,ierr)
> 
> -  KSPSolve(ksp,rhs,x,ierr)
> 
> Is this correct? Morever is there a way to check if the precondtioner has been reused or not? 
> 
> Thank you very much for your kind comments
> 
> ------
> 
> Edoardo Alinovi, Ph.D.
> 
> DICCA, Scuola Politecnica,
> Universita' degli Studi di Genova,
> 1, via Montallegro,
> 16145 Genova, Italy
> 
> Email: edoardo.alinovi at dicca.unige.it
> Tel: +39 010 353 2540
> 
> 
> 
> 
> Il giorno gio 13 set 2018 alle ore 21:25 Edoardo alinovi <edoardo.alinovi at gmail.com> ha scritto:
> Yes, this is due to the fact that the convection are treated implicitly in my code. This gives more stability, but at the same time each time step the coffs changes due to the change of mass fluxes. I can eventually avoid the pc set up during non orthoghonal correction where only rhs changes. 
> 
> Il 13 set 2018 21:19, "Smith, Barry F." <bsmith at mcs.anl.gov> ha scritto:
> 
> 
> > On Sep 13, 2018, at 2:16 PM, Edoardo alinovi <edoardo.alinovi at gmail.com> wrote:
> > 
> > Yes, I need it or at least I think so. I am using piso alghorithm which requires to solve the pressure eq twice in a time step. Since the coeffs of the matrix are different in the two cases, then I have to setup the pc every time.
> 
>    What about at the next time-step? Are the coefficients different yet again at each new timestep?
> 
> 
>    Barry
> 
> 
> > 
> > Thanks Barry! 
> > 
> > Il gio 13 set 2018, 21:08 Smith, Barry F. <bsmith at mcs.anl.gov> ha scritto:
> > 
> > 
> > > On Sep 13, 2018, at 1:55 PM, Edoardo alinovi <edoardo.alinovi at gmail.com> wrote:
> > > 
> > > Hello Barry,
> > > 
> > > Thank you very much for your replay! Maybe the best feature of PETSc is this mailing list :)  You are right cg + block Jcoby is twice faster than hypre, but still a bit slower  than OF. I have  definitely to test a bigger case. It seems that hypre loses a lot of time in setting up the PC. Is this normal? 
> > 
> >     Yes, BoomerAMG (or any AMG method) has "large" setup times that don't scale as well as the solve time. Thus if you can reuse the setup for multiple solves you gain a great deal. The best case is, of course, where the same Poisson problem (with a different right hand side obviously) has to be solved many times. I noticed in your log_view that it seems to need to do a new setup for each solve.
> > 
> >    Barry
> > 
> > > 
> > > Thank you!
> > > 
> > > 
> > > 
> > > ------
> > > 
> > > Edoardo Alinovi, Ph.D.
> > > 
> > > DICCA, Scuola Politecnica
> > > Universita' di Genova
> > > 1, via Montallegro
> > > 16145 Genova, Italy
> > > 
> > > email: edoardo.alinovi at dicca.unige.it
> > > Tel: +39 010 353 2540
> > > 
> > > 
> > > 
> > > 2018-09-13 20:10 GMT+02:00 Smith, Barry F. <bsmith at mcs.anl.gov>:
> > > 
> > >    What pressure solver is OpenFOAM using? Are you using the same convergence tolerance for the pressure solver for the two approaches? Have you tried PETSc with a simpler solver: -pc_type bjacobi or -pc_type asm ; the problem is pretty small and maybe at this size hypre is overkill?
> > > 
> > >    Barry
> > > 
> > > 
> > > > On Sep 13, 2018, at 12:58 PM, Edoardo alinovi <edoardo.alinovi at gmail.com> wrote:
> > > > 
> > > > Hello PETSc's frieds,
> > > > 
> > > > It is a couple of weeks that I am trying to enhance the perforamance of my code. Actually I am solving NS equation for a 3D problem of 220k cells with 4procs on my laptop (i7-7800k @ 2.3Ghz with dynamic overclocking). I have installed petsc under linux suse 15 in a virtual machine (I do not know if this is important or not). 
> > > > 
> > > > After some profiling, I can  see that the bottle neck is inside KSPSolve while solving pressure equation (solved with cg + hypre pc). For this reason my code is running twice time slower than openFOAM and the gap is only due the solution of pressure. Have you got some hints for me?  At this point I am sure I am doing something wrong! I have attached the log of a test simulation.
> > > > 
> > > > Thank you very much!
> > > > 
> > > > ------
> > > > 
> > > > Edoardo Alinovi, Ph.D.
> > > > 
> > > > DICCA, Scuola Politecnica
> > > > Universita' di Genova
> > > > 1, via Montallegro
> > > > 16145 Genova, Italy
> > > > 
> > > > email: edoardo.alinovi at dicca.unige.it
> > > > Tel: +39 010 353 2540
> > > > 
> > > > 
> > > > <run_log_view.txt>
> > > 
> > > 
> > 
> 
> 



More information about the petsc-users mailing list