[Nek5000-users] Proper setup for AMG solver

nek5000-users at lists.mcs.anl.gov nek5000-users at lists.mcs.anl.gov
Sun Sep 2 05:38:14 CDT 2018


Do you really need to solve for 10 scalars fields?

I would turn off the residual projection for scalars. However it's hard to come up with a general recommondation. That's something you may want to experiment with. Turned it on/off and see what happens.

How much faster was the pressure solve with XXT vs AMG? I have never seen a case with more than 10-15%. 

Based on my experience the crossing point (where AMG beats XXT) is often somewhere between 300-500k elements. The work scales O(N^5/3) vs O(N). Also, the xxt setup timings can be significant for larger element counts.

For OIFS I would either use a targetCFL = 1.9 or 3.9 (1 or 2 RK4 substep). Are you using a variable dt? 

If your time step size is small enough to be accurate depends heavily on what you are interested in. I have done similiar pipe and channel flows simulations using different time step sizes and the results (first and second order statistics) where more or less the same. Again, that's something you need to play with to be sure it's true for your case. 

Please contact me off-list. I am sure there are ways to speed-up your case ;)

Cheers,
Stefan


-----Original message-----
> From:nek5000-users at lists.mcs.anl.gov <nek5000-users at lists.mcs.anl.gov>
> Sent: Sunday 2nd September 2018 12:06
> To: nek5000-users at lists.mcs.anl.gov
> Subject: Re: [Nek5000-users] Proper setup for AMG solver
> 
> Hello Stefan & Paul,
> 
> thanks for your suggestions. I use Nek5000 v17 (mver 17.0.4 in makenek).
> 
> I have taken a closer look at the logfiles as suggested by Paul. It seems like I spend the most time for the scalar fields second most for the velocity fields and pressure solve is the smallest share. The numbers below are in seconds for a typical timestep without calculating statistics or writing out files at Re_b = 5300 (Re_t = u_t D / nu = 360, 768 cores) and Re_b = 37700 (Re_t = 2000, 6144 cores).
> I use projection for all 10 scalar fields.
> 
> 360:
> Scalars done		0.064
> Fluid done	    0.039
> U-PRES gmres		0.021
> Step			0.127
> 
> 2000:
> Scalars done		0.94
> Fluid done		0.51
> U-PRES gmres		0.21
> Step			1.72
> 
> 
> Paul, could you elaborate on why you would not use characteristics when running 10 scalar fields? 
> In my short tests at Re_b=5300, it appears to be more time consuming per step but due to the increase in DT it is worth it to go to characteristics. The time per timestep for targetCFL=2.0 increases by a factor of 3 but DT is increased by a factor of 6.
> Besides, are such increased timesteps still small enough to capture the temporal evolution of the flow?
> 
> 
> When I am back on my workstation, I will create a different mesh with lx1=8, lxd=10 and run with the settings Stefan suggested. I expect to get a significant speedup when using the lower tolerances also for velocity and scalars and changing to characteristics.
> 
> I know about the limit of 350k elements for XXT as I have once commented out the part of the code where this is tested.
> Since for my setups AMG was always slower than XXT, I am thinking about sticking to XXT.
> Is there any other reason than your experience with AMG and XXT at large number of elements to enforce AMG?
> 
> 
> Best Regards,
> Steffen
> 
> 
> 
> Message: 5
> Date: Sat, 1 Sep 2018 17:10:54 +0200
> From: nek5000-users at lists.mcs.anl.gov
> To: nek5000-users at lists.mcs.anl.gov     <nek5000-users at lists.mcs.anl.gov>
> Subject: Re: [Nek5000-users] Proper setup for AMG solver
> Message-ID:
>         <mailman.9805.1535814716.86639.nek5000-users at lists.mcs.anl.gov>
> Content-Type: text/plain; charset=utf-8
> 
> Try to use
> 
> lx1=8/lxd=10 with a (potentially) finer mesh
> BDF2 + OIFS with a targetCFL=3.5
> set dt = 0 (this will adjust dt to targetCFL)
> pressure tol = 1e-5 (residual projection turned on) 1e-6 for velocity and scalars (residual projection turned off)
> 
> Note, a mesh with more than 350k elements requires AMG. The default parameters are fine.
> 
> What version of Nek5000 are you using?
> 
> Cheers,
> Stefan
> 
> 
> -----Original message-----
> > From:nek5000-users at lists.mcs.anl.gov <nek5000-users at lists.mcs.anl.gov>
> > Sent: Saturday 1st September 2018 16:50
> > To: nek5000-users at lists.mcs.anl.gov
> > Subject: [Nek5000-users] Proper setup for AMG solver
> >
> > Dear Nek users & experts,
> >
> > I am currently running a turbulent pipe flow very similar to the simulations from El Khoury et al 2013.
> >
> > Additionally, I solve for 10 thermal fields being treated as passive scalars.
> >
> > The Reynolds number is the same as the highest in El Khoury (2013), Re_b = 37700.
> >
> > As I am using the relaxation term filtering (RT-Filter), I have a slightly lower resolution of about 250,000 elements at N=11 (5 times less than El Khoury).
> >
> > As the simulation is still very heavy, I have been looking into ways for speeding it up.
> > I found some good suggestions here:
> > http://nek5000.github.io/NekDoc/faq.html?highlight=amg#computational-speed
> >
> > and here (older version?)
> > http://nek5000.github.io/NekDoc/large_scale.html
> >
> >
> > However, I have some questions regarding these suggestions.
> > 1) Dealiasing:
> > Usually I use lxd = 3/2*lx1. Can I lower that or even use lxd=lx1?
> >
> > 2) Tolerances:
> > I have tested to reduce the tolerance for pressure from 1e-8 to 5e-5 for a run at Re_b=5300 without any significant speedup. Would you consider 5e-5 for pressure accurate enough for evaluating statistics like turbulent kinetic energy budgets, Reynolds shear
> >  stress budgets or budget of turbulent heat fluxes?
> >
> > 3) Time discretisation: BDF2 and OIFS with Courant=2-5
> > If I go from BDF3/EXT3 at C=0.5 to BDF2/OIFS at C=5.0, will I not miss high frequency fluctuations in time, since DT is much larger?
> >
> > 4) AMG instead of XXT:
> > I have tested AMG instead of XXT for both Re_b=5300 and Re_b=37700 without any speedup. Time/timestep is even higher with AMG. My workflow looks like this
> > 4.1) Set SEMG_AMG in the par file.
> > 4.2) Run the simulation once to dump amg files.
> > 4.3) Run amg_hypre (Here I do not know which options to choose, thus I have only uses default settings)
> > 4.4) Run the simulation.
> > Maybe I should choose different options for amg_hypre, or should I rather use the amg_matlab2 tools? For the matlab tools I have not found an explanation on how to use them.
> >
> >
> > I am grateful for any advice regarding these aspects.
> >
> > Best Regards,
> > Steffen
> >
> >
> >
> > _______________________________________________
> > Nek5000-users mailing list
> > Nek5000-users at lists.mcs.anl.gov
> > https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users
> 
> 
> ------------------------------
> 
> Message: 6
> Date: Sat, 1 Sep 2018 15:15:29 +0000
> From: nek5000-users at lists.mcs.anl.gov
> To: "nek5000-users at lists.mcs.anl.gov"
>         <nek5000-users at lists.mcs.anl.gov>
> Subject: Re: [Nek5000-users] Proper setup for AMG solver
> Message-ID:
>         <mailman.9806.1535814932.86639.nek5000-users at lists.mcs.anl.gov>
> Content-Type: text/plain; charset="us-ascii"
> 
> 
> How much time are you spending in your scalar fields?
> 
> 
> Do you have projection turned on for all these fields?
> 
> 
> grep tep logfile
> 
> 
> will tell you how much time per step
> 
> 
> grep gmr logfile will tell you how much time in the pressure on each step
> 
> 
> What's left over is mostly passive scalar, unless you are using characteristics.
> 
> 
> I would not recommend characteristics when running 10 scalar fields.
> 
> 
> Paul
> 
> 
> ________________________________
> From: Nek5000-users <nek5000-users-bounces at lists.mcs.anl.gov> on behalf of nek5000-users at lists.mcs.anl.gov <nek5000-users at lists.mcs.anl.gov>
> Sent: Saturday, September 1, 2018 9:50:38 AM
> To: nek5000-users at lists.mcs.anl.gov
> Subject: [Nek5000-users] Proper setup for AMG solver
> 
> 
> Dear Nek users & experts,
> 
> I am currently running a turbulent pipe flow very similar to the simulations from El Khoury et al 2013.
> Additionally, I solve for 10 thermal fields being treated as passive scalars.
> 
> The Reynolds number is the same as the highest in El Khoury (2013), Re_b = 37700.
> As I am using the relaxation term filtering (RT-Filter), I have a slightly lower resolution of about 250,000 elements at N=11 (5 times less than El Khoury).
> 
> As the simulation is still very heavy, I have been looking into ways for speeding it up.
> I found some good suggestions here:
> http://nek5000.github.io/NekDoc/faq.html?highlight=amg#computational-speed
> 
> and here (older version?)
> http://nek5000.github.io/NekDoc/large_scale.html
> 
> 
> However, I have some questions regarding these suggestions.
> 1) Dealiasing:
> Usually I use lxd = 3/2*lx1. Can I lower that or even use lxd=lx1?
> 
> 2) Tolerances:
> I have tested to reduce the tolerance for pressure from 1e-8 to 5e-5 for a run at Re_b=5300 without any significant speedup. Would you consider 5e-5 for pressure accurate enough for evaluating statistics like turbulent kinetic energy budgets, Reynolds shear stress budgets or budget of turbulent heat fluxes?
> 
> 3) Time discretisation: BDF2 and OIFS with Courant=2-5
> If I go from BDF3/EXT3 at C=0.5 to BDF2/OIFS at C=5.0, will I not miss high frequency fluctuations in time, since DT is much larger?
> 
> 4) AMG instead of XXT:
> I have tested AMG instead of XXT for both Re_b=5300 and Re_b=37700 without any speedup. Time/timestep is even higher with AMG. My workflow looks like this
> 4.1) Set SEMG_AMG in the par file.
> 4.2) Run the simulation once to dump amg files.
> 4.3) Run amg_hypre (Here I do not know which options to choose, thus I have only uses default settings)
> 4.4) Run the simulation.
> Maybe I should choose different options for amg_hypre, or should I rather use the amg_matlab2 tools? For the matlab tools I have not found an explanation on how to use them.
> 
> 
> I am grateful for any advice regarding these aspects.
> 
> Best Regards,
> Steffen
> 
> 
> 
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.mcs.anl.gov/pipermail/nek5000-users/attachments/20180901/62611526/attachment.html>
> 
> ------------------------------
> 
> Subject: Digest Footer
> 
> _______________________________________________
> Nek5000-users mailing list
> Nek5000-users at lists.mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users
> 
> 
> ------------------------------
> 
> End of Nek5000-users Digest, Vol 115, Issue 1
> *********************************************
> _______________________________________________
> Nek5000-users mailing list
> Nek5000-users at lists.mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/nek5000-users
> 


More information about the Nek5000-users mailing list