<html><head><meta http-equiv="Content-Type" content="text/html; charset=us-ascii"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><div class=""><br class=""></div> It is using <div class=""><br class=""></div><div class="">MatSOR 369 1.0 9.1214e+00 1.0 7.32e+09 1.0 0.0e+00 0.0e+00 0.0e+00 29 27 0 0 0 29 27 0 0 0 803 0 0 0.00e+00 565 1.35e+03 0</div><div class=""><br class=""></div><div class="">which runs on the CPU not the GPU hence the large amount of time in memory copies and poor performance. We are switching the default to be Chebyshev/Jacobi which runs completely on the GPU (may already be switched in the main branch). </div><div class=""><br class=""></div><div class="">You can run with <span style="font-family: Menlo; font-size: 14px;" class="">-mg_levels_pc_type</span><span style="font-family: Menlo; font-size: 14px;" class=""> jacobi</span> You should then see almost the entire solver running on the GPU.</div><div class=""><font face="Menlo" class=""><span style="font-size: 14px;" class=""><br class=""></span></font></div><div class="">You may need to tune the number of smoothing steps or other parameters of GAMG to get the faster solution time.</div><div class=""><br class=""></div><div class=""> Barry</div><div class=""><br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Mar 22, 2022, at 10:30 AM, Qi Yang <<a href="mailto:qiyang@oakland.edu" class="">qiyang@oakland.edu</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div dir="ltr" class=""><div dir="ltr" class=""><div dir="ltr" class="">To whom it may concern,<div class=""><div class=""><br class=""></div><div class="">I have tried petsc ex50(Possion) with cuda, ksp cg solver and gamg precondition, however, it run for about 30s. I also tried NVIDIA AMGX with the same solver and same grid (3000*3000), it only took 2s. I used nsight system software to analyze those two cases, found petsc took much time in the memory process (63% of total time, however, amgx only took 19%). Attached are screenshots of them.</div><div class=""><br class=""></div><div class="">The petsc command is : mpiexec -n 1 ./ex50 -da_grid_x 3000 -da_grid_y 3000 -ksp_type cg -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1 -vec_type cuda -mat_type aijcusparse -ksp_monitor -ksp_view -log-view </div><div class=""><br class=""></div><div class="">The log file is also attached.</div><div class=""><br class=""></div><div class="">Regards,</div><div class="">Qi</div><div class=""><br class=""></div><div class=""><span id="cid:ii_l1288l930"><1.png></span><br class=""></div></div><div class=""><span id="cid:ii_l1288w5h1"><2.png></span><br class=""></div></div></div></div>
<span id="cid:f_l128i6sx2"><log.PETSc_cg_amg_ex50_gpu_cuda></span></div></blockquote></div><br class=""></div></body></html>