<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=Windows-1252">
<style type="text/css" style="display:none;"><!-- P {margin-top:0;margin-bottom:0;} --></style>
</head>
<body dir="ltr">
<div id="divtagdefaultwrapper" style="font-size:12pt;color:#000000;font-family:Calibri,Helvetica,sans-serif;" dir="ltr">
<p>Hi all, <br>
</p>
<p><br>
</p>
<p>We are using PETSc 3.20 in our code and running succesfully several solvers on Nvidia GPU with OpenMPI library which are not GPU aware (so I need to add the flag -use_gpu_aware_mpi 0).</p>
<p><br>
</p>
<p>But now, when using OpenMPI GPU Aware library (OpenMPI 4.0.5 ou 4.1.5 from NVHPC), some parallel calculations failed with
<b>KSP_DIVERGED_ITS</b> or <b>KSP_DIVERGED_DTOL</b></p>
<p>with several configurations. It may run wells on a small test case with (matrix is symmetric):<br>
</p>
<p><b><br>
</b></p>
<p><b>-ksp_type cg -pc_type gamg -pc_gamg_type classical</b><br>
</p>
<p><br>
</p>
<p>But suddenly with a number of devices for instance bigger than 4 or 8, it may fail.
<br>
</p>
<p><br>
</p>
<p>If I switch to another solver (BiCGstab), it may converge:<br>
</p>
<p></p>
<p><br>
</p>
<p><b>-ksp_type bcgs -pc_type gamg -pc_gamg_type classical</b><br>
</p>
<p><br>
</p>
<p>The more sensitive cases where it diverges are the following:</p>
<p><b>-ksp_type cg -pc_type hypre -pc_hypre_type boomeramg<br>
</b></p>
<p><b>-ksp_type cg -pc_type gamg <b> -pc_gamg_type classical</b></b></p>
<p><br>
</p>
<p></p>
<p>And the <b>bcgs</b> turnaroud doesn't work each time...</p>
<p><br>
</p>
<p></p>
<p>It seems to work without problem with aggregation (at least 128 GPUs on my simulation):<br>
</p>
<p></p>
<p><b>-ksp_type cg -pc_type gamg -<span>pc_gamg_type</span> agg</b></p>
<p></p>
<br>
<p>So I guess there is a weird thing happening in my code during the solve in PETSc with MPI GPU Aware, as all the previous configurations works with non GPU aware MPI.</p>
<p><br>
</p>
<p>Here is the -ksp_view log during one fail with the first configuration:</p>
<p><br>
</p>
<p></p>
<div><b>KSP Object: () 8 MPI processes<br>
type: cg<br>
maximum iterations=10000, nonzero initial guess<br>
tolerances: relative=0., absolute=0.0001, divergence=10000.<br>
left preconditioning<br>
using UNPRECONDITIONED norm type for convergence test<br>
PC Object: () 8 MPI processes<br>
type: hypre<br>
HYPRE BoomerAMG preconditioning<br>
Cycle type V<br>
Maximum number of levels 25<br>
Maximum number of iterations PER hypre call 1<br>
Convergence tolerance PER hypre call 0.<br>
Threshold for strong coupling 0.7<br>
Interpolation truncation factor 0.<br>
Interpolation: max elements per row 0<br>
Number of levels of aggressive coarsening 0<br>
Number of paths for aggressive coarsening 1<br>
Maximum row sums 0.9<br>
Sweeps down 1<br>
Sweeps up 1<br>
Sweeps on coarse 1<br>
Relax down l1scaled-Jacobi<br>
Relax up l1scaled-Jacobi<br>
Relax on coarse Gaussian-elimination<br>
Relax weight (all) 1.<br>
Outer relax weight (all) 1.<br>
Maximum size of coarsest grid 9<br>
Minimum size of coarsest grid 1<br>
Not using CF-relaxation<br>
Not using more complex smoothers.<br>
Measure type local<br>
Coarsen type PMIS<br>
Interpolation type ext+i<br>
SpGEMM type cusparse<br>
linear system matrix = precond matrix:<br>
Mat Object: () 8 MPI processes<br>
type: mpiaijcusparse<br>
rows=64000, cols=64000<br>
total: nonzeros=311040, allocated nonzeros=311040<br>
total number of mallocs used during MatSetValues calls=0<br>
not using I-node (on process 0) routines</b><br>
</div>
<p></p>
<p><br>
</p>
<p>I didn't succeed for the moment creating a reproducer with ex.c examples...</p>
<p><br>
</p>
<p>Did you see this kind of behaviour before?</p>
<p>Should I update my PETSc version ?<br>
</p>
<p><br>
</p>
<p>Thanks for any advice,<br>
</p>
<p><br>
</p>
<p></p>
<div id="Signature">
<div id="divtagdefaultwrapper" dir="ltr" style="font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, Helvetica, sans-serif, "EmojiFont", "Apple Color Emoji", "Segoe UI Emoji", NotoColorEmoji, "Segoe UI Symbol", "Android Emoji", EmojiSymbols;">
<div style="font-family:Tahoma; font-size:13px">
<div class="BodyFragment"><font size="2"><span style="font-size:10pt">
<div class="PlainText">Pierre LEDAC<br>
Commissariat à l’énergie atomique et aux énergies alternatives<br>
Centre de SACLAY<br>
DES/ISAS/DM2S/SGLS/LCAN<br>
Bâtiment 451 – point courrier n°43<br>
F-91191 Gif-sur-Yvette<br>
+33 1 69 08 04 03<br>
+33 6 83 42 05 79</div>
</span></font></div>
</div>
</div>
</div>
</div>
</body>
</html>