<div dir="ltr"><div dir="ltr">On Tue, May 3, 2022 at 3:28 PM Barry Smith <<a href="mailto:bsmith@petsc.dev">bsmith@petsc.dev</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div style="overflow-wrap: break-word;"><div><br></div> A difficult question with no easy answers. <div><br></div><div> First, do you have a restart system so you can save your state just before your "bad behavior" and run experiments easily at the bad point? </div><div><br></div><div> You could try to use SLEPc to compute the first few eigenmodes (presumably associated with excessively small eigenvalues) and visualize them? You could restart with and without preconditioning to see how badly the conditioning becomes for both the system and the preconditioned system to help see if the problem comes from just the preconditioner starting to behavior poorly or because the operator starts to behave poorly.</div><div><br></div><div> There should be a way to allow doing this directly from within the KSPSolve trivially with appropriate monitors but I suspect that does not exist because calling SLEPc trivially from PETSc is a nightmare because of the dependency diamond. </div></div></blockquote><div> </div><div>A quick and dirty thing is the plot the residual when your system does not converge. You can get an idea where</div><div>the algebraic error is largest. There is already a -ksp_monitor_range, and Barry had code to cut out the region of</div><div>high residual in a DMDA woth a halo, solve that, and project it back in, but I cannot remember where it is. Barry?</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div style="overflow-wrap: break-word;"><div> Barry</div><div><br><div><br><blockquote type="cite"><div>On May 3, 2022, at 11:58 AM, Alfredo J Duarte Gomez <<a href="mailto:aduarteg@utexas.edu" target="_blank">aduarteg@utexas.edu</a>> wrote:</div><br><div><div dir="ltr"><div>Good morning PETSC team,</div><div><br></div><div>I have a bit of an open question on diagnosing preconditioner performance.</div><div><br></div><div>For a bit of background, I am using the TS object in combination with the matrix-free snes, and a custom user defined preconditioner (PCSHELL). Everything is implemented with the help of a DMDA. Smallest problem size that I can get away with is a grid with 2.1 million points with 4 fields each, for a total of 8.4 million equations.</div><div><br></div><div>The preconditioner works very well overall, but at some stages of the solution it performs poorly, evidenced by significant increases in the number of GMRES iterations and the maximum/minimum eigenvalue computed using KSPComputeExtremeSingularValues().</div><div><br></div><div>I am trying to understand the locations where the preconditioner is not working well, so for example, is there any way to map the maximum eigenvalue to a particular location/field in the DMDA. Alternatively, are there any other ways of diagnosing where the preconditioner is not doing a good job? <br></div><div><br></div><div>GMRES iterations and the max/min eigenvalue provide a good overall picture, but I am struggling to get preconditioner metrics that are specific to a location and field.<br></div><div><br></div><div>So far I have taken a close look at fields such as the residual, and Newton updates, but it is difficult to tell how to assess these in combination with the preconditioner.</div><div><br></div><div>I appreciate any suggestions.</div><div><br></div><div>Thank you and have a good day.<br></div><div><br></div><div>-Alfredo<br></div><div><br></div><div><br></div><div><br>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><font face="arial, sans-serif">Alfredo Duarte</font><div><font face="arial, sans-serif">Graduate Research Assistant</font></div><div><font face="arial, sans-serif">The University of Texas at Austin</font></div></div></div></div></div></div></div>
</div></blockquote></div><br></div></div></blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>