Mark,<br><br>I've run GAMG twice with different coarse grid sizes of 2 and 8 with 1 sweep of SOR on the coarse grid. For a size of 8 it converges nicely, but for a size of 2, I think the null space is causing too many problems. If GAMG were to coarsen to a size of 1, then there would be no hope because only the null space would remain, right? This doesn't ever seem to occur with ML because there are at least as many rows as processors.<br>
<br>John<br><br><div class="gmail_quote">On Fri, Mar 30, 2012 at 8:42 AM, Mark F. Adams <span dir="ltr"><<a href="mailto:mark.adams@columbia.edu">mark.adams@columbia.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div style="word-wrap:break-word"><br><div><div class="im"><div>On Mar 29, 2012, at 2:40 PM, John Mousel wrote:</div><br><blockquote type="cite">I'm attempting to solve a non-symmetric discretization of a 3D Poisson problem. The problem is singular. I've attached the results of KSPView from runs with ML and GAMG. When I run ML, I get convergence in 30 iterations. When I attempt to use the same settings with GAMG, I'm not getting convergence at all. The two things I notice are:<br>
<br>1. GAMG is using KSPType preonly, even though I've set it to be Richardson in my command line options.<br></blockquote><div><br></div></div><div>PETSc seems to switch the coarse grid solver to GMRES in Setup. This seems to be a bug and I unwisely decide to override this manually. I will undo this in the next checkin. This should not be the problem however.</div>
<div class="im"><br><blockquote type="cite">2. ML only coarsens down to 4 rows while GAMG coarsens to 2. My problem is singular, and whenever I try to use LU, I get zero pivot problems. To mitigate this, I've been using Richardson with SOR on the coarse matrix. Could the smaller coarse grid size of GAMG be causing problems with SOR. If so, is there a way to put a lower limit on the coarse grid size?<br>
<br></blockquote><div><br></div></div><div>I'm thinking that with a 2x2 coarse grid 8 iterations of SOR is picking up the null space. Maybe try just one SOR iteration on the coarse grid.</div><div><br></div><div>Also, can you run with options_left so that I can see your arguments. One known bug is the mat_diagaonal_scale breaks GAMG, but it should also break ML.</div>
<div><br></div><div>Mark</div><br><blockquote type="cite"><div class="im">John<br><br><div class="gmail_quote">On Thu, Mar 29, 2012 at 11:03 AM, Jed Brown <span dir="ltr"><<a href="mailto:jedbrown@mcs.anl.gov" target="_blank">jedbrown@mcs.anl.gov</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div><div class="gmail_quote">On Thu, Mar 29, 2012 at 09:18, John Mousel <span dir="ltr"><<a href="mailto:john.mousel@gmail.com" target="_blank">john.mousel@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div>[0]PETSC ERROR: Error in external library!<br>[0]PETSC ERROR: Cannot disable floating point exceptions!</div></blockquote></div><div><br></div></div><div>Looks like something is strange with your environment because fesetenv() is returning an error. I have disabled the call if the trap mode is not changing.</div>
<br><div><a href="http://petsc.cs.iit.edu/petsc/petsc-dev/rev/352b4c19e451" target="_blank">http://petsc.cs.iit.edu/petsc/petsc-dev/rev/352b4c19e451</a></div>
</blockquote></div><br>
</div><span><KSPView_GAMG.txt></span><span><KSPView_ML.txt></span></blockquote></div><br></div></blockquote></div><br>