[petsc-users] GAMG issue

Mark F. Adams mark.adams at columbia.edu
Fri Mar 30 12:55:55 CDT 2012


Humm, that is mysterious.

ML uses the same PETSc MG infrastructure ... so there must be something in its prolongation operators...

I thought of what _might_ be a better way to run your problems.  Set "-pc_gamg_coarse_eq_limit 1 [or a very small integer]" and then use PC "none" on the coarse grid.  Maybe even PREONLY so the coarse grid does nothing.  

Mark


On Mar 30, 2012, at 11:58 AM, John Mousel wrote:

> Mark,
> 
> I've just run on one core which allows me to get ML to produce a one row coarse grid. The problem converges. I'm a bit confused.
> 
> John
> 
> On Fri, Mar 30, 2012 at 10:24 AM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
> -mg_coarse_pc_type svd?
> 
> (Use redundant for parallel.)
> 
> On Mar 30, 2012 9:21 AM, "Mark F. Adams" <mark.adams at columbia.edu> wrote:
> 
> On Mar 30, 2012, at 10:52 AM, John Mousel wrote:
> 
>> Mark,
>> 
>> I've run GAMG twice with different coarse grid sizes of 2 and 8 with 1 sweep of SOR on the coarse grid. For a size of 8 it converges nicely, but for a size of 2, I think the null space is causing too many problems.
> 
> YEs, the iterative method is seeing the null space because of floating point error.
> 
>> If GAMG were to coarsen to a size of 1, then there would be no hope because only the null space would remain, right? This doesn't ever seem to occur with ML because there are at least as many rows as processors.
> 
> Yes that seems like a good assumption.  The right thing to do here would probably be to do and SVD and filter out the very low modes explicitly.  For now I guess tweaking -pc_gamg_coarse_eq_limit n is al that can be done.  Not very satisfying.  We will think about this ... any thoughts anyone?
> 
> Mark
> 
>> 
>> John
>> 
>> On Fri, Mar 30, 2012 at 8:42 AM, Mark F. Adams <mark.adams at columbia.edu> wrote:
>> 
>> On Mar 29, 2012, at 2:40 PM, John Mousel wrote:
>> 
>>> I'm attempting to solve a non-symmetric discretization of a 3D Poisson problem. The problem is singular. I've attached the results of KSPView from runs with ML and GAMG. When I run ML, I get convergence in 30 iterations. When I attempt to use the same settings with GAMG, I'm not getting convergence at all. The two things I notice are:
>>> 
>>> 1. GAMG is using KSPType preonly, even though I've set it to be Richardson in my command line options.
>> 
>> PETSc seems to switch the coarse grid solver to GMRES in Setup.  This seems to be a bug and I unwisely decide to override this manually. I will undo this in the next checkin.  This should not be the problem however.
>> 
>>> 2. ML only coarsens down to 4 rows while GAMG coarsens to 2. My problem is singular, and whenever I try to use LU, I get zero pivot problems. To mitigate this, I've been using Richardson with SOR on the coarse matrix. Could the smaller coarse grid size of GAMG be causing problems with SOR. If so, is there a way to put a lower limit on the coarse grid size?
>>> 
>> 
>> I'm thinking that with a 2x2 coarse grid 8 iterations of SOR is picking up the null space.  Maybe try just one SOR iteration on the coarse grid.
>> 
>> Also, can you run with options_left so that I can see your arguments.  One known bug is the mat_diagaonal_scale breaks GAMG, but it should also break ML.
>> 
>> Mark
>> 
>>> John
>>> 
>>> On Thu, Mar 29, 2012 at 11:03 AM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>>> On Thu, Mar 29, 2012 at 09:18, John Mousel <john.mousel at gmail.com> wrote:
>>> [0]PETSC ERROR: Error in external library!
>>> [0]PETSC ERROR: Cannot disable floating point exceptions!
>>> 
>>> Looks like something is strange with your environment because fesetenv() is returning an error. I have disabled the call if the trap mode is not changing.
>>> 
>>> http://petsc.cs.iit.edu/petsc/petsc-dev/rev/352b4c19e451
>>> 
>>> <KSPView_GAMG.txt><KSPView_ML.txt>
>> 
>> 
>> <KSPView_ML.txt><KSPView_GAMG.txt>
> 
> 
> <KSPView_ML.txt><KSPView_GAMG.txt>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120330/c45f3032/attachment.htm>


More information about the petsc-users mailing list