[petsc-users] Multigrid with defect correction

Barry Smith bsmith at mcs.anl.gov
Sat Feb 25 18:26:34 CST 2017


> On Feb 25, 2017, at 2:21 PM, Matt Landreman <matt.landreman at gmail.com> wrote:
> 
> Thanks everyone for the help.
> 
> On item 1 (using Amat different from Pmat with geometric multigrid), I tried Barry's suggestion but it did not seem to resolve the issue. For example, in ksp ex25.c, I tried adding the following lines after line 112:
>   if (J == jac) {                                                                                                       
>     ierr = PetscPrintf(PETSC_COMM_WORLD,"Creating a new Amat\n");                                                           
>     ierr = DMCreateMatrix(da,&J);                                                                                          
>     ierr = KSPSetOperators(ksp,J,jac);                                                                                      
>   }                                                   
>   ierr = MatShift(J,1.0); 
> 
> This change should set Amat (J) to be different from Pmat (jac), (specifically Amat=identity matrix), so the solution from KSP should be completely different from the original ex25.c. But viewing the solution vector, the solution is unchanged. It seems PETSc is ignoring the Amat created in this approach.

   Ahh, yes a multiple number of errors on my part.

1) there was old code that immediately overwrote the newly set KSPSetOperators() after the call to ComputeMatrix(), hence simply removing the new user provided matrix, fixed in commit d73a0ed5842d2f684a9729e0b9c14ba0e3d554c3

2) the default behavior of PCMG was to use the pmat on each level to compute the residual, not the mat, fixed in commit 13842ffb25245dcab28ec15282db994aa7b56c77 so even after 1) was fixed it would still be using "the wrong" matrix to compute the residuals.

3) Even after 2) was fixed my advice still would not do what I wanted because the default setting of the matrix to compute the residual on each level (the call the PCMGSetResidual() in mg.c) occurs BEFORE any calls to ComputeMatrix() so it would still use the original matrix provided by the DM and not the new one provided by the user. 

  The best solution I see moving forward is to actually properly support providing two DMs to KSP, one responsible for the "true" operator and one responsible for the "approximate" operator used in constructing the preconditioner. Conceptually looking back this makes perfect sense and should have been done the very first day that I introduced the concept of providing a DM to the KSP. Unfortunately this requires a good amount of replumbing; wherever we do refinement/coarsening of DM we need to manage it for both, the horrible DMKSP/SNES/TS construct that Jed needed to introduce to get user callbacks working needs to handle the two DMs in some way, plus TS and SNES need to handle the two DMs. There are additional complications in managing the user callbacks for function evaluations, when they should utilize the first DM and when they should utilize the second; for Newton based TS and SNES solves presumably they should almost always be just using the higher order function, but for nonlinear multigrid and other non-Newton based nonlinear solvers there are presumably places where the lower order function should be called. The end result of the refactorization is very good however since it will then be straightforward to support many cases of the "high order discretization for operators, but low order discretization for preconditioners" paradigms, not just for DMDA but for other DMS without the usual PETSc model and code.

  Matt,
    
    So yes we definitely want to support what you want to do and have always intended to support it eventually, now is the time we start to add the support. I'll start a git branch shortly that begins to make it simple for your case of KSP; extending it everywhere will take more time.

  Barry






> 
> Matt K's suggestion of switching from KSP to SNES does work, allowing Amat to differ from Pmat on the finest multigrid level. (On coarser levels, it seems PETSc still forces Amat=Pmat on entry to computeMatrix). 
> 
> On Jed's comment, the application I have in mind is indeed a convection-dominated equation (a steady linear 3D convection-diffusion equation with smoothly varying anisotropic coefficients and recirculating convection). Gamg and hypre-boomerAMG have been working on it ok if I discretize with low-order upwind differences in Pmat and use Amat=Pmat, but I'd like higher order accuracy. Using gmres with a higher-order discretization in Amat and low-order Pmat works ok, but the number of KSP iterations required gets large as the diffusion gets small compared to convection, even with -pc_type lu. So I'm working to see if geometric mg with defect correction at each level can do better.
> 
> Thanks,
> Matt Landreman
> 
> 
> 
> 
> On Thu, Feb 23, 2017 at 5:23 PM, Jed Brown <jed at jedbrown.org> wrote:
> Matt Landreman <matt.landreman at gmail.com> writes:
> > 3. Is it at all sensible to do this second kind of defect correction with
> > _algebraic_ multigrid? Perhaps Amat for each level could be formed from the
> > high-order matrix at the fine level by the Galerkin operator R A P, after
> > getting all the restriction matrices created by gamg for Pmat?
> 
> Note that defect correction is most commonly used for
> transport-dominated processes for which the high order discretization is
> not h-elliptic.  AMG heuristics are typically really bad for such
> problems so stabilizing a smoother isn't really a relevant issue.  Also,
> it is usually used for strongly nonlinear problems where AMG's setup
> costs are likely overkill.  This is not to say that defect correction
> AMG won't work, but there is reason to believe that it doesn't matter
> for many important problems.
> 



More information about the petsc-users mailing list