[petsc-dev] [petsc-users] Multigrid with defect correction

Lawrence Mitchell wencel at gmail.com
Tue Mar 7 02:47:29 CST 2017


> On 5 Mar 2017, at 23:02, Barry Smith <bsmith at mcs.anl.gov> wrote:
> 
>> 
>> On Mar 5, 2017, at 11:18 AM, Jed Brown <jed at jedbrown.org> wrote:
>> 
>> Barry Smith <bsmith at mcs.anl.gov> writes:
>> 
>>>  I've looked at the code again and I'm afraid that just "adding a second DM" is not trivial. The DMKSP/SNES/TS stuff is all built around a single DM and has stuff like ->dmoriginal in it (do we have two dmoriginal with all its complications?). 
>>> 
>>>  Perhaps the entire KSPSetComputeOperators() model was misguided and
>>>  I should have stuck with the "you fill up all the needed
>>>  information for the levels of PCMG" before you do the multigrid
>>>  setup instead of providing callbacks that can fill it them up on
>>>  the fly "as needed".  
>> 
>> How would this work when the PCMG is deeply nested and you want to
>> rediscretize on coarse grids?
> 
>    So, for example, a PCFIELDSPLIT with a PCMG on one of the fields. 
> 
>    Good point, but I don't think anyone can actually do this now without having Jed hold our hand and write more obscure hook code. (And any design that requires the knowledge of a single guru is not a good design).


FWIW, here's how I do it at the moment (which doesn't use the hook system, and so is plausibly fragile):

In DMCreateFieldDecomposition, I do:

    if ctx is not None:
        # DM from a hierarchy, so let's split apart in case we want to use it
        # Inside a solve, ctx to split.  If we're not from a
        # hierarchy, this information is not used, so don't bother
        # splitting, since it costs some time.
        if dm.getRefineLevel() - dm.getCoarsenLevel() != 0:
            ctxs = ctx.split([i for i in range(len(W))])
            for d, c in zip(dms, ctxs):
                set_appctx(d, c)

So I notice if I have an appctx lying around and split that, as well as the dms and hang it on the decomposed dms.

Inside DMCoarsen I do:

    ctx = get_appctx(dm)
    if ctx is not None:
        set_appctx(cdm, coarsen(ctx))
        # Necessary for MG inside a fieldsplit in a SNES.
        cdm.setKSPComputeOperators(myComputeOperators)

So I manually transfer that across.  Since we always use a SNES, SNESComputeJacobian is automatically transferred across.

For the linearisation around the current solution, my app contexts know where they were coarsened from, so I can inject the solution that the fine context knows about to coarse contexts and then compute the linearisation.

This allows PCMG inside PCFieldSplit (nested), and also FAS (either as NPC or solver).  But is all a bit flakey right now, since I'm sure I'm not doing things right.

Cheers,

Lawrence





More information about the petsc-dev mailing list