[petsc-dev] Telescope usage

Loic Gouarin loic.gouarin at polytechnique.edu
Tue May 17 04:56:45 CDT 2022


Hi Dave,

could you explain what you mean by state ?

Thanks,

Loic

Le 17/05/2022 à 11:50, Dave May a écrit :
> Hi Loic,
>
> Can you confirm if your problem has stored state which needs to be 
> repartitioned?
>
> Thanks,
> Dave
>
> On Tue 17. May 2022 at 09:07, Loic Gouarin 
> <loic.gouarin at polytechnique.edu> wrote:
>
>
>     Le 17/05/2022 à 01:43, Dave May a écrit :
>>
>>
>>     On Tue, 17 May 2022 at 00:12, Matthew Knepley <knepley at gmail.com>
>>     wrote:
>>
>>         On Mon, May 16, 2022 at 9:59 AM Loic Gouarin
>>         <loic.gouarin at polytechnique.edu> wrote:
>>
>>             Le 16/05/2022 à 21:54, Barry Smith a écrit :
>>
>>>
>>>
>>>>             On May 16, 2022, at 3:50 PM, Loic Gouarin
>>>>             <loic.gouarin at polytechnique.edu> wrote:
>>>>
>>>>             Thanks Barry for your quick reply.
>>>>
>>>>             Le 16/05/2022 à 21:41, Barry Smith a écrit :
>>>>>
>>>>>               Loic,
>>>>>
>>>>>                 From your code it looks like you are using a DM.
>>>>>             Is it a DMDA or a shell DM? If it is a DMDA then the
>>>>>             process is intended to be pretty straightforward.
>>>>>             PCTELESCOPE should create a new DMDA that has the same
>>>>>             properties as the coarse grid DMDA but lives on a
>>>>>             smaller number of ranks. From this you can just
>>>>>             provide multiple levels of coarsening of the DMDA to
>>>>>             produce the smaller multigrid problems.
>>>>             It's a DMDA. So what you mean is to take the KSP of
>>>>             PCTELESCOPE, get the DM which is the same of the coarse
>>>>             grid and build all the levels as previously for the
>>>>             levels of PCTELESCOPE ?
>>>
>>>               Yes, I think you should be able to pull out of the
>>>             coarse level PC (on the subset of ranks) a DM that and
>>>             then setup coarse levels from that. But you need to look
>>>             at an example that that uses telescope and probably even
>>>             the telescope code itself to see all the details. I
>>>             don't know them anymore.
>>
>>             I looked at the source code but It didn't help. I will
>>             try to explain more clearly my experiments to understand
>>             what I make wrong and you can help more easily.
>>
>>         Hi Loic!
>>
>>         Here is my guess. When TELESCOPE redistributes the coarse
>>         grid DMDA, it creates a new DMDA and copies over all the
>>         information it can see. However, I think you may have state
>>         that is attached to the old DMDA that is not getting
>>         redistributed. If this is true, I can help you write a hook
>>         that redistributes that state when TELESCOPE maps between
>>         distributions.
>>
>>
>>     If you have state you need to propagate between different
>>     communicators, you will have to use the telescope type which is
>>     internally is referred to as
>>     TELESCOPE_COARSEDM
>>
>>     To activate this type you need the option
>>     -pc_telescope_use_coarse_dm
>>     Or to call PCTelescopeSetUseCoarseDM()
>>     The assumptions of usng this options are described here
>>     https://petsc.org/main/docs/manualpages/PC/PCTelescopeSetUseCoarseDM
>>
>>     and source associated with the COARSEDM
>>     https://petsc.org/main/src/ksp/pc/impls/telescope/telescope_coarsedm.c.html
>>
>>     Using TELESCOPE_COARSEDM you will have access to the hooks Matt
>>     is referring to.
>>
>     Thanks Matthew and Dave.
>
>     I used TELESCOPE_COARSEDM in my previous tests but it tells me
>     that "Zero instances of a coarse DM were found". And indeed, my
>     coarse solver only has the DM on the coarse mesh and not on the
>     fine level with a coarse DM atttached. I think it's because I use
>     DMKSPSetComputeOperators to create the KSP on each level.
>
>     So for now, I just added
>
>     KSP coarse;
>     ierr=PCMGGetCoarseSolve(pc_i, &coarse);CHKERRQ(ierr);
>     PC pc_coarse;
>     ierr=KSPGetPC(coarse, &pc_coarse);CHKERRQ(ierr);
>     ierr=PCSetType(pc_coarse, PCTELESCOPE);CHKERRQ(ierr);
>     ierr=PCTelescopeSetUseCoarseDM(pc_coarse, PETSC_TRUE);CHKERRQ(ierr);
>
>     but it's not enough.
>
>     Cheers,
>
>     Loic
>
>
>>     Cheers,
>>     Dave
>>
>>
>>
>>            Thanks,
>>
>>              Matt
>>
>>             Thanks,
>>
>>             Loic
>>
>>>
>>>               Barry
>>>
>>>>>
>>>>>                Can you let us know more details of what exactly
>>>>>             goes wrong with attempts? It is likely straightforward
>>>>>             to get things working, perhaps due to our unclear
>>>>>             documentation.
>>>>
>>>>             I essentially have wrong state or NULL matrices into
>>>>             the PCTELESCOPE.
>>>>
>>>>             Loic
>>>>
>>>>>
>>>>>               Barry
>>>>>
>>>>>
>>>>>>             On May 16, 2022, at 3:28 PM, Loic Gouarin
>>>>>>             <loic.gouarin at polytechnique.edu> wrote:
>>>>>>
>>>>>>             Hello,
>>>>>>
>>>>>>             I could have posted my message on the user list but
>>>>>>             it seems to me that it's more in the petsc
>>>>>>             development part. Don't hesitate to tell me if I'm wrong.
>>>>>>             I am developing a code called cafes that studies
>>>>>>             fluid-particle interactions in a Stokes fluid. For
>>>>>>             that, we implemented the whole solver in matrix free
>>>>>>             on a cartesian grid. The preconditioner is naturally
>>>>>>             a geometrical multigrid for the velocity. The set of
>>>>>>             matrix free for each level is set by hand (I worked
>>>>>>             with Matthew on this a long time ago now!).
>>>>>>             I would have liked to implement telescope on this
>>>>>>             part. I started it but I didn't get out of it. So I
>>>>>>             would like to have your help.
>>>>>>             What I tried to do was to describe my classical
>>>>>>             multigrid up to a level N1 and then take the coarse
>>>>>>             solver and apply the telescope preconditioner to it
>>>>>>             up to a level N2. I don't know if this is the best
>>>>>>             way but it seemed the most intuitive. The problem is
>>>>>>             that I could not get the coarse matrix because the
>>>>>>             setup is not done yet (I use a
>>>>>>             DMKSPSetComputeOperators for the matrices).
>>>>>>
>>>>>>             The construction of the matrix free for each level is
>>>>>>             here:
>>>>>>             https://github.com/gouarin/cafes/blob/master/cafes/problem/stokes.hpp#L59-L88
>>>>>>
>>>>>>             The description of the Stokes preconditioner is
>>>>>>             described here :
>>>>>>             https://github.com/gouarin/cafes/blob/master/cafes/problem/stokes.hpp#L109-L162
>>>>>>
>>>>>>             I have tried multiple implementations without
>>>>>>             success. For me, the start is to add at line 105
>>>>>>             something like
>>>>>>
>>>>>>             KSP coarse;
>>>>>>             ierr = PCMGGetCoarseSolve(pc_i, &coarse);CHKERRQ(ierr);
>>>>>>             PC pc_coarse;
>>>>>>             ierr = KSPGetPC(coarse, &pc_coarse);CHKERRQ(ierr);
>>>>>>             ierr = PCSetType(pc_coarse, PCTELESCOPE);CHKERRQ(ierr);
>>>>>>
>>>>>>             Then, to create the several matrices of the levels of
>>>>>>             TELESCOPE, I thought using the same process as before
>>>>>>             with DMKSPSetComputeOperators. But it doesn't work...
>>>>>>             Could you tell me how I can make it work ?
>>>>>>
>>>>>>             Thanks,
>>>>>>             Loic
>>>>>>
>>>>>
>>>
>>
>>
>>         -- 
>>         What most experimenters take for granted before they begin
>>         their experiments is infinitely more interesting than any
>>         results to which their experiments lead.
>>         -- Norbert Wiener
>>
>>         https://www.cse.buffalo.edu/~knepley/
>>         <http://www.cse.buffalo.edu/~knepley/>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20220517/2d3afbcf/attachment-0001.html>


More information about the petsc-dev mailing list