[petsc-users] PETSc / AMRex

Matthew Knepley knepley at gmail.com
Fri Jul 15 14:26:40 CDT 2022


On Fri, Jul 15, 2022 at 2:12 PM Randall Mackie <rlmackie862 at gmail.com>
wrote:

>
>
> On Jul 15, 2022, at 11:58 AM, Matthew Knepley <knepley at gmail.com> wrote:
>
> On Fri, Jul 15, 2022 at 1:46 PM Randall Mackie <rlmackie862 at gmail.com>
> wrote:
>
>> On Jul 15, 2022, at 11:20 AM, Matthew Knepley <knepley at gmail.com> wrote:
>>
>> On Fri, Jul 15, 2022 at 11:01 AM Randall Mackie <rlmackie862 at gmail.com>
>> wrote:
>>
>>> I am also interested in converting my DMDA code to DMPlex so that I can
>>> use OcTree grids.
>>>
>>> Is there a simple example that would show how to do a box grid in
>>> DMPlex, or more information about how to convert a DMDA grid to DMPlex?
>>>
>>
>> Hi Randy,
>>
>> Creating a box mesh is easy and can be done from the command line.
>>
>> The hard part is usually converting the loop structure. Plex is setup to
>> support FEM and FVM, which are both
>> cell-oriented. DMDA, on the other hand, tends to support a stencil of
>> cells/vertices. Is this how your code looks?
>>
>>
>> Hi Matt,
>>
>> I figured the hard part was the loop structure.
>>
>> Yes, my DMDA code is pretty standard and I have fields defined on block
>> edges (it’s a staggered grid implementation, but written long before petsc
>> had staggered grid capability) and I just loop over the DMDA grid points
>> like:
>>
>> do k=zs,ze
>>   do j=ys,ye
>>     do i=xs,xe
>>
>> I’d be very interested to see what you can show us.
>>
>
> What information do you need when computing an entry, for the cell and
> then for the face?
>
>
> I currently set up a 3D DMDA using a box stencil and a stencil width of 2.
> The i,j,k coordinates refer both to the cell (where there is a physical
> value assigned) and to the 3 edges of the cell at the top SW corner.
> For local computations, I need to be able to access the values up to +/- 2
> grid points away.
>
> I don’t really refer to the faces since that is implicitly included in the
> curl-curl formulation I am solving.
>
> Is this what you are asking for?
>

Yes. Unfortunately, this is hard. The topological definitions are all
local, so even 1 layer of cells is awkward, but 2 layers
would be harder. With adaptivity, it gets harder still.

My approach, with Abhishek and Dave Salac, has been to preprocess all
stencils and store them. Since p4est assumes
a Cartesian topology, it might be easier to directly use the p4est
traversal. Toby might be better at explaining that.

  Thanks,

     Matt


> Randy
>
>
>   Thanks,
>
>      Matt
>
>
>> Thanks, Randy
>>
>>
>> I have a student working on a PetscFD which I could show you, but it is
>> far from production.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> Thanks, Randy
>>>
>>> On Jun 21, 2022, at 10:57 AM, Mark Adams <mfadams at lbl.gov> wrote:
>>>
>>> (keep on the list, you will need Matt and Toby soon anyway).
>>>
>>> So you want to add AMRex to your code.
>>>
>>> I think the first thing that you want to do is move your DMDA code into
>>> a DMPLex code. You can create a "box" mesh and it is not hard.
>>> Others like Matt can give advice on how to get started on that
>>> translation.
>>> There is a simple step to create a DMForest (p4/8est) that Matt
>>> mentioned from the DMPlex .
>>>
>>> Now at this point you can run your current SNES tests and get back to
>>> where you started, but AMR is easy now.
>>> Or as easy as it gets.
>>>
>>> As far as AMRex, well, it's not clear what AMRex does for you at this
>>> point.
>>> You don't seem to have AMRex code that you want to reuse.
>>> If there is some functionality that you need then we can talk about it
>>> or if you have some programmatic reason to use it (eg, they are paying you)
>>> then, again, we can talk about it.
>>>
>>> PETSc/p4est and AMRex are similar with different strengths and design,
>>> and you could use both but that would complicate things.
>>>
>>> Hope that helps,
>>> Mark
>>>
>>>
>>> On Tue, Jun 21, 2022 at 1:18 PM Bernigaud Pierre <
>>> pierre.bernigaud at onera.fr> wrote:
>>>
>>>> Hello Mark,
>>>>
>>>> We have a working solver employing SNES, to which is attached a DMDA to
>>>> handle ghost cells / data sharing between processors for flux evaluation
>>>> (using DMGlobalToLocalBegin / DMGlobalToLocalEnd) . We are considering to
>>>> add an AMReX layer to the solver, but no work has been done yet, as we are
>>>> currently evaluating if it would be feasible without too much trouble.
>>>>
>>>> Our main subject of concern would be to understand how to interface
>>>> correctly PETSc (SNES+DMDA) and AMRex, as AMRex also appears to have his
>>>> own methods for parallel data management. Hence our inquiry for examples,
>>>> just to get a feel for how it would work out.
>>>>
>>>> Best,
>>>>
>>>> Pierre
>>>> Le 21/06/2022 à 18:00, Mark Adams a écrit :
>>>>
>>>> Hi Bernigaud,
>>>>
>>>> To be clear, you have SNES working with DMDA in AMRex, but
>>>> without adapting dynamically and you want to know what to do next.
>>>>
>>>> Is that right?
>>>>
>>>> Mark
>>>>
>>>>
>>>>
>>>>
>>>> On Tue, Jun 21, 2022 at 11:46 AM Bernigaud Pierre <
>>>> pierre.bernigaud at onera.fr> wrote:
>>>>
>>>>> Greetings,
>>>>>
>>>>> I hope you are doing great.
>>>>>
>>>>> We are currently working on parallel solver employing PETSc for the
>>>>> main
>>>>> numerical methods (GMRES, Newton-Krylov method). We would be
>>>>> interested
>>>>> in combining the PETSc solvers with the AMR framework provided by the
>>>>> library AMReX (https://amrex-codes.github.io/amrex/). I know that
>>>>> within
>>>>> the AMReX framework the KSP solvers provided by PETSc can be used, but
>>>>> what about the SNES solvers? More specifically, we are using a DMDA to
>>>>> manage parallel communications during the SNES calculations, and I am
>>>>> wondering how it would behave in a context where the data layout
>>>>> between
>>>>> processors is modified by the AMR code when refining the grid.
>>>>>
>>>>> Would you have any experience on this matter ? Is there any
>>>>> collaboration going on between PETsc and AMReX, or would you know of a
>>>>> code using both of them?
>>>>>
>>>>> Respectfully,
>>>>>
>>>>> Pierre Bernigaud
>>>>>
>>>>> --
>>>> <jlgjjjnkhffoclfc.gif>*Pierre Bernigaud*
>>>> Doctorant
>>>> Département multi-physique pour l’énergétique
>>>> Modélisation Propulsion Fusée
>>>> Tél: +33 1 80 38 62 33
>>>>
>>>>
>>>> ONERA - The French Aerospace Lab - Centre de Palaiseau
>>>> 6, Chemin de la Vauve aux Granges - 91123 PALAISEAU
>>>> Coordonnées GPS : 48.715169, 2.232833
>>>>
>>>> Nous suivre sur : www.onera.fr | Twitter
>>>> <http://www.twitter.com/@onera_fr> |  LinkedIn
>>>> <http://www.linkedin.com/company/onera> | Facebook
>>>> <http://www.facebook.fr/thefrenchaerospacelab> | Instagram
>>>> <https://www.instagram.com/onera_the_french_aerospace_lab>
>>>>
>>>>
>>>> Avertissement/disclaimer https://www.onera.fr/en/emails-terms
>>>> <dldmcfkmcojhebgb.png>
>>>>
>>>>
>>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220715/648699e4/attachment.html>


More information about the petsc-users mailing list