[petsc-users] PETSc / AMRex

Matthew Knepley knepley at gmail.com
Fri Jul 15 13:20:42 CDT 2022


On Fri, Jul 15, 2022 at 11:01 AM Randall Mackie <rlmackie862 at gmail.com>
wrote:

> I am also interested in converting my DMDA code to DMPlex so that I can
> use OcTree grids.
>
> Is there a simple example that would show how to do a box grid in DMPlex,
> or more information about how to convert a DMDA grid to DMPlex?
>

Hi Randy,

Creating a box mesh is easy and can be done from the command line.

The hard part is usually converting the loop structure. Plex is setup to
support FEM and FVM, which are both
cell-oriented. DMDA, on the other hand, tends to support a stencil of
cells/vertices. Is this how your code looks?

I have a student working on a PetscFD which I could show you, but it is far
from production.

  Thanks,

     Matt


> Thanks, Randy
>
> On Jun 21, 2022, at 10:57 AM, Mark Adams <mfadams at lbl.gov> wrote:
>
> (keep on the list, you will need Matt and Toby soon anyway).
>
> So you want to add AMRex to your code.
>
> I think the first thing that you want to do is move your DMDA code into a
> DMPLex code. You can create a "box" mesh and it is not hard.
> Others like Matt can give advice on how to get started on that translation.
> There is a simple step to create a DMForest (p4/8est) that Matt mentioned
> from the DMPlex .
>
> Now at this point you can run your current SNES tests and get back to
> where you started, but AMR is easy now.
> Or as easy as it gets.
>
> As far as AMRex, well, it's not clear what AMRex does for you at this
> point.
> You don't seem to have AMRex code that you want to reuse.
> If there is some functionality that you need then we can talk about it or
> if you have some programmatic reason to use it (eg, they are paying you)
> then, again, we can talk about it.
>
> PETSc/p4est and AMRex are similar with different strengths and design, and
> you could use both but that would complicate things.
>
> Hope that helps,
> Mark
>
>
> On Tue, Jun 21, 2022 at 1:18 PM Bernigaud Pierre <
> pierre.bernigaud at onera.fr> wrote:
>
>> Hello Mark,
>>
>> We have a working solver employing SNES, to which is attached a DMDA to
>> handle ghost cells / data sharing between processors for flux evaluation
>> (using DMGlobalToLocalBegin / DMGlobalToLocalEnd) . We are considering to
>> add an AMReX layer to the solver, but no work has been done yet, as we are
>> currently evaluating if it would be feasible without too much trouble.
>>
>> Our main subject of concern would be to understand how to interface
>> correctly PETSc (SNES+DMDA) and AMRex, as AMRex also appears to have his
>> own methods for parallel data management. Hence our inquiry for examples,
>> just to get a feel for how it would work out.
>>
>> Best,
>>
>> Pierre
>> Le 21/06/2022 à 18:00, Mark Adams a écrit :
>>
>> Hi Bernigaud,
>>
>> To be clear, you have SNES working with DMDA in AMRex, but
>> without adapting dynamically and you want to know what to do next.
>>
>> Is that right?
>>
>> Mark
>>
>>
>>
>>
>> On Tue, Jun 21, 2022 at 11:46 AM Bernigaud Pierre <
>> pierre.bernigaud at onera.fr> wrote:
>>
>>> Greetings,
>>>
>>> I hope you are doing great.
>>>
>>> We are currently working on parallel solver employing PETSc for the main
>>> numerical methods (GMRES, Newton-Krylov method). We would be interested
>>> in combining the PETSc solvers with the AMR framework provided by the
>>> library AMReX (https://amrex-codes.github.io/amrex/). I know that
>>> within
>>> the AMReX framework the KSP solvers provided by PETSc can be used, but
>>> what about the SNES solvers? More specifically, we are using a DMDA to
>>> manage parallel communications during the SNES calculations, and I am
>>> wondering how it would behave in a context where the data layout between
>>> processors is modified by the AMR code when refining the grid.
>>>
>>> Would you have any experience on this matter ? Is there any
>>> collaboration going on between PETsc and AMReX, or would you know of a
>>> code using both of them?
>>>
>>> Respectfully,
>>>
>>> Pierre Bernigaud
>>>
>>> --
>> <jlgjjjnkhffoclfc.gif>*Pierre Bernigaud*
>> Doctorant
>> Département multi-physique pour l’énergétique
>> Modélisation Propulsion Fusée
>> Tél: +33 1 80 38 62 33
>>
>>
>> ONERA - The French Aerospace Lab - Centre de Palaiseau
>> 6, Chemin de la Vauve aux Granges - 91123 PALAISEAU
>> Coordonnées GPS : 48.715169, 2.232833
>>
>> Nous suivre sur : www.onera.fr | Twitter
>> <http://www.twitter.com/@onera_fr> |  LinkedIn
>> <http://www.linkedin.com/company/onera> | Facebook
>> <http://www.facebook.fr/thefrenchaerospacelab> | Instagram
>> <https://www.instagram.com/onera_the_french_aerospace_lab>
>>
>>
>> Avertissement/disclaimer https://www.onera.fr/en/emails-terms
>> <dldmcfkmcojhebgb.png>
>>
>>
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220715/da3baa12/attachment.html>


More information about the petsc-users mailing list