[petsc-users] PETSc / AMRex
Randall Mackie
rlmackie862 at gmail.com
Fri Jul 15 14:47:06 CDT 2022
> On Jul 15, 2022, at 12:40 PM, Jed Brown <jed at jedbrown.org> wrote:
>
> Matthew Knepley <knepley at gmail.com> writes:
>
>>> I currently set up a 3D DMDA using a box stencil and a stencil width of 2.
>>> The i,j,k coordinates refer both to the cell (where there is a physical
>>> value assigned) and to the 3 edges of the cell at the top SW corner.
>>> For local computations, I need to be able to access the values up to +/- 2
>>> grid points away.
>>>
>>> I don’t really refer to the faces since that is implicitly included in the
>>> curl-curl formulation I am solving.
>>>
>>> Is this what you are asking for?
>>>
>>
>> Yes. Unfortunately, this is hard. The topological definitions are all
>> local, so even 1 layer of cells is awkward, but 2 layers
>> would be harder. With adaptivity, it gets harder still.
>>
>> My approach, with Abhishek and Dave Salac, has been to preprocess all
>> stencils and store them. Since p4est assumes
>> a Cartesian topology, it might be easier to directly use the p4est
>> traversal. Toby might be better at explaining that.
>
> Randall, do you want a single globally structured block or could you have a classical structured discretization inside each hex "element" (one or more per process) with interface conditions?
Hi Jed,
I currently have a 3D staggered grid finite difference code (staggered grid) using PETSc that models EM fields and has been working well for many years. I am now wanting to use OcTree grids to increase discretization in a reasonable way in certain parts of the model. The design of the OcTree grid would be external to PETSc. I am just looking for the best and easiest way to solve this within the PETSc framework since I am already very familiar with the DMDA capability and my code is fully functional there.
Since the boundary values are on the outer domain of the entire model, I suspect the answer to your question is that I would want one globally structured block (if I am understanding it correctly).
Thanks, Randy
More information about the petsc-users
mailing list