[petsc-users] Structured multi-block topology
Mark Lohry
mlohry at gmail.com
Fri Apr 25 18:02:15 CDT 2014
On Fri, Apr 25, 2014 at 6:03 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Fri, Apr 25, 2014 at 2:47 PM, Mark Lohry <mlohry at gmail.com> wrote:
>>
>> Most common use case would be from 1 to 15 blocks or so, although I
>> have occasionally seen need for 100+. This also brings to mind a
>> question I had about more general connectivity in single-block domains
>> like a C-mesh, where you have halo dependencies to itself that are not
>> simply "periodic."
>
>
> Jed and I have discussed this. DMComposite is really not great for even
> moderate numbers of blocks because it serializes. Also, the real performance
> advantage of structured blocks is saturated by a small number of structured
> cells (32 or so), and thus large structured pieces do not make a lot of
> sense
> (this is the same intuition behind the performance of spectral elements).
Huh? Perhaps this is the case in finite elements, but in FV you see
great performance (and accuracy vs nDOFs) benefits from relatively
large structured blocks in conjunction with FAS multigrid. As a visual
aid, I'm thinking of meshes like this which have on the order of 64^3
- 128^3 cells:
http://spitfire.princeton.edu/mesh3.png
> Our
> recommendation is to have a coarse unstructured mesh with regular refinement
> inside each coarse cell. This is something we could really support well. Its
> not all there yet, but we have all the pieces.
>
Would you still be able to exchange the fine-mesh data across block
boundaries? That would be essential.
>>
>> As far as the difficult task of setting up the communication, is that
>> even possible to do manually without treating everything as fully
>> unstructured? What other packages are out there for multi block
>> structured domains?
>
>
> There is Overture.
>
> Matt
>
Fair enough, thanks. For the time being it looks like I'll have to
proceed by managing all the parallelization/message passing internally
and manually filling in petsc vectors for the various solvers.
More information about the petsc-users
mailing list