[petsc-users] multiblock example

Jeremy Kozdon jkozdon at stanford.edu
Thu May 3 13:05:25 CDT 2012


Thanks for the response Matt. I am clearly a little slow to respond as I
have been working on other things.

Right now there is nothing in particular that isn't working for us, since
we have really started things yet.

The interaction between our blocks can be complicated, but we don't use
ghost cells at all and interaction is done weakly through interfaces so I
would think that this will simplify things (we use summation-by-parts
finite difference methods with the simultaneous approximation term method
if that happens to mean anything to you). Haven't really thought through
our Jacobians yet, but the nonlinearities in our problems do arise through
the coupling terms.

On Sun, Apr 15, 2012 at 2:54 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Sun, Apr 15, 2012 at 4:24 PM, Jeremy Kozdon <jkozdon at stanford.edu>wrote:
>
>> Hello All,
>>
>> We are considering using petsc for a new code we are developing which
>> involves multiblock grids and finite difference methods. I am still pretty
>> new to petsc (still reading the manual), so wondering where a good place to
>> look for examples on this is (since I assume that we are not the first to
>> consider this). Not sure if there is a good place in the petsc source to
>> consider this, one of the libraries examples perhaps, or another opensource
>> package.
>>
>> One of the things we want to be able to do is assign different numbers of
>> processors to different blocks based on the work load we estimate for that
>> block. Currently we do this by decomposing the global communicator into a
>> block communicators, but if I understand correctly this is not the correct
>> approach to take with petsc since we will need to use DMComposite which
>> relies on collective operations.
>>
>
> The Cartesian grid interface is the DMDA class in PETSc. You can prescribe
> whatever processor layout you want. The key
> question is how will the blocks interact. The DMComposite is fine for
> putting blocks together to generate a combined residual,
> but the Jacobian interface is not as fleshed out, for example it does not
> automatically allocate the off-diagonal blocks from
> block interaction. I don't think there are any problems with collectives
> in DMComposite. Do you have a specific thing that
> does not work for you?
>
>
>> Currently everything we are doing is with structured cartesian grids, but
>> in the future we will also need to be able to handle unstructured grids
>> which are coupled to the structured grids.
>>
>
> This can also be done with DMComposite, using DMComplex for the
> unstructured grids.
>
>    Matt
>
>
>> Thanks!
>> Jeremy
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120503/74937f9e/attachment-0001.htm>


More information about the petsc-users mailing list