[petsc-users] multiblock example

Matthew Knepley knepley at gmail.com
Sun Apr 15 16:54:34 CDT 2012


On Sun, Apr 15, 2012 at 4:24 PM, Jeremy Kozdon <jkozdon at stanford.edu> wrote:

> Hello All,
>
> We are considering using petsc for a new code we are developing which
> involves multiblock grids and finite difference methods. I am still pretty
> new to petsc (still reading the manual), so wondering where a good place to
> look for examples on this is (since I assume that we are not the first to
> consider this). Not sure if there is a good place in the petsc source to
> consider this, one of the libraries examples perhaps, or another opensource
> package.
>
> One of the things we want to be able to do is assign different numbers of
> processors to different blocks based on the work load we estimate for that
> block. Currently we do this by decomposing the global communicator into a
> block communicators, but if I understand correctly this is not the correct
> approach to take with petsc since we will need to use DMComposite which
> relies on collective operations.
>

The Cartesian grid interface is the DMDA class in PETSc. You can prescribe
whatever processor layout you want. The key
question is how will the blocks interact. The DMComposite is fine for
putting blocks together to generate a combined residual,
but the Jacobian interface is not as fleshed out, for example it does not
automatically allocate the off-diagonal blocks from
block interaction. I don't think there are any problems with collectives in
DMComposite. Do you have a specific thing that
does not work for you?


> Currently everything we are doing is with structured cartesian grids, but
> in the future we will also need to be able to handle unstructured grids
> which are coupled to the structured grids.
>

This can also be done with DMComposite, using DMComplex for the
unstructured grids.

   Matt


> Thanks!
> Jeremy
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120415/7fe8e934/attachment.htm>


More information about the petsc-users mailing list