[petsc-users] Block Number of Grid; Domain Decomposit ion

Matthew Knepley knepley at gmail.com
Sun May 5 10:55:02 CDT 2019


On Sun, May 5, 2019 at 9:59 AM tang hongwei via petsc-users <
petsc-users at mcs.anl.gov> wrote:

> Dear Developers,
>       Thanks for your work about PETSc. I am using PETSc (version 3.1) for
> CFD, but I get confused by some problems, hoping you can help me:
>
> 1. Could the block number of grid be more than 1 ? I use pointwise (
> http://www.pointwise.com) to draw grid. For some cases, the block number
> of grid may be more than 1.
>

Are you talking about DMDA? It can only handle purely Cartesian meshes.

We now have:

  - DMStag: Cartesian meshes with staggered discretizations
  - DMComposite: Using multiple DMs at a time (this might be what you want
for multiblock)
  - DMForest: Using p4est for structured, adaptive grids (this can also
handle multiblock)
  - DMPlex: Arbitrary meshes (this can also handle multiblock)


> 2. How PETSc decompose the domain ?
>

Into blocks.


> In the user manual, I notice that PETSc can decompose the domain
> automatically.  However, I don't understand how PETSc distributes
> processors for each sub-domain (How xs, ys, zs, xm, ym, zm get values?)
>

It divides each direction into pieces, and then the partitions are the
tensor products of those pieces.

  Thanks,

    Matt


> Best Regards,
> Hongwei
>
> ------------------------------
> Sent from YoMail <http://www.yomail.com/?utm_source=signature>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190505/fe367de2/attachment.html>


More information about the petsc-users mailing list