[petsc-users] implementation of multi-level grid in petsc

Mohammad Mirzadeh mirzadeh at gmail.com
Thu Aug 8 15:44:25 CDT 2013


How big of an application are you looking into? If you are thinking in
the range of couple of 10M grid points on couple of hundred
processors, then I'd say the simplest approach is to create grid in
serial and then use PETSc's interface to ParMetis to handle
partitioning. I did this with my quadtree code and could easily scale
quadtrees on the order of 16.5M grid points upto 75% on 256 processors
for a Poisson equation test.

If you are thinking way larger problem (think couple of 100M grids and
order several thousands processors), I could recommend p4est if you
want to do tree-based grids. In that case using deal.II interface will
be really beneficial as p4est alone is really a bare bone package. I
do not have enough experience with block-structured AMR so I cannot
comment on that.

On Thu, Aug 8, 2013 at 1:28 PM, Mark F. Adams <mfadams at lbl.gov> wrote:
>
> On Aug 8, 2013, at 3:32 PM, Roc Wang <pengxwang at hotmail.com> wrote:
>
> Thanks Mat,
>
>     I tried Chombo for implementing AMR but not tried SAMRAI yet. Chombo can
> do AMR, but it seems the data structure is quite complicated for customizing
> usage.  What I want to do with petsc is to compose a simple "home-made" like
> blocked multi-level grid, though it is not automatically adaptive.  However,
> I don't have too much experiences on petsc. As of now, I suppose to use DM
> to manage the data for the big domain and all small sub-domains.  I am not
> sure whether it is a good idea.  So, any suggestions are appreciated very
> much.  Thanks again.
>
>
> As Matt said, this is not what you want to do, most likely.  Building AMR on
> DM/DA is a lot of work unless you have a simple application and have a clear
> idea of how to do it.  Chombo is flexible but it is complex and takes time
> to get started.  I'm not familiar wit SAMARI but I would guess it is like
> Chombo.  Deall.II might be worth looking into.  I'm not familiar.
>
> Best,
>
>
>
>
> ________________________________
> Date: Thu, 8 Aug 2013 14:03:53 -0500
> Subject: Re: [petsc-users] implementation of multi-level grid in petsc
> From: knepley at gmail.com
> To: pengxwang at hotmail.com
> CC: petsc-users at mcs.anl.gov
>
> On Thu, Aug 8, 2013 at 1:29 PM, Roc Wang <pengxwang at hotmail.com> wrote:
>
> Hi,
>
>     I am working on multi-level grid for Poisson equation.  I need to refine
> some sub-region in the computational domain. To this, I plan to build  some
> boxes (patches) based on the coarsest level. I am using DM to manage the
> data. I found there is a new function DMPatachCreate() in the version 3.4.
> Is this function the right one I should use for the refined region?  If it
> is not, which ones I should use?
>
>
> That is an experiment and does not work.
>
>
>     My proposed approach is to start with  code
> dm/impls/patch/examples/tests/ex1.c. And then follow the code
> /dm/examples/tutorials/ex65dm.c. Is this approach the right way to my goal?
>
>     In addition, I need to use not only the nodes but also the cells
> including nodes.  Should I use DMMesh to create the cells? I noticed DMMesh
> is mainly for unstructured grid, but I didn't find other class that
> implements structured cells.  Can anybody give me some suggestions on
> multi-level grid or let me know which examples I should start with? Thanks.
>
>
> No, that is not appropriate.
>
> It sounds like you want structured AMR. PETSc does not do this, and there
> are packages that do it.:
>
> a) Chombo
>
> b) SAMRAI
>
> which are both patch-based AMR. If you want octree-style AMR you could use
> p4est, but it would mean
> a lot of coding along the lines of http://arxiv.org/abs/1308.1472, or
> Deal.II which is a complete package.
> I think Deal is the closest to using PETSc solvers.
>
>   Thanks,
>
>      Matt
>
> --
> What most experimenters take for granted before they begin their experiments
> is infinitely more interesting than any results to which their experiments
> lead.
> -- Norbert Wiener
>
>


More information about the petsc-users mailing list