[petsc-dev] [petsc-users] Writing a domain decomposition code with PETSc

Matthew Knepley knepley at gmail.com
Fri Oct 4 11:25:12 CDT 2013


On Fri, Oct 4, 2013 at 9:55 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>    Asmund,
>
>     You can use the DMDA to manage the layout of your velocity variables
> as well as the pressure variables. You will have two DMDA, one that manages
> the cell-centered pressure variables (this is created with the dof argument
> of 1) and one that handles the velocities (that is created with the dof
> argument of 3) on the "faces". Then you can have a ghosted representation
> of the velocities from which you compute the right hand side for your
> pressure equation.

    What kind of boundary conditions do you have for the velocities? This
> will determine exactly how to create the DMDA for the velocities.
>
>     Note the though the x, y, and z velocities are physically associated
> with the three sets of faces of the cells and thus not collocated on the
> physical domain you can stack the three of them up at the same i,j,k mesh
> point of the DMDA vector.  Depending on your boundary conditions there may
> be less pressure variables then velocity variables in each direction of the
> grid; to make the two different DMDA "line up" you can just have an extra
> "slab" of pressure variables in each direction that are never computed on.
> It's easy to draw a picture in 2d of the stagger grid to see what I mean.


Note that you no longer have to do this, since you can use PetscSection and
DMCreateSubDM(), but you may not want
to tell people to do this yet.

   Matt


>
>    Barry
>
> On Oct 4, 2013, at 8:35 AM, Åsmund Ervik <asmund.ervik at ntnu.no> wrote:
>
> > Dear all,
> >
> > We have a two-phase incompressible Navier-Stokes solver written in
> > Fortran where we use PETSc for solving the pressure Poisson equation.
> > Since both PETSc and parallelism was an afterthought to this code, it
> > doesn't scale well at all, so I am tasked with re-writing the whole
> > thing now. Before I commit any fresh mistakes in the design of this new
> > code, I will ask for input on my "design decisions" so far.
> >
> > I want to do domain decomposition on a structured 3D grid. I've been
> > trying to wrap my head around the DM and DMDA parts of PETSc, and as far
> > as I understand, these will help me solve the pressure Poisson equation
> > on a decomposed domain (and with geometric multigrid via Galerkin)
> > fairly easily.
> >
> > The tricky part, then; it seems that I must handle "the rest" of the
> > domain decomposition myself. Omitting some detail, this means my code
> will:
> >
> > * set up parameters, initial conditions, etc.
> > * decompose my array for the velocity field into several parts,
> > * time loop:
> >       * communicate e.g. the velocity field on the boundaries
> >       * each mpi worker will calculate on the local domain the
> >         intermediate velocity field, the rhs to the Poisson equation
> >         and set up the correct sparse matrix
> >       * PETSc will solve the Poisson equation to give me the pressure
> >       * each mpi worker will then calculate the updated
> >         divergence-free velocity field
> >       * each mpi worker will calculate the time step (CFL condition),
> >         and we choose the lowest dt among all nodes
> > * end time loop
> >
> > Have I misunderstood anything here? At first I thought the DMDA would
> > give me the framework for decomposing the velocity field, handling
> > communication of the ghost values at the boundaries etc, but it seems
> > this is not the case?
> >
> > One further question: is it a good idea to set up the DMDA letting PETSc
> > decide the number of processors in each direction, and then using this
> > same partition for the rest of my code?
> >
> > If there are any unclear details, please ask. If it matters, I am using
> > the level-set and ghost-fluid methods, so the matrix for my Poisson
> > equation must be recomputed each time step. I believe this is the same
> > situation as Michele Rosso who posted on this list recently.
> >
> > Best regards,
> > Åsmund Ervik
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20131004/cf326e95/attachment.html>


More information about the petsc-dev mailing list