[petsc-users] Writing a domain decomposition code with PETSc

Matthew Knepley knepley at gmail.com
Fri Oct 4 14:22:21 CDT 2013


On Fri, Oct 4, 2013 at 2:10 PM, Åsmund Ervik <asmund.ervik at ntnu.no> wrote:

>  Is there any reason I should prefer Xdmf over Cgns? I think both use
> hdf5 in the background.
>

I guess not. It was just easier for me to get Xdmf going. We have trial
CGNS as well.

  Matt


> Apologies by the way for my phone's inability to reply inline properly.
>
>  Åsmund
>
>
>  Sent from my VT-102
>
> Matthew Knepley <knepley at gmail.com> skrev:
>  On Fri, Oct 4, 2013 at 1:51 PM, Åsmund Ervik <asmund.ervik at ntnu.no>wrote:
>
>>
>> Matthew Knepley <knepley at gmail.com> skrev:
>> On Fri, Oct 4, 2013 at 12:57 PM, Åsmund Ervik <asmund.ervik at ntnu.no>wrote:
>>
>>>  Barry,
>>>
>>>  Thanks for the quick answer.
>>>
>>>  Good to hear that I can use the DMDA framework for all variables.
>>> Should I put all scalars (e.g. pressure, level set function, etc) in the
>>> same DA, or should I keep a distinct one for the pressure (where I want to
>>> use multigrid)?
>>>
>>
>>  Separate variables which are solved for.
>>
>>  Ok. So only variables that belong together, such as the velocity
>> components, should be grouped via the dof?
>>
>
>  Its just the solved for/not solved for distinction that is important.
>
>
>>     The reason I was unsure is that I can't seem to find an example
>>> which manipulates the local array from a DA. I would've guessed there was
>>> something like
>>>
>>>  real, dimension(:,:,:) u,v,w
>>> call DMDAGetLocalArray(da,u,v,w)
>>> ! Some computations looping over local i,j,k that manipulate u,v,w
>>> call DMDARestoreLocalArray(da,u,v,w)
>>>
>>
>>
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMDAVecGetArray.html
>>
>>  Doh. Thanks. I've been looking at that, and kept focusing on the input
>> which must be a vector, but I see that's just the internal storage format.
>>
>>  Speaking of storage formats, I would like to have a unified format for
>> checkpointing and visualization output. So that after analyzing a result I
>> could restart from an arbitrary time step in the output. Is the VTK viewer
>> or the binary PETSc viewer the best way to go? I use Tecplot and Visit
>> today.
>>
>
>  Right now, this does not exist. The binary format is great for
> checkpointing, but not for viz. In the future, I predict HDF5, since you
> can put Xdmf around it (could also do this for PETSc binary) for viz and
> it has nice Python tools. Jed probably has an opinion.
>
>     Matt
>
>
>>
>>
>>>  On the BCs for velocity, I would like to support several options. To
>>> get the code up and running I would be OK with just periodic, but I would
>>> eventually like to support full slip and no slip, and preferably a mix of
>>> these for the different faces. Perhaps also inflow and outflow. I don't
>>> need (physical) pressure BCs though.  Would this complicate things much?
>>>
>>
>>  Both periodic and ghost cells are supported. Imposing Dirichlet
>> conditions on an unknown is also easy.
>>
>>  Good. I will come back to this if I have specific questions.
>>
>>  Åsmund
>>
>>
>>    Thanks,
>>
>>      Matt
>>
>>
>>>  I understand the point about the velocity i,j,k lining up, this is how
>>> we do it currently.
>>>
>>>  Åsmund
>>>
>>>  Sent from my VT-102
>>>
>>> Barry Smith <bsmith at mcs.anl.gov> skrev:
>>>
>>>    Asmund,
>>>
>>>     You can use the DMDA to manage the layout of your velocity variables
>>> as well as the pressure variables. You will have two DMDA, one that manages
>>> the cell-centered pressure variables (this is created with the dof argument
>>> of 1) and one that handles the velocities (that is created with the dof
>>> argument of 3) on the "faces". Then you can have a ghosted representation
>>> of the velocities from which you compute the right hand side for your
>>> pressure equation.
>>>
>>>     What kind of boundary conditions do you have for the velocities?
>>> This will determine exactly how to create the DMDA for the velocities.
>>>
>>>     Note the though the x, y, and z velocities are physically associated
>>> with the three sets of faces of the cells and thus not collocated on the
>>> physical domain you can stack the three of them up at the same i,j,k mesh
>>> point of the DMDA vector.  Depending on your boundary conditions there may
>>> be less pressure variables then velocity variables in each direction of the
>>> grid; to make the two different DMDA "line up" you can just have an extra
>>> "slab" of pressure variables in each direction that are never computed on.
>>> It's easy to draw a picture in 2d of the stagger grid to see what I mean.
>>>
>>>
>>>    Barry
>>>
>>> On Oct 4, 2013, at 8:35 AM, Åsmund Ervik <asmund.ervik at ntnu.no> wrote:
>>>
>>> > Dear all,
>>> >
>>> > We have a two-phase incompressible Navier-Stokes solver written in
>>> > Fortran where we use PETSc for solving the pressure Poisson equation.
>>> > Since both PETSc and parallelism was an afterthought to this code, it
>>> > doesn't scale well at all, so I am tasked with re-writing the whole
>>> > thing now. Before I commit any fresh mistakes in the design of this new
>>> > code, I will ask for input on my "design decisions" so far.
>>> >
>>> > I want to do domain decomposition on a structured 3D grid. I've been
>>> > trying to wrap my head around the DM and DMDA parts of PETSc, and as
>>> far
>>> > as I understand, these will help me solve the pressure Poisson equation
>>> > on a decomposed domain (and with geometric multigrid via Galerkin)
>>> > fairly easily.
>>> >
>>> > The tricky part, then; it seems that I must handle "the rest" of the
>>> > domain decomposition myself. Omitting some detail, this means my code
>>> will:
>>> >
>>> > * set up parameters, initial conditions, etc.
>>> > * decompose my array for the velocity field into several parts,
>>> > * time loop:
>>> >        * communicate e.g. the velocity field on the boundaries
>>> >        * each mpi worker will calculate on the local domain the
>>> >          intermediate velocity field, the rhs to the Poisson equation
>>> >          and set up the correct sparse matrix
>>> >        * PETSc will solve the Poisson equation to give me the pressure
>>> >        * each mpi worker will then calculate the updated
>>> >          divergence-free velocity field
>>> >        * each mpi worker will calculate the time step (CFL condition),
>>> >          and we choose the lowest dt among all nodes
>>> > * end time loop
>>> >
>>> > Have I misunderstood anything here? At first I thought the DMDA would
>>> > give me the framework for decomposing the velocity field, handling
>>> > communication of the ghost values at the boundaries etc, but it seems
>>> > this is not the case?
>>> >
>>> > One further question: is it a good idea to set up the DMDA letting
>>> PETSc
>>> > decide the number of processors in each direction, and then using this
>>> > same partition for the rest of my code?
>>> >
>>> > If there are any unclear details, please ask. If it matters, I am using
>>> > the level-set and ghost-fluid methods, so the matrix for my Poisson
>>> > equation must be recomputed each time step. I believe this is the same
>>> > situation as Michele Rosso who posted on this list recently.
>>> >
>>> > Best regards,
>>> > Åsmund Ervik
>>>
>>>
>>
>>
>>  --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
>
>  --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131004/3ac3f765/attachment.html>


More information about the petsc-users mailing list