[petsc-users] discontinuous viscosity stokes equation 3D staggered grid

Bishesh Khanal bisheshkh at gmail.com
Wed Jul 17 09:31:24 CDT 2013


Dear all,
(I apologize for a long email!! I think it's better to clearly explain the
problem even if it is boring than be short and confusing).

I need to solve following kind of set of equations in 3D for the domain of
around 250 ^ 3 grid size:
div (K(grad(v)) - grad (p) = f1  (momentum eqn)
div (v) = f2                           (continuity eqn)

v a vector of 3 components, p a scalar.
K is piecewise discontinuous. Say it has two values K1, K2 with ratio K1/K2
reaching up to 10^5.
f1, f2 are not zero and are functions of the space variable (x,y,z).
This is thus pretty much a standard stokes flow equation (except for the
non zero f2).

Now, I implemented two different approaches, each for both 2D and 3D, in
MATLAB. It works for the smaller sizes but I have problems solving it for
the problem size I need (250^3 grid size).
I use staggered grid with p on cell centers, and components of v on cell
faces. Similar split up of K to cell center and faces to account for the
variable viscosity case)
The two approaches I take are:

M1) Setting up Ax=b where x stacks up the components of v and the scalar p
for all discretized points. Thus A has one of the block matrices with zero
diagonal. Then use iterative solvers available. This requires the
preconditioner and that's where I had problems. I could not find a suitable
precondtioner and iterative solver pair in MATLAB that could produce result
for this size of the saddle point problem I have.

M2) Using Augmented Lagrangian / Penalty method: "Uncouple" the pressure
and velocity; i.e. use a pseudotime step to iteratively update p starting
from the initial guess. After each iteration use the new p value to solve
for the momentum equation. In this case the Ax=b would have A with no zeros
in the diagonal, and we are solving for x which has only the components of
v. Here preconditioner such as ILU worked with gmres etc. But for the
bigger problem size (the size I need !!) it does not scale up.

Now, this is when I started learning petsc and after having played around a
bit with it, I've a preliminary implementation of 2D problem with method
M1. What I did was basically use DMDA and KSP. To adapt DMDA for staggered
grid, I simply used ghost-values as explained in fig 7.5, page 96 of:
http://books.google.co.uk/books?id=W83gxp165SkC&printsec=frontcover&dq=Introduction+to+Numerical+Geodynamic+Modelling&hl=en&sa=X&ei=v6TmUaP_L4PuOs3agJgE&ved=0CDIQ6AEwAA
That means for 2D case, I have a DMDA with 3 dof and set KSP with this DM
object and solve the system.

Now my actual question (you are patient :) )
1) From 2D example, I can see that in 3D the issue again will be to have a
proper preconditioner. I saw some discussions in the archive about using
PCFieldsplit or schur complement. Before implementing the 3D case I'd like
to get some suggestions if I could directly use some combination of solver
and preconditioner to get result for my problem size in 3D. Can I use
multigrid with proper preconditioner in this case ? And by the way is there
a possibility of having the support in petsc for this implementation: (Matt
from petsc-dev is one of the authors of the paper, so I thought I could as
it here)
http://hpc.sagepub.com/content/early/2013/03/03/1094342013478640.abstract

2) The other option is to use M2 (see above), which worked in Matlab for
smaller 3D sizes. But the tricky part there would be to tune the pseudotime
parameter based on different viscocity values. Furthermore, I'm not sure if
this will scale up in parallel to give me the results.
I will have to try, but it'd be nice if you can suggest which of the two
methods should I try first ?
My interest in the first method was due to the possibility having an
implementation in Petsc of using multigrid with GPU (if that speeds up the
whole thing).

Thanks,
Bishesh
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130717/ac297680/attachment.html>


More information about the petsc-users mailing list