[petsc-users] 3D multi-dof DMDA and Poisson equation

Håkon Strandenes haakon at hakostra.net
Wed Oct 15 08:41:21 CDT 2014


Thanks for the reply, both Jed and Matthew. However, I am still a bit 
unsure on a small detail:

I have created a 3D DMDA with ndof = 4, one for each velocity component 
(u,v,w) and one for the pressure (p). I only solve a Poisson equation to 
correct p, the velocities are solved explicitly.

My question is this time:
When I create a matrix with DMCreateMatrix(), I get four times as many 
rows as I really need, i.e. it assumes that I want to solve equations 
for all my DOF's. The same happens when I create a vector with 
DMCreateGlobalVector(), it gets one element for each dof. This is not 
desirable, since I only solve equations for one unknown per gridpoint. 
If my domain (for debugginf purposes) have 250 points, I want the local 
portion of my matrix to have 250 rows. Instead it gets 1000.

Was I wrong when I created my DMDA with ndof = 4? Should I have created 
one DMDA for the velocity with ndof = 3 and one with ndof = 1 to use as 
a basis for the pressure solution? Or have I again missed something 
fundamentally?

Regards,
Håkon


On 08. okt. 2014 22:07, Jed Brown wrote:
> Håkon Strandenes <haakon at hakostra.net> writes:
>
>> Hi,
>>
>> I am making a new, simple incompressible CFD code. I have created a grid
>> as a 3D DMDA, with ndof=4, one dof for each velocity component (u,v,w)
>> and one for the pressure (p). Currently I am creating a forward Euler
>> step, where the velocities are updated based on the velocities and
>> pressure from the previous time step. That works great. Now I need to
>> solve a Poisson equation for the pressure correction term, enforcing
>> continuity onto my solution, and this is where I have a few questions
>> for you:
>>
>> 1)
>> How do I set up my equation system in the most efficient way? The
>> vectors from the DMDA is not contiguous in a global sense, i.e. if I map
>> the local (i,j,k) indices to a global index with
>>
>> globId = (info.mx*info.my)*k + (info-.mx)*j + i;
>
> This is a "natural index", used in some cases for IO, but not for
> computation.
>
>> globId is not gontignous on any process. How do I create my coefficient
>> matrix, r.h.s. vector and solution vector?
>
> DMCreateGlobalVector()
>
> it will use what is described in the Users Manual as "PETSc ordering".
> Please read this section of the manual; I think it will answer your
> questions.
>
>> Is there some way I can make the local parts of the matrix and vectors
>> mimic the decomposition from the DMDA? Or should I push all values
>> into a different decomposition scheme, where each process holds one
>> contigous slab of the equation system using the global ID's calculated
>> above? Have I missed some really basic stuff here?
>>
>> 2)
>> I know that my coefficient matrix from the Poisson equation is
>> symmetric. Currently I am still creating and inserting values on both
>> the upper and lower diagonal. Is this really necessary? Or is there a
>> "smart" way I can save half the storage by only creating the upper or
>> lower part?
>
> -dm_mat_type sbaij
>
>> 3)
>> I solve a transient problem, but my coefficient matrix is the same for
>> all time steps and I only recompute the r.h.s. vector for each time
>> step. I have got the impression that KSPsolve "finds out" if the A
>> matrix have changed since last invocation or not, recalculating the
>> preconditioner as necessary.
>
> If you just call KSPSolve, it will reuse everything.  Call
> KSPSetOperators (assuming you are manually doing the time
> discretization) if the matrix changes.
>
>> But still I feel that I should utilize this property more. Do you have
>> any tips in how I can use this in the most efficient way possible? For
>> example really, really efficient, but expensive, preconditioners?
>>
>> Thanks in advance.
>>
>> Regards,
>> Håkon Strandenes


More information about the petsc-users mailing list