[petsc-users] interpolate staggered grid values in parallel

Bishesh Khanal bisheshkh at gmail.com
Wed Nov 26 15:13:29 CST 2014


On Tue, Nov 25, 2014 at 6:40 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Tue, Nov 25, 2014 at 11:36 AM, Bishesh Khanal <bisheshkh at gmail.com>
> wrote:
>
>> Dear all,
>> I'm solving a system using petsc and I get a global Vec, say X . It uses
>> DMDA with 4 dofs (3 velocity + 1 pressure). X contains velocity at the cell
>> faces since I solve using staggered grid.
>> Now I'd like to create one array for velocity with values at cell centers
>> and another array for pressure (not the Vec so that I can send the pointer
>> to the array to other part of the code that does not use Petsc).
>>
>> Currently what I do is :
>> ------ Vec               X, X_local;
>> ------ PetscScalar  *X_array;
>>
>> // Scatter X to X_local and then use:
>>
>> ------- VecGetArray(X_local, &X_array)
>>
>> And then have a function, say
>> getVelocityAt(x, y, z, component) {
>> // interpolate velocity at (x,y,z) cell center using X_array
>> }
>>
>> The function getVelocityAt() gets called from outside petsc in a loop
>> over all (x,y,z) positions.
>> This is not done in parallel.
>>
>> Now, how do I use Petsc to instead interpolate the cell center velocities
>> in parallel and store it
>> in an array say
>> PetscScalar *X_array_cellCenter;
>> ?
>> This would need to have size one less along each axis compared to the
>> orginal DMDA size.
>> This way I intend to return X_array_cellCenter to the code outside Petsc.
>>
>
> SNES ex30 is an example of a staggered grid code using DMDA. It does this
> kind of interpolation,
> and puts the result in a Vec.
>

After looking at the example, I tried the following but I have a problem:

The idea I take from the e.g is to use DMDALocalInfo *info to get local
grid coordinates and use for loop to iterate through local grid values that
are
available in the Field **x array to do the interpolation.

Here is what I tried in my case where I get the solution from a ksp
attached to a dmda:

// Get the solution to Vec x from ksp.
// ksp is attached to DMDA da which has 4 dofs: vx,vy,vz,p;
KSPGetSoution(ksp, &x);
//x has velocity solution in faces of the cell.

//Get the array from x:
Field ***xArray;   //Field is a struct with 4 members: vx,vy,vz and p.
DMDAGetVectorArray(da,xVec,&xArray);

//Now I want to have a new array to store the cell center velocity values
//by interpolating from xArray.

DMCreateGlobalVector(daV, &xC); //daV has same size as da but with dof=3
FieldVelocity ***xCArray;  //FieldVelocity is a struct with 3 members:
vx,vy and vz.
DMDAVecGetArray(daV,xC,&xCArray);

//Do the interpolation
DMDAGetLocalInfo(da,&info);
for (PetscInt k=info.zs; k<info.zs+info.zm; ++k) {
  for (PetscInt j=info.ys; j<info.ys+info.ym; ++j) {
    for (PetscInt i=info.xs; i<info.xs+info.xm; ++i) {
        //do interpolation which requires access such as:
    if(i==0 || j==0 || k==0) {
        xCArray[k][j][i].vx = 0; //boundary condition;
        xCArray[k][j][i].vy = 0;
        xCArray[k][j][i].vz = 0
    } else {
        xCArray[k][j][i].vx = 0.5*(xArray[k][j][i].vx +
xArray[k][j][i-1].vx);
        xCArray[k][j][i].vy = 0.5*(xArray[k][j][i].vy +
xArray[k][j][i-1].vy);
        xCArray[k][j][i].vz = 0.5*(xArray[k][j][i].vz +
xArray[k][j][i-1].vz);
    }

I get runtime error within this loop.
What am I doing wrong here ?



>
>   Thanks,
>
>      Matt
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141126/41f31b96/attachment.html>


More information about the petsc-users mailing list