[petsc-users] interpolate staggered grid values in parallel

Dave May dave.mayhem23 at gmail.com
Mon Dec 1 14:56:55 CST 2014


It's clear as it is. The function name indicates it returns a vector which
is the solution associated with the Krylov method. The local part of the
vector will not be a solution.



On 1 December 2014 at 21:42, Bishesh Khanal <bisheshkh at gmail.com> wrote:

>
>
> On Wed, Nov 26, 2014 at 10:13 PM, Bishesh Khanal <bisheshkh at gmail.com>
> wrote:
>
>>
>>
>> On Tue, Nov 25, 2014 at 6:40 PM, Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>>> On Tue, Nov 25, 2014 at 11:36 AM, Bishesh Khanal <bisheshkh at gmail.com>
>>> wrote:
>>>
>>>> Dear all,
>>>> I'm solving a system using petsc and I get a global Vec, say X . It
>>>> uses DMDA with 4 dofs (3 velocity + 1 pressure). X contains velocity at the
>>>> cell faces since I solve using staggered grid.
>>>> Now I'd like to create one array for velocity with values at cell
>>>> centers and another array for pressure (not the Vec so that I can send the
>>>> pointer to the array to other part of the code that does not use Petsc).
>>>>
>>>> Currently what I do is :
>>>> ------ Vec               X, X_local;
>>>> ------ PetscScalar  *X_array;
>>>>
>>>> // Scatter X to X_local and then use:
>>>>
>>>> ------- VecGetArray(X_local, &X_array)
>>>>
>>>> And then have a function, say
>>>> getVelocityAt(x, y, z, component) {
>>>> // interpolate velocity at (x,y,z) cell center using X_array
>>>> }
>>>>
>>>> The function getVelocityAt() gets called from outside petsc in a loop
>>>> over all (x,y,z) positions.
>>>> This is not done in parallel.
>>>>
>>>> Now, how do I use Petsc to instead interpolate the cell center
>>>> velocities in parallel and store it
>>>> in an array say
>>>> PetscScalar *X_array_cellCenter;
>>>> ?
>>>> This would need to have size one less along each axis compared to the
>>>> orginal DMDA size.
>>>> This way I intend to return X_array_cellCenter to the code outside
>>>> Petsc.
>>>>
>>>
>>> SNES ex30 is an example of a staggered grid code using DMDA. It does
>>> this kind of interpolation,
>>> and puts the result in a Vec.
>>>
>>
>> After looking at the example, I tried the following but I have a problem:
>>
>> The idea I take from the e.g is to use DMDALocalInfo *info to get local
>> grid coordinates and use for loop to iterate through local grid values
>> that are
>> available in the Field **x array to do the interpolation.
>>
>> Here is what I tried in my case where I get the solution from a ksp
>> attached to a dmda:
>>
>> // Get the solution to Vec x from ksp.
>> // ksp is attached to DMDA da which has 4 dofs: vx,vy,vz,p;
>> KSPGetSoution(ksp, &x);
>> //x has velocity solution in faces of the cell.
>>
>> //Get the array from x:
>> Field ***xArray;   //Field is a struct with 4 members: vx,vy,vz and p.
>> DMDAGetVectorArray(da,xVec,&xArray);
>>
>> //Now I want to have a new array to store the cell center velocity values
>> //by interpolating from xArray.
>>
>> DMCreateGlobalVector(daV, &xC); //daV has same size as da but with dof=3
>> FieldVelocity ***xCArray;  //FieldVelocity is a struct with 3 members:
>> vx,vy and vz.
>> DMDAVecGetArray(daV,xC,&xCArray);
>>
>> //Do the interpolation
>> DMDAGetLocalInfo(da,&info);
>> for (PetscInt k=info.zs; k<info.zs+info.zm; ++k) {
>>   for (PetscInt j=info.ys; j<info.ys+info.ym; ++j) {
>>     for (PetscInt i=info.xs; i<info.xs+info.xm; ++i) {
>>         //do interpolation which requires access such as:
>>     if(i==0 || j==0 || k==0) {
>>         xCArray[k][j][i].vx = 0; //boundary condition;
>>         xCArray[k][j][i].vy = 0;
>>         xCArray[k][j][i].vz = 0
>>     } else {
>>         xCArray[k][j][i].vx = 0.5*(xArray[k][j][i].vx +
>> xArray[k][j][i-1].vx);
>>         xCArray[k][j][i].vy = 0.5*(xArray[k][j][i].vy +
>> xArray[k][j][i-1].vy);
>>         xCArray[k][j][i].vz = 0.5*(xArray[k][j][i].vz +
>> xArray[k][j][i-1].vz);
>>     }
>>
>> I get runtime error within this loop.
>> What am I doing wrong here ?
>>
>>
> Ok, I now see that KSPGetSolution(ksp, &x) gives a GLOBAL vector in x and
> not a local vector; hence the runtime memory error above since I can't
> access ghost values in indices such as [i-1] wth the global vec.
> Was there any reason not to make this information explicit in the docs to
> prevent the confusion I had ? Or should it have been obvious to me and that
> I'm missing something here ?
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSPGetSolution.html
> Thanks
>
>
>>
>>>
>>>   Thanks,
>>>
>>>      Matt
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141201/db093bfa/attachment.html>


More information about the petsc-users mailing list