[petsc-users] interpolate staggered grid values in parallel
Matthew Knepley
knepley at gmail.com
Tue Nov 25 11:40:03 CST 2014
On Tue, Nov 25, 2014 at 11:36 AM, Bishesh Khanal <bisheshkh at gmail.com>
wrote:
> Dear all,
> I'm solving a system using petsc and I get a global Vec, say X . It uses
> DMDA with 4 dofs (3 velocity + 1 pressure). X contains velocity at the cell
> faces since I solve using staggered grid.
> Now I'd like to create one array for velocity with values at cell centers
> and another array for pressure (not the Vec so that I can send the pointer
> to the array to other part of the code that does not use Petsc).
>
> Currently what I do is :
> ------ Vec X, X_local;
> ------ PetscScalar *X_array;
>
> // Scatter X to X_local and then use:
>
> ------- VecGetArray(X_local, &X_array)
>
> And then have a function, say
> getVelocityAt(x, y, z, component) {
> // interpolate velocity at (x,y,z) cell center using X_array
> }
>
> The function getVelocityAt() gets called from outside petsc in a loop over
> all (x,y,z) positions.
> This is not done in parallel.
>
> Now, how do I use Petsc to instead interpolate the cell center velocities
> in parallel and store it
> in an array say
> PetscScalar *X_array_cellCenter;
> ?
> This would need to have size one less along each axis compared to the
> orginal DMDA size.
> This way I intend to return X_array_cellCenter to the code outside Petsc.
>
SNES ex30 is an example of a staggered grid code using DMDA. It does this
kind of interpolation,
and puts the result in a Vec.
Thanks,
Matt
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141125/2b2e1ebc/attachment.html>
More information about the petsc-users
mailing list