[petsc-users] Calculating the PETSc index of a DMComposite vector
Anush Krishnan
anush at bu.edu
Mon Nov 11 19:57:28 CST 2013
On 11 November 2013 19:59, Jed Brown <jedbrown at mcs.anl.gov> wrote:
> That direction is fragile and/or expensive unless it is within the
> subdomain overlap/ghost region. Where are you getting this index from?
>
Let me give a brief description of the problem I'm working on:
I use a structured staggered grid to store a fluid velocity field, and I
need to interpolate this field on to a set of points that represent an
immersed body present in the flow. These points can be arbitrarily placed
in the domain and need not conform to the grid.
Around each body point, I choose a small rectangular region of velocity
grid points and perform an interpolation with these values to obtain the
velocity at the body point. This can be represented as:
[C][u] = [b]
where [C] contains the interpolation coefficients, [u] contains velocity
values from the small rectangular region and [b] is the velocity at the
body point. These matrices can be expanded to include the entire velocity
field and all the body points.
Currently, I'm storing a copy of the body on every process. And since my
grid is structured and cartesian, it's easy to calculate the indices of the
velocity grid points around each body point. This is how I obtain the
Natural Ordering index.
I created the vector [b] using VecCreate, and PETSC_DECIDE for the size on
each process. The length of [b] is 2*nb, where nb is the number of body
points, and the first half of the vector stores the x-component and the
second stores the y-component of velocity. [u] is a DMComposite of the U
and V components of velocity.
Currently, my fix is to loop over all the velocity grid points, check if
they come under the influence of any of the body points, and
correspondingly set the interpolation coefficient value in the matrix. But
this procedure requires a double loop.
I haven't thought about parallelising the body storage yet.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131111/0c3d3dd3/attachment.html>
More information about the petsc-users
mailing list