[petsc-users] Incrementing individual entries in vectors

Derek Gaston friedmud at gmail.com
Thu Jan 23 19:39:40 CST 2014


Of course I'm on a plane right now and in the middle of this algorithm and
didn't think about the obvious use of VecSetValue() with with ADD_VALUES
and a 1... duh.

I was actually just looking for a VecIncrementValue() or some such.

Please ignore unless you see an even more efficient way to do this...

Derek



On Thu, Jan 23, 2014 at 6:32 PM, Derek Gaston <friedmud at gmail.com> wrote:

> I have a funky algorithm I'm working on... and in it I need to increment
> individual entries in vectors in parallel and then finalize the vector,
> summing to get the final value.  So, something like this:
>
> If I have a vector "v" with 6 values in it (all starting at 0) spread
> across two processors (so that the first 3 entries are on processor 0 and
> the last three on processor 1).
>
> On processor 0 I need to be able to do:
>
> v[1]++;
> v[2]++;
> v[1]++;
>
> Then on processor 1 I'd like to do:
>
> v[0]++;
> v[1]++'
> v[5]++;
> v[1]++;
>
> Then, after finalizing the vector I'd like to end up with a vector that
> has these values on processor 0:
>
> v[0] -> 0
> v[1] -> 4
> v[2] -> 1
>
> and on processor 1:
>
> v[3] -> 0
> v[4] -> 0
> v[5] -> 1
>
> Is there anything in PETSc that would help here??  I realize that I can
> roll my own solution using arrays and MPI for this... but it would be handy
> if I could do this with a PETSc Vec because the next step is to use
> VecPointwiseDivide() to divide another vector by this funky one pointwise.
>  Both of these vectors will have the same parallel distribution.  Also,
> note that these vectors might be really large (like 300 million to 1
> billion entries) so it would be really handy to have the automatic
> parallelization of Vec...
>
> Thanks for any help...
>
> Derek
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140123/4159cac0/attachment.html>


More information about the petsc-users mailing list