[petsc-users] Scattering a vector to/from a subset of processors

Sreeram R Venkat srvenkat at utexas.edu
Thu Oct 5 12:57:00 CDT 2023


Thank you. This works for me.

Sreeram

On Wed, Oct 4, 2023 at 6:41 PM Junchao Zhang <junchao.zhang at gmail.com>
wrote:

> Hi, Sreeram,
> You can try this code. Since x, y are both MPI vectors, we just need to
> say we want to scatter x[0:N] to y[0:N]. The 12 index sets with your
> example on the 12 processes would be [0..8], [9..17], [18..26], [27..35],
> [], ..., [].  Actually, you can do it arbitrarily, say, with 12 index sets
> [0..17], [18..35], .., [].  PETSc will figure out how to do the
> communication.
>
> PetscInt rstart, rend, N;
> IS ix;
> VecScatter vscat;
> Vec y;
> MPI_Comm comm;
> VecType type;
>
> PetscObjectGetComm((PetscObject)x, &comm);
> VecGetType(x, &type);
> VecGetSize(x, &N);
> VecGetOwnershipRange(x, &rstart, &rend);
>
> VecCreate(comm, &y);
> VecSetSizes(y, PETSC_DECIDE, N);
> VecSetType(y, type);
>
> ISCreateStride(PetscObjectComm((PetscObject)x), rend - rstart, rstart, 1,
> &ix);
> VecScatterCreate(x, ix, y, ix, &vscat);
>
> --Junchao Zhang
>
>
> On Wed, Oct 4, 2023 at 6:03 PM Sreeram R Venkat <srvenkat at utexas.edu>
> wrote:
>
>> Suppose I am running on 12 processors, and I have a vector "v" of size 36
>> partitioned over the first 4. v still uses the PETSC_COMM_WORLD, so it has
>> a layout of (9, 9, 9, 9, 0, 0, ..., 0). Now, I would like to repartition it
>> over all 12 processors, so that the layout becomes (3, 3, 3, ..., 3). I've
>> been trying to use VecScatter to do this, but I'm not sure what IndexSets
>> to use for the sender and receiver.
>>
>> The result I am trying to achieve is this:
>>
>> Assume the vector is v = <0, 1, 2, ..., 35>
>>
>>      Start                                Finish
>> Proc | Entries                 Proc | Entries
>>     0   |  0,...,8                     0   | 0, 1, 2
>>     1   |  9,...,17                   1   | 3, 4, 5
>>     2   |  18,...,26                 2   | 6, 7, 8
>>     3   |  27,...,35                 3   | 9, 10, 11
>>     4   |  None                     4   | 12, 13, 14
>>     5   |  None                     5   | 15, 16, 17
>>     6   |  None                     6   | 18, 19, 20
>>     7   |  None                     7   | 21, 22, 23
>>     8   |  None                     8   | 24, 25, 26
>>     9   |  None                     9   | 27, 28, 29
>>     10   |  None                   10 | 30, 31, 32
>>     11   |  None                   11  | 33, 34, 35
>>
>> Appreciate any help you can provide on this.
>>
>> Thanks,
>> Sreeram
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20231005/3b6ea7ca/attachment.html>


More information about the petsc-users mailing list