[petsc-users] Scattering a vector to/from a subset of processors

Sreeram R Venkat srvenkat at utexas.edu
Tue Dec 5 17:15:37 CST 2023


Hi, I have a follow up question on this.

Now, I'm trying to do a scatter and permutation of the vector. Under the
same setup as the original example, here are the new Start and Finish
states I want to achieve:
 Start                                Finish
Proc | Entries                 Proc | Entries
    0   |  0,...,8                     0   | 0, 12, 24
    1   |  9,...,17                   1   | 1, 13, 25
    2   |  18,...,26                 2   | 2, 14, 26
    3   |  27,...,35                 3   | 3, 15, 27
    4   |  None                     4   | 4, 16, 28
    5   |  None                     5   | 5, 17, 29
    6   |  None                     6   | 6, 18, 30
    7   |  None                     7   | 7, 19, 31
    8   |  None                     8   | 8, 20, 32
    9   |  None                     9   | 9, 21, 33
    10   |  None                   10 | 10, 22, 34
    11   |  None                   11  | 11, 23, 35

So far, I've tried to use ISCreateGeneral(), with each process giving an
idx array corresponding to the indices it wants (i.e. idx on P0 looks like
[0,12,24] P1 -> [1,13, 25], and so on).
Then I use this to create the VecScatter with VecScatterCreate(x, is, y,
NULL, &scatter).

However, when I try to do the scatter, I get some illegal memory access
errors.

Is there something wrong with how I define the index sets?

Thanks,
Sreeram





On Thu, Oct 5, 2023 at 12:57 PM Sreeram R Venkat <srvenkat at utexas.edu>
wrote:

> Thank you. This works for me.
>
> Sreeram
>
> On Wed, Oct 4, 2023 at 6:41 PM Junchao Zhang <junchao.zhang at gmail.com>
> wrote:
>
>> Hi, Sreeram,
>> You can try this code. Since x, y are both MPI vectors, we just need to
>> say we want to scatter x[0:N] to y[0:N]. The 12 index sets with your
>> example on the 12 processes would be [0..8], [9..17], [18..26], [27..35],
>> [], ..., [].  Actually, you can do it arbitrarily, say, with 12 index sets
>> [0..17], [18..35], .., [].  PETSc will figure out how to do the
>> communication.
>>
>> PetscInt rstart, rend, N;
>> IS ix;
>> VecScatter vscat;
>> Vec y;
>> MPI_Comm comm;
>> VecType type;
>>
>> PetscObjectGetComm((PetscObject)x, &comm);
>> VecGetType(x, &type);
>> VecGetSize(x, &N);
>> VecGetOwnershipRange(x, &rstart, &rend);
>>
>> VecCreate(comm, &y);
>> VecSetSizes(y, PETSC_DECIDE, N);
>> VecSetType(y, type);
>>
>> ISCreateStride(PetscObjectComm((PetscObject)x), rend - rstart, rstart, 1,
>> &ix);
>> VecScatterCreate(x, ix, y, ix, &vscat);
>>
>> --Junchao Zhang
>>
>>
>> On Wed, Oct 4, 2023 at 6:03 PM Sreeram R Venkat <srvenkat at utexas.edu>
>> wrote:
>>
>>> Suppose I am running on 12 processors, and I have a vector "v" of size
>>> 36 partitioned over the first 4. v still uses the PETSC_COMM_WORLD, so it
>>> has a layout of (9, 9, 9, 9, 0, 0, ..., 0). Now, I would like to
>>> repartition it over all 12 processors, so that the layout becomes (3, 3, 3,
>>> ..., 3). I've been trying to use VecScatter to do this, but I'm not sure
>>> what IndexSets to use for the sender and receiver.
>>>
>>> The result I am trying to achieve is this:
>>>
>>> Assume the vector is v = <0, 1, 2, ..., 35>
>>>
>>>      Start                                Finish
>>> Proc | Entries                 Proc | Entries
>>>     0   |  0,...,8                     0   | 0, 1, 2
>>>     1   |  9,...,17                   1   | 3, 4, 5
>>>     2   |  18,...,26                 2   | 6, 7, 8
>>>     3   |  27,...,35                 3   | 9, 10, 11
>>>     4   |  None                     4   | 12, 13, 14
>>>     5   |  None                     5   | 15, 16, 17
>>>     6   |  None                     6   | 18, 19, 20
>>>     7   |  None                     7   | 21, 22, 23
>>>     8   |  None                     8   | 24, 25, 26
>>>     9   |  None                     9   | 27, 28, 29
>>>     10   |  None                   10 | 30, 31, 32
>>>     11   |  None                   11  | 33, 34, 35
>>>
>>> Appreciate any help you can provide on this.
>>>
>>> Thanks,
>>> Sreeram
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20231205/d983631b/attachment.html>


More information about the petsc-users mailing list