[petsc-users] doubts on VecScatterCreate

Matthew Knepley knepley at gmail.com
Fri Nov 8 04:23:34 CST 2019


On Fri, Nov 8, 2019 at 2:23 AM Emmanuel Ayala via petsc-users <
petsc-users at mcs.anl.gov> wrote:

> Hi,
>
> Thank you very much for the help and the quickly answer.
>
> After check my code the problem was previous to scattering. *The scatter
> routines work perfectly.*
>
> Just to undesrtand, in the line:
>
> ISCreateStride(MPI_COMM_WORLD,(end-start),start,1,&is_fromA);
>
> if I use MPI_COMM_WORLD, it means that all the processes have a copy of
> the (current local process) index set?
> If I use MPI_COMM_SELF, it means that only the local process have
> information about the index set?
>

No, each process has whatever you give it in this call. The comm determines
what happens with collective calls, like


https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/IS/ISGetTotalIndices.html

  Thanks,

    Matt


> Kind regards
>
>
>
> El lun., 4 de nov. de 2019 a la(s) 08:47, Smith, Barry F. (
> bsmith at mcs.anl.gov) escribió:
>
>>
>>    It works for me. Please send a complete code that fails.
>>
>>
>>
>>
>> > On Nov 3, 2019, at 11:41 PM, Emmanuel Ayala via petsc-users <
>> petsc-users at mcs.anl.gov> wrote:
>> >
>> > Hi everyone, thanks in advance.
>> >
>> > I have three parallel vectors: A, B and C. A and B have different
>> sizes, and C must be contain these two vectors (MatLab notation C=[A;B]). I
>> need to do some operations on C then put back the proper portion of C on A
>> and B, then I do some computations on A and B y put again on C, and the
>> loop repeats.
>> >
>> > For these propose I use Scatters:
>> >
>> > C is created as a parallel vector with size of (sizeA + sizeB) with
>> petsc_decide for parallel layout. The vectors have been distributed on the
>> same amount of processes.
>> >
>> > For the specific case with order [A;B]
>> >
>> > VecGetOwnershipRange(A,&start,&end);
>> > ISCreateStride(MPI_COMM_WORLD,(end-start),start,1,&is_fromA);
>> > ISCreateStride(MPI_COMM_WORLD,(end-start),start,1,&is_toC1);// this is
>> redundant
>> > VecScatterCreate(A,is_fromA,C,is_toC1,&scatter1);
>> >
>> > VecGetSize(A,&sizeA)
>> > VecGetOwnershipRange(B,&start,&end);
>> > ISCreateStride(MPI_COMM_WORLD,(end-start),start,1,&is_fromB);
>> > ISCreateStride(MPI_COMM_WORLD,(end-start),(start+sizeA),1,&is_toC2);
>> //shifts the index location
>> > VecScatterCreate(B,is_fromB,C,is_toC2,&scatter2);
>> >
>> > Then I can use
>> > VecScatterBegin(scatter1,A,C,INSERT_VALUES,SCATTER_FORWARD);
>> > VecScatterEnd(scatter1,A,C,INSERT_VALUES,SCATTER_FORWARD);
>> >
>> > and
>> > VecScatterBegin(scatter1,C,A,INSERT_VALUES,SCATTER_REVERSE);
>> > VecScatterEnd(scatter1,C,A,INSERT_VALUES,SCATTER_REVERSE);
>> >
>> > and the same with B.
>> > I used MPI_COMM SELF and I got the same results.
>> >
>> > The situation is: My results look good for the portion of B, but no for
>> the portion of A, there is something that I'm doing wrong with the
>> scattering?
>> >
>> > Best regards.
>>
>>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20191108/682ae801/attachment.html>


More information about the petsc-users mailing list