[petsc-users] Custom vector owenrship ranges

Mohammad Bahaa m.bahaa.eldein at gmail.com
Wed Mar 26 08:28:45 CDT 2014


Actually I tried your suggestion, and it works fine, but it's slightly
different from what I need, since each process should have access to other
processes' values, since there's some sort of interaction, so I need
process 0 (for instance) to be capable of accessing (just reading) value
out of its ownership range


On Wed, Mar 19, 2014 at 1:16 AM, Matthew Knepley <knepley at gmail.com> wrote:

> On Tue, Mar 18, 2014 at 11:00 AM, Mohammad Bahaa <m.bahaa.eldein at gmail.com
> > wrote:
>
>> I used
>> call VecCreateMPIWithArray(PETSC_COMM_WORLD,1,nc,ncall,myx,xall,ierr)
>>
>> however, when I use process 0 to write a file containing the combined
>> values (the xall vector), the values seem not to be updated by some
>> processes, eventhough I use PetscBarrier, in other words, values locally
>> owned by processes 0 and 2 are ok, but those owned by process 1 & 3 aren't !
>>
>
> For collective writes, use VecView() or -vec_view
>
>    Matt
>
>
>> On Tue, Mar 18, 2014 at 3:43 PM, Mohammad Bahaa <m.bahaa.eldein at gmail.com
>> > wrote:
>>
>>> the second approach of the MPI vector did it for me, thanks
>>>
>>>
>>> On Tue, Mar 18, 2014 at 3:20 PM, Mohammad Bahaa <
>>> m.bahaa.eldein at gmail.com> wrote:
>>>
>>>> Forgive me as my expression "sum up" was misguiding or misplaced, I
>>>> didn't mean to literally sum the values in the vectors, I meant I want to
>>>> put all values from each local vector into one global vector that can be
>>>> accessed by all processes, "COMM_WORLD" communicator for instance
>>>>
>>>>
>>>> On Tue, Mar 18, 2014 at 3:09 PM, Matthew Knepley <knepley at gmail.com>wrote:
>>>>
>>>>> On Tue, Mar 18, 2014 at 7:53 AM, Mohammad Bahaa <
>>>>> m.bahaa.eldein at gmail.com> wrote:
>>>>>
>>>>>> I'm using "PETSC_COMM_SELF" communicator for running n serial
>>>>>> independent processes, I need to sum up a certain vector from the n
>>>>>> processes in one vector, however, vectors involved in each process vary in
>>>>>> size, and I couldn't find any function to define custom ownership ranges,
>>>>>> so assuming I have a 4 processes run with each computing an "x" vector as
>>>>>> follows:
>>>>>>
>>>>>> 1. process (1) with x of length 51
>>>>>> 2. process (2) with x of length 49
>>>>>> 3. process (3) with x of length 52
>>>>>> 4. process (4) with x of length 48
>>>>>>
>>>>>
>>>>> Let your local length be n, so that on proc 3 n== 52. Then
>>>>>
>>>>>   VecCreate(comm, &v);
>>>>>   VecSetSizes(v, n, PETSC_DETERMINE);
>>>>>   VecSetFromOptions(v);
>>>>>   <fill up v>
>>>>>   VecSum(v, &sum);
>>>>>
>>>>> You could also make a parallel Vec from your Seq vecs:
>>>>>
>>>>>   VecGetArray(lv, &array);
>>>>>   VecCreateMPIWithArray(comm, 1, n, PETSC_DETERMINE, array, &v);
>>>>>
>>>>>   Thanks,
>>>>>
>>>>>      Matt
>>>>>
>>>>>
>>>>>> The processes sum up to 100 elements, when I define a vector "x_all"
>>>>>> of size "100" with "PETSC_COMM_WORLD" communicator, the ownership
>>>>>> ranges are equal, which isn't the case, how to customize them ?
>>>>>>
>>>>>> --
>>>>>> Mohamamd Bahaa ElDin
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin their
>>>>> experiments is infinitely more interesting than any results to which their
>>>>> experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Mohamamd Bahaa ElDin
>>>>
>>>
>>>
>>>
>>> --
>>> Mohamamd Bahaa ElDin
>>>
>>
>>
>>
>> --
>> Mohamamd Bahaa ElDin
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>



-- 
Mohamamd Bahaa ElDin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140326/84850ae6/attachment-0001.html>


More information about the petsc-users mailing list