use of VecPlaceArray in parallel with fortran

Barry Smith bsmith at mcs.anl.gov
Mon Nov 9 15:00:53 CST 2009


On Nov 9, 2009, at 10:30 AM, Wienand Drenth wrote:

> Hello Barry,
>
> Thank you for that.
>
> Just another question. As I wrote in my first email, in the current
> code, we utilize a local non-PetSc arrays and using VecPlaceArray we
> "give" this array to PetSc vectors to do the KSPSolve. Afterwards, we

     We are having some difficulty understanding your question and  
what exactly you want to do?

     In PETSc we use the term "local part" to mean the part of a  
vector owned and stored on a particular process. A global vector  
(parallel) in PETSc then stores part of it on each process.
If you have on each process an array that holds the "local part" of a  
parallel vector. For example, double precision v(nlocal) then you can  
create a parallel vector with VecCreateWithArray() or
VecPlaceArray() passing in v.

    If you have the entire Fortran array stored on one process and you  
want it parallel in PETSc then you can use the  
VecScatterCreateToZero() to get the scatter to spread it to all the  
processors.

    If you have parts stored on each process and you want ghost points  
filled in on each process then you need to set up a scatter with  
VecScatterCreate().

    Barry



> can just continue with our local non-PetSc arrays. If I understand you
> correctly, and for my knowledge, this approach will not be possible in
> a parallel setting?
>
> When I do, with for example two processors, and with local array being
> blocal = 1, 2, .... , 10
> then for the zeroth processor I have also values 1, 2, ... , 10 and
> not just half (i.e., 1,2,3,4,5,0,0,0,0,0).
> for the first processor I have only part of the values, but they start
> with the first entry of my array, and not half-way:
> 0,0,0,0,0, 1,2,3,4,5 instead of 0,0,0,0,0, 6,7,8,9,10
>
>
> Regards,
>
> Wienand
>
> On Fri, Nov 6, 2009 at 7:48 PM, Barry Smith <bsmith at mcs.anl.gov>  
> wrote:
>>
>> VecPlaceArray() gives to the vector its local (on process) part  of  
>> the
>> array, not the whole array (and requires no communication). If you  
>> want the
>> entire array of the vector on one or all processes you can use
>> VecScatterCreateToAll() or VecScatterCreateToZero() and then use the
>> VecScatter created to move the values to where you want them.
>>
>>  Barry
>>
>> On Nov 6, 2009, at 12:00 PM, Wienand Drenth wrote:
>>
>>> Hello all,
>>>
>>> In my research code I solve a linear system of equations, and (of
>>> course) I use PetSc routines for that. However, in the code we have
>>> our own data arrays for the right handside vector B, and solution
>>> vector X. Only just prior to the call to KSPSolve, we use the  
>>> routine
>>> VecPlaceArray to synchronize the Fortran array B and X with their
>>> PetSc counterparts (M_B and M_X, for example, respectively).
>>>
>>> I was wondering if this would work in parallel as well? I have  
>>> adapted
>>> one of the tutorial examples  (ex2f from the ksp tutorials) to  
>>> utilize
>>> the VecPlaceArray mechanism. I encountered no problems, except  
>>> when I
>>> want to run the program in parallel.
>>>
>>> When I do that, and print my own vector X afterwards, different
>>> processors show different parts of the solution. For example, for a
>>> vector of length 10, and with two processors, processor one will  
>>> have
>>> values for the first five elements (remainder is zero), and  
>>> processor
>>> two will have values for the last five elements in the array.
>>>
>>>> From the same ksp tutorials, I have tried ex13 as well, the c  
>>>> program.
>>>
>>> Here I do not get partial outputs for different processors.
>>>
>>> I wonder whether one cannot use VecPlaceArray in a parralel  
>>> setting in
>>> Fortran, except by doing extra bookkeeping? I hope someone can
>>> enlighten me, and indicate where I missed something in my  
>>> programming
>>> or otherwise.
>>>
>>> Thanks in advance,
>>>
>>> Wienand Drenth
>>>
>>>
>>>
>>> --
>>> Wienand Drenth PhD
>>> Eindhoven, the Netherlands
>>
>>
>
>
>
> -- 
> Wienand Drenth PhD
> Eindhoven, the Netherlands



More information about the petsc-users mailing list