use of VecPlaceArray in parallel with fortran

Barry Smith bsmith at mcs.anl.gov
Fri Nov 6 12:48:00 CST 2009


   VecPlaceArray() gives to the vector its local (on process) part  of  
the array, not the whole array (and requires no communication). If you  
want the entire array of the vector on one or all processes you can  
use VecScatterCreateToAll() or VecScatterCreateToZero() and then use  
the VecScatter created to move the values to where you want them.

    Barry

On Nov 6, 2009, at 12:00 PM, Wienand Drenth wrote:

> Hello all,
>
> In my research code I solve a linear system of equations, and (of
> course) I use PetSc routines for that. However, in the code we have
> our own data arrays for the right handside vector B, and solution
> vector X. Only just prior to the call to KSPSolve, we use the routine
> VecPlaceArray to synchronize the Fortran array B and X with their
> PetSc counterparts (M_B and M_X, for example, respectively).
>
> I was wondering if this would work in parallel as well? I have adapted
> one of the tutorial examples  (ex2f from the ksp tutorials) to utilize
> the VecPlaceArray mechanism. I encountered no problems, except when I
> want to run the program in parallel.
>
> When I do that, and print my own vector X afterwards, different
> processors show different parts of the solution. For example, for a
> vector of length 10, and with two processors, processor one will have
> values for the first five elements (remainder is zero), and processor
> two will have values for the last five elements in the array.
>
>> From the same ksp tutorials, I have tried ex13 as well, the c  
>> program.
> Here I do not get partial outputs for different processors.
>
> I wonder whether one cannot use VecPlaceArray in a parralel setting in
> Fortran, except by doing extra bookkeeping? I hope someone can
> enlighten me, and indicate where I missed something in my programming
> or otherwise.
>
> Thanks in advance,
>
> Wienand Drenth
>
>
>
> -- 
> Wienand Drenth PhD
> Eindhoven, the Netherlands



More information about the petsc-users mailing list