use of VecPlaceArray in parallel with fortran

Jed Brown jed at 59A2.org
Tue Nov 10 05:40:39 CST 2009


Wienand Drenth wrote:
> Would the following procedure lead to a correct and working solution:
>
> Suppose I have a Fortran array X, and I create on processor zero a
> sequential PetSc vector  MS_X and place the array X into MS_X using
> VecPlaceArray. With VecScatterCreaterToZero, and SCATTER_REVERSE as
> scatter mode I can spread it onto the global (parallel) vector M_X.
> 
> After my calculations, I can do the same to scatter the parallel
> solution onto my sequential vector MS_X (now with SCATTER_FORWARD),
> and continue afterwards with X.

With this last part, you are responsible for broadcasting X before your
code can continue.  VecScatterCreateToAll() would get PETSc to do it for
you, *but* these may be too restrictive for what you want.

It will only work if the local portions are contiguous (it is an issue
of natural versus "PETSc" ordering, see Figure 9 of the user's manual).
Presumably your code uses the natural ordering, but solvers will perform
better if they can use the PETSc ordering.  Therefore you will probably
have to make your own scatter.

Assembling the matrix is more tricky because it will be a major
bottleneck if process 0 has to do all of it (unless you solve many
problems with the same matrix) and it is expensive to assemble it on the
wrong process (i.e. assemble in the natural ordering and let PETSc send
the entries to the correct process).

I don't know how how your code is organized, but I highly recommend
using a decomposition like is done by DA (and preferably also use the
DA, even if it means you have to do more copies -- cheap compared to the
shenanigans we are talking about here).  This should involve *less*
modification to your existing serial code, and will offer much better
scalability.

Jed

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 261 bytes
Desc: OpenPGP digital signature
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20091110/d8d91be9/attachment.pgp>


More information about the petsc-users mailing list