DMComposite and VecGhost
Jed Brown
jed at 59A2.org
Wed May 13 07:01:10 CDT 2009
This does not work correctly because the ghost values belong to a
different vector after VecPlaceArray. A design that seems attractive is
to have the DMComposite vector use VecGhost with layout
[field1_owned, field2_owned | field1_ghost, field2_ghost]
You could then update ghosts for all fields at once, effectively
hoisting the comm out of the loop in DMCompositeScatter and replacing
the global to local scatters for each component with a sequential
scatter from the local form of the composite vector to the local form of
each component vector. For fields using VecGhost, this local scatter
just copies two contiguous blocks into the local rep, and with Array, it
is a single sequential copy.
I suspect that cutting down on comm by only having one parallel scatter
for all fields could be significant for large processor counts with
matrix-free methods.
This can be done in a way that is compatible with the current design
because DA global vecs have no ghost values so VecGhostUpdate on the
composite vector would be a no-op (but of course there would parallel
comm for each field in DAGlobalToLocal). A modest API addition could
allow a purely local scatter from the DMComposite's local form to the DA
local vec.
Note that this design also permits a single-comm adjoint to
DMCompositeScatter. DMCompositeGather is not it since it only updates
the owned portion, but in the FEM context elements usually only appear
on one process so we need an analogue of
VecGhostUpdateBegin(g,ADD_VALUES,SCATTER_REVERSE)
Are there reasons this is a bad idea? Would you prefer a different way
of using VecGhost with DMComposite?
Jed
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 260 bytes
Desc: OpenPGP digital signature
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20090513/8d2d2c16/attachment.sig>
More information about the petsc-dev
mailing list