[petsc-users] Efficient interaction of local and distributed Mat/Vec

Jed Brown jedbrown at mcs.anl.gov
Mon Nov 21 08:12:05 CST 2011


On Mon, Nov 21, 2011 at 07:31, Thomas Witkowski <
Thomas.Witkowski at tu-dresden.de> wrote:

> In my FETI-DP code, there are many interactions between local and
> distributed data structures (matrices, vectors). Mainly, there is on each
> rank a matrix mat_bb (MATSEQAIJ) representing the local subdomain problem
> (without the primal nodes). In my implementation, the corresponding vector
> f_b is a distributed VECMPI. Thus, on each rank the local part of f_b
> corresponds to the size of the local matrix mat_bb. For each solution with
> mat_bb and the right-hand-side f_b, my code creates a temporary vector
> f_b_seq (VECSEQ), creates two IS (for the global indices in f_b and the
> local in f_b_seq), and copy the values from f_b to f_b_seq with a
> VecScatter. After the solution with m_b_b, the same is done the other way
> round.
>
> This works fine. My question: Is this the best/most efficient way to do it
> with PETSc? I'm not really sure. It's a lot of code and I do not like the
> idea of coping the same value from one date structure to another one just
> to make them "compatible" in some way.
>

It is very unlikely that these copies (which end up calling memcpy()) have
a measurable effect on performance. There is a new alternative that would
be slightly less code and will avoid the copy in some cases. Call

VecGetSubVector(f_b,is_f_b,&f_b_seq); // is_f_b should have been created on
MPI_COMM_SELF
// use f_b_seq
VecRestoreSubVector(f_b,is_f_b,&f_b_seq);

The index set is_f_b should have been created on MPI_COMM_SELF (the
communicator that you want f_b_seq to reside on) and contain the global
indices from f_b that you want.

http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/Vec/VecGetSubVector.html

It is also possible to use VecPlaceArray(), but I find that much more ugly
than VecGetSubVector().
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111121/165a5d81/attachment.htm>


More information about the petsc-users mailing list