[petsc-users] Question about parallel Vectors and communicators

Jed Brown jed at jedbrown.org
Tue May 7 11:43:15 CDT 2019


The standard approach would be to communicate via the parent comm.  So
you split comm world into part0 and part1 and use a VecScatter with vecs
on world (which can have zero entries on part1 and part0 respectively)
to exchange your data.  You can use VecPlaceArray or VecCreate*WithArray
to avoid an extra copy.

GIRET Jean-Christophe via petsc-users <petsc-users at mcs.anl.gov> writes:

> Dear PETSc users,
>
> I would like to use Petsc4Py for a project extension, which consists mainly of:
>
> -          Storing data and matrices on several rank/nodes which could not fit on a single node.
>
> -          Performing some linear algebra in a parallel fashion (solving sparse linear system for instance)
>
> -          Exchanging those data structures (parallel vectors) between non-overlapping MPI communicators, created for instance by splitting MPI_COMM_WORLD.
>
> While the two first items seems to be well addressed by PETSc, I am wondering about the last one.
>
> Is it possible to access the data of a vector, defined on a communicator from another, non-overlapping communicator? From what I have seen from the documentation and the several threads on the user mailing-list, I would say no. But maybe I am missing something? If not, is it possible to transfer a vector defined on a given communicator on a communicator which is a subset of the previous one?
>
> Best regards,
> Jean-Christophe


More information about the petsc-users mailing list