[petsc-users] Question about parallel Vectors and communicators

Mark Adams mfadams at lbl.gov
Tue May 7 14:39:27 CDT 2019


On Tue, May 7, 2019 at 11:38 AM GIRET Jean-Christophe via petsc-users <
petsc-users at mcs.anl.gov> wrote:

> Dear PETSc users,
>
>
>
> I would like to use Petsc4Py for a project extension, which consists
> mainly of:
>
> -          Storing data and matrices on several rank/nodes which could
> not fit on a single node.
>
> -          Performing some linear algebra in a parallel fashion (solving
> sparse linear system for instance)
>
> -          Exchanging those data structures (parallel vectors) between
> non-overlapping MPI communicators, created for instance by splitting
> MPI_COMM_WORLD.
>
>
>
> While the two first items seems to be well addressed by PETSc, I am
> wondering about the last one.
>
>
>
> Is it possible to access the data of a vector, defined on a communicator
> from another, non-overlapping communicator? From what I have seen from the
> documentation and the several threads on the user mailing-list, I would say
> no. But maybe I am missing something? If not, is it possible to transfer a
> vector defined on a given communicator on a communicator which is a subset
> of the previous one?
>

If you are sending to a subset of processes then VecGetSubVec + Jed's
tricks might work.

https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecGetSubVector.html


>
>
> Best regards,
>
> Jean-Christophe
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190507/311b3099/attachment.html>


More information about the petsc-users mailing list