[petsc-users] Question about parallel Vectors and communicators

Matthew Knepley knepley at gmail.com
Thu May 9 12:18:24 CDT 2019


On Thu, May 9, 2019 at 12:34 PM GIRET Jean-Christophe via petsc-users <
petsc-users at mcs.anl.gov> wrote:

> Hello,
>
>
>
> Thanks Mark and Jed for your quick answers.
>
>
>
> So the idea is to define all the Vecs on the world communicator, and
> perform the communications using traditional scatter objects? The data
> would still be accessible on the two sub-communicators as they are both
> subsets of the MPI_COMM_WORLD communicator, but they would be used while
> creating the Vecs or the IS for the scatter. Is that right?
>
>
>
> I’m currently trying, without success, to perform a Scatter from a MPI Vec
> defined on a subcomm to another Vec defined on the world comm, and
> vice-versa. But I don’t know if it’s possible.
>

You cannot do that. What you want to do is:

1) Create two Vecs on COMM_WORLD. Make the second vec have all 0 sizes on
processes not in the subcomm.

2) Create a scatter between the two Vec

3) Scatter data

4) Use VecGetArray() to get the pointer to the data on the second vec, and
use VecCreateWithArray() ONLY on the subcomm,
    or if you do not mind copies, just use a copy.

  Thanks,

    Matt


>
>
I can imagine that trying doing that seems a bit strange. However, I’m
> dealing with code coupling (and linear algebra for the main part of the
> code), and my idea was trying to use the Vec data structures to perform
> data exchange between some parts of the software which would have their own
> communicator. It would eliminate the need to re-implement an ad-hoc
> solution.
>
>
>
> An option would be to stick on the world communicator for all the PETSc
> part, but I could face some situations where my Vecs could be small while I
> would have to run the whole simulation on an important number of core for
> the coupled part. I imagine that It may not really serve the linear system
> solving part in terms of performance. Another one would be perform all the
> PETSc operations on a sub-communicator and use “raw” MPI communications
> between the communicators to perform the data exchange for the coupling
> part.
>
>
>
> Thanks again for your support,
>
> Best regards,
>
> Jean-Christophe
>
>
>
> *De :* Mark Adams [mailto:mfadams at lbl.gov]
> *Envoyé :* mardi 7 mai 2019 21:39
> *À :* GIRET Jean-Christophe
> *Cc :* petsc-users at mcs.anl.gov
> *Objet :* Re: [petsc-users] Question about parallel Vectors and
> communicators
>
>
>
>
>
>
>
> On Tue, May 7, 2019 at 11:38 AM GIRET Jean-Christophe via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
> Dear PETSc users,
>
>
>
> I would like to use Petsc4Py for a project extension, which consists
> mainly of:
>
> -          Storing data and matrices on several rank/nodes which could
> not fit on a single node.
>
> -          Performing some linear algebra in a parallel fashion (solving
> sparse linear system for instance)
>
> -          Exchanging those data structures (parallel vectors) between
> non-overlapping MPI communicators, created for instance by splitting
> MPI_COMM_WORLD.
>
>
>
> While the two first items seems to be well addressed by PETSc, I am
> wondering about the last one.
>
>
>
> Is it possible to access the data of a vector, defined on a communicator
> from another, non-overlapping communicator? From what I have seen from the
> documentation and the several threads on the user mailing-list, I would say
> no. But maybe I am missing something? If not, is it possible to transfer a
> vector defined on a given communicator on a communicator which is a subset
> of the previous one?
>
>
>
> If you are sending to a subset of processes then VecGetSubVec + Jed's
> tricks might work.
>
>
>
>
> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecGetSubVector.html
>
>
>
>
>
> Best regards,
>
> Jean-Christophe
>
>
>
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190509/eb3e32f6/attachment.html>


More information about the petsc-users mailing list