[petsc-users] Question about parallel Vectors and communicators
Zhang, Junchao
jczhang at mcs.anl.gov
Fri May 10 15:01:14 CDT 2019
Jean-Christophe,
I added a petsc example at https://bitbucket.org/petsc/petsc/pull-requests/1652/add-an-example-to-show-transfer-vectors/diff#chg-src/vec/vscat/examples/ex9.c
It shows how to transfer vectors from a parent communicator to vectors on a child communicator. It also shows how to transfer vectors from a subcomm to vectors on another subcomm. The two subcomms are not required to cover all processes in PETSC_COMM_WORLD.
Hope it helps you better understand Vec and VecScatter.
--Junchao Zhang
On Thu, May 9, 2019 at 11:34 AM GIRET Jean-Christophe via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Hello,
Thanks Mark and Jed for your quick answers.
So the idea is to define all the Vecs on the world communicator, and perform the communications using traditional scatter objects? The data would still be accessible on the two sub-communicators as they are both subsets of the MPI_COMM_WORLD communicator, but they would be used while creating the Vecs or the IS for the scatter. Is that right?
I’m currently trying, without success, to perform a Scatter from a MPI Vec defined on a subcomm to another Vec defined on the world comm, and vice-versa. But I don’t know if it’s possible.
I can imagine that trying doing that seems a bit strange. However, I’m dealing with code coupling (and linear algebra for the main part of the code), and my idea was trying to use the Vec data structures to perform data exchange between some parts of the software which would have their own communicator. It would eliminate the need to re-implement an ad-hoc solution.
An option would be to stick on the world communicator for all the PETSc part, but I could face some situations where my Vecs could be small while I would have to run the whole simulation on an important number of core for the coupled part. I imagine that It may not really serve the linear system solving part in terms of performance. Another one would be perform all the PETSc operations on a sub-communicator and use “raw” MPI communications between the communicators to perform the data exchange for the coupling part.
Thanks again for your support,
Best regards,
Jean-Christophe
De : Mark Adams [mailto:mfadams at lbl.gov<mailto:mfadams at lbl.gov>]
Envoyé : mardi 7 mai 2019 21:39
À : GIRET Jean-Christophe
Cc : petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>
Objet : Re: [petsc-users] Question about parallel Vectors and communicators
On Tue, May 7, 2019 at 11:38 AM GIRET Jean-Christophe via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Dear PETSc users,
I would like to use Petsc4Py for a project extension, which consists mainly of:
- Storing data and matrices on several rank/nodes which could not fit on a single node.
- Performing some linear algebra in a parallel fashion (solving sparse linear system for instance)
- Exchanging those data structures (parallel vectors) between non-overlapping MPI communicators, created for instance by splitting MPI_COMM_WORLD.
While the two first items seems to be well addressed by PETSc, I am wondering about the last one.
Is it possible to access the data of a vector, defined on a communicator from another, non-overlapping communicator? From what I have seen from the documentation and the several threads on the user mailing-list, I would say no. But maybe I am missing something? If not, is it possible to transfer a vector defined on a given communicator on a communicator which is a subset of the previous one?
If you are sending to a subset of processes then VecGetSubVec + Jed's tricks might work.
https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecGetSubVector.html
Best regards,
Jean-Christophe
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190510/7e95dc09/attachment.html>
More information about the petsc-users
mailing list