[petsc-users] Question about parallel Vectors and communicators

Zhang, Junchao jczhang at mcs.anl.gov
Mon May 13 10:14:20 CDT 2019


 The index sets provide possible i, j in scatter "y[j] = x[i]". Each process provides a portion of the i and j of the whole scatter. The only requirement of VecScatterCreate is that on each process, local sizes of ix and iy must be equal (a process can provide empty ix and iy).  A process's i and j can point to anyplace in their vector (not constrained to the vector's local part)
 The interpretation of ix and iy is not dependent on their communicator, instead, is dependent on their associated vector. Let P and S stand for parallel and sequential vectors respectively, there are four combinations of vecscatters: PtoP, PtoS, StoP and StoS.  The assumption is: if x is parallel, then ix contains global indices of x. If x is sequential, ix contains local indices of x. Similarly for y and iy.
 So, index sets created with PETSC_COMM_SELF can perfectly include global indices. That is why I always use PETSC_COMM_SELF to create index sets for VecScatter. It makes things easier to understand.
 The quote you gave is also confusing to me. If you use PETSC_COMM_SELF, it means only the current process uses the IS. That sounds ok since other processes can not get a reference to this IS.
 Maybe, other petsc developers can explain when parallel communicators are useful for index sets.  My feeling is that they are useless at least for VecScatter.

--Junchao Zhang


On Mon, May 13, 2019 at 9:07 AM GIRET Jean-Christophe <jean-christophe.giret at irt-saintexupery.com<mailto:jean-christophe.giret at irt-saintexupery.com>> wrote:
Hello,

Thank you all for you answers and examples, it’s now very clear: the trick is to alias a Vec on a subcomm with a Vec on the parent comm, and to make the comm through Scatter on the parent comm. I have also been able to implement it with petsc4py.

Junchao, thank you for your example. It is indeed very clear. Although I understand how the exchanges are made through the Vecs defined on the parent comms, I am wondering why ISCreateStride is defined on the communicator PETSC_COMM_SELF and not on the parent communicator spanning the Vecs used for the Scatter operations.

When I read the documentation, I see: “The communicator, comm, should consist of all processes that will be using the IS.” I would say in that case that it is the same communicator used for the ‘exchange’ vectors.

I am surely misunderstanding something here, but I didn’t find any answer while googling. Any hint on that?

Again, thank you all for your great support,
Best,
JC



De : Zhang, Junchao [mailto:jczhang at mcs.anl.gov<mailto:jczhang at mcs.anl.gov>]
Envoyé : vendredi 10 mai 2019 22:01
À : GIRET Jean-Christophe
Cc : Mark Adams; petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>
Objet : Re: [petsc-users] Question about parallel Vectors and communicators

Jean-Christophe,
  I added a petsc example at https://bitbucket.org/petsc/petsc/pull-requests/1652/add-an-example-to-show-transfer-vectors/diff#chg-src/vec/vscat/examples/ex9.c
  It shows how to transfer vectors from a parent communicator to vectors on a child communicator. It also shows how to transfer vectors from a subcomm to vectors on another subcomm. The two subcomms are not required to cover all processes in PETSC_COMM_WORLD.
  Hope it helps you better understand Vec and VecScatter.
--Junchao Zhang


On Thu, May 9, 2019 at 11:34 AM GIRET Jean-Christophe via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Hello,

Thanks Mark and Jed for your quick answers.

So the idea is to define all the Vecs on the world communicator, and perform the communications using traditional scatter objects? The data would still be accessible on the two sub-communicators as they are both subsets of the MPI_COMM_WORLD communicator, but they would be used while creating the Vecs or the IS for the scatter. Is that right?

I’m currently trying, without success, to perform a Scatter from a MPI Vec defined on a subcomm to another Vec defined on the world comm, and vice-versa. But I don’t know if it’s possible.

I can imagine that trying doing that seems a bit strange. However, I’m dealing with code coupling (and linear algebra for the main part of the code), and my idea was trying to use the Vec data structures to perform data exchange between some parts of the software which would have their own communicator. It would eliminate the need to re-implement an ad-hoc solution.

An option would be to stick on the world communicator for all the PETSc part, but I could face some situations where my Vecs could be small while I would have to run the whole simulation on an important number of core for the coupled part. I imagine that It may not really serve the linear system solving part in terms of performance. Another one would be perform all the PETSc operations on a sub-communicator and use “raw” MPI communications between the communicators to perform the data exchange for the coupling part.

Thanks again for your support,
Best regards,
Jean-Christophe

De : Mark Adams [mailto:mfadams at lbl.gov<mailto:mfadams at lbl.gov>]
Envoyé : mardi 7 mai 2019 21:39
À : GIRET Jean-Christophe
Cc : petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>
Objet : Re: [petsc-users] Question about parallel Vectors and communicators



On Tue, May 7, 2019 at 11:38 AM GIRET Jean-Christophe via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Dear PETSc users,

I would like to use Petsc4Py for a project extension, which consists mainly of:

-          Storing data and matrices on several rank/nodes which could not fit on a single node.

-          Performing some linear algebra in a parallel fashion (solving sparse linear system for instance)

-          Exchanging those data structures (parallel vectors) between non-overlapping MPI communicators, created for instance by splitting MPI_COMM_WORLD.

While the two first items seems to be well addressed by PETSc, I am wondering about the last one.

Is it possible to access the data of a vector, defined on a communicator from another, non-overlapping communicator? From what I have seen from the documentation and the several threads on the user mailing-list, I would say no. But maybe I am missing something? If not, is it possible to transfer a vector defined on a given communicator on a communicator which is a subset of the previous one?

If you are sending to a subset of processes then VecGetSubVec + Jed's tricks might work.

https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecGetSubVector.html


Best regards,
Jean-Christophe


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190513/14b236e3/attachment-0001.html>


More information about the petsc-users mailing list