[petsc-users] Question about parallel Vectors and communicators

Dave May dave.mayhem23 at gmail.com
Thu May 9 12:34:44 CDT 2019


On Thu, 9 May 2019 at 18:19, Matthew Knepley via petsc-users <
petsc-users at mcs.anl.gov> wrote:

> On Thu, May 9, 2019 at 12:34 PM GIRET Jean-Christophe via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
>> Hello,
>>
>>
>>
>> Thanks Mark and Jed for your quick answers.
>>
>>
>>
>> So the idea is to define all the Vecs on the world communicator, and
>> perform the communications using traditional scatter objects? The data
>> would still be accessible on the two sub-communicators as they are both
>> subsets of the MPI_COMM_WORLD communicator, but they would be used while
>> creating the Vecs or the IS for the scatter. Is that right?
>>
>>
>>
>> I’m currently trying, without success, to perform a Scatter from a MPI
>> Vec defined on a subcomm to another Vec defined on the world comm, and
>> vice-versa. But I don’t know if it’s possible.
>>
>
> You cannot do that. What you want to do is:
>
> 1) Create two Vecs on COMM_WORLD. Make the second vec have all 0 sizes on
> processes not in the subcomm.
>
> 2) Create a scatter between the two Vec
>
> 3) Scatter data
>
> 4) Use VecGetArray() to get the pointer to the data on the second vec, and
> use VecCreateWithArray() ONLY on the subcomm,
>     or if you do not mind copies, just use a copy.
>

You can find a concrete example of doing exactly what Matt described above
within PCTELESCOPE.

See
  src/ksp/pc/impls/telescope/telescope.c
or go here

https://www.mcs.anl.gov/petsc/petsc-current/src/ksp/pc/impls/telescope/telescope.c.html#PCTELESCOPE

Specifically you want to examine the functions
  PCTelescopeSetUp_default()
  PCApply_Telescope()

To explain in more detail, within PCTelescopeSetUp_default()
* I create two vectors, xtmp (living on comm_1 with some ranks owning zero
entries) and xred (defined on a sub comm of comm_1, say sub_comm_1).
* I create the scatter between some input x and xtmp.
The points implement Matt's steps 1 & 2

In PCApply_Telescope(), I perform the scatter between the input arg x and
the vector xtmp (both defined on comm_1) [Matt's step 3].

Then you'll see this

if (xred) {
     PetscScalar *LA_xred;
     VecGetOwnershipRange(xred,&st,&ed);
     VecGetArray(xred,&LA_xred);
     for (i=0; i<ed-st; i++) {
       LA_xred[i] = x_array[i];
     }
     VecRestoreArray(xred,&LA_xred);
   }

This is Matt's step 4. Note, that xred was created such that it was NULL on
ranks in comm_1 which are not defined in sub_comm_1. Hence the test if
(xred) is checking if we are living on a rank within sub_comm_1. If this is
true, I copy ALL entries from xtmp.

In PCTELESCOPE I did not bother with the optimization of using
VecCreateWithArray() - I just do the copy.


Cheers
Dave




>
>   Thanks,
>
>     Matt
>
>
>>
>>
> I can imagine that trying doing that seems a bit strange. However, I’m
>> dealing with code coupling (and linear algebra for the main part of the
>> code), and my idea was trying to use the Vec data structures to perform
>> data exchange between some parts of the software which would have their own
>> communicator. It would eliminate the need to re-implement an ad-hoc
>> solution.
>>
>>
>>
>> An option would be to stick on the world communicator for all the PETSc
>> part, but I could face some situations where my Vecs could be small while I
>> would have to run the whole simulation on an important number of core for
>> the coupled part. I imagine that It may not really serve the linear system
>> solving part in terms of performance. Another one would be perform all the
>> PETSc operations on a sub-communicator and use “raw” MPI communications
>> between the communicators to perform the data exchange for the coupling
>> part.
>>
>>
>>
>> Thanks again for your support,
>>
>> Best regards,
>>
>> Jean-Christophe
>>
>>
>>
>> *De :* Mark Adams [mailto:mfadams at lbl.gov]
>> *Envoyé :* mardi 7 mai 2019 21:39
>> *À :* GIRET Jean-Christophe
>> *Cc :* petsc-users at mcs.anl.gov
>> *Objet :* Re: [petsc-users] Question about parallel Vectors and
>> communicators
>>
>>
>>
>>
>>
>>
>>
>> On Tue, May 7, 2019 at 11:38 AM GIRET Jean-Christophe via petsc-users <
>> petsc-users at mcs.anl.gov> wrote:
>>
>> Dear PETSc users,
>>
>>
>>
>> I would like to use Petsc4Py for a project extension, which consists
>> mainly of:
>>
>> -          Storing data and matrices on several rank/nodes which could
>> not fit on a single node.
>>
>> -          Performing some linear algebra in a parallel fashion (solving
>> sparse linear system for instance)
>>
>> -          Exchanging those data structures (parallel vectors) between
>> non-overlapping MPI communicators, created for instance by splitting
>> MPI_COMM_WORLD.
>>
>>
>>
>> While the two first items seems to be well addressed by PETSc, I am
>> wondering about the last one.
>>
>>
>>
>> Is it possible to access the data of a vector, defined on a communicator
>> from another, non-overlapping communicator? From what I have seen from the
>> documentation and the several threads on the user mailing-list, I would say
>> no. But maybe I am missing something? If not, is it possible to transfer a
>> vector defined on a given communicator on a communicator which is a subset
>> of the previous one?
>>
>>
>>
>> If you are sending to a subset of processes then VecGetSubVec + Jed's
>> tricks might work.
>>
>>
>>
>>
>> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecGetSubVector.html
>>
>>
>>
>>
>>
>> Best regards,
>>
>> Jean-Christophe
>>
>>
>>
>>
>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190509/e4cfb5d6/attachment-0001.html>


More information about the petsc-users mailing list