[petsc-dev] mpi_comm_create_group()
Matthew Knepley
knepley at gmail.com
Fri Dec 3 07:27:05 CST 2021
On Thu, Dec 2, 2021 at 11:56 PM Adrian Croucher <a.croucher at auckland.ac.nz>
wrote:
>
> On 12/3/21 5:53 PM, Jed Brown wrote:
> >> Ah, interesting. I guess mpi_comm_split() might also have the advantage
> >> that each rank only needs to know if it is in the group or not, rather
> >> than needing an array of all participating ranks, as mpi_comm_create()
> >> and mpi_comm_create_group() do.
> > That's exactly why it's often more convenient. But it is collective on
> the parent communicator.
>
> Yes, I was a bit concerned about that after reading this (from
> https://cvw.cac.cornell.edu/MPIAdvTopics/subdividing):
>
> " ...the MPI specification states that the call to MPI_Comm_create must
> be executed by all processes in the input communicator (in our case,
> MPI_COMM_WORLD), and that all processes must pass the same value for the
> group argument (grp), even if they do not belong to the new group. This
> can be a dire problem with a very large number of processes, such as are
> found in petascale systems, so MPI_COMM_CREATE_GROUP was introduced in
> MPI-3 to alleviate this problem."
>
This seems like hyperbole to me. At worst you need one collective operation
to figure out who is in.
This is no more expensive than one dot product, and probably less. Unless
you plan to do this millions
of times in the simulation, I do not see the dire problem.
Thanks,
Matt
> - Adrian
>
> --
> Dr Adrian Croucher
> Senior Research Fellow
> Department of Engineering Science
> University of Auckland, New Zealand
> email: a.croucher at auckland.ac.nz
> tel: +64 (0)9 923 4611
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20211203/3c24fdf3/attachment.html>
More information about the petsc-dev
mailing list