[mpich-discuss] New communicator from connect/accept primitives
Francisco Javier García Blas
fjblas at arcos.inf.uc3m.es
Mon Jan 18 10:26:08 CST 2010
Hello again,
In first place, thanks for response of Rajeev and Jayesh. Following
Rajeev 's instruccion, I implemented an basic example using
connect/accept and intercomm_create/merge primitives. I am doing
something wrong because when MPI_Intercomm_create is invoked, all the
processes become blocked. I don't find the error, maybe it could be a
bad numeration in local and remote communicator but I tried all the
combinations.
I am using mpich2 1.0.5.
I attach the source code and a makefile.
Best regards
Rajeev Thakur escribió:
> You will need to use intercomm_merge but you have to merge them one
> pair at a time. Example below from an old mail.
>
> Rajeev
>
>
> If you have 3 intercommunicators AB_inter, AC_inter, and AD_inter, you
> can merge them all into a single
> intercommunicator as follows:
>
> * begin by doing an MPI_Intercomm_merge on AB_inter, resulting in an
> intracommunicator AB_intra.
>
> * then create an intercommunicator between AB on one side and C on the
> other
> by using MPI_Intercomm_create. Pass AB_intra as the local_comm on A and B,
> MPI_COMM_WORLD as the intracomm on C, and AC_inter as the peer_comm. This
> results in the intercommunicator AB_C_inter.
>
> * then call MPI_Intercomm_merge on it to create the intracommunicator
> ABC_intra.
>
> * then call MPI_Intercomm_create to create an intercommunicator
> between ABC
> and D just as you did with AB and C above.
>
> * Again do an intercomm_merge. This will give you an intracommunicator
> containing A, B, C, D.
>
> * If you want an intercommunicator with A in one group and B,C,D in the
> other, as you would get with a single spawn of 3 processes, you have
> to call
> MPI_Comm_split to split this single communicator into two
> intracommunicators, one containing A and the other containing B,C,D. Then
> call MPI_Intercomm_create to create the intercommunicator.
>
> ------------------------------------------------------------------------
> *From:* mpich-discuss-bounces at mcs.anl.gov
> [mailto:mpich-discuss-bounces at mcs.anl.gov] *On Behalf Of
> *Francisco Javier García Blas
> *Sent:* Friday, January 15, 2010 11:09 AM
> *To:* mpich-discuss at mcs.anl.gov
> *Subject:* [mpich-discuss] New communicator from connect/accept
> primitives
>
> Hello all,
>
> I wondering the possibility of get a new inter-comunicator from N
> communicators, which are results from different calls of
> mpi_comm_connect or mpi_comm_accept.
>
> My initial solution was first, to get the group of each
> inter-communicator with mpi_comm_group, second, to join all the
> groups into one bigger and finally, to create a new communicator
> from the group with the mpi_comm_create primitive.
>
> Currently I am handling a pool of inter - communicators in order
> to keep the functionality. However this idea is not suitable for
> collective and MPI_ANY_SOURCE sends/recvs.
>
> Exist another way to join all the inter-communicator into one?
>
> Any suggestion?
>
> Best regards.
>
>
>
>
> --------------------------------------------------
> Francisco Javier García Blas
> Computer Architecture, Communications and Systems Area.
> Computer Science Department. UNIVERSIDAD CARLOS III DE MADRID
> Avda. de la Universidad, 30
> 28911 Leganés (Madrid), SPAIN
> e-mail: fjblas at arcos.inf.uc3m.es <mailto:fjblas at arcos.inf.uc3m.es>
> fjblas at inf.uc3m.es <mailto:fjblas at inf.uc3m.es>
> Phone:(+34) 916249118
> FAX: (+34) 916249129
> --------------------------------------------------
>
> ------------------------------------------------------------------------
>
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: test_inter.tgz
Type: application/octet-stream
Size: 1410 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20100118/5185eb55/attachment.obj>
More information about the mpich-discuss
mailing list