[MPICH] MPI_Comm_create for inter-communicator problem?
zws
zws at mail.tsinghua.edu.cn
Sat Aug 4 21:25:15 CDT 2007
MPI-II have extended the MPI-I's MPI_Comm_create, so
we can use this API in creation of Inter-communicator.
With the following sample code, It always get
"MPI_Comm_create(121): Too many communicators" errors.
The following is a example code, I am wondering if there are some
miss understanding about the API.
int main(int argc, char **argv)
{
int rank,size,color;
MPI_Comm intercomm,myComm;
MPI_Comm dupintercomm;
MPI_Comm crtintercomm;
MPI_Comm splintercomm;
int grpleft[4]={0,2,4,6};
int grpright[4]={1,3,5,7};
int grpleftexcl[4]={0,2};
int grprightincl[4]={1,3};
MPI_Group left,right,grpworld, sub;
MPI_Init(&argc, &argv);
MPI_Comm_rank(MPI_COMM_WORLD, &rank);
MPI_Comm_size(MPI_COMM_WORLD, &size);
color=rank%2;
MPI_Comm_group(MPI_COMM_WORLD, &grpworld);
MPI_Comm_split(MPI_COMM_WORLD, color, rank, &myComm);
if(color == 0) {
MPI_Intercomm_create(myComm, 0, MPI_COMM_WORLD, 1, 1, &intercomm);
MPI_Comm_group(intercomm, &left);
MPI_Group_excl(left, 2, grpleftexcl,&sub);
} else {
MPI_Intercomm_create(myComm, 0, MPI_COMM_WORLD, 0, 1, &intercomm);
MPI_Comm_group(intercomm, &right);
MPI_Group_incl(right, 2, grprightincl,&sub);
}
MPI_Comm_create(intercomm, sub, &crtintercomm);
MPI_Comm_free(&crtintercomm);
MPI_Comm_free(&intercomm);
MPI_Group_free(&sub);
MPI_Finalize();
return 0;
}
When run this example, the output says:
[zws at cn116-ib c]$ mpiexec -n 8 src/intercommcreate
rank 7 in job 113 cn116-ib_33326 caused collective abort of all ranks
exit status of rank 7: return code 1
[cli_3]: aborting job:
Fatal error in MPI_Comm_create: Other MPI error, error stack:
MPI_Comm_create(219): MPI_Comm_create(comm=0x84000001, group=0x88000002, new_comm=0x60000fffffff9374) failed
MPI_Comm_create(121): Too many communicators
[cli_2]: aborting job:
Fatal error in MPI_Comm_create: Other MPI error, error stack:
MPI_Comm_create(219): MPI_Comm_create(comm=0x84000001, group=0x88000002, new_comm=0x60000fffffffb074) failed
MPI_Comm_create(121): Too many communicators
[cli_6]: aborting job:
Fatal error in MPI_Comm_create: Other MPI error, error stack:
MPI_Comm_create(219): MPI_Comm_create(comm=0x84000001, group=0x88000002, new_comm=0x60000fffffffb074) failed
MPI_Comm_create(121): Too many communicators
[cli_5]: aborting job:
Fatal error in MPI_Comm_create: Other MPI error, error stack:
MPI_Comm_create(219): MPI_Comm_create(comm=0x84000001, group=0x88000002, new_comm=0x60000fffffff93f4) failed
MPI_Comm_create(121): Too many communicators
rank 6 in job 113 cn116-ib_33326 caused collective abort of all ranks
exit status of rank 6: return code 1
[cli_7]: aborting job:
Fatal error in MPI_Comm_create: Other MPI error, error stack:
MPI_Comm_create(219): MPI_Comm_create(comm=0x84000001, group=0x88000002, new_comm=0x60000fffffff9374) failed
MPI_Comm_create(121): Too many communicators
[cli_1]: aborting job:
Fatal error in MPI_Comm_create: Other MPI error, error stack:
MPI_Comm_create(219): MPI_Comm_create(comm=0x84000001, group=0x88000002, new_comm=0x60000fffffffb1f4) failed
MPI_Comm_create(121): Too many communicators
rank 3 in job 113 cn116-ib_33326 caused collective abort of all ranks
exit status of rank 3: return code 1
rank 2 in job 113 cn116-ib_33326 caused collective abort of all ranks
exit status of rank 2: return code 1
More information about the mpich-discuss
mailing list