[mpich-discuss] Problems with mpi spawn multiple
    fernando_luz 
    fernando_luz at tpn.usp.br
       
    Mon Nov  3 14:35:11 CST 2008
    
    
  
Hello,
I have problems when i try use spwan multiple in mpich2 (1.0.7 version)
I receive this error when i execute my code:
[0]Fatal error in MPI_Comm_spawn_multiple: Other MPI error, error stack:
[0]MPI_Comm_spawn_multiple(152)..: MPI_Comm_spawn_multiple(count=2,
cmds=0x816b080, argvs=(nil), maxprocs=0x816afc0, infos=0x81a1600, root=-3,
MPI_COMM_WORLD, intercomm=0xbfa96f24, errors=(nil)) failed
[0]MPID_Comm_spawn_multiple(56)..: 
[0]MPIDI_Comm_spawn_multiple(203): 
[0]MPID_Comm_accept(149).........: 
[0]MPIDI_Comm_accept(974)........: Unable to allocate -45393952 bytes of
memory for remote_translation (probably out of memory)
job aborted:
rank: node: exit code[: error message]
0: a53: -2: Fatal error in MPI_Comm_spawn_multiple: Other MPI error, error
stack:
MPI_Comm_spawn_multiple(152)..: MPI_Comm_spawn_multiple(count=2,
cmds=0x816b080, argvs=(nil), maxprocs=0x816afc0, infos=0x81a1600, root=-3,
MPI_COMM_WORLD, intercomm=0xbfa96f24, errors=(nil)) failed
MPID_Comm_spawn_multiple(56)..: 
MPIDI_Comm_spawn_multiple(203): 
MPID_Comm_accept(149).........: 
MPIDI_Comm_accept(974)........: Unable to allocate -45393952 bytes of
memory for remote_translation (probably out of memory)
my code when I supose have a problem:
universe_size = 3;
  for (int i = 0; i < universe_size-1; i++){
    strcpy(program_name[i], "worker_02"); 
    information[i] = information[i].Create();
    information[i].Set("wdir","/home/fernando_luz/");
   
information[i].Set("path","/home/fernando_luz/SVN/TPN3/casos_testes/02/worker_02/");
    information[i].Set("host","10.2.7.53");
    n_proc[i] = 1;
  }
  everyone = MPI::COMM_WORLD.Spawn_multiple(universe_size-1, (const
char**) program_name, MPI::ARGVS_NULL, n_proc, information, MPI::ROOT);
anybody have ideas?
Thanks 
Fernando Luz
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20081103/a807272d/attachment.htm>
    
    
More information about the mpich-discuss
mailing list