[mpich-discuss] MPI_Comm_spawn / MPI_Info question
Christoph Sprenger
csprenger at wetafx.co.nz
Sun Sep 16 19:18:01 CDT 2012
doh accidently pressed ctrl-enter.
it seems that hydra passes envall by default to spawned nodes which
solves pretty much my previous issue.
i am trying to figure out if its possible to do the following:
MPI_Info_set(info, "hosts", "node1,node2");
seems that they all get launched on the original node where mpiexec was run.
MPI_Info_set(info, "host", "node2");
seems to work fine and it spawns as expected on the other machine.
preferably i wanted to avoid generating a hostfile and was curious if
thats possible to supply multiple machines directly in the MPI_Info.
Cheers,
Christoph
On 17/09/12 12:06, Christoph Sprenger wrote:
>
> Thanks for that Pavan.
> Just installed and upgraded and super happy how smooth that went.
>
>
>
>
> On 16/09/12 17:19, Pavan Balaji wrote:
>>
>> Can you upgrade to the latest version of mpich2? 1.2.1p1 is too old;
>> it uses a different process manager which is not supported anymore.
>>
>> -- Pavan
>>
>> On 09/15/2012 09:23 PM, Christoph Sprenger wrote:
>>> Hi,
>>>
>>> I've been trying to get MPI_Info to work with the Spawn interface, but
>>> seems like the env vars aren't supplied to the new processes.
>>> I've been searching the docs and source, but can't find the list of
>>> hints for mpich2, so i am not sure if "envlist" is the right key for
>>> the
>>> mpd.
>>> not sure if i missed the obvious. is there an overview somewhere of all
>>> the reserved keys ( eg envall,... )
>>>
>>>
>>>
>>> MPI_Info info;
>>> MPI_Info_create(&info);
>>> MPI_Info_set(info, "envlist", "CUSTOM_VAR");
>>>
>>> ...
>>> MPI_Comm_spawn(argv[0], &argv[1] , n, info, 0, MPI_COMM_WORLD,
>>> &intracomm, MPI_ERRCODES_IGNORE);
>>>
>>>
>>> i currently use mpich2-1.2.1
>>>
>>>
>>> Cheers,
>>> Christoph
>>>
>>>
>>> _______________________________________________
>>> mpich-discuss mailing list mpich-discuss at mcs.anl.gov
>>> To manage subscription options or unsubscribe:
>>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>>>
>>
>
More information about the mpich-discuss
mailing list