[MPICH] MPICH2 on Windows XP Pro Network
AGPX
agpxnet at yahoo.it
Thu Nov 1 17:06:34 CDT 2007
Hi again,
I have done another test (on pcamd3000):
mpiexec -verbose -n 1 -host pcamd3000 cpi.exe
and it works.
Then I have tried (in the same machine):
mpiexec -verbose -n 1 -host pcamd2600 cpi.exe
Here the result:
..\smpd_add_host_to_default_list
...\smpd_add_extended_host_to_default_list
.../smpd_add_extended_host_to_default_list
../smpd_add_host_to_default_list
..\smpd_get_full_path_name
...fixing up exe name: 'cpi.exe' -> '(null)'
...path not found. leaving as is in case the path exists on the remote machine.
../smpd_get_full_path_name
..handling executable:
cpi.exe
..\smpd_get_next_host
...\smpd_get_host_id
.../smpd_get_host_id
../smpd_get_next_host
..\smpd_create_cliques
...\next_launch_node
.../next_launch_node
...\next_launch_node
.../next_launch_node
../smpd_create_cliques
..\smpd_fix_up_host_tree
../smpd_fix_up_host_tree
./mp_parse_command_args
.host tree:
. host: pcamd2600, parent: 0, id: 1
.launch nodes:
. iproc: 0, id: 1, exe: cpi.exe
.\smpd_create_context
..\smpd_init_context
...\smpd_init_command
.../smpd_init_command
../smpd_init_context
./smpd_create_context
.\smpd_make_socket_loop
..\smpd_get_hostname
../smpd_get_hostname
./smpd_make_socket_loop
.\smpd_create_context
..\smpd_init_context
...\smpd_init_command
.../smpd_init_command
../smpd_init_context
./smpd_create_context
.\smpd_enter_at_state
..sock_waiting for the next event.
..SOCK_OP_CONNECT
..\smpd_handle_op_connect
...connect succeeded, posting read of the challenge string
../smpd_handle_op_connect
..sock_waiting for the next event.
..SOCK_OP_READ
..\smpd_handle_op_read
...\smpd_state_reading_challenge_string
....read challenge string: '1.0.6 6334'
....\smpd_verify_version
..../smpd_verify_version
....\smpd_hash
..../smpd_hash
.../smpd_state_reading_challenge_string
../smpd_handle_op_read
..sock_waiting for the next event.
..SOCK_OP_WRITE
..\smpd_handle_op_write
...\smpd_state_writing_challenge_response
....wrote challenge response: 'a265b67cda2a8efcab8aa9934228d262'
.../smpd_state_writing_challenge_response
../smpd_handle_op_write
..sock_waiting for the next event.
..SOCK_OP_READ
..\smpd_handle_op_read
...\smpd_state_reading_connect_result
....read connect result: 'FAIL'
....connection rejected, server returned - FAIL
....\smpd_post_abort_command
.....\smpd_create_command
......\smpd_init_command
....../smpd_init_command
...../smpd_create_command
.....\smpd_add_command_arg
...../smpd_add_command_arg
.....\smpd_command_destination
......0 -> 0 : returning NULL context
...../smpd_command_destination
Aborting: unable to connect to pcamd2600
..../smpd_post_abort_command
....\smpd_exit
.....\smpd_kill_all_processes
...../smpd_kill_all_processes
.....\smpd_finalize_drive_maps
...../smpd_finalize_drive_maps
.....\smpd_dbs_finalize
...../smpd_dbs_finalize
what's the challenge_string / challenge_response? Seems that pcamd2600 return 'FAIL' as response (instead when the host is pcamd3000 itself, it return 'SUCCESS').
----- Messaggio originale -----
Da: AGPX <agpxnet at yahoo.it>
A: Jayesh Krishna <jayesh at mcs.anl.gov>
Cc: mpich-discuss at mcs.anl.gov
Inviato: Giovedì 1 novembre 2007, 23:00:06
Oggetto: Re: [MPICH] MPICH2 on Windows XP Pro Network
Yes, you are right. It's an error in my previous post. I have specified "WORKGROUP\AGPX" for both machines.
Best regards,
Gianluca Arcidiacono
----- Messaggio originale -----
Da: Jayesh Krishna <jayesh at mcs.anl.gov>
A: AGPX <agpxnet at yahoo.it>
Cc: mpich-discuss at mcs.anl.gov
Inviato: Giovedì 1 novembre 2007, 22:54:34
Oggetto: RE: [MPICH] MPICH2 on Windows XP Pro Network
DIV {
MARGIN:0px;}
Hi,
You mentioned in your previous email that when you
registered the username/password in mpiexec you specified "WORKGROUP\pcamd3000" as the
username. Shouldn't the username be "WORKGROUP\AGPX"
?
Regards,
Jayesh
From: AGPX [mailto:agpxnet at yahoo.it]
Sent: Thursday, November 01, 2007 4:47 PM
To: Jayesh
Krishna
Cc: mpich-discuss at mcs.anl.gov
Subject: Re: [MPICH]
MPICH2 on Windows XP Pro Network
Hi,
thanks for
reply. What did you mean exactly with "from the other using username AGPX &
the password"? I can share directories in both machines, with AGPX account,
without problems (both machines can see the directories shared from the
other).
Best regards,
Gianluca Arcidiacono
-----
Messaggio originale -----
Da: Jayesh Krishna <jayesh at mcs.anl.gov>
A:
AGPX <agpxnet at yahoo.it>
Cc: mpich-discuss at mcs.anl.gov
Inviato:
Giovedì 1 novembre 2007, 19:52:52
Oggetto: RE: [MPICH] MPICH2 on Windows XP
Pro Network
DIV {
MARGIN:0px;}
Hi,
Can you share a directory on one machine from the
other using username AGPX & the password ?
Regards,
Jayesh
From: owner-mpich-discuss at mcs.anl.gov
[mailto:owner-mpich-discuss at mcs.anl.gov] On Behalf Of
AGPX
Sent: Thursday, November 01, 2007 1:09 PM
To:
mpich-discuss at mcs.anl.gov
Subject: [MPICH] MPICH2 on Windows XP Pro
Network
Hi all,
I have installed the latest version of MPICH2 on two
machines on my local network.
Both machines have Windows XP Pro SP2 installed
and there aren't firewall actived.
Both machines have the same user account
(with Administrator priviledge): AGPX, with the same password.
Both machines
have the same phrase for the smpd.
Both machines are in the workgroup named:
WORKGROUP
Name of the machine are: pcamd3000 and pcamd2600
On
pcamd3000 I do the following:
mpiexec
-register
I have specified WORKGROUP\pcamd3000 and the password used
for the user account AGPX
On
pcamd2600 I do the following:
mpiexec
-register
I have specified WORKGROUP\pcamd2600 and the password used
for the user account AGPX
Then I have tried:
On
pcamd3000:
smpd
-status
smpd
running on pcamd3000
On pcamd2600:
smpd
-status
smpd
running on pcamd2600
Well done. Next step (from c:\program
files\mpich2\examples):
On pcamd3000:
mpiexec
-hosts 1 pcamd3000 cpi.exe
Enter
the number of intervals: (0 quits) 0
On pcamd2600:
mpiexec
-hosts 1 pcamd2600 cpi.exe
Enter
the number of intervals: (0 quits) 0
But If I try:
On
pcamd3000:
mpiexec
-hosts 2 pcamd3000 pcamd2600 cpi.exe
I obtain:
abort:
unable to connect to pcamd2600
On pcamd2600:
mpiexec
-hosts 2 pcamd3000 pcamd2600 cpi.exe
I obtain:
abort:
unable to connect to pcamd3000
Both machines see each other (via
ping). To verify better I do (on pcamd3000):
telnet
pcamd2600 8676
the response was:
1.0.6
292
so basically, I can connect to smpd of pcamd2600.
I try
on pcamd3000:
mpiexec
-verbose -hosts 2 pcamd3000 pcamd2600 cpi.exe
here the full
response:
..\smpd_add_host_to_default_list
...\smpd_add_extended_host_to_default_list
.../smpd_add_extended_host_to_default_list
../smpd_add_host_to_default_list
..\smpd_add_host_to_default_list
...\smpd_add_extended_host_to_default_list
.../smpd_add_extended_host_to_default_list
../smpd_add_host_to_default_list
..\smpd_get_full_path_name
...fixing
up exe name: 'cpi.exe' ->
'(null)'
../smpd_get_full_path_name
..handling
executable:
cpi.exe
..\smpd_get_next_host
...\smpd_get_host_id
.../smpd_get_host_id
../smpd_get_next_host
..\smpd_get_next_host
...\smpd_get_host_id
.../smpd_get_host_id
../smpd_get_next_host
..\smpd_create_cliques
...\prev_launch_node
.../prev_launch_node
...\prev_launch_node
.../prev_launch_node
...\prev_launch_node
.../prev_launch_node
...\prev_launch_node
.../prev_launch_node
../smpd_create_cliques
..\smpd_fix_up_host_tree
../smpd_fix_up_host_tree
./mp_parse_command_args
.host
tree:
. host: pcamd3000, parent: 0, id: 1
. host: pcamd2600, parent: 1,
id: 2
.launch nodes:
. iproc: 1, id: 2, exe: cpi.exe
. iproc: 0, id: 1,
exe:
cpi.exe
.\smpd_create_context
..\smpd_init_context
...\smpd_init_command
.../smpd_init_command
../smpd_init_context
./smpd_create_context
.\smpd_make_socket_loop
..\smpd_get_hostname
../smpd_get_hostname
./smpd_make_socket_loop
.\smpd_create_context
..\smpd_init_context
...\smpd_init_command
.../smpd_init_command
../smpd_init_context
./smpd_create_context
.\smpd_enter_at_state
..sock_waiting
for the next
event.
..SOCK_OP_CONNECT
..\smpd_handle_op_connect
...connect
succeeded, posting read of the challenge
string
../smpd_handle_op_connect
..sock_waiting for the next
event.
..SOCK_OP_READ
..\smpd_handle_op_read
...\smpd_state_reading_challenge_string
....read
challenge string: '1.0.6
9961'
....\smpd_verify_version
..../smpd_verify_version
....\smpd_hash
..../smpd_hash
.../smpd_state_reading_challenge_string
../smpd_handle_op_read
..sock_waiting
for the next
event.
..SOCK_OP_WRITE
..\smpd_handle_op_write
...\smpd_state_writing_challenge_response
....wrote
challenge response:
'a0256a5646a163e279c8db9db9042b15'
.../smpd_state_writing_challenge_response
../smpd_handle_op_write
..sock_waiting
for the next
event.
..SOCK_OP_READ
..\smpd_handle_op_read
...\smpd_state_reading_connect_result
....read
connect result:
'SUCCESS'
.../smpd_state_reading_connect_result
../smpd_handle_op_read
..sock_waiting
for the next
event.
..SOCK_OP_WRITE
..\smpd_handle_op_write
...\smpd_state_writing_process_session_request
....wrote
process session request:
'process'
.../smpd_state_writing_process_session_request
../smpd_handle_op_write
..sock_waiting
for the next
event.
..SOCK_OP_READ
..\smpd_handle_op_read
...\smpd_state_reading_cred_request
....read
cred request:
'credentials'
....\smpd_hide_string_arg
.....\first_token
...../first_token
.....\compare_token
...../compare_token
.....\next_token
......\first_token
....../first_token
......\first_token
....../first_token
...../next_token
..../smpd_hide_string_arg
..../smpd_hide_string_arg
....\smpd_hide_string_arg
.....\first_token
...../first_token
.....\compare_token
...../compare_token
.....\next_token
......\first_token
....../first_token
......\first_token
....../first_token
...../next_token
..../smpd_hide_string_arg
..../smpd_hide_string_arg
....\smpd_hide_string_arg
.....\first_token
...../first_token
.....\compare_token
...../compare_token
.....\next_token
......\first_token
....../first_token
......\first_token
....../first_token
...../next_token
..../smpd_hide_string_arg
..../smpd_hide_string_arg
.....\smpd_option_on
......\smpd_get_
smpd_data
.......\smpd_get_smpd_data_from_environment
......./smpd_get_smpd_data_from_environment
.......\smpd_get_smpd_data_default
......./smpd_get_smpd_data_default
.......Unable
to get the data for the key
'nocache'
....../smpd_get_smpd_data
...../smpd_option_on
....\smpd_hide_string_arg
.....\first_token
...../first_token
.....\compare_token
...../compare_token
.....\next_token
......\first_token
....../first_token
......\first_token
....../first_token
...../next_token
..../smpd_hide_string_arg
..../smpd_hide_string_arg
.../smpd_handle_op_read
...sock_waiting
for the next
event.
...SOCK_OP_WRITE
...\smpd_handle_op_write
....\smpd_state_writing_cred_ack_yes
.....wrote
cred request yes
ack.
..../smpd_state_writing_cred_ack_yes
.../smpd_handle_op_write
...sock_waiting
for the next
event.
...SOCK_OP_WRITE
...\smpd_handle_op_write
....\smpd_state_writing_account
.....wrote
account:
'WORKGROUP\AGPX'
.....\smpd_encrypt_data
...../smpd_encrypt_data
..../smpd_state_writing_account
.../smpd_handle_op_write
...sock_waiting
for the next
event.
...SOCK_OP_WRITE
...\smpd_handle_op_write
....\smpd_hide_string_arg
.....\first_token
...../first_token
.....\compare_token
...../compare_token
.....\next_token
......\first_token
....../first_token
......\first_token
....../first_token
...../next_token
..../smpd_hide_string_arg
..../smpd_hide_string_arg
.....\smpd_hide_string_arg
......\first_token
....../first_token
......\compare_token
....../compare_token
......\next_token
.......\first_token
......./first_token
.......\first_token
......./first_token
....../next_token
...../smpd_hide_string_arg
...../smpd_hide_string_arg
....\smpd_hide_string_arg
.....\first_token
...../first_token
.....\compare_token
...../compare_token
.....\next_token
......\first_token
....../first_token
......\first_token
....../first_token
...../next_token
..../smpd_hide_string_arg
..../smpd_hide_
string_arg
.../smpd_handle_op_write
...sock_waiting for the next
event.
...SOCK_OP_READ
...\smpd_handle_op_read
....\smpd_state_reading_process_result
.....read
process session result:
'SUCCESS'
..../smpd_state_reading_process_result
.../smpd_handle_op_read
...sock_waiting
for the next
event.
...SOCK_OP_READ
...\smpd_handle_op_read
....\smpd_state_reading_reconnect_request
.....read
re-connect request: '1037'
.....closing the old socket in the left
context.
.....MPIDU_Sock_post_close(1724)
.....connecting a new
socket.
.....\smpd_create_context
......\smpd_init_context
.......\smpd_init_command
......./smpd_init_command
....../smpd_init_context
...../smpd_create_context
.....posting
a re-connect to pcamd3000:1037 in left
context.
..../smpd_state_reading_reconnect_request
.../smpd_handle_op_read
...sock_waiting
for the next
event.
...SOCK_OP_CONNECT
...\smpd_handle_op_connect
....\smpd_generate_session_header
.....session
header: (id=1 parent=0
level=0)
..../smpd_generate_session_header
.../smpd_handle_op_connect
...sock_waiting
for the next
event.
...SOCK_OP_WRITE
...\smpd_handle_op_write
....\smpd_state_writing_session_header
.....wrote
session header: 'id=1 parent=0
level=0'
.....\smpd_post_read_command
......posting a read for a command
header on the left context, sock
1660
...../smpd_post_read_command
.....creating connect command for left
node
.....creating connect command to
'pcamd2600'
.....\smpd_create_command
......\smpd_init_command
....../smpd_init_command
...../smpd_create_command
.....\smpd_add_command_arg
...../smpd_add_command_arg
.....\smpd_add_command_int_arg
...../smpd_add_command_int_arg
.....\smpd_post_write_command
......\smpd_package_command
....../smpd_package_command
......smpd_post_write_command
on the left context sock 1660: 65 bytes for command: "cmd=connect src=0 dest=1
tag=0 host=pcamd2600 id=2 "
...../smpd_post_write_command
.....not
connected yet: pcamd2600 not
connected
..../smpd_state_writing_session_header
.../smpd_handle_op_write
...sock_waiting
for the next
event.
...SOCK_OP_CLOSE
...\smpd_handle_op_close
....\smpd_get_state_string
..../smpd_get_state_string
....op_close
received - SMPD_CLOSING state.
....Unaffiliated left context
closing.
....\smpd_free_context
.....freeing left
context.
.....\smpd_init_context
......\smpd_init_command
....../smpd_init_command
...../smpd_init_context
..../smpd_free_context
.../smpd_handle_op_close
...sock_waiting
for the next
event.
...SOCK_OP_WRITE
...\smpd_handle_op_write
....\smpd_state_writing_cmd
.....wrote
command
.....command written to left: "cmd=connect src=0 dest=1 tag=0
host=pcamd2600 id=2 "
.....moving 'connect' command to the
wait_list.
..../smpd_state_writing_cmd
.../smpd_handle_op_write
...sock_waiting
for the next
event.
...SOCK_OP_READ
...\smpd_handle_op_read
....\smpd_state_reading_cmd_header
.....read
command header
.....command header read, posting read for data: 69
bytes
..../smpd_state_reading_cmd_header
.../smpd_handle_op_read
...sock_waiting
for the next
event.
...SOCK_OP_READ
...\smpd_handle_op_read
....\smpd_state_reading_cmd
.....read
command
.....\smpd_parse_command
...../smpd_parse_command
.....read
command: "cmd=abort src=1 dest=0 tag=0 error="unable to connect to pcamd2600"
"
.....\smpd_handle_command
......handling command:
...... src =
1
...... dest = 0
...... cmd = abort
...... tag =
0
...... ctx = left
...... len = 69
...... str =
cmd=abort src=1 dest=0 tag=0 error="unable to connect to pcamd2600"
......\smpd_command_destination
.......0 -> 0 : returning NULL
context
....../smpd_command_destination
......\smpd_handle_abort_command
.......abort:
unable to connect to
pcamd2600
....../smpd_handle_abort_command
...../smpd_handle_command
.....\smpd_post_read_command
......posting
a read for a command header on the left context, sock
1660
...../smpd_post_read_command
.....\smpd_create_command
......\smpd_init_command
....../smpd_init_command
...../smpd_create_command
.....\smpd_post_write_command
......\smpd_package_command
....../smpd_package_command
......smpd_post_write_command
on the left context sock 1660: 43 bytes for command: "cmd=close src=0 dest=1
tag=1
"
...../smpd_post_write_command
..../smpd_state_reading_cmd
.../smpd_handle_op_read
...sock_waiting
for the next
event.
...SOCK_OP_WRITE
...\smpd_handle_op_write
....\smpd_state_writing_cmd
.....wrote
command
.....command written to left: "cmd=close src=0 dest=1 tag=1
"
.....\smpd_free_command
......\smpd_init_command
....../smpd_init_command
...../smpd_free_command
..../smpd_state_writing_cmd
.../smpd_handle_op_write
...sock_waiting
for the next
event.
...SOCK_OP_READ
...\smpd_handle_op_read
....\smpd_state_reading_cmd_header
.....read
command header
.....command header read, posting read for data: 31
bytes
..../smpd_state_reading_cmd_header
.../smpd_handle_op_read
...sock_waiting
for the next
event.
...SOCK_OP_READ
...\smpd_handle_op_read
....\smpd_state_reading_cmd
.....read
command
.....\smpd_parse_command
...../smpd_parse_command
.....read
command: "cmd=closed src=1 dest=0 tag=1
"
.....\smpd_handle_command
......handling command:
...... src =
1
...... dest = 0
...... cmd = closed
...... tag =
1
...... ctx = left
...... len = 31
...... str =
cmd=closed src=1 dest=0 tag=1
......\smpd_command_destination
.......0
-> 0 : returning NULL
context
....../smpd_command_destination
......\smpd_handle_closed_command
.......closed
command received from left child, closing
sock.
.......MPIDU_Sock_post_close(1660)
.......received a closed at node
with no parent context, assuming root, returning
SMPD_EXITING.
....../smpd_handle_closed_command
...../smpd_handle_command
.....not
posting read for another command because SMPD_EXITING
returned
..../smpd_state_reading_cmd
.../smpd_handle_op_read
...sock_waiting
for the next
event.
...SOCK_OP_CLOSE
...\smpd_handle_op_close
....\smpd_get_state_string
..../smpd_get_state_string
....op_close
received - SMPD_EXITING state.
....\smpd_free_context
.....freeing left
context.
.....\smpd_init_context
......\smpd_init_command
....../smpd_init_command
...../smpd_init_context
..../smpd_free_context
.../smpd_handle_op_close
../smpd_enter_at_state
./main
.\smpd_exit
..\smpd_kill_all_processes
../smpd_kill_all_processes
..\smpd_finalize_drive_maps
../smpd_finalize_drive_maps
..\smpd_dbs_finalize
../smpd_dbs_finalize
What's
wrong? Seems that authentication doesn't work.
I haven't share any
directory and on pcamd3000, windows is located on C:\Windows. On pcamd2600,
windows is located on E:\Windows. Can this be a problem?
I tried also on
pcamd3000:
smpd
-status pcamd2600
Aborting:
unable to connect to pcamd2600
All this test give the same results
when launched on pcamd2600.
I found no clue in the previous message of
this mailing list. I really doesn't understand what is going wrong. Please help
me!
Thanks,
Gianluca Arcidiacono
L'email della prossima generazione? Puoi averla con la
nuova Yahoo!
Mail
L'email della prossima generazione? Puoi averla con la
nuova
Yahoo! Mail
L'email della prossima generazione? Puoi averla con la nuova Yahoo! Mail
___________________________________
L'email della prossima generazione? Puoi averla con la nuova Yahoo! Mail: http://it.docs.yahoo.com/nowyoucan.html
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20071101/6ca030ed/attachment.htm>
More information about the mpich-discuss
mailing list