<html><head><style type="text/css"><!-- DIV {margin:0px;} --></style></head><body><div style="font-family:times new roman,new york,times,serif;font-size:12pt"><div>Hi all,<br><br>I have installed the latest version of MPICH2 on two machines on my local network.<br>Both machines have Windows XP Pro SP2 installed and there aren't firewall actived.<br>Both machines have the same user account (with Administrator priviledge): AGPX, with the same password.<br>Both machines have the same phrase for the smpd.<br>Both machines are in the workgroup named: WORKGROUP<br>Name of the machine are: pcamd3000 and pcamd2600<br><br>On pcamd3000 I do the following:<br><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">mpiexec -register</span><br>I have specified WORKGROUP\pcamd3000 and the password used for the user account AGPX<br><span></span><span></span><span></span><br>On pcamd2600 I do the following:<br>
<span style="color: rgb(0, 0, 255); font-family: courier,monaco,monospace,sans-serif;">mpiexec -register</span><br>
I have specified WORKGROUP\pcamd2600 and the password used for the user account AGPX<br>
<br>Then I have tried:<br><br>On pcamd3000:<br><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">smpd -status</span><br style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">
<span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">smpd running on pcamd3000</span><br><br>On pcamd2600:<br><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">
smpd -status</span><br style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">
<span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">smpd running on pcamd2600</span><br><br>Well done. Next step (from c:\program files\mpich2\examples):<br><br>On pcamd3000:<br><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">mpiexec -hosts 1 pcamd3000 cpi.exe</span><br style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);"><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">Enter the number of intervals: (0 quits) 0</span><br><br>On pcamd2600:<br>
<span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">mpiexec -hosts 1 pcamd2600 cpi.exe</span><br style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);"><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">
Enter the number of intervals: (0 quits) 0</span><br>
<br>But If I try:<br><br>On pcamd3000:<br><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">
mpiexec -hosts 2 pcamd3000 pcamd2600 cpi.exe</span><br>I obtain:<br><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">abort: unable to connect to pcamd2600</span><br><br>On pcamd2600:<br><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">
mpiexec -hosts 2 pcamd3000 pcamd2600 cpi.exe</span><br>
I obtain:<br><span style="color: rgb(0, 0, 255); font-family: courier,monaco,monospace,sans-serif;">
abort: unable to connect to pcamd3000</span><br>
<br>
Both machines see each other (via ping). To verify better I do (on pcamd3000):<br><br><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">telnet pcamd2600 8676</span><br><br>the response was:<br><br><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">1.0.6 292</span><br><br>so basically, I can connect to smpd of pcamd2600.<br><br>I try on pcamd3000:<br><br><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">mpiexec -verbose -hosts 2 pcamd3000 pcamd2600 cpi.exe</span><br><br>here the full
response:<br><br>..\smpd_add_host_to_default_list<br>...\smpd_add_extended_host_to_default_list<br>.../smpd_add_extended_host_to_default_list<br>../smpd_add_host_to_default_list<br>..\smpd_add_host_to_default_list<br>...\smpd_add_extended_host_to_default_list<br>.../smpd_add_extended_host_to_default_list<br>../smpd_add_host_to_default_list<br>..\smpd_get_full_path_name<br>...fixing up exe name: 'cpi.exe' -> '(null)'<br>../smpd_get_full_path_name<br>..handling
executable:<br>cpi.exe<br>..\smpd_get_next_host<br>...\smpd_get_host_id<br>.../smpd_get_host_id<br>../smpd_get_next_host<br>..\smpd_get_next_host<br>...\smpd_get_host_id<br>.../smpd_get_host_id<br>../smpd_get_next_host<br>..\smpd_create_cliques<br>...\prev_launch_node<br>.../prev_launch_node<br>...\prev_launch_node<br>.../prev_launch_node<br>...\prev_launch_node<br>.../prev_launch_node<br>...\prev_launch_node<br>.../prev_launch_node<br>../smpd_create_cliques<br>..\smpd_fix_up_host_tree<br>../smpd_fix_up_host_tree<br>./mp_parse_command_args<br>.host tree:<br>. host: pcamd3000, parent: 0, id: 1<br>. host: pcamd2600, parent: 1, id: 2<br>.launch nodes:<br>. iproc: 1, id: 2, exe: cpi.exe<br>. iproc: 0, id: 1, exe:
cpi.exe<br>.\smpd_create_context<br>..\smpd_init_context<br>...\smpd_init_command<br>.../smpd_init_command<br>../smpd_init_context<br>./smpd_create_context<br>.\smpd_make_socket_loop<br>..\smpd_get_hostname<br>../smpd_get_hostname<br>./smpd_make_socket_loop<br>.\smpd_create_context<br>..\smpd_init_context<br>...\smpd_init_command<br>.../smpd_init_command<br>../smpd_init_context<br>./smpd_create_context<br>.\smpd_enter_at_state<br>..sock_waiting for the next event.<br>..SOCK_OP_CONNECT<br>..\smpd_handle_op_connect<br>...connect succeeded, posting read of the challenge string<br>../smpd_handle_op_connect<br>..sock_waiting for the next event.<br>..SOCK_OP_READ<br>..\smpd_handle_op_read<br>...\smpd_state_reading_challenge_string<br>....read challenge string: '1.0.6 9961'<br>....\smpd_verify_version<br>..../smpd_verify_version<br>....\smpd_hash<br>..../smpd_hash<br>.../smpd_state_reading_challenge_string<br>../smpd_handle_op_read<br>..sock_waiting for the
next event.<br>..SOCK_OP_WRITE<br>..\smpd_handle_op_write<br>...\smpd_state_writing_challenge_response<br>....wrote challenge response: 'a0256a5646a163e279c8db9db9042b15'<br>.../smpd_state_writing_challenge_response<br>../smpd_handle_op_write<br>..sock_waiting for the next event.<br>..SOCK_OP_READ<br>..\smpd_handle_op_read<br>...\smpd_state_reading_connect_result<br>....read connect result: 'SUCCESS'<br>.../smpd_state_reading_connect_result<br>../smpd_handle_op_read<br>..sock_waiting for the next event.<br>..SOCK_OP_WRITE<br>..\smpd_handle_op_write<br>...\smpd_state_writing_process_session_request<br>....wrote process session request: 'process'<br>.../smpd_state_writing_process_session_request<br>../smpd_handle_op_write<br>..sock_waiting for the next event.<br>..SOCK_OP_READ<br>..\smpd_handle_op_read<br>...\smpd_state_reading_cred_request<br>....read cred request:
'credentials'<br>....\smpd_hide_string_arg<br>.....\first_token<br>...../first_token<br>.....\compare_token<br>...../compare_token<br>.....\next_token<br>......\first_token<br>....../first_token<br>......\first_token<br>....../first_token<br>...../next_token<br>..../smpd_hide_string_arg<br>..../smpd_hide_string_arg<br>....\smpd_hide_string_arg<br>.....\first_token<br>...../first_token<br>.....\compare_token<br>...../compare_token<br>.....\next_token<br>......\first_token<br>....../first_token<br>......\first_token<br>....../first_token<br>...../next_token<br>..../smpd_hide_string_arg<br>..../smpd_hide_string_arg<br>....\smpd_hide_string_arg<br>.....\first_token<br>...../first_token<br>.....\compare_token<br>...../compare_token<br>.....\next_token<br>......\first_token<br>....../first_token<br>......\first_token<br>....../first_token<br>...../next_token<br>..../smpd_hide_string_arg<br>..../smpd_hide_string_arg<br>.....\smpd_option_on<br>......\smpd_get_
smpd_data<br>.......\smpd_get_smpd_data_from_environment<br>......./smpd_get_smpd_data_from_environment<br>.......\smpd_get_smpd_data_default<br>......./smpd_get_smpd_data_default<br>.......Unable to get the data for the key 'nocache'<br>....../smpd_get_smpd_data<br>...../smpd_option_on<br>....\smpd_hide_string_arg<br>.....\first_token<br>...../first_token<br>.....\compare_token<br>...../compare_token<br>.....\next_token<br>......\first_token<br>....../first_token<br>......\first_token<br>....../first_token<br>...../next_token<br>..../smpd_hide_string_arg<br>..../smpd_hide_string_arg<br>.../smpd_handle_op_read<br>...sock_waiting for the next event.<br>...SOCK_OP_WRITE<br>...\smpd_handle_op_write<br>....\smpd_state_writing_cred_ack_yes<br>.....wrote cred request yes ack.<br>..../smpd_state_writing_cred_ack_yes<br>.../smpd_handle_op_write<br>...sock_waiting for the next
event.<br>...SOCK_OP_WRITE<br>...\smpd_handle_op_write<br>....\smpd_state_writing_account<br>.....wrote account: 'WORKGROUP\AGPX'<br>.....\smpd_encrypt_data<br>...../smpd_encrypt_data<br>..../smpd_state_writing_account<br>.../smpd_handle_op_write<br>...sock_waiting for the next
event.<br>...SOCK_OP_WRITE<br>...\smpd_handle_op_write<br>....\smpd_hide_string_arg<br>.....\first_token<br>...../first_token<br>.....\compare_token<br>...../compare_token<br>.....\next_token<br>......\first_token<br>....../first_token<br>......\first_token<br>....../first_token<br>...../next_token<br>..../smpd_hide_string_arg<br>..../smpd_hide_string_arg<br>.....\smpd_hide_string_arg<br>......\first_token<br>....../first_token<br>......\compare_token<br>....../compare_token<br>......\next_token<br>.......\first_token<br>......./first_token<br>.......\first_token<br>......./first_token<br>....../next_token<br>...../smpd_hide_string_arg<br>...../smpd_hide_string_arg<br>....\smpd_hide_string_arg<br>.....\first_token<br>...../first_token<br>.....\compare_token<br>...../compare_token<br>.....\next_token<br>......\first_token<br>....../first_token<br>......\first_token<br>....../first_token<br>...../next_token<br>..../smpd_hide_string_arg<br>..../smpd_hide_
string_arg<br>.../smpd_handle_op_write<br>...sock_waiting for the next event.<br>...SOCK_OP_READ<br>...\smpd_handle_op_read<br>....\smpd_state_reading_process_result<br>.....read process session result: 'SUCCESS'<br>..../smpd_state_reading_process_result<br>.../smpd_handle_op_read<br>...sock_waiting for the next event.<br>...SOCK_OP_READ<br>...\smpd_handle_op_read<br>....\smpd_state_reading_reconnect_request<br>.....read re-connect request: '1037'<br>.....closing the old socket in the left context.<br>.....MPIDU_Sock_post_close(1724)<br>.....connecting a new socket.<br>.....\smpd_create_context<br>......\smpd_init_context<br>.......\smpd_init_command<br>......./smpd_init_command<br>....../smpd_init_context<br>...../smpd_create_context<br>.....posting a re-connect to pcamd3000:1037 in left context.<br>..../smpd_state_reading_reconnect_request<br>.../smpd_handle_op_read<br>...sock_waiting for the next
event.<br>...SOCK_OP_CONNECT<br>...\smpd_handle_op_connect<br>....\smpd_generate_session_header<br>.....session header: (id=1 parent=0 level=0)<br>..../smpd_generate_session_header<br>.../smpd_handle_op_connect<br>...sock_waiting for the next event.<br>...SOCK_OP_WRITE<br>...\smpd_handle_op_write<br>....\smpd_state_writing_session_header<br>.....wrote session header: 'id=1 parent=0 level=0'<br>.....\smpd_post_read_command<br>......posting a read for a command header on the left context, sock 1660<br>...../smpd_post_read_command<br>.....creating connect command for left node<br>.....creating connect command to
'pcamd2600'<br>.....\smpd_create_command<br>......\smpd_init_command<br>....../smpd_init_command<br>...../smpd_create_command<br>.....\smpd_add_command_arg<br>...../smpd_add_command_arg<br>.....\smpd_add_command_int_arg<br>...../smpd_add_command_int_arg<br>.....\smpd_post_write_command<br>......\smpd_package_command<br>....../smpd_package_command<br>......smpd_post_write_command on the left context sock 1660: 65 bytes for command: "cmd=connect src=0 dest=1 tag=0 host=pcamd2600 id=2 "<br>...../smpd_post_write_command<br>.....not connected yet: pcamd2600 not connected<br>..../smpd_state_writing_session_header<br>.../smpd_handle_op_write<br>...sock_waiting for the next event.<br>...SOCK_OP_CLOSE<br>...\smpd_handle_op_close<br>....\smpd_get_state_string<br>..../smpd_get_state_string<br>....op_close received - SMPD_CLOSING state.<br>....Unaffiliated left context closing.<br>....\smpd_free_context<br>.....freeing left
context.<br>.....\smpd_init_context<br>......\smpd_init_command<br>....../smpd_init_command<br>...../smpd_init_context<br>..../smpd_free_context<br>.../smpd_handle_op_close<br>...sock_waiting for the next event.<br>...SOCK_OP_WRITE<br>...\smpd_handle_op_write<br>....\smpd_state_writing_cmd<br>.....wrote command<br>.....command written to left: "cmd=connect src=0 dest=1 tag=0 host=pcamd2600 id=2 "<br>.....moving 'connect' command to the wait_list.<br>..../smpd_state_writing_cmd<br>.../smpd_handle_op_write<br>...sock_waiting for the next event.<br>...SOCK_OP_READ<br>...\smpd_handle_op_read<br>....\smpd_state_reading_cmd_header<br>.....read command header<br>.....command header read, posting read for data: 69 bytes<br>..../smpd_state_reading_cmd_header<br>.../smpd_handle_op_read<br>...sock_waiting for the next event.<br>...SOCK_OP_READ<br>...\smpd_handle_op_read<br>....\smpd_state_reading_cmd<br>.....read
command<br>.....\smpd_parse_command<br>...../smpd_parse_command<br>.....read command: "cmd=abort src=1 dest=0 tag=0 error="unable to connect to pcamd2600" "<br>.....\smpd_handle_command<br>......handling command:<br>...... src = 1<br>...... dest = 0<br>...... cmd = abort<br>...... tag = 0<br>...... ctx = left<br>...... len = 69<br>...... str = cmd=abort src=1 dest=0 tag=0 error="unable to connect to pcamd2600" <br>......\smpd_command_destination<br>.......0 -> 0 : returning NULL context<br>....../smpd_command_destination<br>......\smpd_handle_abort_command<br>.......abort: unable to connect to pcamd2600<br>....../smpd_handle_abort_command<br>...../smpd_handle_command<br>.....\smpd_post_read_command<br>......posting a read for a command header on the left context, sock
1660<br>...../smpd_post_read_command<br>.....\smpd_create_command<br>......\smpd_init_command<br>....../smpd_init_command<br>...../smpd_create_command<br>.....\smpd_post_write_command<br>......\smpd_package_command<br>....../smpd_package_command<br>......smpd_post_write_command on the left context sock 1660: 43 bytes for command: "cmd=close src=0 dest=1 tag=1 "<br>...../smpd_post_write_command<br>..../smpd_state_reading_cmd<br>.../smpd_handle_op_read<br>...sock_waiting for the next event.<br>...SOCK_OP_WRITE<br>...\smpd_handle_op_write<br>....\smpd_state_writing_cmd<br>.....wrote command<br>.....command written to left: "cmd=close src=0 dest=1 tag=1 "<br>.....\smpd_free_command<br>......\smpd_init_command<br>....../smpd_init_command<br>...../smpd_free_command<br>..../smpd_state_writing_cmd<br>.../smpd_handle_op_write<br>...sock_waiting for the next event.<br>...SOCK_OP_READ<br>...\smpd_handle_op_read<br>....\smpd_state_reading_cmd_header<br>.....read
command header<br>.....command header read, posting read for data: 31 bytes<br>..../smpd_state_reading_cmd_header<br>.../smpd_handle_op_read<br>...sock_waiting for the next event.<br>...SOCK_OP_READ<br>...\smpd_handle_op_read<br>....\smpd_state_reading_cmd<br>.....read command<br>.....\smpd_parse_command<br>...../smpd_parse_command<br>.....read command: "cmd=closed src=1 dest=0 tag=1 "<br>.....\smpd_handle_command<br>......handling command:<br>...... src = 1<br>...... dest = 0<br>...... cmd = closed<br>...... tag = 1<br>...... ctx = left<br>...... len = 31<br>...... str = cmd=closed src=1 dest=0 tag=1 <br>......\smpd_command_destination<br>.......0 -> 0 : returning NULL context<br>....../smpd_command_destination<br>......\smpd_handle_closed_command<br>.......closed command received from left child, closing sock.<br>.......MPIDU_Sock_post_close(1660)<br>.......received a closed at node with no parent context,
assuming root, returning SMPD_EXITING.<br>....../smpd_handle_closed_command<br>...../smpd_handle_command<br>.....not posting read for another command because SMPD_EXITING returned<br>..../smpd_state_reading_cmd<br>.../smpd_handle_op_read<br>...sock_waiting for the next event.<br>...SOCK_OP_CLOSE<br>...\smpd_handle_op_close<br>....\smpd_get_state_string<br>..../smpd_get_state_string<br>....op_close received - SMPD_EXITING state.<br>....\smpd_free_context<br>.....freeing left context.<br>.....\smpd_init_context<br>......\smpd_init_command<br>....../smpd_init_command<br>...../smpd_init_context<br>..../smpd_free_context<br>.../smpd_handle_op_close<br>../smpd_enter_at_state<br>./main<br>.\smpd_exit<br>..\smpd_kill_all_processes<br>../smpd_kill_all_processes<br>..\smpd_finalize_drive_maps<br>../smpd_finalize_drive_maps<br>..\smpd_dbs_finalize<br>../smpd_dbs_finalize<br><br>What's wrong? Seems that authentication doesn't work.<br><br>I haven't share any
directory and on pcamd3000, windows is located on C:\Windows. On pcamd2600, windows is located on E:\Windows. Can this be a problem?<br><br>I tried also on pcamd3000:<br><br><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">smpd -status pcamd2600</span><br style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);"><span style="font-family: courier,monaco,monospace,sans-serif; color: rgb(0, 0, 255);">Aborting: unable to connect to pcamd2600</span><br><br>All this test give the same results when launched on pcamd2600.<br><br>I found no clue in the previous message of this mailing list. I really doesn't understand what is going wrong. Please help me!<br><br>Thanks,<br><br>Gianluca Arcidiacono<br></div></div><br>
<hr size=1><font face="Arial" size="2"><hr size=1><font face="Arial" size="2">L'email della prossima generazione? Puoi averla con la <a href="http://us.rd.yahoo.com/mail/it/taglines/hotmail/nowyoucan/nextgen/*http://it.docs.yahoo.com/nowyoucan.html">nuova Yahoo! Mail</a></font></body></html>