<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML><HEAD>
<META http-equiv=Content-Type content="text/html; charset=us-ascii">
<STYLE type=text/css>DIV {
        MARGIN: 0px
}
</STYLE>

<META content="MSHTML 6.00.6000.16525" name=GENERATOR></HEAD>
<BODY>
<DIV dir=ltr align=left><FONT face=Arial color=#0000ff size=2><SPAN 
class=667314918-01112007>Hi,</SPAN></FONT></DIV>
<DIV dir=ltr align=left><FONT face=Arial color=#0000ff size=2><SPAN 
class=667314918-01112007>&nbsp;Can you share a directory on one machine from the 
other using username AGPX &amp; the password&nbsp;?</SPAN></FONT></DIV>
<DIV><FONT face=Arial color=#0000ff size=2></FONT>&nbsp;</DIV>
<DIV><SPAN class=667314918-01112007></SPAN><FONT face=Arial><FONT 
color=#0000ff><FONT size=2>R<SPAN 
class=667314918-01112007>egards,</SPAN></FONT></FONT></FONT></DIV>
<DIV><FONT><FONT color=#0000ff><FONT size=2><SPAN 
class=667314918-01112007></SPAN></FONT></FONT></FONT><SPAN 
class=667314918-01112007></SPAN><FONT face=Arial><FONT color=#0000ff><FONT 
size=2>J<SPAN 
class=667314918-01112007>ayesh</SPAN></FONT></FONT></FONT><BR></DIV>
<DIV class=OutlookMessageHeader lang=en-us dir=ltr align=left>
<HR tabIndex=-1>
<FONT face=Tahoma size=2><B>From:</B> owner-mpich-discuss@mcs.anl.gov 
[mailto:owner-mpich-discuss@mcs.anl.gov] <B>On Behalf Of 
</B>AGPX<BR><B>Sent:</B> Thursday, November 01, 2007 1:09 PM<BR><B>To:</B> 
mpich-discuss@mcs.anl.gov<BR><B>Subject:</B> [MPICH] MPICH2 on Windows XP Pro 
Network<BR></FONT><BR></DIV>
<DIV></DIV>
<DIV style="FONT-SIZE: 12pt; FONT-FAMILY: times new roman,new york,times,serif">
<DIV>Hi all,<BR><BR>I have installed the latest version of MPICH2 on two 
machines on my local network.<BR>Both machines have Windows XP Pro SP2 installed 
and there aren't firewall actived.<BR>Both machines have the same user account 
(with Administrator priviledge): AGPX, with the same password.<BR>Both machines 
have the same phrase for the smpd.<BR>Both machines are in the workgroup named: 
WORKGROUP<BR>Name of the machine are: pcamd3000 and pcamd2600<BR><BR>On 
pcamd3000 I do the following:<BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">mpiexec 
-register</SPAN><BR>I have specified WORKGROUP\pcamd3000 and the password used 
for the user account AGPX<BR><SPAN></SPAN><SPAN></SPAN><SPAN></SPAN><BR>On 
pcamd2600 I do the following:<BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">mpiexec 
-register</SPAN><BR>I have specified WORKGROUP\pcamd2600 and the password used 
for the user account AGPX<BR><BR>Then I have tried:<BR><BR>On 
pcamd3000:<BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">smpd 
-status</SPAN><BR 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif"><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">smpd 
running on pcamd3000</SPAN><BR><BR>On pcamd2600:<BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">smpd 
-status</SPAN><BR 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif"><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">smpd 
running on pcamd2600</SPAN><BR><BR>Well done. Next step (from c:\program 
files\mpich2\examples):<BR><BR>On pcamd3000:<BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">mpiexec 
-hosts 1 pcamd3000 cpi.exe</SPAN><BR 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif"><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">Enter 
the number of intervals:&nbsp; (0 quits) 0</SPAN><BR><BR>On pcamd2600:<BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">mpiexec 
-hosts 1 pcamd2600 cpi.exe</SPAN><BR 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif"><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">Enter 
the number of intervals:&nbsp; (0 quits) 0</SPAN><BR><BR>But If I try:<BR><BR>On 
pcamd3000:<BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">mpiexec 
-hosts 2 pcamd3000 pcamd2600 cpi.exe</SPAN><BR>I obtain:<BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">abort: 
unable to connect to pcamd2600</SPAN><BR><BR>On pcamd2600:<BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">mpiexec 
-hosts 2 pcamd3000 pcamd2600 cpi.exe</SPAN><BR>I obtain:<BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">abort: 
unable to connect to pcamd3000</SPAN><BR><BR>Both machines see each other (via 
ping). To verify better I do (on pcamd3000):<BR><BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">telnet 
pcamd2600 8676</SPAN><BR><BR>the response was:<BR><BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">1.0.6 
292</SPAN><BR><BR>so basically, I can connect to smpd of pcamd2600.<BR><BR>I try 
on pcamd3000:<BR><BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">mpiexec 
-verbose -hosts 2 pcamd3000 pcamd2600 cpi.exe</SPAN><BR><BR>here the full 
response:<BR><BR>..\smpd_add_host_to_default_list<BR>...\smpd_add_extended_host_to_default_list<BR>.../smpd_add_extended_host_to_default_list<BR>../smpd_add_host_to_default_list<BR>..\smpd_add_host_to_default_list<BR>...\smpd_add_extended_host_to_default_list<BR>.../smpd_add_extended_host_to_default_list<BR>../smpd_add_host_to_default_list<BR>..\smpd_get_full_path_name<BR>...fixing 
up exe name: 'cpi.exe' -&gt; 
'(null)'<BR>../smpd_get_full_path_name<BR>..handling 
executable:<BR>cpi.exe<BR>..\smpd_get_next_host<BR>...\smpd_get_host_id<BR>.../smpd_get_host_id<BR>../smpd_get_next_host<BR>..\smpd_get_next_host<BR>...\smpd_get_host_id<BR>.../smpd_get_host_id<BR>../smpd_get_next_host<BR>..\smpd_create_cliques<BR>...\prev_launch_node<BR>.../prev_launch_node<BR>...\prev_launch_node<BR>.../prev_launch_node<BR>...\prev_launch_node<BR>.../prev_launch_node<BR>...\prev_launch_node<BR>.../prev_launch_node<BR>../smpd_create_cliques<BR>..\smpd_fix_up_host_tree<BR>../smpd_fix_up_host_tree<BR>./mp_parse_command_args<BR>.host 
tree:<BR>. host: pcamd3000, parent: 0, id: 1<BR>. host: pcamd2600, parent: 1, 
id: 2<BR>.launch nodes:<BR>. iproc: 1, id: 2, exe: cpi.exe<BR>. iproc: 0, id: 1, 
exe: 
cpi.exe<BR>.\smpd_create_context<BR>..\smpd_init_context<BR>...\smpd_init_command<BR>.../smpd_init_command<BR>../smpd_init_context<BR>./smpd_create_context<BR>.\smpd_make_socket_loop<BR>..\smpd_get_hostname<BR>../smpd_get_hostname<BR>./smpd_make_socket_loop<BR>.\smpd_create_context<BR>..\smpd_init_context<BR>...\smpd_init_command<BR>.../smpd_init_command<BR>../smpd_init_context<BR>./smpd_create_context<BR>.\smpd_enter_at_state<BR>..sock_waiting 
for the next 
event.<BR>..SOCK_OP_CONNECT<BR>..\smpd_handle_op_connect<BR>...connect 
succeeded, posting read of the challenge 
string<BR>../smpd_handle_op_connect<BR>..sock_waiting for the next 
event.<BR>..SOCK_OP_READ<BR>..\smpd_handle_op_read<BR>...\smpd_state_reading_challenge_string<BR>....read 
challenge string: '1.0.6 
9961'<BR>....\smpd_verify_version<BR>..../smpd_verify_version<BR>....\smpd_hash<BR>..../smpd_hash<BR>.../smpd_state_reading_challenge_string<BR>../smpd_handle_op_read<BR>..sock_waiting 
for the next 
event.<BR>..SOCK_OP_WRITE<BR>..\smpd_handle_op_write<BR>...\smpd_state_writing_challenge_response<BR>....wrote 
challenge response: 
'a0256a5646a163e279c8db9db9042b15'<BR>.../smpd_state_writing_challenge_response<BR>../smpd_handle_op_write<BR>..sock_waiting 
for the next 
event.<BR>..SOCK_OP_READ<BR>..\smpd_handle_op_read<BR>...\smpd_state_reading_connect_result<BR>....read 
connect result: 
'SUCCESS'<BR>.../smpd_state_reading_connect_result<BR>../smpd_handle_op_read<BR>..sock_waiting 
for the next 
event.<BR>..SOCK_OP_WRITE<BR>..\smpd_handle_op_write<BR>...\smpd_state_writing_process_session_request<BR>....wrote 
process session request: 
'process'<BR>.../smpd_state_writing_process_session_request<BR>../smpd_handle_op_write<BR>..sock_waiting 
for the next 
event.<BR>..SOCK_OP_READ<BR>..\smpd_handle_op_read<BR>...\smpd_state_reading_cred_request<BR>....read 
cred request: 
'credentials'<BR>....\smpd_hide_string_arg<BR>.....\first_token<BR>...../first_token<BR>.....\compare_token<BR>...../compare_token<BR>.....\next_token<BR>......\first_token<BR>....../first_token<BR>......\first_token<BR>....../first_token<BR>...../next_token<BR>..../smpd_hide_string_arg<BR>..../smpd_hide_string_arg<BR>....\smpd_hide_string_arg<BR>.....\first_token<BR>...../first_token<BR>.....\compare_token<BR>...../compare_token<BR>.....\next_token<BR>......\first_token<BR>....../first_token<BR>......\first_token<BR>....../first_token<BR>...../next_token<BR>..../smpd_hide_string_arg<BR>..../smpd_hide_string_arg<BR>....\smpd_hide_string_arg<BR>.....\first_token<BR>...../first_token<BR>.....\compare_token<BR>...../compare_token<BR>.....\next_token<BR>......\first_token<BR>....../first_token<BR>......\first_token<BR>....../first_token<BR>...../next_token<BR>..../smpd_hide_string_arg<BR>..../smpd_hide_string_arg<BR>.....\smpd_option_on<BR>......\smpd_get_ 
smpd_data<BR>.......\smpd_get_smpd_data_from_environment<BR>......./smpd_get_smpd_data_from_environment<BR>.......\smpd_get_smpd_data_default<BR>......./smpd_get_smpd_data_default<BR>.......Unable 
to get the data for the key 
'nocache'<BR>....../smpd_get_smpd_data<BR>...../smpd_option_on<BR>....\smpd_hide_string_arg<BR>.....\first_token<BR>...../first_token<BR>.....\compare_token<BR>...../compare_token<BR>.....\next_token<BR>......\first_token<BR>....../first_token<BR>......\first_token<BR>....../first_token<BR>...../next_token<BR>..../smpd_hide_string_arg<BR>..../smpd_hide_string_arg<BR>.../smpd_handle_op_read<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_WRITE<BR>...\smpd_handle_op_write<BR>....\smpd_state_writing_cred_ack_yes<BR>.....wrote 
cred request yes 
ack.<BR>..../smpd_state_writing_cred_ack_yes<BR>.../smpd_handle_op_write<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_WRITE<BR>...\smpd_handle_op_write<BR>....\smpd_state_writing_account<BR>.....wrote 
account: 
'WORKGROUP\AGPX'<BR>.....\smpd_encrypt_data<BR>...../smpd_encrypt_data<BR>..../smpd_state_writing_account<BR>.../smpd_handle_op_write<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_WRITE<BR>...\smpd_handle_op_write<BR>....\smpd_hide_string_arg<BR>.....\first_token<BR>...../first_token<BR>.....\compare_token<BR>...../compare_token<BR>.....\next_token<BR>......\first_token<BR>....../first_token<BR>......\first_token<BR>....../first_token<BR>...../next_token<BR>..../smpd_hide_string_arg<BR>..../smpd_hide_string_arg<BR>.....\smpd_hide_string_arg<BR>......\first_token<BR>....../first_token<BR>......\compare_token<BR>....../compare_token<BR>......\next_token<BR>.......\first_token<BR>......./first_token<BR>.......\first_token<BR>......./first_token<BR>....../next_token<BR>...../smpd_hide_string_arg<BR>...../smpd_hide_string_arg<BR>....\smpd_hide_string_arg<BR>.....\first_token<BR>...../first_token<BR>.....\compare_token<BR>...../compare_token<BR>.....\next_token<BR>......\first_token<BR>....../first_token<BR>......\first_token<BR>....../first_token<BR>...../next_token<BR>..../smpd_hide_string_arg<BR>..../smpd_hide_ 
string_arg<BR>.../smpd_handle_op_write<BR>...sock_waiting for the next 
event.<BR>...SOCK_OP_READ<BR>...\smpd_handle_op_read<BR>....\smpd_state_reading_process_result<BR>.....read 
process session result: 
'SUCCESS'<BR>..../smpd_state_reading_process_result<BR>.../smpd_handle_op_read<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_READ<BR>...\smpd_handle_op_read<BR>....\smpd_state_reading_reconnect_request<BR>.....read 
re-connect request: '1037'<BR>.....closing the old socket in the left 
context.<BR>.....MPIDU_Sock_post_close(1724)<BR>.....connecting a new 
socket.<BR>.....\smpd_create_context<BR>......\smpd_init_context<BR>.......\smpd_init_command<BR>......./smpd_init_command<BR>....../smpd_init_context<BR>...../smpd_create_context<BR>.....posting 
a re-connect to pcamd3000:1037 in left 
context.<BR>..../smpd_state_reading_reconnect_request<BR>.../smpd_handle_op_read<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_CONNECT<BR>...\smpd_handle_op_connect<BR>....\smpd_generate_session_header<BR>.....session 
header: (id=1 parent=0 
level=0)<BR>..../smpd_generate_session_header<BR>.../smpd_handle_op_connect<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_WRITE<BR>...\smpd_handle_op_write<BR>....\smpd_state_writing_session_header<BR>.....wrote 
session header: 'id=1 parent=0 
level=0'<BR>.....\smpd_post_read_command<BR>......posting a read for a command 
header on the left context, sock 
1660<BR>...../smpd_post_read_command<BR>.....creating connect command for left 
node<BR>.....creating connect command to 
'pcamd2600'<BR>.....\smpd_create_command<BR>......\smpd_init_command<BR>....../smpd_init_command<BR>...../smpd_create_command<BR>.....\smpd_add_command_arg<BR>...../smpd_add_command_arg<BR>.....\smpd_add_command_int_arg<BR>...../smpd_add_command_int_arg<BR>.....\smpd_post_write_command<BR>......\smpd_package_command<BR>....../smpd_package_command<BR>......smpd_post_write_command 
on the left context sock 1660: 65 bytes for command: "cmd=connect src=0 dest=1 
tag=0 host=pcamd2600 id=2 "<BR>...../smpd_post_write_command<BR>.....not 
connected yet: pcamd2600 not 
connected<BR>..../smpd_state_writing_session_header<BR>.../smpd_handle_op_write<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_CLOSE<BR>...\smpd_handle_op_close<BR>....\smpd_get_state_string<BR>..../smpd_get_state_string<BR>....op_close 
received - SMPD_CLOSING state.<BR>....Unaffiliated left context 
closing.<BR>....\smpd_free_context<BR>.....freeing left 
context.<BR>.....\smpd_init_context<BR>......\smpd_init_command<BR>....../smpd_init_command<BR>...../smpd_init_context<BR>..../smpd_free_context<BR>.../smpd_handle_op_close<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_WRITE<BR>...\smpd_handle_op_write<BR>....\smpd_state_writing_cmd<BR>.....wrote 
command<BR>.....command written to left: "cmd=connect src=0 dest=1 tag=0 
host=pcamd2600 id=2 "<BR>.....moving 'connect' command to the 
wait_list.<BR>..../smpd_state_writing_cmd<BR>.../smpd_handle_op_write<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_READ<BR>...\smpd_handle_op_read<BR>....\smpd_state_reading_cmd_header<BR>.....read 
command header<BR>.....command header read, posting read for data: 69 
bytes<BR>..../smpd_state_reading_cmd_header<BR>.../smpd_handle_op_read<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_READ<BR>...\smpd_handle_op_read<BR>....\smpd_state_reading_cmd<BR>.....read 
command<BR>.....\smpd_parse_command<BR>...../smpd_parse_command<BR>.....read 
command: "cmd=abort src=1 dest=0 tag=0 error="unable to connect to pcamd2600" 
"<BR>.....\smpd_handle_command<BR>......handling command:<BR>...... src&nbsp; = 
1<BR>...... dest = 0<BR>...... cmd&nbsp; = abort<BR>...... tag&nbsp; = 
0<BR>...... ctx&nbsp; = left<BR>...... len&nbsp; = 69<BR>...... str&nbsp; = 
cmd=abort src=1 dest=0 tag=0 error="unable to connect to pcamd2600" 
<BR>......\smpd_command_destination<BR>.......0 -&gt; 0 : returning NULL 
context<BR>....../smpd_command_destination<BR>......\smpd_handle_abort_command<BR>.......abort: 
unable to connect to 
pcamd2600<BR>....../smpd_handle_abort_command<BR>...../smpd_handle_command<BR>.....\smpd_post_read_command<BR>......posting 
a read for a command header on the left context, sock 
1660<BR>...../smpd_post_read_command<BR>.....\smpd_create_command<BR>......\smpd_init_command<BR>....../smpd_init_command<BR>...../smpd_create_command<BR>.....\smpd_post_write_command<BR>......\smpd_package_command<BR>....../smpd_package_command<BR>......smpd_post_write_command 
on the left context sock 1660: 43 bytes for command: "cmd=close src=0 dest=1 
tag=1 
"<BR>...../smpd_post_write_command<BR>..../smpd_state_reading_cmd<BR>.../smpd_handle_op_read<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_WRITE<BR>...\smpd_handle_op_write<BR>....\smpd_state_writing_cmd<BR>.....wrote 
command<BR>.....command written to left: "cmd=close src=0 dest=1 tag=1 
"<BR>.....\smpd_free_command<BR>......\smpd_init_command<BR>....../smpd_init_command<BR>...../smpd_free_command<BR>..../smpd_state_writing_cmd<BR>.../smpd_handle_op_write<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_READ<BR>...\smpd_handle_op_read<BR>....\smpd_state_reading_cmd_header<BR>.....read 
command header<BR>.....command header read, posting read for data: 31 
bytes<BR>..../smpd_state_reading_cmd_header<BR>.../smpd_handle_op_read<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_READ<BR>...\smpd_handle_op_read<BR>....\smpd_state_reading_cmd<BR>.....read 
command<BR>.....\smpd_parse_command<BR>...../smpd_parse_command<BR>.....read 
command: "cmd=closed src=1 dest=0 tag=1 
"<BR>.....\smpd_handle_command<BR>......handling command:<BR>...... src&nbsp; = 
1<BR>...... dest = 0<BR>...... cmd&nbsp; = closed<BR>...... tag&nbsp; = 
1<BR>...... ctx&nbsp; = left<BR>...... len&nbsp; = 31<BR>...... str&nbsp; = 
cmd=closed src=1 dest=0 tag=1 <BR>......\smpd_command_destination<BR>.......0 
-&gt; 0 : returning NULL 
context<BR>....../smpd_command_destination<BR>......\smpd_handle_closed_command<BR>.......closed 
command received from left child, closing 
sock.<BR>.......MPIDU_Sock_post_close(1660)<BR>.......received a closed at node 
with no parent context, assuming root, returning 
SMPD_EXITING.<BR>....../smpd_handle_closed_command<BR>...../smpd_handle_command<BR>.....not 
posting read for another command because SMPD_EXITING 
returned<BR>..../smpd_state_reading_cmd<BR>.../smpd_handle_op_read<BR>...sock_waiting 
for the next 
event.<BR>...SOCK_OP_CLOSE<BR>...\smpd_handle_op_close<BR>....\smpd_get_state_string<BR>..../smpd_get_state_string<BR>....op_close 
received - SMPD_EXITING state.<BR>....\smpd_free_context<BR>.....freeing left 
context.<BR>.....\smpd_init_context<BR>......\smpd_init_command<BR>....../smpd_init_command<BR>...../smpd_init_context<BR>..../smpd_free_context<BR>.../smpd_handle_op_close<BR>../smpd_enter_at_state<BR>./main<BR>.\smpd_exit<BR>..\smpd_kill_all_processes<BR>../smpd_kill_all_processes<BR>..\smpd_finalize_drive_maps<BR>../smpd_finalize_drive_maps<BR>..\smpd_dbs_finalize<BR>../smpd_dbs_finalize<BR><BR>What's 
wrong? Seems that authentication doesn't work.<BR><BR>I haven't share any 
directory and on pcamd3000, windows is located on C:\Windows. On pcamd2600, 
windows is located on E:\Windows. Can this be a problem?<BR><BR>I tried also on 
pcamd3000:<BR><BR><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">smpd 
-status pcamd2600</SPAN><BR 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif"><SPAN 
style="COLOR: rgb(0,0,255); FONT-FAMILY: courier,monaco,monospace,sans-serif">Aborting: 
unable to connect to pcamd2600</SPAN><BR><BR>All this test give the same results 
when launched on pcamd2600.<BR><BR>I found no clue in the previous message of 
this mailing list. I really doesn't understand what is going wrong. Please help 
me!<BR><BR>Thanks,<BR><BR>Gianluca Arcidiacono<BR></DIV></DIV><BR>
<HR SIZE=1>
<FONT face=Arial size=2>
<HR SIZE=1>
<FONT face=Arial size=2>L'email della prossima generazione? Puoi averla con la 
<A 
href="http://us.rd.yahoo.com/mail/it/taglines/hotmail/nowyoucan/nextgen/*http://it.docs.yahoo.com/nowyoucan.html">nuova 
Yahoo! Mail</A></FONT></FONT></BODY></HTML>