Problems running MPICH2
Rajeev Thakur
thakur at mcs.anl.gov
Tue May 31 12:44:20 CDT 2005
That's because of round-off error in floating-point arithmetic, which is not
strictly associative. Since the difference is in the 13th decimal place, you
can ignore it.
Rajeev
> -----Original Message-----
> From: owner-mpich-discuss at mcs.anl.gov
> [mailto:owner-mpich-discuss at mcs.anl.gov] On Behalf Of Marcos Bahia
> Sent: Tuesday, May 31, 2005 12:12 PM
> To: mpich-discuss at mcs.anl.gov
> Subject: Problems running MPICH2
>
> I'm a very new user to MPI and I have been having some problems. I
> hope you can help me.
> I have installed MPICH2 in a network of PCs. The processors are
> Pentium 4 - 2,60 GHz and the operational system is Windows 2000
> Professional Edition. I had no problems while installing MPICH2 but I
> got some strange results while running CPI.exe.
> I used the command "mpiexec -localonly 1 cpi.exe" and got the
> following results for n = 10000000:
>
> Pi is approximately 3.1415926535897309
> Error is 0.0000000000000622
> Wall clock time = 0.146069
>
> So I began to increase the number of PCs involved in the computation
> and the time, as expected, decreased accordingly, leading to a very
> linear behavior. I kept only one process running in each machine.
> The problem was that, keeping n = 10000000, the error returned was
> different every time a new computation with an additional PC was run.
> The results are:
>
> 2 PCs -> Error = 0.0000000000001918, wall clock time = 0.073221;
> 3 PCs -> Error = 0.0000000000001190, wall clock time = 0.049889;
> 4 PCs -> Error = 0.0000000000001066, wall clock time = 0.037724;
> 5 PCs -> Error = 0.0000000000000786, wall clock time = 0.030447;
> 6 PCs -> Error = 0.0000000000000226, wall clock time = 0.025617;
> 7 PCs -> Error = 0.0000000000000027, wall clock time = 0.022449;
> 8 PCs -> Error = 0.0000000000000138, wall clock time = 0.020213.
>
> The network is connected through a 3Com, 16 port switch.
> Any idea of what might be happening?
>
>
More information about the mpich-discuss
mailing list