[MPICH] MPICH2 performance tuning and characterising
Anthony Chan
chan at mcs.anl.gov
Tue Mar 20 10:54:01 CDT 2007
On Tue, 20 Mar 2007, stephen mulcahy wrote:
> smulcahy at titan:~$ ./cpilog
> Process 0 running on titan
> pi is approximately 3.1415926535897643, Error is 0.0000000000000289
> wall clock time = 0.063363
> Writing logfile....
> Enabling the Default clock synchronization...
> Finished writing logfile ./cpilog.clog2.
>
> smulcahy at titan:~$ ./fpilog
> Process 0 of 1 is alive
> event IDs are 600 601 , 602 603 ,
> 5000 5001 , 604 605
> The number of intervals = 1000000
> pi is approximately: 3.1415926535897640 Error is: 0.0000000000000289
> pi is approximately: 3.1415926535897640 Error is: 0.0000000000000289
> pi is approximately: 3.1415926535897640 Error is: 0.0000000000000289
> pi is approximately: 3.1415926535897640 Error is: 0.0000000000000289
> pi is approximately: 3.1415926535897640 Error is: 0.0000000000000289
> Writing logfile....
> Enabling the Default clock synchronization...
> Finished writing logfile Unknown.clog2.
>
> So logging does seem to be compiled in - but for some reason the MPI
> program I'm using does not seem to use it. I have verified that we're
> using the mpirun/mpiexec command from the latest mpich2 install so the
> logging should be enabled in that.
>
It seems everything is OK with your installation of MPICH2+MPE.
Another way to check if MPE logging is linked in is to do a grep on
the executable, "nm <your_executable> | grep -i MPE_Log". If you
see a bunch of MPE_Log symbols listed, MPE logging library is linked
in.
Does your app use MPI_THREAD_MULTIPLE support ?
A.Chan
More information about the mpich-discuss
mailing list