[mpich-discuss] mpich runs

Luis Del Castillo luis.delcastillo at cathalac.org
Wed Oct 20 12:20:48 CDT 2010


Thanks Gus I will do it.

-----Original Message-----
From: Gus Correa [mailto:gus at ldeo.columbia.edu] 
Sent: miércoles, 20 de octubre de 2010 11:16 a.m.
To: Mpich Discuss
Subject: Re: [mpich-discuss] mpich runs

That is very true for any MPI program, even for serial ones.

Luis: You could try to reduce the "startup effect" on cpi.c
by increasing the "default # of rectangles" in the code (n=10000 now).
Another possibility is to repeat the calculation many times:
enclose the calculation in a loop,
and determine starttime and endtime outside the loop.

Also, changing the printf format from %f to %e or %g may help with
these tiny time values.

Gus Correa

Dave Goodell wrote:
> Poorly synchronized timings of extremely short 
> computations like cpi will tend to vary wildly, 
> since the amount of time that the computation 
> takes is on the same order of magnitude as the 
> spawning time for the job. 
> You shouldn't read too much into the timing value that cpi prints out, 
> it's mostly just a demonstration that calling MPI_Wtime doesn't cause a crash :)
> 
> -Dave
> 
> On Oct 20, 2010, at 9:48 AM CDT, Gus Correa wrote:
> 
>> Hi Luis, Reuti, Rajeev, list
>>
>> I wonder if the different wall times really indicate an error.
>>
>> Cpi.c uses MPI_Wtime(), right?
>> There was a recent discussion in the OpenMPI list about
>> how good a measure of time this produces, particularly
>> in multi-core multi-socket systems.
>> See this thread:
>> http://www.open-mpi.org/community/lists/users/2010/10/14456.php
>>
>> My two cents,
>> Gus Correa
>>
>>
>> Luis Del Castillo wrote:
>>> Everytime I run mpirun -np 5 or -np 8 the results for PI are the same but the wall clock time is different. -----Original Message-----
>>> From: Reuti [mailto:reuti at staff.uni-marburg.de] Sent: miércoles, 20 de octubre de 2010 08:31 a.m.
>>> To: mpich-discuss at mcs.anl.gov
>>> Subject: Re: [mpich-discuss] mpich runs
>>> Am 20.10.2010 um 00:30 schrieb Luis Del Castillo:
>>>> Hi fellows my name is Luis Alejandro Del Castillo 
>>> and am  having issues to figure it out why if run mpirun twice with the same configuration give me 2 different answers.
>>>> wrf at cl2master-> ./mpirun -np 5 ../examples/cpi
>>>> Process 0 on cl2master
>>>> Process 1 on node1
>>>> Process 3 on node3
>>>> Process 4 on node4
>>>> Process 2 on node2
>>>> pi is approximately 3.1416009869231245, Error is 0.0000083333333314
>>>> wall clock time = 0.007812
>>>> wrf at cl2master-> ./mpirun -np 5 ../examples/cpi
>>>> Process 0 on cl2master
>>>> Process 3 on node3
>>>> Process 1 on node1
>>>> Process 2 on node2
>>>> Process 4 on node4
>>> Do you mean the different order in stdout? AFAIK this cannot be predicted.
>>> -- Reuti
>>>> pi is approximately 3.1416009869231245, Error is 0.0000083333333314
>>>> wall clock time = 0.000000
>>>> wrf at cl2master->
>>>> I have a Quad Core Beowulf cluster with 11 nodes (1 master , 10 slaves) all the same configuration.
>>>> I want to do this because I am trying to explain the performance doing some graph.
>>>> Thanks
>>>>  Luis Alejandro Del Castillo Riley
>>>> luis.delcastillo at cathalac.org
>>>> Soporte de Tecnología / IT Support  CATHALAC
>>>> www.cathalac.org
>>>> Tel: (507)317-3235 (directo)
>>>> Tel: (507)317-3200 (Central Telefónica)
>>>> Fax: (507)317-3299
>>>> 111 Ciudad del Saber, Clayton, Panamá
>>>> Apartado 0843-03102, Panamá
>>>> _______________________________________________
>>>> mpich-discuss mailing list
>>>> mpich-discuss at mcs.anl.gov
>>>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>>> _______________________________________________
>>> mpich-discuss mailing list
>>> mpich-discuss at mcs.anl.gov
>>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>>> _______________________________________________
>>> mpich-discuss mailing list
>>> mpich-discuss at mcs.anl.gov
>>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>> _______________________________________________
>> mpich-discuss mailing list
>> mpich-discuss at mcs.anl.gov
>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
> 
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss

_______________________________________________
mpich-discuss mailing list
mpich-discuss at mcs.anl.gov
https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss


More information about the mpich-discuss mailing list