[mpich-discuss] mpich runs

Rajeev Thakur thakur at mcs.anl.gov
Tue Oct 19 19:26:24 CDT 2010


The answer, as in the value of pi, is the same in both runs. It is odd that the wall clock time printed in the second run is 0.

Rajeev

On Oct 19, 2010, at 5:30 PM, Luis Del Castillo wrote:

> Hi fellows my name is Luis Alejandro Del Castillo and am  having issues to figure it out why if run  mpirun twice with the same configuration give me 2 different answers.
>  
> wrf at cl2master-> ./mpirun -np 5 ../examples/cpi
> Process 0 on cl2master
> Process 1 on node1
> Process 3 on node3
> Process 4 on node4
> Process 2 on node2
> pi is approximately 3.1416009869231245, Error is 0.0000083333333314
> wall clock time = 0.007812
> wrf at cl2master-> ./mpirun -np 5 ../examples/cpi
> Process 0 on cl2master
> Process 3 on node3
> Process 1 on node1
> Process 2 on node2
> Process 4 on node4
> pi is approximately 3.1416009869231245, Error is 0.0000083333333314
> wall clock time = 0.000000
> wrf at cl2master->
>  
> I have a Quad Core Beowulf cluster with 11 nodes (1 master , 10 slaves) all the same configuration.
> I want to do this because I am trying to explain the performance doing some graph.
>  
> Thanks
>  
>  
> Luis Alejandro Del Castillo Riley
> luis.delcastillo at cathalac.org
> Soporte de Tecnología / IT Support  
> CATHALAC
> www.cathalac.org
> Tel: (507)317-3235 (directo)
> Tel: (507)317-3200 (Central Telefónica)
> Fax: (507)317-3299
> 111 Ciudad del Saber, Clayton, Panamá
> Apartado 0843-03102, Panamá
>  
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss



More information about the mpich-discuss mailing list