[petsc-users] Streams scaling.log
Manuel Valera
mvalera-w at sdsu.edu
Wed Apr 25 14:34:24 CDT 2018
I get it, thanks, that's a strong argument i will tell my advisor about
Have a great day,
On Wed, Apr 25, 2018 at 12:30 PM, Smith, Barry F. <bsmith at mcs.anl.gov>
wrote:
>
>
> > On Apr 25, 2018, at 2:12 PM, Manuel Valera <mvalera-w at sdsu.edu> wrote:
> >
> > Hi and thanks for the quick answer,
> >
> > Yes it looks i am using MPICH for my configure instead of using the
> system installation of OpenMPI, in the past i had better experience using
> MPICH but maybe this will be a conflict, should i reconfigure using the
> system MPI installation?
> >
> > I solved the problem in a different way by login into the nodes i wanted
> to use and doing the make streams tests there, but i get the following:
> >
> > np speedup
> > 1 1.0
> > 2 1.51
> > 3 2.17
> > 4 2.66
> > 5 2.87
> > 6 3.06
> > 7 3.44
> > 8 3.84
> > 9 3.81
> > 10 3.17
> > 11 3.69
> > 12 3.81
> > 13 3.26
> > 14 3.51
> > 15 3.61
> > 16 3.81
> > 17 3.8
> > 18 3.64
> > 19 3.48
> > 20 4.01
> >
> > So very modest scaling, this is about the same i get with my
> application, how can i make it work faster?
>
> You can't, the memory bandwidth is the limiting factor on this machine
> (not the number of cores) and there is nothing to be done about it. When
> buying machines make sure that the memory bandwidth is an important factor
> in the decision.
>
> Barry
>
>
> > i am already using --map-by and --machinefile arguments for mpirun,
> maybe this is also a conflict with the different MPI installations?
> >
> > Thanks,
> >
> >
> >
> > On Wed, Apr 25, 2018 at 11:51 AM, Karl Rupp <rupp at iue.tuwien.ac.at>
> wrote:
> > Hi Manuel,
> >
> > this looks like the wrong MPI gets used. You should see an increasing
> number of processes, e.g.
> >
> > Number of MPI processes 1 Processor names node37
> > Triad: 6052.3571 Rate (MB/s)
> > Number of MPI processes 2 Processor names node37 node37
> > Triad: 9138.9376 Rate (MB/s)
> > Number of MPI processes 3 Processor names node37 node37 node37
> > Triad: 11077.5905 Rate (MB/s)
> > Number of MPI processes 4 Processor names node37 node37 node37 node37
> > Triad: 12055.9123 Rate (MB/s)
> >
> > Best regards,
> > Karli
> >
> >
> >
> >
> > On 04/25/2018 08:26 PM, Manuel Valera wrote:
> > Hi,
> >
> > I'm running scaling tests on my system to check why my scaling is so
> poor, and after following the MPIVersion guidelines my scaling.log output
> looks like this:
> >
> > Number of MPI processes 1 Processor names node37
> > Triad: 12856.9252 Rate (MB/s)
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Triad: 9138.3320 Rate (MB/s)
> > Triad: 9945.0006 Rate (MB/s)
> > Triad: 10480.8471 Rate (MB/s)
> > Triad: 12055.4846 Rate (MB/s)
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Triad: 7394.1014 Rate (MB/s)
> > Triad: 5528.9757 Rate (MB/s)
> > Triad: 6052.7506 Rate (MB/s)
> > Triad: 6188.5710 Rate (MB/s)
> > Triad: 6944.4515 Rate (MB/s)
> > Triad: 7407.1594 Rate (MB/s)
> > Triad: 9508.1984 Rate (MB/s)
> > Triad: 10699.7551 Rate (MB/s)
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Number of MPI processes 1 Processor names node37
> > Triad: 6682.3749 Rate (MB/s)
> > Triad: 6825.3243 Rate (MB/s)
> > Triad: 7217.8178 Rate (MB/s)
> > Triad: 7525.1025 Rate (MB/s)
> > Triad: 7882.1781 Rate (MB/s)
> > Triad: 8071.1430 Rate (MB/s)
> > Triad: 10341.9424 Rate (MB/s)
> > Triad: 10418.4740 Rate (MB/s)
> >
> >
> > Is this normal? i feel is different from what i get from an usual
> streams test, how can i get it to work properly?
> >
> > Thanks,
> >
> >
> >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180425/27d401d2/attachment.html>
More information about the petsc-users
mailing list