[petsc-users] On what condition is useful MPI-based solution?
Jose E. Roman
jroman at dsic.upv.es
Wed Jan 13 02:50:57 CST 2010
On 13/01/2010, Takuya Sekikawa wrote:
> Dear SLEPc/PETSc team,
>
> I tried to run SLEPc's samples program ex1 in MPI-based multi-PC
> environment. (1-PC vs 2-PCs)
>
> (ex)
> [running ex1 in only 1 PC]
> ----------------------------------------------------------
> $ time -p mpiexec -n 1 ./ex1 -n 4000 -eps_max_it 10000
> ...
> real 76.2
> ----------------------------------------------------------
> says, ex1 took 76.2 seconds.
>
> next, I run same sample in 2 PCs environment.
>
> [running ex1 in 2 PCs]
> ----------------------------------------------------------
> $ time -p mpiexec -n 2 ./ex1 -n 4000 -eps_max_it 10000
> ...
> real 265.54
> ----------------------------------------------------------
> I got 265.54 seconds. (slower than single PC)
>
> [Q1]
> Can ex1 sample speed up with MPI?, if so, generally on what condition?
Yes. The same example on my desktop computer (Intel Core 2 Duo):
With -n 1 --> real 33.99
With -n 2 --> real 21.90
If you simply have two PCs connected via a slow network, then you cannot expect good speedup. Try in a cluster with fast network.
On the other hand, a better way to measure the parallel execution time is to edit the source file and put PetscGetTime around the Solve call.
>
> [Q2]
> Generally, Is MPI only useful in very large matrix?
> Now I have to solve eigenvalue problem of 1M x 1M matrix,
> Should I use MPI-based system?
For a 1 million matrix I would suggest to run in parallel on an MPI cluster. However, a single computer might be enough if the matrix is very sparse, you need very few eigenvalues, and/or the system has enough memory (but in that case, be prepared for very long response times, depending on how your problem converges).
Jose
>
> Thanks,
> Takuya
> ---------------------------------------------------------------
> Takuya Sekikawa
> Mathematical Systems, Inc
> sekikawa at msi.co.jp
> ---------------------------------------------------------------
>
More information about the petsc-users
mailing list