[petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos
Matthew Knepley
knepley at gmail.com
Tue Oct 23 06:53:27 CDT 2018
On Tue, Oct 23, 2018 at 6:24 AM Ale Foggia <amfoggia at gmail.com> wrote:
> Hello,
>
> I'm currently using Lanczos solver (EPSLANCZOS) to get the smallest real
> eigenvalue (EPS_SMALLEST_REAL) of a Hermitian problem (EPS_HEP). Those are
> the only options I set for the solver. My aim is to be able to
> predict/estimate the time-to-solution. To do so, I was doing a scaling of
> the code for different sizes of matrices and for different number of MPI
> processes. As I was not observing a good scaling I checked the number of
> iterations of the solver (given by EPSGetIterationNumber). I've encounter
> that for the **same size** of matrix (that meaning, the same problem), when
> I change the number of MPI processes, the amount of iterations changes, and
> the behaviour is not monotonic. This are the numbers I've got:
>
I am sure you know this, but this test is strong scaling and will top out
when the individual problem sizes become too small (we see this at several
thousand unknowns).
Thanks,
Matt
>
> # procs # iters
> 960 157
> 992 189
> 1024 338
> 1056 190
> 1120 174
> 2048 136
>
> I've checked the mailing list for a similar situation and I've found
> another person with the same problem but in another solver ("[SLEPc] GD is
> not deterministic when using different number of cores", Nov 19 2015), but
> I think the solution this person finds does not apply to my problem
> (removing "-eps_harmonic" option).
>
> Can you give me any hint on what is the reason for this behaviour? Is
> there a way to prevent this? It's not possible to estimate/predict any time
> consumption for bigger problems if the number of iterations varies this
> much.
>
> Ale
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181023/38a1c9d1/attachment.html>
More information about the petsc-users
mailing list