[petsc-users] [SLEPc] GD is not deterministic when using different number of cores

Denis Davydov davydden at gmail.com
Wed Nov 25 03:10:37 CST 2015


> On 19 Nov 2015, at 11:19, Jose E. Roman <jroman at dsic.upv.es> wrote:
> 
>> 
>> El 19 nov 2015, a las 10:49, Denis Davydov <davydden at gmail.com> escribió:
>> 
>> Dear all,
>> 
>> I was trying to get some scaling results for the GD eigensolver as applied to the density functional theory.
>> Interestingly enough, the number of self-consistent iterations (solution of couple eigenvalue problem and poisson equations) 
>> depends on the number of MPI cores used. For my case the range of iterations is 19-24 for MPI cores between 2 and 160.
>> That makes the whole scaling check useless as the eigenproblem is solved different number of times.
>> 
>> That is **not** the case when I use Krylov-Schur eigensolver with zero shift, which makes me believe that I am missing some settings on GD to make it fully deterministic. The only non-deterministic part I am currently aware of is the initial subspace for the first SC iterations. But that’s the case for both KS and GD. For subsequent iterations I provide previously obtained eigenvectors as initial subspace.
>> 
>> Certainly there will be some round-off error due to different partition of DoFs for different number of MPI cores, 
>> but i don’t expect it to have such a strong influence. Especially given the fact that I don’t see this problem with KS.
>> 
>> Below is the output of -eps-view for GD with -eps_type gd -eps_harmonic -st_pc_type bjacobi -eps_gd_krylov_start -eps_target -10.0
>> I would appreciate any suggestions on how to address the issue.
> 
> The block Jacobi preconditioner differs when you change the number of processes. This will probably make GD iterate more when you use more processes.

I figured else was causing different solution for different MPI cores:  -eps_harmonic.
As soon as I remove it from GD and JD, i have the same number of eigenproblems solved until convergence for all MPI cores (1,2,4,10,20) and for all 
methods (KS/GD/JD).

Regards,
Denis.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151125/31471486/attachment.html>


More information about the petsc-users mailing list