[petsc-users] sources of floating point randomness in JFNK in serial

Matthew Knepley knepley at gmail.com
Thu May 4 07:25:06 CDT 2023


On Thu, May 4, 2023 at 8:21 AM Mark Lohry <mlohry at gmail.com> wrote:

> Do they start very similarly and then slowly drift further apart?
>
>
> Yes, this. I take it this sounds familiar?
>
> See these two examples with 20 fixed iterations pasted at the end. The
> difference for one solve is slight (final SNES norm is identical to 5
> digits), but in the context I'm using it in (repeated applications to solve
> a steady state multigrid problem, though here just one level) the
> differences add up such that I might reach global convergence in 35
> iterations or 38. It's not the end of the world, but I was expecting that
> with -np 1 these would be identical and I'm not sure where the root cause
> would be.
>

The initial KSP residual is different, so its the PC. Please send the
output of -snes_view. If your ASM is using direct factorization, then it
could be randomness in whatever LU you are using.

  Thanks,

    Matt


>   0 SNES Function norm 2.801842107848e+04
>     0 KSP Residual norm 4.045639499595e+01
>     1 KSP Residual norm 1.917999809040e+01
>     2 KSP Residual norm 1.616048521958e+01
> [...]
>    19 KSP Residual norm 8.788043518111e-01
>    20 KSP Residual norm 6.570851270214e-01
>   Linear solve converged due to CONVERGED_ITS iterations 20
>   1 SNES Function norm 1.801309983345e+03
> Nonlinear solve converged due to CONVERGED_ITS iterations 1
>
>
> Same system, identical initial 0 SNES norm, 0 KSP is slightly different
>
>   0 SNES Function norm 2.801842107848e+04
>     0 KSP Residual norm 4.045639473002e+01
>     1 KSP Residual norm 1.917999883034e+01
>     2 KSP Residual norm 1.616048572016e+01
> [...]
>    19 KSP Residual norm 8.788046348957e-01
>    20 KSP Residual norm 6.570859588610e-01
>   Linear solve converged due to CONVERGED_ITS iterations 20
>   1 SNES Function norm 1.801311320322e+03
> Nonlinear solve converged due to CONVERGED_ITS iterations 1
>
> On Wed, May 3, 2023 at 11:05 PM Barry Smith <bsmith at petsc.dev> wrote:
>
>>
>>   Do they start very similarly and then slowly drift further apart? That
>> is the first couple of KSP iterations they are almost identical but then
>> for each iteration get a bit further. Similar for the SNES iterations,
>> starting close and then for more iterations and more solves they start
>> moving apart. Or do they suddenly jump to be very different? You can run
>> with -snes_monitor -ksp_monitor
>>
>> On May 3, 2023, at 9:07 PM, Mark Lohry <mlohry at gmail.com> wrote:
>>
>> This is on a single MPI rank. I haven't checked the coloring, was just
>> guessing there. But the solutions/residuals are slightly different from run
>> to run.
>>
>> Fair to say that for serial JFNK/asm ilu0/gmres we should expect bitwise
>> identical results?
>>
>>
>> On Wed, May 3, 2023, 8:50 PM Barry Smith <bsmith at petsc.dev> wrote:
>>
>>>
>>>   No, the coloring should be identical every time. Do you see
>>> differences with 1 MPI rank? (Or much smaller ones?).
>>>
>>>
>>>
>>> > On May 3, 2023, at 8:42 PM, Mark Lohry <mlohry at gmail.com> wrote:
>>> >
>>> > I'm running multiple iterations of newtonls with an MFFD/JFNK
>>> nonlinear solver where I give it the sparsity. PC asm, KSP gmres, with
>>> SNESSetLagJacobian -2 (compute once and then frozen jacobian).
>>> >
>>> > I'm seeing slight (<1%) but nonzero differences in residuals from run
>>> to run. I'm wondering where randomness might enter here -- does the
>>> jacobian coloring use a random seed?
>>>
>>>
>>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230504/4221f471/attachment.html>


More information about the petsc-users mailing list