[petsc-users] pc_redistribute issue
Mark Adams
mfadams at lbl.gov
Sat Jan 27 14:37:55 CST 2024
Note, pc_redistibute is a great idea but you lose the block size, which is
obvious after you realize it, but is error prone.
Maybe it would be better to throw an error if bs > 1 and add a
-pc_redistribute_ignore_block_size or something for users that want to
press on.
Thanks,
Mark
On Sat, Jan 27, 2024 at 1:26 PM Mark Adams <mfadams at lbl.gov> wrote:
> Well, that puts the reason after the iterations, which is progress.
>
> Oh, I see the preconditioned norm goes down a lot, but the reported
> residual that you would think is used for testing (see first post) does not
> go down 12 digits.
> This matrix is very ill conditioned. LU just gets about 7 digits.
>
> Thanks,
> Mark
>
> Residual norms for redistribute_ solve.
> 0 KSP preconditioned resid norm 3.988887683909e+16 true resid norm
> 6.646245659859e+06 ||r(i)||/||b|| 1.000000000000e+00
> 1 KSP preconditioned resid norm 3.257912040767e+02 true resid norm
> 1.741027565497e-04 ||r(i)||/||b|| 2.619565472898e-11
> Linear redistribute_ solve converged due to CONVERGED_RTOL iterations 1
> KSP Object: (redistribute_) 1 MPI process
> type: gmres
> restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> happy breakdown tolerance 1e-30
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-12, absolute=1e-50, divergence=10000.
> left preconditioning
> using PRECONDITIONED norm type for convergence test
> PC Object: (redistribute_) 1 MPI process
> type: bjacobi
> number of blocks = 1
> Local solver information for first block is in the following KSP and
> PC objects on rank 0:
> Use -redistribute_ksp_view ::ascii_info_detail to display information
> for all blocks
> KSP Object: (redistribute_sub_) 1 MPI process
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (redistribute_sub_) 1 MPI process
> type: lu
> out-of-place factorization
> tolerance for zero pivot 2.22045e-14
> matrix ordering: external
> factor fill ratio given 0., needed 0.
> Factored matrix follows:
> Mat Object: (redistribute_sub_) 1 MPI process
> type: mumps
> rows=44378, cols=44378
> package used to perform factorization: mumps
> total: nonzeros=50309372, allocated nonzeros=50309372
> MUMPS run parameters:
>
> On Sat, Jan 27, 2024 at 12:51 PM Matthew Knepley <knepley at gmail.com>
> wrote:
>
>> Okay, so the tolerance is right. It must be using ||b|| instead of
>> ||r0||. Run with
>>
>> -redistribute_ksp_monitor_true_residual
>>
>> You might have to force r0.
>>
>> Thanks,
>>
>> Matt
>>
>> On Sat, Jan 27, 2024 at 11:44 AM Mark Adams <mfadams at lbl.gov> wrote:
>>
>>> KSP Object: (redistribute_) 1 MPI process
>>> type: gmres
>>> restart=30, using Classical (unmodified) Gram-Schmidt
>>> Orthogonalization with no iterative refinement
>>> happy breakdown tolerance 1e-30
>>> maximum iterations=10000, initial guess is zero
>>>
>>> * tolerances: relative=1e-12, absolute=1e-50, divergence=10000.*
>>> left preconditioning
>>> using PRECONDITIONED norm type for convergence test
>>> PC Object: (redistribute_) 1 MPI process
>>> type: bjacobi
>>> number of blocks = 1
>>> Local solver information for first block is in the following KSP and
>>> PC objects on rank 0:
>>> Use -redistribute_ksp_view ::ascii_info_detail to display
>>> information for all blocks
>>> KSP Object: (redistribute_sub_) 1 MPI process
>>> type: preonly
>>> maximum iterations=10000, initial guess is zero
>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
>>> left preconditioning
>>> using NONE norm type for convergence test
>>> PC Object: (redistribute_sub_) 1 MPI process
>>> type: lu
>>> out-of-place factorization
>>> tolerance for zero pivot 2.22045e-14
>>>
>>> On Sat, Jan 27, 2024 at 10:24 AM Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> View the solver.
>>>>
>>>> Matt
>>>>
>>>> On Sat, Jan 27, 2024 at 9:43 AM Mark Adams <mfadams at lbl.gov> wrote:
>>>>
>>>>> I am not getting ksp_rtol 1e-12 into pc_redistribute correctly?
>>>>>
>>>>>
>>>>>
>>>>> * Linear redistribute_ solve converged due to CONVERGED_RTOL
>>>>> iterations 1 0 KSP Residual norm 2.182384017537e+02 1 KSP Residual norm
>>>>> 1.889764161573e-04 *
>>>>> Number of iterations = 1 N = 47628
>>>>> Residual norm 8.65917e-07
>>>>> #PETSc Option Table entries:
>>>>> -f S.bin # (source: command line)
>>>>> -ksp_monitor # (source: command line)
>>>>> -ksp_type preonly # (source: command line)
>>>>> -mat_block_size 36 # (source: command line)
>>>>> -mat_view ascii::ascii_info # (source: command line)
>>>>> -options_left # (source: command line)
>>>>> -pc_type redistribute # (source: command line)
>>>>> -redistribute_ksp_converged_reason # (source: command line)
>>>>>
>>>>> *-redistribute_ksp_rtol 1e-12 # (source: command line)*-redistribute_ksp_type
>>>>> gmres # (source: command line)
>>>>> -redistribute_pc_type bjacobi # (source: command line)
>>>>> -redistribute_sub_pc_factor_mat_solver_type mumps # (source: command
>>>>> line)
>>>>> -redistribute_sub_pc_type lu # (source: command line)
>>>>> #End of PETSc Option Table entries
>>>>> *There are no unused options.*
>>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/
>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240127/2e21e9ca/attachment-0001.html>
More information about the petsc-users
mailing list