[petsc-users] ASM vs GASM
Boyce Griffith
boyceg at gmail.com
Tue Apr 30 11:36:51 CDT 2019
> On Apr 30, 2019, at 12:31 PM, Mark Adams <mfadams at lbl.gov> wrote:
>
> When I said it was singular I was looking at "preconditioned residual norm to an rtol of 1e-12. If I look at the true residual norm, however, it stagnates around 1e-4."
>
> This is not what I am seeing in this output.
Whoops, I switched the outer KSP from GMRES to FGMRES.
With GMRES, the preconditioned residual norm drops nicely (but, of course, the solution is no good):
$ ./main -d 3 -n 10 -ksp_type gmres -ksp_monitor_true_residual
Running ./main -d 3 -n 10 -ksp_type gmres -ksp_monitor_true_residual
0 KSP preconditioned resid norm 7.954859454640e-01 true resid norm 5.249964981356e+02 ||r(i)||/||b|| 1.000000000000e+00
1 KSP preconditioned resid norm 1.791745669837e-01 true resid norm 6.420515097608e+01 ||r(i)||/||b|| 1.222963414120e-01
2 KSP preconditioned resid norm 1.018932536518e-02 true resid norm 1.538149013353e+00 ||r(i)||/||b|| 2.929827187068e-03
3 KSP preconditioned resid norm 1.247250041620e-03 true resid norm 1.231074134137e-01 ||r(i)||/||b|| 2.344918753761e-04
4 KSP preconditioned resid norm 1.090687825399e-04 true resid norm 9.214204251786e-02 ||r(i)||/||b|| 1.755098231037e-04
5 KSP preconditioned resid norm 5.773017036638e-06 true resid norm 9.199655161085e-02 ||r(i)||/||b|| 1.752326957181e-04
6 KSP preconditioned resid norm 4.880868222010e-07 true resid norm 9.199488147685e-02 ||r(i)||/||b|| 1.752295144892e-04
7 KSP preconditioned resid norm 3.528569945380e-08 true resid norm 9.199485972669e-02 ||r(i)||/||b|| 1.752294730601e-04
8 KSP preconditioned resid norm 1.875782938387e-09 true resid norm 9.199486015879e-02 ||r(i)||/||b|| 1.752294738832e-04
9 KSP preconditioned resid norm 8.952213054230e-11 true resid norm 9.199486012037e-02 ||r(i)||/||b|| 1.752294738100e-04
10 KSP preconditioned resid norm 3.450175063457e-12 true resid norm 9.199486011997e-02 ||r(i)||/||b|| 1.752294738092e-04
11 KSP preconditioned resid norm 2.186653062508e-13 true resid norm 9.199486012016e-02 ||r(i)||/||b|| 1.752294738096e-04
> It is just a poor PC. The big drop in the residual at the beginning is suspicious. When you solve this problem well have you checked that you are getting a good solution?
Yes indeed, the solution appears to be no good with this preconditioner.
Note that all that the PC is doing is applying GASM to split the problem onto two subdomains.
> That is, do you check that your model is not messed up, like a bad mesh.
This is a uniform HEX mesh automatically generated by libMesh.
> On Tue, Apr 30, 2019 at 11:35 AM Boyce Griffith via petsc-users <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>> wrote:
>
>
>> On Apr 30, 2019, at 11:23 AM, Fande Kong <fdkong.jd at gmail.com <mailto:fdkong.jd at gmail.com>> wrote:
>>
>>
>>
>> On Tue, Apr 30, 2019 at 7:40 AM Boyce Griffith via petsc-users <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>> wrote:
>>
>>
>>> On Apr 30, 2019, at 9:06 AM, Mark Adams <mfadams at lbl.gov <mailto:mfadams at lbl.gov>> wrote:
>>>
>>>
>>>
>>>
>>> Allowing GASM to construct the "outer" subdomains from the non-overlapping "inner" subdomains, and using "exact" subdomain solvers (subdomain KSPs are using FGMRES+ILU with an rtol of 1e-12), I get convergence in ~2 iterations in the preconditioned residual norm to an rtol of 1e-12. If I look at the true residual norm, however, it stagnates around 1e-4.
>>>
>>>
>>> That PC is singular.
>>
>>
>> Yes. I am confused about why GASM is giving a singular PC but ASM is not with the same subdomains.
>>
>> We could not tell much without the info of detailed solver setup. The overlapping function was implemented by me and Dmitry a couple of years ago, and it is trick when a subdomain is shared by multiple cores. Do you mind to start from our example? Or setup an example for us to demonstrate your issues?
>>
>> At least please use "-ksp_view" (linear solver) or "-snes_view" (nonlinear solver) to print more information that will help us a bit.
>
> Here you go:
>
> $ ./main -d 3 -n 10 -ksp_type fgmres -sub_ksp_type fgmres -sub_pc_type ilu -sub_ksp_rtol 1.0e-12 -ksp_converged_reason -ksp_monitor_true_residual -ksp_max_it 10 -ksp_view
> make: `main' is up to date.
> Running ./main -d 3 -n 10 -ksp_type fgmres -sub_ksp_type fgmres -sub_pc_type ilu -sub_ksp_rtol 1.0e-12 -ksp_converged_reason -ksp_monitor_true_residual -ksp_max_it 10 -ksp_view
>
> 0 KSP unpreconditioned resid norm 5.249964981356e+02 true resid norm 5.249964981356e+02 ||r(i)||/||b|| 1.000000000000e+00
> 1 KSP unpreconditioned resid norm 9.316296223724e-02 true resid norm 9.316296223724e-02 ||r(i)||/||b|| 1.774544450641e-04
> 2 KSP unpreconditioned resid norm 9.314881028141e-02 true resid norm 9.314881028141e-02 ||r(i)||/||b|| 1.774274887779e-04
> 3 KSP unpreconditioned resid norm 9.299990517422e-02 true resid norm 9.299918556770e-02 ||r(i)||/||b|| 1.771424874222e-04
> 4 KSP unpreconditioned resid norm 9.224468272306e-02 true resid norm 9.393543403858e-02 ||r(i)||/||b|| 1.789258297382e-04
> 5 KSP unpreconditioned resid norm 9.150828598034e-02 true resid norm 9.511673987375e-02 ||r(i)||/||b|| 1.811759510997e-04
> 6 KSP unpreconditioned resid norm 9.078924839691e-02 true resid norm 1.013093335976e-01 ||r(i)||/||b|| 1.929714463951e-04
> 7 KSP unpreconditioned resid norm 9.008689850931e-02 true resid norm 1.011099594157e-01 ||r(i)||/||b|| 1.925916835155e-04
> 8 KSP unpreconditioned resid norm 8.940060065590e-02 true resid norm 1.090779251949e-01 ||r(i)||/||b|| 2.077688624253e-04
> 9 KSP unpreconditioned resid norm 8.872975256529e-02 true resid norm 1.102873098599e-01 ||r(i)||/||b|| 2.100724676289e-04
> 10 KSP unpreconditioned resid norm 8.807378313465e-02 true resid norm 1.071996745064e-01 ||r(i)||/||b|| 2.041912182026e-04
> Linear solve did not converge due to DIVERGED_ITS iterations 10
> KSP Object: 1 MPI processes
> type: fgmres
> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
> happy breakdown tolerance 1e-30
> maximum iterations=10, nonzero initial guess
> tolerances: relative=1e-12, absolute=1e-50, divergence=10000.
> right preconditioning
> using UNPRECONDITIONED norm type for convergence test
> PC Object: 1 MPI processes
> type: gasm
> Restriction/interpolation type: RESTRICT
> requested amount of overlap = 0
> total number of subdomains = 2
> max number of local subdomains = 2
> [0|1] number of locally-supported subdomains = 2
> Subdomain solver info is as follows:
> - - - - - - - - - - - - - - - - - -
> [0|1] (subcomm [0|1]) local subdomain number 0, local size = 484
> KSP Object: (sub_) 1 MPI processes
> type: fgmres
> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
> happy breakdown tolerance 1e-30
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-12, absolute=1e-50, divergence=10000.
> right preconditioning
> using UNPRECONDITIONED norm type for convergence test
> PC Object: (sub_) 1 MPI processes
> type: ilu
> out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> matrix ordering: natural
> factor fill ratio given 1., needed 1.
> Factored matrix follows:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=484, cols=484
> package used to perform factorization: petsc
> total: nonzeros=15376, allocated nonzeros=15376
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> linear system matrix = precond matrix:
> Mat Object: () 1 MPI processes
> type: seqaij
> rows=484, cols=484
> total: nonzeros=15376, allocated nonzeros=15376
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> - - - - - - - - - - - - - - - - - -
> [0|1] (subcomm [0|1]) local subdomain number 1, local size = 121
> KSP Object: (sub_) 1 MPI processes
> type: fgmres
> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
> happy breakdown tolerance 1e-30
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-12, absolute=1e-50, divergence=10000.
> right preconditioning
> using UNPRECONDITIONED norm type for convergence test
> PC Object: (sub_) 1 MPI processes
> type: ilu
> out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> matrix ordering: natural
> factor fill ratio given 1., needed 1.
> Factored matrix follows:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=121, cols=121
> package used to perform factorization: petsc
> total: nonzeros=961, allocated nonzeros=961
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> linear system matrix = precond matrix:
> Mat Object: () 1 MPI processes
> type: seqaij
> rows=121, cols=121
> total: nonzeros=961, allocated nonzeros=961
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> - - - - - - - - - - - - - - - - - -
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Petsc has generated inconsistent data
> [0]PETSC ERROR: Called more times than PetscViewerASCIIPushSynchronized()
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html <http://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.11.1, unknown
> [0]PETSC ERROR: ./main on a darwin-dbg named boycesimacwork.dhcp.unc.edu <http://boycesimacwork.dhcp.unc.edu/> by boyceg Tue Apr 30 11:29:02 2019
> [0]PETSC ERROR: Configure options --CC=mpicc --CXX=mpicxx --FC=mpif90 --PETSC_ARCH=darwin-dbg --with-debugging=1 --with-c++-support=1 --with-hypre=1 --download-hypre=1 --with-hdf5=1 --with-hdf5-dir=/usr/local
> [0]PETSC ERROR: #1 PetscViewerASCIIPopSynchronized() line 438 in /Users/boyceg/sfw/petsc/petsc-maint/src/sys/classes/viewer/impls/ascii/filev.c
> [0]PETSC ERROR: #2 PCView_GASM() line 251 in /Users/boyceg/sfw/petsc/petsc-maint/src/ksp/pc/impls/gasm/gasm.c
> [0]PETSC ERROR: #3 PCView() line 1651 in /Users/boyceg/sfw/petsc/petsc-maint/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: #4 KSPView() line 213 in /Users/boyceg/sfw/petsc/petsc-maint/src/ksp/ksp/interface/itcreate.c
> [0]PETSC ERROR: #5 PetscObjectView() line 100 in /Users/boyceg/sfw/petsc/petsc-maint/src/sys/objects/destroy.c
> [0]PETSC ERROR: #6 ObjectView() line 14 in /Users/boyceg/sfw/petsc/petsc-maint/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #7 KSPSolve() line 831 in /Users/boyceg/sfw/petsc/petsc-maint/src/ksp/ksp/interface/itfunc.c
>
>> One difference between them is that the default GASM overlap is 0 (https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGASMSetOverlap.html <https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGASMSetOverlap.html>), but the default ASM overlap is 1 (https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCASMSetOverlap.html <https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCASMSetOverlap.html>).
>>
>> For your particular application, not sure you need any overlap since there are two different PDEs on two different subdomains. It may be fine to run the PC without the overlapping domain. It is a definitely interesting application for GASM.
>>
>> Fande,
>>
>>
>> However, changing the GASM overlap does not make any difference in the convergence history.
>>
>> -- Boyce
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190430/dcb5b291/attachment-0001.html>
More information about the petsc-users
mailing list