From zhaowenbo.npic at gmail.com Sun Oct 1 05:49:07 2017 From: zhaowenbo.npic at gmail.com (Wenbo Zhao) Date: Sun, 1 Oct 2017 18:49:07 +0800 Subject: [petsc-users] Issue of mg_coarse_ksp not converge Message-ID: Hi, I met some questions when I use PETSC/SLEPC to solve two-group neutron diffusion equations with finite difference method. The grid is 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. It is generalized eigenvalue problem Ax=\lamda Bx, where B is diagonally dominant matrix but not symmetry. EPS is set as below, ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);? Krylovschur is used as eps sovler. GAMG is used as PC. I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and preonly is OK. Test 1 $ make NCORE=1 runkr_nonsmooth mpirun -n 1 ./step-41 \ -st_ksp_type gmres \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 Test 2 $ make NCORE=1 runkr_smooth mpirun -n 1 ./step-41 \ -st_ksp_type gmres \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 makefile:43: recipe for target 'runkr_smooth' failed make: *** [runkr_smooth] Error 91 Test 3 $ make NCORE=1 runkr_gmres mpirun -n 1 ./step-41 \ -st_ksp_type gmres \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor -st_mg_coarse_ksp_rtol 1.0e-6 \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 makefile:59: recipe for target 'runkr_gmres' failed make: *** [runkr_gmres] Error 91 Log files were attched. The matrix file were also attched as AMAT.dat and BMAT.dat. Is it correct? Or something wrong with my code or commad-line? Thanks! Wenbo -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log.tgz Type: application/x-gzip Size: 123841 bytes Desc: not available URL: From nataf at ann.jussieu.fr Sun Oct 1 10:41:33 2017 From: nataf at ann.jussieu.fr (Frederic Nataf) Date: Sun, 1 Oct 2017 17:41:33 +0200 Subject: [petsc-users] Best slepc solvers to compute the eigenvectors corresponding to the m smallest eigenvalues Message-ID: Hi to everyone, I?d like to compute a basis for the vector space of a given dimension m (say m=10) that corresponds to the m smallest eigenvalues. You might as well say I want to compute the eigenvectors corresponding to the m smallest eigenvalues but actually I am only interested in vector space they span. I?d like as well to be able initialize the computation with some approximate guess vector space. My matrix is real symmetric definite positive and of moderate size so that I do not need a parallel version. What are the best and most reliable methods to do it. Many thanks, Fr?d?ric Nataf ????????????????????????????????????? Frederic Nataf Laboratoire Jacques Louis Lions, CNRS UMR 7598 Universit? Pierre et Marie Curie Equipe Alpines ? INRIA-Paris HPDDM Open source project: https://github.com/hpddm An Introduction to Domain Decomposition Methods: Algorithms, Theory and Parallel Implementation: http://bookstore.siam.org/ot144/ Coordonn?es: 4 place Jussieu 75252 Paris Cedex 05, France http://www.upmc.fr/fr/universite/campus_et_sites/a_paris_et_en_idf/jussieu.html , bureau 319 Tour 15/25 http://www.ljll.math.upmc.fr/nataf/ Tel: 33 (0)1 44 27 93 03 PORTABLE: 07 60 01 07 96 Fax:33 (0)1 44 27 72 00 --------------------------------------------------------------------------- From jroman at dsic.upv.es Sun Oct 1 11:22:38 2017 From: jroman at dsic.upv.es (Jose E. Roman) Date: Sun, 1 Oct 2017 18:22:38 +0200 Subject: [petsc-users] Best slepc solvers to compute the eigenvectors corresponding to the m smallest eigenvalues In-Reply-To: References: Message-ID: > El 1 oct 2017, a las 17:41, Frederic Nataf escribi?: > > Hi to everyone, > > > I?d like to compute a basis for the vector space of a given dimension m (say m=10) that corresponds to the m smallest eigenvalues. You might as well say I want to compute the eigenvectors corresponding to the m smallest eigenvalues but actually I am only interested in vector space they span. I?d like as well to be able initialize the computation with some approximate guess vector space. My matrix is real symmetric definite positive and of moderate size so that I do not need a parallel version. > > What are the best and most reliable methods to do it. > > Many thanks, > > Fr?d?ric Nataf > In SPD matrices, the smallest eigenvalues are the leftmost ones, so the default solver (Krylov-Schur) with -eps_smallest_real should give you the answer. However, this solver cannot exploit the knowledge of the initial guess subspace. In this case, it may be more effective to use GD or LOBPCG, especially if a good preconditioner is available. If the problem has multiple eigenvalues, I would suggest LOBPCG. Jose From knepley at gmail.com Sun Oct 1 11:34:34 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 1 Oct 2017 12:34:34 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao wrote: > Hi, > > I met some questions when I use PETSC/SLEPC to solve two-group neutron > diffusion equations with finite difference method. The grid is 3*3*3, when > DOF on each points is 2. So the matrix size is 54*54. > It is generalized eigenvalue problem Ax=\lamda Bx, where B is diagonally > dominant matrix but not symmetry. > EPS is set as below, > ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? > ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);? > > Krylovschur is used as eps sovler. GAMG is used as PC. > I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and preonly > is OK. > Why are you setting the coarse solver. This makes no sense. Thanks, Matt > > Test 1 > $ make NCORE=1 runkr_nonsmooth > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ > -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ > -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 > > Test 2 > $ make NCORE=1 runkr_smooth > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ > -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ > -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 > makefile:43: recipe for target 'runkr_smooth' failed > make: *** [runkr_smooth] Error 91 > > Test 3 > $ make NCORE=1 runkr_gmres > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ > -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ > -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor > -st_mg_coarse_ksp_rtol 1.0e-6 \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 > makefile:59: recipe for target 'runkr_gmres' failed > make: *** [runkr_gmres] Error 91 > > Log files were attched. > The matrix file were also attched as AMAT.dat and BMAT.dat. > > Is it correct? Or something wrong with my code or commad-line? > > Thanks! > > Wenbo > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From k_burkart at yahoo.com Sun Oct 1 15:20:21 2017 From: k_burkart at yahoo.com (Klaus Burkart) Date: Sun, 1 Oct 2017 20:20:21 +0000 (UTC) Subject: [petsc-users] How to link petsc? References: <2058352604.729724.1506889221347.ref@mail.yahoo.com> Message-ID: <2058352604.729724.1506889221347@mail.yahoo.com> I installed petsc and the tests were successful on Ubuntu 16.04 PETSC_DIR points to the install directory in my home directory and is in my .bashrc? export PETSC_DIR=/home/klaus/petsc-3.7.6 Now I want to link it and use it with the standard linux compiler to use the functionality in my application but I encounter several issues leading to compiler errors when I try to compile my own application including petsc functionality. I must use absolute path to petsc header files in my source code, Makefile entries to the install directors are not recognized.? Even then, header files referenced in petsc.h are not found leading to the following error .. petsc-3.7.6/include/petsc.h:5:22: fatal error: petscbag.h: file or directory not found? but the file is in the same directory. Adding the other header files also with the absolute path doesn't work, it's not going beyond petsc.h, the error remains What's the correct setup to be able to compile applications including petsc functionality? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sun Oct 1 17:26:20 2017 From: jed at jedbrown.org (Jed Brown) Date: Sun, 01 Oct 2017 16:26:20 -0600 Subject: [petsc-users] How to link petsc? In-Reply-To: <2058352604.729724.1506889221347@mail.yahoo.com> References: <2058352604.729724.1506889221347.ref@mail.yahoo.com> <2058352604.729724.1506889221347@mail.yahoo.com> Message-ID: <87o9pqijxv.fsf@jedbrown.org> Either use the PETSc makefiles or use pkg-config. This is a pretty comprehensive makefile you can use: PETSc.pc := $(PETSC_DIR)/$(PETSC_ARCH)/lib/pkgconfig/PETSc.pc CC := $(shell pkg-config --variable=ccompiler $(PETSc.pc)) CXX := $(shell pkg-config --variable=cxxcompiler $(PETSc.pc)) FC := $(shell pkg-config --variable=fcompiler $(PETSc.pc)) CFLAGS := $(shell pkg-config --variable=cflags_extra $(PETSc.pc)) $(shell pkg-config --cflags-only-other $(PETSc.pc)) CPPFLAGS := $(shell pkg-config --cflags-only-I $(PETSc.pc)) LDFLAGS := $(shell pkg-config --libs-only-L --libs-only-other $(PETSc.pc)) LDFLAGS += $(patsubst -L%, $(shell pkg-config --variable=ldflag_rpath $(PETSc.pc))%, $(shell pkg-config --libs-only-L $(PETSc.pc))) LDLIBS := $(shell pkg-config --libs-only-l $(PETSc.pc)) -lm print: @echo CC=$(CC) @echo CFLAGS=$(CFLAGS) @echo CPPFLAGS=$(CPPFLAGS) @echo LDFLAGS=$(LDFLAGS) @echo LDLIBS=$(LDLIBS) Klaus Burkart writes: > I installed petsc and the tests were successful on Ubuntu 16.04 > > PETSC_DIR points to the install directory in my home directory and is in my .bashrc? export PETSC_DIR=/home/klaus/petsc-3.7.6 > > > Now I want to link it and use it with the standard linux compiler to use the functionality in my application but I encounter several issues leading to compiler errors when I try to compile my own application including petsc functionality. > > I must use absolute path to petsc header files in my source code, Makefile entries to the install directors are not recognized.? Even then, header files referenced in petsc.h are not found leading to the following error .. petsc-3.7.6/include/petsc.h:5:22: fatal error: petscbag.h: file or directory not found? but the file is in the same directory. Adding the other header files also with the absolute path doesn't work, it's not going beyond petsc.h, the error remains > > > > What's the correct setup to be able to compile applications including petsc functionality? From zhaowenbo.npic at gmail.com Sun Oct 1 20:53:54 2017 From: zhaowenbo.npic at gmail.com (Wenbo Zhao) Date: Mon, 2 Oct 2017 09:53:54 +0800 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: Matt, Thanks for your reply. It DOES make no sense for this problem. But I am not clear about the 'preonly' option. Which solver is used in preonly? I wonder if 'preonly' is suitable for large scale problem such as 400,000,000 unknowns. So I tried 'gmres' option and found these error messages. Could you give me some suggestions? Thanks. Wenbo On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley wrote: > On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao > wrote: > >> Hi, >> >> I met some questions when I use PETSC/SLEPC to solve two-group neutron >> diffusion equations with finite difference method. The grid is 3*3*3, when >> DOF on each points is 2. So the matrix size is 54*54. >> It is generalized eigenvalue problem Ax=\lamda Bx, where B is diagonally >> dominant matrix but not symmetry. >> EPS is set as below, >> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >> ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);? >> >> Krylovschur is used as eps sovler. GAMG is used as PC. >> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and preonly >> is OK. >> > > Why are you setting the coarse solver. This makes no sense. > > Thanks, > > Matt > > >> >> Test 1 >> $ make NCORE=1 runkr_nonsmooth >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >> >> Test 2 >> $ make NCORE=1 runkr_smooth >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >> makefile:43: recipe for target 'runkr_smooth' failed >> make: *** [runkr_smooth] Error 91 >> >> Test 3 >> $ make NCORE=1 runkr_gmres >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >> -st_mg_coarse_ksp_rtol 1.0e-6 \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >> makefile:59: recipe for target 'runkr_gmres' failed >> make: *** [runkr_gmres] Error 91 >> >> Log files were attched. >> The matrix file were also attched as AMAT.dat and BMAT.dat. >> >> Is it correct? Or something wrong with my code or commad-line? >> >> Thanks! >> >> Wenbo >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vitse at lmt.ens-cachan.fr Mon Oct 2 03:12:52 2017 From: vitse at lmt.ens-cachan.fr (Matthieu Vitse) Date: Mon, 2 Oct 2017 10:12:52 +0200 Subject: [petsc-users] Load distributed matrices from directory In-Reply-To: References: <56F37582-36CD-4862-8E63-E14A90704517@lmt.ens-cachan.fr> Message-ID: <87E15445-7E6D-4DC4-AA1D-3CBA9FCC3DC3@lmt.ens-cachan.fr> > Le 29 sept. 2017 ? 17:43, Barry Smith > a ?crit : > > Or is your matrix generator code sequential and cannot generate the full matrix so you want to generate chunks at a time and save to disk then load them? Better for you to refactor your code to work in parallel in generating the whole thing (since you can already generate parts the refactoring shouldn't be terribly difficult). Thanks for your answer. The matrix is already generated in parallel, but we want to keep control on the decomposition which conflicts with directly using PCASM. That?s why we would really like to work only with the distributed matrices. Are there some issues that would prevent me from doing that ? Moreover, ASM is a first step, we would like then to use those matrices for multi-preconditioning our problem, and take into account MPCs (as a consequence we really need to know the decomposition). Thanks, ? Matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Oct 2 04:09:14 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 2 Oct 2017 05:09:14 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao wrote: > Matt, > Thanks for your reply. > It DOES make no sense for this problem. > But I am not clear about the 'preonly' option. Which solver is used in > preonly? I wonder if 'preonly' is suitable for large scale problem such as > 400,000,000 unknowns. > So I tried 'gmres' option and found these error messages. > I mean, why are you setting this at all. Just do not set the coarse solver. The default should work fine. Thanks, Matt > Could you give me some suggestions? > > Thanks. > > Wenbo > > > On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley > wrote: > >> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao >> wrote: >> >>> Hi, >>> >>> I met some questions when I use PETSC/SLEPC to solve two-group neutron >>> diffusion equations with finite difference method. The grid is 3*3*3, when >>> DOF on each points is 2. So the matrix size is 54*54. >>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is diagonally >>> dominant matrix but not symmetry. >>> EPS is set as below, >>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>> ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);? >>> >>> Krylovschur is used as eps sovler. GAMG is used as PC. >>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and >>> preonly is OK. >>> >> >> Why are you setting the coarse solver. This makes no sense. >> >> Thanks, >> >> Matt >> >> >>> >>> Test 1 >>> $ make NCORE=1 runkr_nonsmooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>> >>> Test 2 >>> $ make NCORE=1 runkr_smooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>> makefile:43: recipe for target 'runkr_smooth' failed >>> make: *** [runkr_smooth] Error 91 >>> >>> Test 3 >>> $ make NCORE=1 runkr_gmres >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>> makefile:59: recipe for target 'runkr_gmres' failed >>> make: *** [runkr_gmres] Error 91 >>> >>> Log files were attched. >>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>> >>> Is it correct? Or something wrong with my code or commad-line? >>> >>> Thanks! >>> >>> Wenbo >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Oct 2 04:34:21 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 2 Oct 2017 05:34:21 -0400 Subject: [petsc-users] Load distributed matrices from directory In-Reply-To: <87E15445-7E6D-4DC4-AA1D-3CBA9FCC3DC3@lmt.ens-cachan.fr> References: <56F37582-36CD-4862-8E63-E14A90704517@lmt.ens-cachan.fr> <87E15445-7E6D-4DC4-AA1D-3CBA9FCC3DC3@lmt.ens-cachan.fr> Message-ID: On Mon, Oct 2, 2017 at 4:12 AM, Matthieu Vitse wrote: > > Le 29 sept. 2017 ? 17:43, Barry Smith a ?crit : > > Or is your matrix generator code sequential and cannot generate the full > matrix so you want to generate chunks at a time and save to disk then load > them? Better for you to refactor your code to work in parallel in > generating the whole thing (since you can already generate parts the > refactoring shouldn't be terribly difficult). > > > Thanks for your answer. > > The matrix is already generated in parallel, but we want to keep control > on the decomposition which conflicts with directly using PCASM. > Please explain this statement with an example. When using MatLoad(), you are in control of the partitions, although not of the row order. Also, I am confused by your use of the word "distributed". We use it to mean an object, like a Mat that exists on several processes in a coordinated way. Thanks, Matt > That?s why we would really like to work only with the distributed > matrices. Are there some issues that would prevent me from doing that ? > Moreover, ASM is a first step, we would like then to use those matrices for > multi-preconditioning our problem, and take into account MPCs (as a > consequence we really need to know the decomposition). > > Thanks, > > ? > Matt > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mailinglists at xgm.de Mon Oct 2 05:21:25 2017 From: mailinglists at xgm.de (Florian Lindner) Date: Mon, 2 Oct 2017 18:21:25 +0800 Subject: [petsc-users] Mat/Vec with empty ranks Message-ID: Hello, I have a matrix and vector that live on 4 ranks, but only rank 2 and 3 have values: e.g. Vec Object: 4 MPI processes type: mpi Process [0] Process [1] 1.1 2.5 3. 4. Process [2] 5. 6. 7. 8. Process [3] Doing a simple LSQR solve does not converge. However, when the values are distributed equally, it converges within 3 iterations. What can I do about that? I have attached a simple program and creates the matrix and vector or loads them from a file. Thanks, Florian -------------- next part -------------- A non-text attachment was scrubbed... Name: in Type: application/octet-stream Size: 72 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: in.info Type: application/x-info Size: 21 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: matrixQ Type: application/octet-stream Size: 208 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: petscQR.cpp Type: text/x-c++src Size: 2603 bytes Desc: not available URL: From zhaowenbo.npic at gmail.com Mon Oct 2 07:30:18 2017 From: zhaowenbo.npic at gmail.com (Wenbo Zhao) Date: Mon, 2 Oct 2017 20:30:18 +0800 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: Matt Because I am not clear about what will happen using 'preonly' for large scale problem. It seems to use a direct solver from below, http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSPPREONLY.html Thanks! Wenbo On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley wrote: > On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao > wrote: > >> Matt, >> Thanks for your reply. >> It DOES make no sense for this problem. >> But I am not clear about the 'preonly' option. Which solver is used in >> preonly? I wonder if 'preonly' is suitable for large scale problem such as >> 400,000,000 unknowns. >> So I tried 'gmres' option and found these error messages. >> > > I mean, why are you setting this at all. Just do not set the coarse > solver. The default should work fine. > > Thanks, > > Matt > > >> Could you give me some suggestions? >> >> Thanks. >> >> Wenbo >> >> >> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley >> wrote: >> >>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao >>> wrote: >>> >>>> Hi, >>>> >>>> I met some questions when I use PETSC/SLEPC to solve two-group neutron >>>> diffusion equations with finite difference method. The grid is 3*3*3, when >>>> DOF on each points is 2. So the matrix size is 54*54. >>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>> diagonally dominant matrix but not symmetry. >>>> EPS is set as below, >>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>> ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);? >>>> >>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and >>>> preonly is OK. >>>> >>> >>> Why are you setting the coarse solver. This makes no sense. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> >>>> Test 1 >>>> $ make NCORE=1 runkr_nonsmooth >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>>> >>>> Test 2 >>>> $ make NCORE=1 runkr_smooth >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>> makefile:43: recipe for target 'runkr_smooth' failed >>>> make: *** [runkr_smooth] Error 91 >>>> >>>> Test 3 >>>> $ make NCORE=1 runkr_gmres >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>> makefile:59: recipe for target 'runkr_gmres' failed >>>> make: *** [runkr_gmres] Error 91 >>>> >>>> Log files were attched. >>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>> >>>> Is it correct? Or something wrong with my code or commad-line? >>>> >>>> Thanks! >>>> >>>> Wenbo >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Oct 2 08:04:35 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 2 Oct 2017 09:04:35 -0400 Subject: [petsc-users] Mat/Vec with empty ranks In-Reply-To: References: Message-ID: On Mon, Oct 2, 2017 at 6:21 AM, Florian Lindner wrote: > Hello, > > I have a matrix and vector that live on 4 ranks, but only rank 2 and 3 > have values: > > e.g. > > Vec Object: 4 MPI processes > type: mpi > Process [0] > Process [1] > 1.1 > 2.5 > 3. > 4. > Process [2] > 5. > 6. > 7. > 8. > Process [3] > > > Doing a simple LSQR solve does not converge. However, when the values are > distributed equally, it converges within 3 > iterations. > > What can I do about that? > > I have attached a simple program and creates the matrix and vector or > loads them from a file. > There are a few problems with this program. I am attaching a cleaned up version. However, convergence still differs starting at iteration 2. It appears that LSQR has a problem with this system, or we have a bug that I cannot see. Thanks, Matt > Thanks, > Florian > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: qr.c Type: text/x-csrc Size: 4065 bytes Desc: not available URL: From knepley at gmail.com Mon Oct 2 08:08:07 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 2 Oct 2017 09:08:07 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao wrote: > Matt > > Because I am not clear about what will happen using 'preonly' for large > scale problem. > The size of the problem has nothing to do with 'preonly'. All it means is to apply a preconditioner without a Krylov solver. > It seems to use a direct solver from below, > http://www.mcs.anl.gov/petsc/petsc-current/docs/ > manualpages/KSP/KSPPREONLY.html > However, I still cannot understand why you would change the default? Matt > > Thanks! > Wenbo > > On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley wrote: > >> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao >> wrote: >> >>> Matt, >>> Thanks for your reply. >>> It DOES make no sense for this problem. >>> But I am not clear about the 'preonly' option. Which solver is used in >>> preonly? I wonder if 'preonly' is suitable for large scale problem such as >>> 400,000,000 unknowns. >>> So I tried 'gmres' option and found these error messages. >>> >> >> I mean, why are you setting this at all. Just do not set the coarse >> solver. The default should work fine. >> >> Thanks, >> >> Matt >> >> >>> Could you give me some suggestions? >>> >>> Thanks. >>> >>> Wenbo >>> >>> >>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley >>> wrote: >>> >>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao >>>> wrote: >>>> >>>>> Hi, >>>>> >>>>> I met some questions when I use PETSC/SLEPC to solve two-group neutron >>>>> diffusion equations with finite difference method. The grid is 3*3*3, when >>>>> DOF on each points is 2. So the matrix size is 54*54. >>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>> diagonally dominant matrix but not symmetry. >>>>> EPS is set as below, >>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);? >>>>> >>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and >>>>> preonly is OK. >>>>> >>>> >>>> Why are you setting the coarse solver. This makes no sense. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> >>>>> Test 1 >>>>> $ make NCORE=1 runkr_nonsmooth >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>>>> >>>>> Test 2 >>>>> $ make NCORE=1 runkr_smooth >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>> make: *** [runkr_smooth] Error 91 >>>>> >>>>> Test 3 >>>>> $ make NCORE=1 runkr_gmres >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>> make: *** [runkr_gmres] Error 91 >>>>> >>>>> Log files were attched. >>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>> >>>>> Is it correct? Or something wrong with my code or commad-line? >>>>> >>>>> Thanks! >>>>> >>>>> Wenbo >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Oct 2 07:50:18 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 2 Oct 2017 14:50:18 +0200 Subject: [petsc-users] Load distributed matrices from directory In-Reply-To: <87E15445-7E6D-4DC4-AA1D-3CBA9FCC3DC3@lmt.ens-cachan.fr> References: <56F37582-36CD-4862-8E63-E14A90704517@lmt.ens-cachan.fr> <87E15445-7E6D-4DC4-AA1D-3CBA9FCC3DC3@lmt.ens-cachan.fr> Message-ID: MPCs? If you have a collection of "overlapping matrices" on disk then you will be responsible for even providing the matrix-vector product for the operator which you absolutely need if you are going to use any Krylov based overlapping Schwarz method. How do you plan to perform the matrix vector products? With regard to applying the preconditioner that is more straightforward. Each process gets is "overlapped" matrix and all you need to do is provide the VecScatter from a global vector to the "overlapped vector", do the local MatMult() with your "overlapped matrix" and then scatter-add back. I question if these entire process is even worth your time. Note: I am not a fan of the "custom application code" for a "custom" domain decomposition method (obviously or I would never have designed PETSc ;). I believe in general purpose library that you can then "customize" for your unique problem once you've determined by profiling that the customization/optimization is even worth it. For example, I would just start with PCASM then if I determine that it is not selecting good subdomains you can go and add customization to provide more information to it of how you want the subdomains to be defined etc. Some possibly useful routines for customization PETSC_EXTERN PetscErrorCode PCASMSetLocalSubdomains(PC,PetscInt,IS[],IS[]); PETSC_EXTERN PetscErrorCode PCASMSetTotalSubdomains(PC,PetscInt,IS[],IS[]); PETSC_EXTERN PetscErrorCode PCASMSetOverlap(PC,PetscInt); PETSC_EXTERN PetscErrorCode PCASMSetDMSubdomains(PC,PetscBool); PETSC_EXTERN PetscErrorCode PCASMGetDMSubdomains(PC,PetscBool*); PETSC_EXTERN PetscErrorCode PCASMSetSortIndices(PC,PetscBool); PETSC_EXTERN PetscErrorCode PCASMSetType(PC,PCASMType); PETSC_EXTERN PetscErrorCode PCASMGetType(PC,PCASMType*); PETSC_EXTERN PetscErrorCode PCASMSetLocalType(PC,PCCompositeType); PETSC_EXTERN PetscErrorCode PCASMGetLocalType(PC,PCCompositeType*); PETSC_EXTERN PetscErrorCode PCASMCreateSubdomains(Mat,PetscInt,IS*[]); PETSC_EXTERN PetscErrorCode PCASMDestroySubdomains(PetscInt,IS[],IS[]); PETSC_EXTERN PetscErrorCode PCASMCreateSubdomains2D(PetscInt,PetscInt,PetscInt,PetscInt,PetscInt,PetscInt,PetscInt*,IS**,IS**); PETSC_EXTERN PetscErrorCode PCASMGetLocalSubdomains(PC,PetscInt*,IS*[],IS*[]); PETSC_EXTERN PetscErrorCode PCASMGetLocalSubmatrices(PC,PetscInt*,Mat*[]); PETSC_EXTERN PetscErrorCode PCASMGetSubMatType(PC,MatType*); PETSC_EXTERN PetscErrorCode PCASMSetSubMatType(PC,MatType); If these are not useful you could tell us what kind of customization you want to have within KSP/PCASM and depending on how generally useful it might be we could possibly add more hooks for you. Barry > On Oct 2, 2017, at 10:12 AM, Matthieu Vitse wrote: > > >> Le 29 sept. 2017 ? 17:43, Barry Smith a ?crit : >> >> Or is your matrix generator code sequential and cannot generate the full matrix so you want to generate chunks at a time and save to disk then load them? Better for you to refactor your code to work in parallel in generating the whole thing (since you can already generate parts the refactoring shouldn't be terribly difficult). > > Thanks for your answer. > > The matrix is already generated in parallel, but we want to keep control on the decomposition which conflicts with directly using PCASM. That?s why we would really like to work only with the distributed matrices. Are there some issues that would prevent me from doing that ? Moreover, ASM is a first step, we would like then to use those matrices for multi-preconditioning our problem, and take into account MPCs (as a consequence we really need to know the decomposition). > > Thanks, > > ? > Matt From mfadams at lbl.gov Mon Oct 2 08:34:04 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 2 Oct 2017 09:34:04 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: GAMG will coarsen the problem until it is small and fast to solve with a direct solver (LU). You can use preonly if you have a perfect preconditioner. On Mon, Oct 2, 2017 at 9:08 AM, Matthew Knepley wrote: > On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao > wrote: > >> Matt >> >> Because I am not clear about what will happen using 'preonly' for large >> scale problem. >> > > The size of the problem has nothing to do with 'preonly'. All it means is > to apply a preconditioner without a Krylov solver. > > >> It seems to use a direct solver from below, >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >> KSP/KSPPREONLY.html >> > > However, I still cannot understand why you would change the default? > > Matt > > >> >> Thanks! >> Wenbo >> >> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >> wrote: >> >>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao >>> wrote: >>> >>>> Matt, >>>> Thanks for your reply. >>>> It DOES make no sense for this problem. >>>> But I am not clear about the 'preonly' option. Which solver is used in >>>> preonly? I wonder if 'preonly' is suitable for large scale problem such as >>>> 400,000,000 unknowns. >>>> So I tried 'gmres' option and found these error messages. >>>> >>> >>> I mean, why are you setting this at all. Just do not set the coarse >>> solver. The default should work fine. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Could you give me some suggestions? >>>> >>>> Thanks. >>>> >>>> Wenbo >>>> >>>> >>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley >>>> wrote: >>>> >>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao >>>>> wrote: >>>>> >>>>>> Hi, >>>>>> >>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>> diagonally dominant matrix but not symmetry. >>>>>> EPS is set as below, >>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);? >>>>>> >>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and >>>>>> preonly is OK. >>>>>> >>>>> >>>>> Why are you setting the coarse solver. This makes no sense. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> >>>>>> Test 1 >>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>> 2>&1 >>>>>> >>>>>> Test 2 >>>>>> $ make NCORE=1 runkr_smooth >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>> make: *** [runkr_smooth] Error 91 >>>>>> >>>>>> Test 3 >>>>>> $ make NCORE=1 runkr_gmres >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>> make: *** [runkr_gmres] Error 91 >>>>>> >>>>>> Log files were attched. >>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>> >>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>> >>>>>> Thanks! >>>>>> >>>>>> Wenbo >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhaowenbo.npic at gmail.com Mon Oct 2 08:39:52 2017 From: zhaowenbo.npic at gmail.com (Wenbo Zhao) Date: Mon, 2 Oct 2017 21:39:52 +0800 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: Matt, Thanks for your reply. For the defalt option doesnt work firstly( -st_ksp_type gmres -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried to test those options. Wenbo On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley wrote: > On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao > wrote: > >> Matt >> >> Because I am not clear about what will happen using 'preonly' for large >> scale problem. >> > > The size of the problem has nothing to do with 'preonly'. All it means is > to apply a preconditioner without a Krylov solver. > > >> It seems to use a direct solver from below, >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >> KSP/KSPPREONLY.html >> > > However, I still cannot understand why you would change the default? > > Matt > > >> >> Thanks! >> Wenbo >> >> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >> wrote: >> >>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao >>> wrote: >>> >>>> Matt, >>>> Thanks for your reply. >>>> It DOES make no sense for this problem. >>>> But I am not clear about the 'preonly' option. Which solver is used in >>>> preonly? I wonder if 'preonly' is suitable for large scale problem such as >>>> 400,000,000 unknowns. >>>> So I tried 'gmres' option and found these error messages. >>>> >>> >>> I mean, why are you setting this at all. Just do not set the coarse >>> solver. The default should work fine. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Could you give me some suggestions? >>>> >>>> Thanks. >>>> >>>> Wenbo >>>> >>>> >>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley >>>> wrote: >>>> >>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao >>>>> wrote: >>>>> >>>>>> Hi, >>>>>> >>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>> diagonally dominant matrix but not symmetry. >>>>>> EPS is set as below, >>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);? >>>>>> >>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and >>>>>> preonly is OK. >>>>>> >>>>> >>>>> Why are you setting the coarse solver. This makes no sense. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> >>>>>> Test 1 >>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>> 2>&1 >>>>>> >>>>>> Test 2 >>>>>> $ make NCORE=1 runkr_smooth >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>> make: *** [runkr_smooth] Error 91 >>>>>> >>>>>> Test 3 >>>>>> $ make NCORE=1 runkr_gmres >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>> make: *** [runkr_gmres] Error 91 >>>>>> >>>>>> Log files were attched. >>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>> >>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>> >>>>>> Thanks! >>>>>> >>>>>> Wenbo >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Oct 2 08:51:32 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 2 Oct 2017 09:51:32 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: Please send the output with -st_ksp_view and -st_ksp_monitor and we can start to debug it. You mentioned that B is not symmetric. I assume it is elliptic (diffusion). Where does the asymmetry come from? On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao wrote: > Matt, > Thanks for your reply. > For the defalt option doesnt work firstly( -st_ksp_type gmres -st_pc_type > gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried to test > those options. > > Wenbo > > On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley wrote: > >> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao >> wrote: >> >>> Matt >>> >>> Because I am not clear about what will happen using 'preonly' for large >>> scale problem. >>> >> >> The size of the problem has nothing to do with 'preonly'. All it means is >> to apply a preconditioner without a Krylov solver. >> >> >>> It seems to use a direct solver from below, >>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>> KSP/KSPPREONLY.html >>> >> >> However, I still cannot understand why you would change the default? >> >> Matt >> >> >>> >>> Thanks! >>> Wenbo >>> >>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>> wrote: >>> >>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao >>>> wrote: >>>> >>>>> Matt, >>>>> Thanks for your reply. >>>>> It DOES make no sense for this problem. >>>>> But I am not clear about the 'preonly' option. Which solver is used in >>>>> preonly? I wonder if 'preonly' is suitable for large scale problem such as >>>>> 400,000,000 unknowns. >>>>> So I tried 'gmres' option and found these error messages. >>>>> >>>> >>>> I mean, why are you setting this at all. Just do not set the coarse >>>> solver. The default should work fine. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> Could you give me some suggestions? >>>>> >>>>> Thanks. >>>>> >>>>> Wenbo >>>>> >>>>> >>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao >>>>>> wrote: >>>>>> >>>>>>> Hi, >>>>>>> >>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>> diagonally dominant matrix but not symmetry. >>>>>>> EPS is set as below, >>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);? >>>>>>> >>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and >>>>>>> preonly is OK. >>>>>>> >>>>>> >>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> >>>>>>> Test 1 >>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>> mpirun -n 1 ./step-41 \ >>>>>>> -st_ksp_type gmres \ >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>>> 2>&1 >>>>>>> >>>>>>> Test 2 >>>>>>> $ make NCORE=1 runkr_smooth >>>>>>> mpirun -n 1 ./step-41 \ >>>>>>> -st_ksp_type gmres \ >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>> >>>>>>> Test 3 >>>>>>> $ make NCORE=1 runkr_gmres >>>>>>> mpirun -n 1 ./step-41 \ >>>>>>> -st_ksp_type gmres \ >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>> >>>>>>> Log files were attched. >>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>> >>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>> >>>>>>> Thanks! >>>>>>> >>>>>>> Wenbo >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhaowenbo.npic at gmail.com Mon Oct 2 09:43:22 2017 From: zhaowenbo.npic at gmail.com (Wenbo Zhao) Date: Mon, 2 Oct 2017 22:43:22 +0800 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: Mark, Thanks for your reply. On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: > Please send the output with -st_ksp_view and -st_ksp_monitor and we can > start to debug it. > > Test 1 with nonsmooth and preonly is OK zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth mpirun -n 1 ./step-41 \ -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 Test 2 smooth and preonly is not OK zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth mpirun -n 1 ./step-41 \ -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 makefile:43: recipe for target 'runkr_smooth' failed make: *** [runkr_smooth] Error 91 Test 3 nonsmooth and gmres is not OK zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres mpirun -n 1 ./step-41 \ -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor -st_mg_coarse_ksp_rtol 1.0e-6 \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 makefile:59: recipe for target 'runkr_gmres' failed make: *** [runkr_gmres] Error 91 log-files is attached. You mentioned that B is not symmetric. I assume it is elliptic (diffusion). > Where does the asymmetry come from? > > It is a two-group diffusion equations, where group denotes neutron enegry discretisation. Matrix B consists of neutron diffusion/leakage term, removal term and minus neutron scatter source term between different energies, when matrix A denotes neutron fission source. Diffusion term(Laplace operator) is elliptic and symmetric. Removal term is diagonal only. However scatter term is asymmetry since scatter term from high energy to low energy is far greater than the term from low to high. Wenbo > On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao > wrote: > >> Matt, >> Thanks for your reply. >> For the defalt option doesnt work firstly( -st_ksp_type gmres -st_pc_type >> gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried to test >> those options. >> >> Wenbo >> >> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >> wrote: >> >>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao >>> wrote: >>> >>>> Matt >>>> >>>> Because I am not clear about what will happen using 'preonly' for large >>>> scale problem. >>>> >>> >>> The size of the problem has nothing to do with 'preonly'. All it means >>> is to apply a preconditioner without a Krylov solver. >>> >>> >>>> It seems to use a direct solver from below, >>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>> KSP/KSPPREONLY.html >>>> >>> >>> However, I still cannot understand why you would change the default? >>> >>> Matt >>> >>> >>>> >>>> Thanks! >>>> Wenbo >>>> >>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao >>>>> wrote: >>>>> >>>>>> Matt, >>>>>> Thanks for your reply. >>>>>> It DOES make no sense for this problem. >>>>>> But I am not clear about the 'preonly' option. Which solver is used >>>>>> in preonly? I wonder if 'preonly' is suitable for large scale problem such >>>>>> as 400,000,000 unknowns. >>>>>> So I tried 'gmres' option and found these error messages. >>>>>> >>>>> >>>>> I mean, why are you setting this at all. Just do not set the coarse >>>>> solver. The default should work fine. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Could you give me some suggestions? >>>>>> >>>>>> Thanks. >>>>>> >>>>>> Wenbo >>>>>> >>>>>> >>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao >>>>>> > wrote: >>>>>>> >>>>>>>> Hi, >>>>>>>> >>>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>> EPS is set as below, >>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>> >>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and >>>>>>>> preonly is OK. >>>>>>>> >>>>>>> >>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> Test 1 >>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>> -st_ksp_type gmres \ >>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 >>>>>>>> \ >>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>>>> 2>&1 >>>>>>>> >>>>>>>> Test 2 >>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>> -st_ksp_type gmres \ >>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 >>>>>>>> \ >>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>> >>>>>>>> Test 3 >>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>> -st_ksp_type gmres \ >>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 >>>>>>>> \ >>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>> >>>>>>>> Log files were attched. >>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>> >>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>> >>>>>>>> Thanks! >>>>>>>> >>>>>>>> Wenbo >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log_new.tgz Type: application/x-gzip Size: 124778 bytes Desc: not available URL: From knepley at gmail.com Mon Oct 2 09:48:48 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 2 Oct 2017 10:48:48 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao wrote: > Mark, > > Thanks for your reply. > > On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: > >> Please send the output with -st_ksp_view and -st_ksp_monitor and we can >> start to debug it. >> >> Test 1 with nonsmooth and preonly is OK > zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ > -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ > -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 > > Test 2 smooth and preonly is not OK > zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ > -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ > -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 > makefile:43: recipe for target 'runkr_smooth' failed > make: *** [runkr_smooth] Error 91 > > Test 3 nonsmooth and gmres is not OK > zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ > -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ > -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor > -st_mg_coarse_ksp_rtol 1.0e-6 \ > DO NOT DO THIS. Please send the output where you do NOTHING to the coarse solver. Thanks, Matt > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 > makefile:59: recipe for target 'runkr_gmres' failed > make: *** [runkr_gmres] Error 91 > > log-files is attached. > > > You mentioned that B is not symmetric. I assume it is elliptic >> (diffusion). Where does the asymmetry come from? >> >> > It is a two-group diffusion equations, where group denotes neutron enegry > discretisation. > Matrix B consists of neutron diffusion/leakage term, removal term and > minus neutron scatter source term between different energies, when matrix A > denotes neutron fission source. > > Diffusion term(Laplace operator) is elliptic and symmetric. Removal term > is diagonal only. However scatter term is asymmetry since scatter term from > high energy to low energy is far greater than the term from low to high. > > > Wenbo > > >> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao >> wrote: >> >>> Matt, >>> Thanks for your reply. >>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>> to test those options. >>> >>> Wenbo >>> >>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>> wrote: >>> >>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao >>>> wrote: >>>> >>>>> Matt >>>>> >>>>> Because I am not clear about what will happen using 'preonly' for >>>>> large scale problem. >>>>> >>>> >>>> The size of the problem has nothing to do with 'preonly'. All it means >>>> is to apply a preconditioner without a Krylov solver. >>>> >>>> >>>>> It seems to use a direct solver from below, >>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>> KSP/KSPPREONLY.html >>>>> >>>> >>>> However, I still cannot understand why you would change the default? >>>> >>>> Matt >>>> >>>> >>>>> >>>>> Thanks! >>>>> Wenbo >>>>> >>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao >>>>>> wrote: >>>>>> >>>>>>> Matt, >>>>>>> Thanks for your reply. >>>>>>> It DOES make no sense for this problem. >>>>>>> But I am not clear about the 'preonly' option. Which solver is used >>>>>>> in preonly? I wonder if 'preonly' is suitable for large scale problem such >>>>>>> as 400,000,000 unknowns. >>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>> >>>>>> >>>>>> I mean, why are you setting this at all. Just do not set the coarse >>>>>> solver. The default should work fine. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> Could you give me some suggestions? >>>>>>> >>>>>>> Thanks. >>>>>>> >>>>>>> Wenbo >>>>>>> >>>>>>> >>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley >>>>>>> wrote: >>>>>>> >>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>> >>>>>>>>> Hi, >>>>>>>>> >>>>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>>> EPS is set as below, >>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>> >>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and >>>>>>>>> preonly is OK. >>>>>>>>> >>>>>>>> >>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> Test 1 >>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>> -st_ksp_type gmres \ >>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>> 0 \ >>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>>>>> 2>&1 >>>>>>>>> >>>>>>>>> Test 2 >>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>> -st_ksp_type gmres \ >>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>> 1 \ >>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>> >>>>>>>>> Test 3 >>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>> -st_ksp_type gmres \ >>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>> 0 \ >>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>> >>>>>>>>> Log files were attched. >>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>> >>>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>>> >>>>>>>>> Thanks! >>>>>>>>> >>>>>>>>> Wenbo >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >>> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhaowenbo.npic at gmail.com Mon Oct 2 10:06:05 2017 From: zhaowenbo.npic at gmail.com (Wenbo Zhao) Date: Mon, 2 Oct 2017 23:06:05 +0800 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: Matt, Test 1 nonsmooth zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth mpirun -n 1 ./step-41 \ -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ -mata AMAT.dat -matb BMAT.dat \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 Test 2 smooth zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth mpirun -n 1 ./step-41 \ -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ -mata AMAT.dat -matb BMAT.dat \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 makefile:43: recipe for target 'runkr_smooth' failed make: *** [runkr_smooth] Error 91 Thanks, Wenbo On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley wrote: > On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao > wrote: > >> Mark, >> >> Thanks for your reply. >> >> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >> >>> Please send the output with -st_ksp_view and -st_ksp_monitor and we can >>> start to debug it. >>> >>> Test 1 with nonsmooth and preonly is OK >> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >> >> Test 2 smooth and preonly is not OK >> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >> makefile:43: recipe for target 'runkr_smooth' failed >> make: *** [runkr_smooth] Error 91 >> >> Test 3 nonsmooth and gmres is not OK >> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >> -st_mg_coarse_ksp_rtol 1.0e-6 \ >> > > DO NOT DO THIS. Please send the output where you do NOTHING to the coarse > solver. > > Thanks, > > Matt > > >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >> makefile:59: recipe for target 'runkr_gmres' failed >> make: *** [runkr_gmres] Error 91 >> >> log-files is attached. >> >> >> You mentioned that B is not symmetric. I assume it is elliptic >>> (diffusion). Where does the asymmetry come from? >>> >>> >> It is a two-group diffusion equations, where group denotes neutron enegry >> discretisation. >> Matrix B consists of neutron diffusion/leakage term, removal term and >> minus neutron scatter source term between different energies, when matrix A >> denotes neutron fission source. >> >> Diffusion term(Laplace operator) is elliptic and symmetric. Removal term >> is diagonal only. However scatter term is asymmetry since scatter term from >> high energy to low energy is far greater than the term from low to high. >> >> >> Wenbo >> >> >>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao >>> wrote: >>> >>>> Matt, >>>> Thanks for your reply. >>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>> to test those options. >>>> >>>> Wenbo >>>> >>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao >>>>> wrote: >>>>> >>>>>> Matt >>>>>> >>>>>> Because I am not clear about what will happen using 'preonly' for >>>>>> large scale problem. >>>>>> >>>>> >>>>> The size of the problem has nothing to do with 'preonly'. All it means >>>>> is to apply a preconditioner without a Krylov solver. >>>>> >>>>> >>>>>> It seems to use a direct solver from below, >>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>> KSP/KSPPREONLY.html >>>>>> >>>>> >>>>> However, I still cannot understand why you would change the default? >>>>> >>>>> Matt >>>>> >>>>> >>>>>> >>>>>> Thanks! >>>>>> Wenbo >>>>>> >>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao >>>>>> > wrote: >>>>>>> >>>>>>>> Matt, >>>>>>>> Thanks for your reply. >>>>>>>> It DOES make no sense for this problem. >>>>>>>> But I am not clear about the 'preonly' option. Which solver is used >>>>>>>> in preonly? I wonder if 'preonly' is suitable for large scale problem such >>>>>>>> as 400,000,000 unknowns. >>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>> >>>>>>> >>>>>>> I mean, why are you setting this at all. Just do not set the coarse >>>>>>> solver. The default should work fine. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> Could you give me some suggestions? >>>>>>>> >>>>>>>> Thanks. >>>>>>>> >>>>>>>> Wenbo >>>>>>>> >>>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley >>>>>>> > wrote: >>>>>>>> >>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Hi, >>>>>>>>>> >>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>>>> EPS is set as below, >>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>> >>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and >>>>>>>>>> preonly is OK. >>>>>>>>>> >>>>>>>>> >>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> >>>>>>>>>> Test 1 >>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>>> 0 \ >>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>> >>>>>>>>>> Test 2 >>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>>> 1 \ >>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>> 2>&1 >>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>> >>>>>>>>>> Test 3 >>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>>> 0 \ >>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>> 2>&1 >>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>> >>>>>>>>>> Log files were attched. >>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>> >>>>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>>>> >>>>>>>>>> Thanks! >>>>>>>>>> >>>>>>>>>> Wenbo >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>>> >>> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log_new1.tgz Type: application/x-gzip Size: 9722 bytes Desc: not available URL: From knepley at gmail.com Mon Oct 2 10:11:51 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 2 Oct 2017 11:11:51 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao wrote: > Matt, > Thanks Wenbo. > Test 1 nonsmooth > zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ > -mata AMAT.dat -matb BMAT.dat \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 > > Test 2 smooth > zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ > -mata AMAT.dat -matb BMAT.dat \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 > makefile:43: recipe for target 'runkr_smooth' failed > make: *** [runkr_smooth] Error 91 > > Mark, the solve is not failing, its the construction of the interpolator I think. Check out this stack [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: [0]PETSC ERROR: KSPSolve has not converged [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.8.0, unknown [0]PETSC ERROR: ./step-41 on a arch-linux2-c-debug named ubuntu by zhaowenbo Mon Oct 2 08:00:58 2017 [0]PETSC ERROR: Configure options --with-mpi=1 --with-shared-libraries=1 --with-64-bit-indices=1 --with-debugging=1 [0]PETSC ERROR: #1 KSPSolve() line 855 in /home/zhaowenbo/research/petsc/petsc_git/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #2 PCGAMGOptProlongator_AGG() line 1186 in /home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/impls/gamg/agg.c [0]PETSC ERROR: #3 PCSetUp_GAMG() line 528 in /home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/impls/gamg/gamg.c [0]PETSC ERROR: #4 PCSetUp() line 924 in /home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/interface/precon.c [0]PETSC ERROR: #5 KSPSetUp() line 378 in /home/zhaowenbo/research/petsc/petsc_git/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #6 STSetUp_Shift() line 129 in /home/zhaowenbo/research/slepc/slepc_git/src/sys/classes/st/impls/shift/shift.c [0]PETSC ERROR: #7 STSetUp() line 281 in /home/zhaowenbo/research/slepc/slepc_git/src/sys/classes/st/interface/stsolve.c [0]PETSC ERROR: #8 EPSSetUp() line 273 in /home/zhaowenbo/research/slepc/slepc_git/src/eps/interface/epssetup.c [0]PETSC ERROR: #9 solve_diffusion_3d() line 1029 in src/diffu.c [0]PETSC ERROR: #10 main() line 25 in src/main.c [0]PETSC ERROR: PETSc Option Table entries: [0]PETSC ERROR: -eps_monitor [0]PETSC ERROR: -eps_ncv 10 [0]PETSC ERROR: -eps_nev 1 [0]PETSC ERROR: -log_view [0]PETSC ERROR: -mata AMAT.dat [0]PETSC ERROR: -matb BMAT.dat [0]PETSC ERROR: -st_ksp_monitor [0]PETSC ERROR: -st_ksp_type gmres [0]PETSC ERROR: -st_ksp_view [0]PETSC ERROR: -st_pc_gamg_agg_nsmooths 1 [0]PETSC ERROR: -st_pc_gamg_type agg [0]PETSC ERROR: -st_pc_type gamg [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 91. Thanks, Matt > Thanks, > > Wenbo > > On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley > wrote: > >> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao >> wrote: >> >>> Mark, >>> >>> Thanks for your reply. >>> >>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >>> >>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we can >>>> start to debug it. >>>> >>>> Test 1 with nonsmooth and preonly is OK >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>> >>> Test 2 smooth and preonly is not OK >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>> makefile:43: recipe for target 'runkr_smooth' failed >>> make: *** [runkr_smooth] Error 91 >>> >>> Test 3 nonsmooth and gmres is not OK >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>> >> >> DO NOT DO THIS. Please send the output where you do NOTHING to the coarse >> solver. >> >> Thanks, >> >> Matt >> >> >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>> makefile:59: recipe for target 'runkr_gmres' failed >>> make: *** [runkr_gmres] Error 91 >>> >>> log-files is attached. >>> >>> >>> You mentioned that B is not symmetric. I assume it is elliptic >>>> (diffusion). Where does the asymmetry come from? >>>> >>>> >>> It is a two-group diffusion equations, where group denotes neutron >>> enegry discretisation. >>> Matrix B consists of neutron diffusion/leakage term, removal term and >>> minus neutron scatter source term between different energies, when matrix A >>> denotes neutron fission source. >>> >>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal term >>> is diagonal only. However scatter term is asymmetry since scatter term from >>> high energy to low energy is far greater than the term from low to high. >>> >>> >>> Wenbo >>> >>> >>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao >>>> wrote: >>>> >>>>> Matt, >>>>> Thanks for your reply. >>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>> to test those options. >>>>> >>>>> Wenbo >>>>> >>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao >>>>>> wrote: >>>>>> >>>>>>> Matt >>>>>>> >>>>>>> Because I am not clear about what will happen using 'preonly' for >>>>>>> large scale problem. >>>>>>> >>>>>> >>>>>> The size of the problem has nothing to do with 'preonly'. All it >>>>>> means is to apply a preconditioner without a Krylov solver. >>>>>> >>>>>> >>>>>>> It seems to use a direct solver from below, >>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>> KSP/KSPPREONLY.html >>>>>>> >>>>>> >>>>>> However, I still cannot understand why you would change the default? >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> >>>>>>> Thanks! >>>>>>> Wenbo >>>>>>> >>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>>>>>> wrote: >>>>>>> >>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>> >>>>>>>>> Matt, >>>>>>>>> Thanks for your reply. >>>>>>>>> It DOES make no sense for this problem. >>>>>>>>> But I am not clear about the 'preonly' option. Which solver is >>>>>>>>> used in preonly? I wonder if 'preonly' is suitable for large scale problem >>>>>>>>> such as 400,000,000 unknowns. >>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>> >>>>>>>> >>>>>>>> I mean, why are you setting this at all. Just do not set the coarse >>>>>>>> solver. The default should work fine. >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> Could you give me some suggestions? >>>>>>>>> >>>>>>>>> Thanks. >>>>>>>>> >>>>>>>>> Wenbo >>>>>>>>> >>>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> Hi, >>>>>>>>>>> >>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>>>>> EPS is set as below, >>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>> >>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths >>>>>>>>>>> and preonly is OK. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Test 1 >>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>> >>>>>>>>>>> Test 2 >>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>>> 2>&1 >>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>> >>>>>>>>>>> Test 3 >>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>>> 2>&1 >>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>> >>>>>>>>>>> Log files were attched. >>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>> >>>>>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>>>>> >>>>>>>>>>> Thanks! >>>>>>>>>>> >>>>>>>>>>> Wenbo >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>>> >>>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Oct 2 10:15:29 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 2 Oct 2017 11:15:29 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: non-smoothed aggregation is converging very fast. smoothed fails in the eigen estimator. Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor, and see if you get more output (I'm not 100% sure about these args). On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao wrote: > Matt, > > Test 1 nonsmooth > zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ > -mata AMAT.dat -matb BMAT.dat \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 > > Test 2 smooth > zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ > -mata AMAT.dat -matb BMAT.dat \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 > makefile:43: recipe for target 'runkr_smooth' failed > make: *** [runkr_smooth] Error 91 > > > Thanks, > > Wenbo > > On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley > wrote: > >> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao >> wrote: >> >>> Mark, >>> >>> Thanks for your reply. >>> >>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >>> >>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we can >>>> start to debug it. >>>> >>>> Test 1 with nonsmooth and preonly is OK >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>> >>> Test 2 smooth and preonly is not OK >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>> makefile:43: recipe for target 'runkr_smooth' failed >>> make: *** [runkr_smooth] Error 91 >>> >>> Test 3 nonsmooth and gmres is not OK >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>> >> >> DO NOT DO THIS. Please send the output where you do NOTHING to the coarse >> solver. >> >> Thanks, >> >> Matt >> >> >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>> makefile:59: recipe for target 'runkr_gmres' failed >>> make: *** [runkr_gmres] Error 91 >>> >>> log-files is attached. >>> >>> >>> You mentioned that B is not symmetric. I assume it is elliptic >>>> (diffusion). Where does the asymmetry come from? >>>> >>>> >>> It is a two-group diffusion equations, where group denotes neutron >>> enegry discretisation. >>> Matrix B consists of neutron diffusion/leakage term, removal term and >>> minus neutron scatter source term between different energies, when matrix A >>> denotes neutron fission source. >>> >>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal term >>> is diagonal only. However scatter term is asymmetry since scatter term from >>> high energy to low energy is far greater than the term from low to high. >>> >>> >>> Wenbo >>> >>> >>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao >>>> wrote: >>>> >>>>> Matt, >>>>> Thanks for your reply. >>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>> to test those options. >>>>> >>>>> Wenbo >>>>> >>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao >>>>>> wrote: >>>>>> >>>>>>> Matt >>>>>>> >>>>>>> Because I am not clear about what will happen using 'preonly' for >>>>>>> large scale problem. >>>>>>> >>>>>> >>>>>> The size of the problem has nothing to do with 'preonly'. All it >>>>>> means is to apply a preconditioner without a Krylov solver. >>>>>> >>>>>> >>>>>>> It seems to use a direct solver from below, >>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>> KSP/KSPPREONLY.html >>>>>>> >>>>>> >>>>>> However, I still cannot understand why you would change the default? >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> >>>>>>> Thanks! >>>>>>> Wenbo >>>>>>> >>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>>>>>> wrote: >>>>>>> >>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>> >>>>>>>>> Matt, >>>>>>>>> Thanks for your reply. >>>>>>>>> It DOES make no sense for this problem. >>>>>>>>> But I am not clear about the 'preonly' option. Which solver is >>>>>>>>> used in preonly? I wonder if 'preonly' is suitable for large scale problem >>>>>>>>> such as 400,000,000 unknowns. >>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>> >>>>>>>> >>>>>>>> I mean, why are you setting this at all. Just do not set the coarse >>>>>>>> solver. The default should work fine. >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> Could you give me some suggestions? >>>>>>>>> >>>>>>>>> Thanks. >>>>>>>>> >>>>>>>>> Wenbo >>>>>>>>> >>>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> Hi, >>>>>>>>>>> >>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>>>>> EPS is set as below, >>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>> >>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths >>>>>>>>>>> and preonly is OK. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Test 1 >>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>> >>>>>>>>>>> Test 2 >>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>>> 2>&1 >>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>> >>>>>>>>>>> Test 3 >>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>>> 2>&1 >>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>> >>>>>>>>>>> Log files were attched. >>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>> >>>>>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>>>>> >>>>>>>>>>> Thanks! >>>>>>>>>>> >>>>>>>>>>> Wenbo >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>>> >>>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Oct 2 10:21:08 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 2 Oct 2017 11:21:08 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: Yea, it fails in the eigen estimator, but the Cheby eigen estimator works in the solve that works: eigenvalue estimates used: min = 0.100004, max = 1.10004 eigenvalues estimate via gmres min 0.0118548, max 1.00004 Why would it just give "KSPSolve has not converged". It is not supposed to converge ... On Mon, Oct 2, 2017 at 11:11 AM, Matthew Knepley wrote: > On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao > wrote: > >> Matt, >> > > Thanks Wenbo. > > >> Test 1 nonsmooth >> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >> -mata AMAT.dat -matb BMAT.dat \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >> >> Test 2 smooth >> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >> -mata AMAT.dat -matb BMAT.dat \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >> makefile:43: recipe for target 'runkr_smooth' failed >> make: *** [runkr_smooth] Error 91 >> >> > Mark, the solve is not failing, its the construction of the interpolator I > think. Check out this stack > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: > [0]PETSC ERROR: KSPSolve has not converged > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.8.0, unknown > [0]PETSC ERROR: ./step-41 on a arch-linux2-c-debug named ubuntu by > zhaowenbo Mon Oct 2 08:00:58 2017 > [0]PETSC ERROR: Configure options --with-mpi=1 --with-shared-libraries=1 > --with-64-bit-indices=1 --with-debugging=1 > [0]PETSC ERROR: #1 KSPSolve() line 855 in /home/zhaowenbo/research/ > petsc/petsc_git/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: #2 PCGAMGOptProlongator_AGG() line 1186 in > /home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/impls/gamg/agg.c > [0]PETSC ERROR: #3 PCSetUp_GAMG() line 528 in /home/zhaowenbo/research/ > petsc/petsc_git/src/ksp/pc/impls/gamg/gamg.c > [0]PETSC ERROR: #4 PCSetUp() line 924 in /home/zhaowenbo/research/ > petsc/petsc_git/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: #5 KSPSetUp() line 378 in /home/zhaowenbo/research/ > petsc/petsc_git/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: #6 STSetUp_Shift() line 129 in /home/zhaowenbo/research/ > slepc/slepc_git/src/sys/classes/st/impls/shift/shift.c > [0]PETSC ERROR: #7 STSetUp() line 281 in /home/zhaowenbo/research/ > slepc/slepc_git/src/sys/classes/st/interface/stsolve.c > [0]PETSC ERROR: #8 EPSSetUp() line 273 in /home/zhaowenbo/research/ > slepc/slepc_git/src/eps/interface/epssetup.c > [0]PETSC ERROR: #9 solve_diffusion_3d() line 1029 in src/diffu.c > [0]PETSC ERROR: #10 main() line 25 in src/main.c > [0]PETSC ERROR: PETSc Option Table entries: > [0]PETSC ERROR: -eps_monitor > [0]PETSC ERROR: -eps_ncv 10 > [0]PETSC ERROR: -eps_nev 1 > [0]PETSC ERROR: -log_view > [0]PETSC ERROR: -mata AMAT.dat > [0]PETSC ERROR: -matb BMAT.dat > [0]PETSC ERROR: -st_ksp_monitor > [0]PETSC ERROR: -st_ksp_type gmres > [0]PETSC ERROR: -st_ksp_view > [0]PETSC ERROR: -st_pc_gamg_agg_nsmooths 1 > [0]PETSC ERROR: -st_pc_gamg_type agg > [0]PETSC ERROR: -st_pc_type gamg > [0]PETSC ERROR: ----------------End of Error Message -------send entire > error message to petsc-maint at mcs.anl.gov---------- > -------------------------------------------------------------------------- > MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD > with errorcode 91. > > Thanks, > > Matt > > >> Thanks, >> >> Wenbo >> >> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley >> wrote: >> >>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao >>> wrote: >>> >>>> Mark, >>>> >>>> Thanks for your reply. >>>> >>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >>>> >>>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we >>>>> can start to debug it. >>>>> >>>>> Test 1 with nonsmooth and preonly is OK >>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>> runkr_nonsmooth >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>>> >>>> Test 2 smooth and preonly is not OK >>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>> makefile:43: recipe for target 'runkr_smooth' failed >>>> make: *** [runkr_smooth] Error 91 >>>> >>>> Test 3 nonsmooth and gmres is not OK >>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>> >>> >>> DO NOT DO THIS. Please send the output where you do NOTHING to the >>> coarse solver. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>> makefile:59: recipe for target 'runkr_gmres' failed >>>> make: *** [runkr_gmres] Error 91 >>>> >>>> log-files is attached. >>>> >>>> >>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>> (diffusion). Where does the asymmetry come from? >>>>> >>>>> >>>> It is a two-group diffusion equations, where group denotes neutron >>>> enegry discretisation. >>>> Matrix B consists of neutron diffusion/leakage term, removal term and >>>> minus neutron scatter source term between different energies, when matrix A >>>> denotes neutron fission source. >>>> >>>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal >>>> term is diagonal only. However scatter term is asymmetry since scatter term >>>> from high energy to low energy is far greater than the term from low to >>>> high. >>>> >>>> >>>> Wenbo >>>> >>>> >>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao >>>>> wrote: >>>>> >>>>>> Matt, >>>>>> Thanks for your reply. >>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>> to test those options. >>>>>> >>>>>> Wenbo >>>>>> >>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao >>>>>> > wrote: >>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> Because I am not clear about what will happen using 'preonly' for >>>>>>>> large scale problem. >>>>>>>> >>>>>>> >>>>>>> The size of the problem has nothing to do with 'preonly'. All it >>>>>>> means is to apply a preconditioner without a Krylov solver. >>>>>>> >>>>>>> >>>>>>>> It seems to use a direct solver from below, >>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>> KSP/KSPPREONLY.html >>>>>>>> >>>>>>> >>>>>>> However, I still cannot understand why you would change the default? >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> Thanks! >>>>>>>> Wenbo >>>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>>>>>>> wrote: >>>>>>>> >>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Matt, >>>>>>>>>> Thanks for your reply. >>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>> But I am not clear about the 'preonly' option. Which solver is >>>>>>>>>> used in preonly? I wonder if 'preonly' is suitable for large scale problem >>>>>>>>>> such as 400,000,000 unknowns. >>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>> >>>>>>>>> >>>>>>>>> I mean, why are you setting this at all. Just do not set the >>>>>>>>> coarse solver. The default should work fine. >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> Could you give me some suggestions? >>>>>>>>>> >>>>>>>>>> Thanks. >>>>>>>>>> >>>>>>>>>> Wenbo >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Hi, >>>>>>>>>>>> >>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>> >>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths >>>>>>>>>>>> and preonly is OK. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Test 1 >>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>> >>>>>>>>>>>> Test 2 >>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>>>> 2>&1 >>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>> >>>>>>>>>>>> Test 3 >>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>>>> 2>&1 >>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>> >>>>>>>>>>>> Log files were attched. >>>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>>> >>>>>>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>>>>>> >>>>>>>>>>>> Thanks! >>>>>>>>>>>> >>>>>>>>>>>> Wenbo >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhaowenbo.npic at gmail.com Mon Oct 2 10:23:25 2017 From: zhaowenbo.npic at gmail.com (Wenbo Zhao) Date: Mon, 2 Oct 2017 23:23:25 +0800 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: I get more output zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth mpirun -n 1 ./step-41 \ -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ -mata AMAT.dat -matb BMAT.dat \ -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 makefile:43: recipe for target 'runkr_smooth' failed make: *** [runkr_smooth] Error 91 zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth mpirun -n 1 ./step-41 \ -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ -mata AMAT.dat -matb BMAT.dat \ -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 On Mon, Oct 2, 2017 at 11:15 PM, Mark Adams wrote: > non-smoothed aggregation is converging very fast. smoothed fails in the > eigen estimator. > > Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor, > and see if you get more output (I'm not 100% sure about these args). > > > > On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao > wrote: > >> Matt, >> >> Test 1 nonsmooth >> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >> -mata AMAT.dat -matb BMAT.dat \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >> >> Test 2 smooth >> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >> -mata AMAT.dat -matb BMAT.dat \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >> makefile:43: recipe for target 'runkr_smooth' failed >> make: *** [runkr_smooth] Error 91 >> >> >> Thanks, >> >> Wenbo >> >> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley >> wrote: >> >>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao >>> wrote: >>> >>>> Mark, >>>> >>>> Thanks for your reply. >>>> >>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >>>> >>>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we >>>>> can start to debug it. >>>>> >>>>> Test 1 with nonsmooth and preonly is OK >>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>> runkr_nonsmooth >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>>> >>>> Test 2 smooth and preonly is not OK >>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>> makefile:43: recipe for target 'runkr_smooth' failed >>>> make: *** [runkr_smooth] Error 91 >>>> >>>> Test 3 nonsmooth and gmres is not OK >>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>> >>> >>> DO NOT DO THIS. Please send the output where you do NOTHING to the >>> coarse solver. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>> makefile:59: recipe for target 'runkr_gmres' failed >>>> make: *** [runkr_gmres] Error 91 >>>> >>>> log-files is attached. >>>> >>>> >>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>> (diffusion). Where does the asymmetry come from? >>>>> >>>>> >>>> It is a two-group diffusion equations, where group denotes neutron >>>> enegry discretisation. >>>> Matrix B consists of neutron diffusion/leakage term, removal term and >>>> minus neutron scatter source term between different energies, when matrix A >>>> denotes neutron fission source. >>>> >>>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal >>>> term is diagonal only. However scatter term is asymmetry since scatter term >>>> from high energy to low energy is far greater than the term from low to >>>> high. >>>> >>>> >>>> Wenbo >>>> >>>> >>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao >>>>> wrote: >>>>> >>>>>> Matt, >>>>>> Thanks for your reply. >>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>> to test those options. >>>>>> >>>>>> Wenbo >>>>>> >>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao >>>>>> > wrote: >>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> Because I am not clear about what will happen using 'preonly' for >>>>>>>> large scale problem. >>>>>>>> >>>>>>> >>>>>>> The size of the problem has nothing to do with 'preonly'. All it >>>>>>> means is to apply a preconditioner without a Krylov solver. >>>>>>> >>>>>>> >>>>>>>> It seems to use a direct solver from below, >>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>> KSP/KSPPREONLY.html >>>>>>>> >>>>>>> >>>>>>> However, I still cannot understand why you would change the default? >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> Thanks! >>>>>>>> Wenbo >>>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>>>>>>> wrote: >>>>>>>> >>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Matt, >>>>>>>>>> Thanks for your reply. >>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>> But I am not clear about the 'preonly' option. Which solver is >>>>>>>>>> used in preonly? I wonder if 'preonly' is suitable for large scale problem >>>>>>>>>> such as 400,000,000 unknowns. >>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>> >>>>>>>>> >>>>>>>>> I mean, why are you setting this at all. Just do not set the >>>>>>>>> coarse solver. The default should work fine. >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> Could you give me some suggestions? >>>>>>>>>> >>>>>>>>>> Thanks. >>>>>>>>>> >>>>>>>>>> Wenbo >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Hi, >>>>>>>>>>>> >>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>> >>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths >>>>>>>>>>>> and preonly is OK. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Test 1 >>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>> >>>>>>>>>>>> Test 2 >>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>>>> 2>&1 >>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>> >>>>>>>>>>>> Test 3 >>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>>>> 2>&1 >>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>> >>>>>>>>>>>> Log files were attched. >>>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>>> >>>>>>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>>>>>> >>>>>>>>>>>> Thanks! >>>>>>>>>>>> >>>>>>>>>>>> Wenbo >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log_new2.tgz Type: application/x-gzip Size: 7663 bytes Desc: not available URL: From knepley at gmail.com Mon Oct 2 10:29:56 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 2 Oct 2017 11:29:56 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Mon, Oct 2, 2017 at 11:21 AM, Mark Adams wrote: > Yea, it fails in the eigen estimator, but the Cheby eigen estimator works > in the solve that works: > > eigenvalue estimates used: min = 0.100004, max = 1.10004 > eigenvalues estimate via gmres min 0.0118548, max 1.00004 > > Why would it just give "KSPSolve has not converged". It is not supposed to > converge ... > This sounds like a mistake with http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSPSetErrorIfNotConverged.html somewhere. Matt > On Mon, Oct 2, 2017 at 11:11 AM, Matthew Knepley > wrote: > >> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao >> wrote: >> >>> Matt, >>> >> >> Thanks Wenbo. >> >> >>> Test 1 nonsmooth >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>> -mata AMAT.dat -matb BMAT.dat \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>> >>> Test 2 smooth >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>> -mata AMAT.dat -matb BMAT.dat \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>> makefile:43: recipe for target 'runkr_smooth' failed >>> make: *** [runkr_smooth] Error 91 >>> >>> >> Mark, the solve is not failing, its the construction of the interpolator >> I think. Check out this stack >> >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [0]PETSC ERROR: >> [0]PETSC ERROR: KSPSolve has not converged >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html >> for trouble shooting. >> [0]PETSC ERROR: Petsc Release Version 3.8.0, unknown >> [0]PETSC ERROR: ./step-41 on a arch-linux2-c-debug named ubuntu by >> zhaowenbo Mon Oct 2 08:00:58 2017 >> [0]PETSC ERROR: Configure options --with-mpi=1 --with-shared-libraries=1 >> --with-64-bit-indices=1 --with-debugging=1 >> [0]PETSC ERROR: #1 KSPSolve() line 855 in /home/zhaowenbo/research/petsc >> /petsc_git/src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: #2 PCGAMGOptProlongator_AGG() line 1186 in >> /home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/impls/gamg/agg.c >> [0]PETSC ERROR: #3 PCSetUp_GAMG() line 528 in >> /home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/impls/gamg/gamg.c >> [0]PETSC ERROR: #4 PCSetUp() line 924 in /home/zhaowenbo/research/petsc >> /petsc_git/src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: #5 KSPSetUp() line 378 in /home/zhaowenbo/research/petsc >> /petsc_git/src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: #6 STSetUp_Shift() line 129 in >> /home/zhaowenbo/research/slepc/slepc_git/src/sys/classes/st/ >> impls/shift/shift.c >> [0]PETSC ERROR: #7 STSetUp() line 281 in /home/zhaowenbo/research/slepc >> /slepc_git/src/sys/classes/st/interface/stsolve.c >> [0]PETSC ERROR: #8 EPSSetUp() line 273 in /home/zhaowenbo/research/slepc >> /slepc_git/src/eps/interface/epssetup.c >> [0]PETSC ERROR: #9 solve_diffusion_3d() line 1029 in src/diffu.c >> [0]PETSC ERROR: #10 main() line 25 in src/main.c >> [0]PETSC ERROR: PETSc Option Table entries: >> [0]PETSC ERROR: -eps_monitor >> [0]PETSC ERROR: -eps_ncv 10 >> [0]PETSC ERROR: -eps_nev 1 >> [0]PETSC ERROR: -log_view >> [0]PETSC ERROR: -mata AMAT.dat >> [0]PETSC ERROR: -matb BMAT.dat >> [0]PETSC ERROR: -st_ksp_monitor >> [0]PETSC ERROR: -st_ksp_type gmres >> [0]PETSC ERROR: -st_ksp_view >> [0]PETSC ERROR: -st_pc_gamg_agg_nsmooths 1 >> [0]PETSC ERROR: -st_pc_gamg_type agg >> [0]PETSC ERROR: -st_pc_type gamg >> [0]PETSC ERROR: ----------------End of Error Message -------send entire >> error message to petsc-maint at mcs.anl.gov---------- >> ------------------------------------------------------------ >> -------------- >> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD >> with errorcode 91. >> >> Thanks, >> >> Matt >> >> >>> Thanks, >>> >>> Wenbo >>> >>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley >>> wrote: >>> >>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao >>>> wrote: >>>> >>>>> Mark, >>>>> >>>>> Thanks for your reply. >>>>> >>>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >>>>> >>>>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we >>>>>> can start to debug it. >>>>>> >>>>>> Test 1 with nonsmooth and preonly is OK >>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>> runkr_nonsmooth >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>>>> >>>>> Test 2 smooth and preonly is not OK >>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>> make: *** [runkr_smooth] Error 91 >>>>> >>>>> Test 3 nonsmooth and gmres is not OK >>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>> >>>> >>>> DO NOT DO THIS. Please send the output where you do NOTHING to the >>>> coarse solver. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>> make: *** [runkr_gmres] Error 91 >>>>> >>>>> log-files is attached. >>>>> >>>>> >>>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>>> (diffusion). Where does the asymmetry come from? >>>>>> >>>>>> >>>>> It is a two-group diffusion equations, where group denotes neutron >>>>> enegry discretisation. >>>>> Matrix B consists of neutron diffusion/leakage term, removal term and >>>>> minus neutron scatter source term between different energies, when matrix A >>>>> denotes neutron fission source. >>>>> >>>>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal >>>>> term is diagonal only. However scatter term is asymmetry since scatter term >>>>> from high energy to low energy is far greater than the term from low to >>>>> high. >>>>> >>>>> >>>>> Wenbo >>>>> >>>>> >>>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao >>>>>> wrote: >>>>>> >>>>>>> Matt, >>>>>>> Thanks for your reply. >>>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>>> to test those options. >>>>>>> >>>>>>> Wenbo >>>>>>> >>>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>>>>>> wrote: >>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao < >>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> Because I am not clear about what will happen using 'preonly' for >>>>>>>>> large scale problem. >>>>>>>>> >>>>>>>> >>>>>>>> The size of the problem has nothing to do with 'preonly'. All it >>>>>>>> means is to apply a preconditioner without a Krylov solver. >>>>>>>> >>>>>>>> >>>>>>>>> It seems to use a direct solver from below, >>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>>> KSP/KSPPREONLY.html >>>>>>>>> >>>>>>>> >>>>>>>> However, I still cannot understand why you would change the default? >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> Thanks! >>>>>>>>> Wenbo >>>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> Matt, >>>>>>>>>>> Thanks for your reply. >>>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>>> But I am not clear about the 'preonly' option. Which solver is >>>>>>>>>>> used in preonly? I wonder if 'preonly' is suitable for large scale problem >>>>>>>>>>> such as 400,000,000 unknowns. >>>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> I mean, why are you setting this at all. Just do not set the >>>>>>>>>> coarse solver. The default should work fine. >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Could you give me some suggestions? >>>>>>>>>>> >>>>>>>>>>> Thanks. >>>>>>>>>>> >>>>>>>>>>> Wenbo >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hi, >>>>>>>>>>>>> >>>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>>>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>>>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>>> >>>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths >>>>>>>>>>>>> and preonly is OK. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Test 1 >>>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor >>>>>>>>>>>>> \ >>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>>> >>>>>>>>>>>>> Test 2 >>>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor >>>>>>>>>>>>> \ >>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>>>>> 2>&1 >>>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>>> >>>>>>>>>>>>> Test 3 >>>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>>>>> 2>&1 >>>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>>> >>>>>>>>>>>>> Log files were attched. >>>>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>>>> >>>>>>>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks! >>>>>>>>>>>>> >>>>>>>>>>>>> Wenbo >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Oct 2 10:30:20 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 2 Oct 2017 11:30:20 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: It seems to solve fine but then fails on this: ierr = PetscObjectSAWsBlock((PetscObject)ksp);CHKERRQ(ierr); if (ksp->errorifnotconverged && ksp->reason < 0) SETERRQ(comm,PETSC_ERR_NOT_CONVERGED,"KSPSolve has not converged"); It looks like somehow ksp->errorifnotconverged got set. On Mon, Oct 2, 2017 at 11:23 AM, Wenbo Zhao wrote: > I get more output > zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ > -mata AMAT.dat -matb BMAT.dat \ > -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 > makefile:43: recipe for target 'runkr_smooth' failed > make: *** [runkr_smooth] Error 91 > > zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ > -mata AMAT.dat -matb BMAT.dat \ > -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 > > On Mon, Oct 2, 2017 at 11:15 PM, Mark Adams wrote: > >> non-smoothed aggregation is converging very fast. smoothed fails in the >> eigen estimator. >> >> Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor, >> and see if you get more output (I'm not 100% sure about these args). >> >> >> >> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao >> wrote: >> >>> Matt, >>> >>> Test 1 nonsmooth >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>> -mata AMAT.dat -matb BMAT.dat \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>> >>> Test 2 smooth >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>> -mata AMAT.dat -matb BMAT.dat \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>> makefile:43: recipe for target 'runkr_smooth' failed >>> make: *** [runkr_smooth] Error 91 >>> >>> >>> Thanks, >>> >>> Wenbo >>> >>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley >>> wrote: >>> >>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao >>>> wrote: >>>> >>>>> Mark, >>>>> >>>>> Thanks for your reply. >>>>> >>>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >>>>> >>>>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we >>>>>> can start to debug it. >>>>>> >>>>>> Test 1 with nonsmooth and preonly is OK >>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>> runkr_nonsmooth >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>>>> >>>>> Test 2 smooth and preonly is not OK >>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>> make: *** [runkr_smooth] Error 91 >>>>> >>>>> Test 3 nonsmooth and gmres is not OK >>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>> >>>> >>>> DO NOT DO THIS. Please send the output where you do NOTHING to the >>>> coarse solver. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>> make: *** [runkr_gmres] Error 91 >>>>> >>>>> log-files is attached. >>>>> >>>>> >>>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>>> (diffusion). Where does the asymmetry come from? >>>>>> >>>>>> >>>>> It is a two-group diffusion equations, where group denotes neutron >>>>> enegry discretisation. >>>>> Matrix B consists of neutron diffusion/leakage term, removal term and >>>>> minus neutron scatter source term between different energies, when matrix A >>>>> denotes neutron fission source. >>>>> >>>>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal >>>>> term is diagonal only. However scatter term is asymmetry since scatter term >>>>> from high energy to low energy is far greater than the term from low to >>>>> high. >>>>> >>>>> >>>>> Wenbo >>>>> >>>>> >>>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao >>>>>> wrote: >>>>>> >>>>>>> Matt, >>>>>>> Thanks for your reply. >>>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>>> to test those options. >>>>>>> >>>>>>> Wenbo >>>>>>> >>>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>>>>>> wrote: >>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao < >>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> Because I am not clear about what will happen using 'preonly' for >>>>>>>>> large scale problem. >>>>>>>>> >>>>>>>> >>>>>>>> The size of the problem has nothing to do with 'preonly'. All it >>>>>>>> means is to apply a preconditioner without a Krylov solver. >>>>>>>> >>>>>>>> >>>>>>>>> It seems to use a direct solver from below, >>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>>> KSP/KSPPREONLY.html >>>>>>>>> >>>>>>>> >>>>>>>> However, I still cannot understand why you would change the default? >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> Thanks! >>>>>>>>> Wenbo >>>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> Matt, >>>>>>>>>>> Thanks for your reply. >>>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>>> But I am not clear about the 'preonly' option. Which solver is >>>>>>>>>>> used in preonly? I wonder if 'preonly' is suitable for large scale problem >>>>>>>>>>> such as 400,000,000 unknowns. >>>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> I mean, why are you setting this at all. Just do not set the >>>>>>>>>> coarse solver. The default should work fine. >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Could you give me some suggestions? >>>>>>>>>>> >>>>>>>>>>> Thanks. >>>>>>>>>>> >>>>>>>>>>> Wenbo >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hi, >>>>>>>>>>>>> >>>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>>>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>>>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>>> >>>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths >>>>>>>>>>>>> and preonly is OK. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Test 1 >>>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor >>>>>>>>>>>>> \ >>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>>> >>>>>>>>>>>>> Test 2 >>>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor >>>>>>>>>>>>> \ >>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>>>>> 2>&1 >>>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>>> >>>>>>>>>>>>> Test 3 >>>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>>>>> 2>&1 >>>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>>> >>>>>>>>>>>>> Log files were attched. >>>>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>>>> >>>>>>>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks! >>>>>>>>>>>>> >>>>>>>>>>>>> Wenbo >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Oct 2 10:30:26 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 2 Oct 2017 11:30:26 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams wrote: > non-smoothed aggregation is converging very fast. smoothed fails in the > eigen estimator. > > Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor, > and see if you get more output (I'm not 100% sure about these args). > I also want -st_gamg_est_ksp_converged_reason Thanks, Matt > > On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao > wrote: > >> Matt, >> >> Test 1 nonsmooth >> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >> -mata AMAT.dat -matb BMAT.dat \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >> >> Test 2 smooth >> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >> -mata AMAT.dat -matb BMAT.dat \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >> makefile:43: recipe for target 'runkr_smooth' failed >> make: *** [runkr_smooth] Error 91 >> >> >> Thanks, >> >> Wenbo >> >> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley >> wrote: >> >>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao >>> wrote: >>> >>>> Mark, >>>> >>>> Thanks for your reply. >>>> >>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >>>> >>>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we >>>>> can start to debug it. >>>>> >>>>> Test 1 with nonsmooth and preonly is OK >>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>> runkr_nonsmooth >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>>> >>>> Test 2 smooth and preonly is not OK >>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>> makefile:43: recipe for target 'runkr_smooth' failed >>>> make: *** [runkr_smooth] Error 91 >>>> >>>> Test 3 nonsmooth and gmres is not OK >>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>> >>> >>> DO NOT DO THIS. Please send the output where you do NOTHING to the >>> coarse solver. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>> makefile:59: recipe for target 'runkr_gmres' failed >>>> make: *** [runkr_gmres] Error 91 >>>> >>>> log-files is attached. >>>> >>>> >>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>> (diffusion). Where does the asymmetry come from? >>>>> >>>>> >>>> It is a two-group diffusion equations, where group denotes neutron >>>> enegry discretisation. >>>> Matrix B consists of neutron diffusion/leakage term, removal term and >>>> minus neutron scatter source term between different energies, when matrix A >>>> denotes neutron fission source. >>>> >>>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal >>>> term is diagonal only. However scatter term is asymmetry since scatter term >>>> from high energy to low energy is far greater than the term from low to >>>> high. >>>> >>>> >>>> Wenbo >>>> >>>> >>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao >>>>> wrote: >>>>> >>>>>> Matt, >>>>>> Thanks for your reply. >>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>> to test those options. >>>>>> >>>>>> Wenbo >>>>>> >>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao >>>>>> > wrote: >>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> Because I am not clear about what will happen using 'preonly' for >>>>>>>> large scale problem. >>>>>>>> >>>>>>> >>>>>>> The size of the problem has nothing to do with 'preonly'. All it >>>>>>> means is to apply a preconditioner without a Krylov solver. >>>>>>> >>>>>>> >>>>>>>> It seems to use a direct solver from below, >>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>> KSP/KSPPREONLY.html >>>>>>>> >>>>>>> >>>>>>> However, I still cannot understand why you would change the default? >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> Thanks! >>>>>>>> Wenbo >>>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>>>>>>> wrote: >>>>>>>> >>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Matt, >>>>>>>>>> Thanks for your reply. >>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>> But I am not clear about the 'preonly' option. Which solver is >>>>>>>>>> used in preonly? I wonder if 'preonly' is suitable for large scale problem >>>>>>>>>> such as 400,000,000 unknowns. >>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>> >>>>>>>>> >>>>>>>>> I mean, why are you setting this at all. Just do not set the >>>>>>>>> coarse solver. The default should work fine. >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> Could you give me some suggestions? >>>>>>>>>> >>>>>>>>>> Thanks. >>>>>>>>>> >>>>>>>>>> Wenbo >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Hi, >>>>>>>>>>>> >>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>> >>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths >>>>>>>>>>>> and preonly is OK. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Test 1 >>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>> >>>>>>>>>>>> Test 2 >>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>>>> 2>&1 >>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>> >>>>>>>>>>>> Test 3 >>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>>>> 2>&1 >>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>> >>>>>>>>>>>> Log files were attched. >>>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>>> >>>>>>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>>>>>> >>>>>>>>>>>> Thanks! >>>>>>>>>>>> >>>>>>>>>>>> Wenbo >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhaowenbo.npic at gmail.com Mon Oct 2 10:39:23 2017 From: zhaowenbo.npic at gmail.com (Wenbo Zhao) Date: Mon, 2 Oct 2017 23:39:23 +0800 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley wrote: > On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams wrote: > >> non-smoothed aggregation is converging very fast. smoothed fails in the >> eigen estimator. >> >> Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor, >> and see if you get more output (I'm not 100% sure about these args). >> > > I also want -st_gamg_est_ksp_converged_reason > > Thanks, > > Matt > $make NCORE=1 runkr_smooth mpirun -n 1 ./step-41 \ -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ -mata AMAT.dat -matb BMAT.dat \ -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ -st_gamg_est_ksp_converged_reason \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 makefile:43: recipe for target 'runkr_smooth' failed make: *** [runkr_smooth] Error 91 Thanks Wenbo > > >> >> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao >> wrote: >> >>> Matt, >>> >>> Test 1 nonsmooth >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>> -mata AMAT.dat -matb BMAT.dat \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>> >>> Test 2 smooth >>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>> -mata AMAT.dat -matb BMAT.dat \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>> makefile:43: recipe for target 'runkr_smooth' failed >>> make: *** [runkr_smooth] Error 91 >>> >>> >>> Thanks, >>> >>> Wenbo >>> >>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley >>> wrote: >>> >>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao >>>> wrote: >>>> >>>>> Mark, >>>>> >>>>> Thanks for your reply. >>>>> >>>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >>>>> >>>>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we >>>>>> can start to debug it. >>>>>> >>>>>> Test 1 with nonsmooth and preonly is OK >>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>> runkr_nonsmooth >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>>>> >>>>> Test 2 smooth and preonly is not OK >>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>> make: *** [runkr_smooth] Error 91 >>>>> >>>>> Test 3 nonsmooth and gmres is not OK >>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>> >>>> >>>> DO NOT DO THIS. Please send the output where you do NOTHING to the >>>> coarse solver. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>> make: *** [runkr_gmres] Error 91 >>>>> >>>>> log-files is attached. >>>>> >>>>> >>>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>>> (diffusion). Where does the asymmetry come from? >>>>>> >>>>>> >>>>> It is a two-group diffusion equations, where group denotes neutron >>>>> enegry discretisation. >>>>> Matrix B consists of neutron diffusion/leakage term, removal term and >>>>> minus neutron scatter source term between different energies, when matrix A >>>>> denotes neutron fission source. >>>>> >>>>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal >>>>> term is diagonal only. However scatter term is asymmetry since scatter term >>>>> from high energy to low energy is far greater than the term from low to >>>>> high. >>>>> >>>>> >>>>> Wenbo >>>>> >>>>> >>>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao >>>>>> wrote: >>>>>> >>>>>>> Matt, >>>>>>> Thanks for your reply. >>>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>>> to test those options. >>>>>>> >>>>>>> Wenbo >>>>>>> >>>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>>>>>> wrote: >>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao < >>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> Because I am not clear about what will happen using 'preonly' for >>>>>>>>> large scale problem. >>>>>>>>> >>>>>>>> >>>>>>>> The size of the problem has nothing to do with 'preonly'. All it >>>>>>>> means is to apply a preconditioner without a Krylov solver. >>>>>>>> >>>>>>>> >>>>>>>>> It seems to use a direct solver from below, >>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>>> KSP/KSPPREONLY.html >>>>>>>>> >>>>>>>> >>>>>>>> However, I still cannot understand why you would change the default? >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> Thanks! >>>>>>>>> Wenbo >>>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> Matt, >>>>>>>>>>> Thanks for your reply. >>>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>>> But I am not clear about the 'preonly' option. Which solver is >>>>>>>>>>> used in preonly? I wonder if 'preonly' is suitable for large scale problem >>>>>>>>>>> such as 400,000,000 unknowns. >>>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> I mean, why are you setting this at all. Just do not set the >>>>>>>>>> coarse solver. The default should work fine. >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Could you give me some suggestions? >>>>>>>>>>> >>>>>>>>>>> Thanks. >>>>>>>>>>> >>>>>>>>>>> Wenbo >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hi, >>>>>>>>>>>>> >>>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve two-group >>>>>>>>>>>>> neutron diffusion equations with finite difference method. The grid is >>>>>>>>>>>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>>> >>>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths >>>>>>>>>>>>> and preonly is OK. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Test 1 >>>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor >>>>>>>>>>>>> \ >>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>>> >>>>>>>>>>>>> Test 2 >>>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor >>>>>>>>>>>>> \ >>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>>>>> 2>&1 >>>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>>> >>>>>>>>>>>>> Test 3 >>>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>>>>> 2>&1 >>>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>>> >>>>>>>>>>>>> Log files were attched. >>>>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>>>> >>>>>>>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks! >>>>>>>>>>>>> >>>>>>>>>>>>> Wenbo >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >>> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log_new3.tgz Type: application/x-gzip Size: 7717 bytes Desc: not available URL: From mfadams at lbl.gov Mon Oct 2 10:45:45 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 2 Oct 2017 11:45:45 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: This is normal: Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS iterations 10 It looks like ksp->errorifnotconverged got set somehow. If the default changed in KSP then (SAGG) GAMG would not ever work. I assume you don't have a .petscrc file with more (crazy) options in it ... On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao wrote: > > > On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley > wrote: > >> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams wrote: >> >>> non-smoothed aggregation is converging very fast. smoothed fails in the >>> eigen estimator. >>> >>> Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor, >>> and see if you get more output (I'm not 100% sure about these args). >>> >> >> I also want -st_gamg_est_ksp_converged_reason >> >> Thanks, >> >> Matt >> > $make NCORE=1 runkr_smooth > mpirun -n 1 ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ > -mata AMAT.dat -matb BMAT.dat \ > -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ > -st_gamg_est_ksp_converged_reason \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 > makefile:43: recipe for target 'runkr_smooth' failed > make: *** [runkr_smooth] Error 91 > > Thanks > Wenbo > > >> >> >>> >>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao >>> wrote: >>> >>>> Matt, >>>> >>>> Test 1 nonsmooth >>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>> runkr_nonsmooth >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>> -mata AMAT.dat -matb BMAT.dat \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>>> >>>> Test 2 smooth >>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>> -mata AMAT.dat -matb BMAT.dat \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>> makefile:43: recipe for target 'runkr_smooth' failed >>>> make: *** [runkr_smooth] Error 91 >>>> >>>> >>>> Thanks, >>>> >>>> Wenbo >>>> >>>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao >>>>> wrote: >>>>> >>>>>> Mark, >>>>>> >>>>>> Thanks for your reply. >>>>>> >>>>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >>>>>> >>>>>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we >>>>>>> can start to debug it. >>>>>>> >>>>>>> Test 1 with nonsmooth and preonly is OK >>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>> runkr_nonsmooth >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>> 2>&1 >>>>>> >>>>>> Test 2 smooth and preonly is not OK >>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>> make: *** [runkr_smooth] Error 91 >>>>>> >>>>>> Test 3 nonsmooth and gmres is not OK >>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>> >>>>> >>>>> DO NOT DO THIS. Please send the output where you do NOTHING to the >>>>> coarse solver. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>> make: *** [runkr_gmres] Error 91 >>>>>> >>>>>> log-files is attached. >>>>>> >>>>>> >>>>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>>>> (diffusion). Where does the asymmetry come from? >>>>>>> >>>>>>> >>>>>> It is a two-group diffusion equations, where group denotes neutron >>>>>> enegry discretisation. >>>>>> Matrix B consists of neutron diffusion/leakage term, removal term and >>>>>> minus neutron scatter source term between different energies, when matrix A >>>>>> denotes neutron fission source. >>>>>> >>>>>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal >>>>>> term is diagonal only. However scatter term is asymmetry since scatter term >>>>>> from high energy to low energy is far greater than the term from low to >>>>>> high. >>>>>> >>>>>> >>>>>> Wenbo >>>>>> >>>>>> >>>>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao >>>>>> > wrote: >>>>>>> >>>>>>>> Matt, >>>>>>>> Thanks for your reply. >>>>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>>>> to test those options. >>>>>>>> >>>>>>>> Wenbo >>>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>>>>>>> wrote: >>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao < >>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> Because I am not clear about what will happen using 'preonly' for >>>>>>>>>> large scale problem. >>>>>>>>>> >>>>>>>>> >>>>>>>>> The size of the problem has nothing to do with 'preonly'. All it >>>>>>>>> means is to apply a preconditioner without a Krylov solver. >>>>>>>>> >>>>>>>>> >>>>>>>>>> It seems to use a direct solver from below, >>>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>>>> KSP/KSPPREONLY.html >>>>>>>>>> >>>>>>>>> >>>>>>>>> However, I still cannot understand why you would change the >>>>>>>>> default? >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thanks! >>>>>>>>>> Wenbo >>>>>>>>>> >>>>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Matt, >>>>>>>>>>>> Thanks for your reply. >>>>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>>>> But I am not clear about the 'preonly' option. Which solver is >>>>>>>>>>>> used in preonly? I wonder if 'preonly' is suitable for large scale problem >>>>>>>>>>>> such as 400,000,000 unknowns. >>>>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> I mean, why are you setting this at all. Just do not set the >>>>>>>>>>> coarse solver. The default should work fine. >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Could you give me some suggestions? >>>>>>>>>>>> >>>>>>>>>>>> Thanks. >>>>>>>>>>>> >>>>>>>>>>>> Wenbo >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>> >>>>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve >>>>>>>>>>>>>> two-group neutron diffusion equations with finite difference method. The >>>>>>>>>>>>>> grid is 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is >>>>>>>>>>>>>> diagonally dominant matrix but not symmetry. >>>>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>>>> >>>>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths >>>>>>>>>>>>>> and preonly is OK. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Test 1 >>>>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>>>> >>>>>>>>>>>>>> Test 2 >>>>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>> log_smooth 2>&1 >>>>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>>>> >>>>>>>>>>>>>> Test 3 >>>>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>> log_gmres 2>&1 >>>>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>>>> >>>>>>>>>>>>>> Log files were attched. >>>>>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Is it correct? Or something wrong with my code or commad-line? >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>> >>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Oct 2 10:49:20 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 2 Oct 2017 11:49:20 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: Whenbo, do you build your PETSc? On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams wrote: > This is normal: > > Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS iterations > 10 > > It looks like ksp->errorifnotconverged got set somehow. If the default > changed in KSP then (SAGG) GAMG would not ever work. > > I assume you don't have a .petscrc file with more (crazy) options in it > ... > > > On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao > wrote: > >> >> >> On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley >> wrote: >> >>> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams wrote: >>> >>>> non-smoothed aggregation is converging very fast. smoothed fails in the >>>> eigen estimator. >>>> >>>> Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor, >>>> and see if you get more output (I'm not 100% sure about these args). >>>> >>> >>> I also want -st_gamg_est_ksp_converged_reason >>> >>> Thanks, >>> >>> Matt >>> >> $make NCORE=1 runkr_smooth >> mpirun -n 1 ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >> -mata AMAT.dat -matb BMAT.dat \ >> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >> -st_gamg_est_ksp_converged_reason \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >> makefile:43: recipe for target 'runkr_smooth' failed >> make: *** [runkr_smooth] Error 91 >> >> Thanks >> Wenbo >> >> >>> >>> >>>> >>>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao >>>> wrote: >>>> >>>>> Matt, >>>>> >>>>> Test 1 nonsmooth >>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>> runkr_nonsmooth >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>>>> >>>>> Test 2 smooth >>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>> make: *** [runkr_smooth] Error 91 >>>>> >>>>> >>>>> Thanks, >>>>> >>>>> Wenbo >>>>> >>>>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao >>>>> > wrote: >>>>>> >>>>>>> Mark, >>>>>>> >>>>>>> Thanks for your reply. >>>>>>> >>>>>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >>>>>>> >>>>>>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we >>>>>>>> can start to debug it. >>>>>>>> >>>>>>>> Test 1 with nonsmooth and preonly is OK >>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>> runkr_nonsmooth >>>>>>> mpirun -n 1 ./step-41 \ >>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>>> 2>&1 >>>>>>> >>>>>>> Test 2 smooth and preonly is not OK >>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>> runkr_smooth >>>>>>> mpirun -n 1 ./step-41 \ >>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>> >>>>>>> Test 3 nonsmooth and gmres is not OK >>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres >>>>>>> mpirun -n 1 ./step-41 \ >>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>> >>>>>> >>>>>> DO NOT DO THIS. Please send the output where you do NOTHING to the >>>>>> coarse solver. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>> >>>>>>> log-files is attached. >>>>>>> >>>>>>> >>>>>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>>>>> (diffusion). Where does the asymmetry come from? >>>>>>>> >>>>>>>> >>>>>>> It is a two-group diffusion equations, where group denotes neutron >>>>>>> enegry discretisation. >>>>>>> Matrix B consists of neutron diffusion/leakage term, removal term >>>>>>> and minus neutron scatter source term between different energies, when >>>>>>> matrix A denotes neutron fission source. >>>>>>> >>>>>>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal >>>>>>> term is diagonal only. However scatter term is asymmetry since scatter term >>>>>>> from high energy to low energy is far greater than the term from low to >>>>>>> high. >>>>>>> >>>>>>> >>>>>>> Wenbo >>>>>>> >>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao < >>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>> >>>>>>>>> Matt, >>>>>>>>> Thanks for your reply. >>>>>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>>>>> to test those options. >>>>>>>>> >>>>>>>>> Wenbo >>>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao < >>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> Because I am not clear about what will happen using 'preonly' >>>>>>>>>>> for large scale problem. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> The size of the problem has nothing to do with 'preonly'. All it >>>>>>>>>> means is to apply a preconditioner without a Krylov solver. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> It seems to use a direct solver from below, >>>>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>>>>> KSP/KSPPREONLY.html >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> However, I still cannot understand why you would change the >>>>>>>>>> default? >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Thanks! >>>>>>>>>>> Wenbo >>>>>>>>>>> >>>>>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Matt, >>>>>>>>>>>>> Thanks for your reply. >>>>>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>>>>> But I am not clear about the 'preonly' option. Which solver is >>>>>>>>>>>>> used in preonly? I wonder if 'preonly' is suitable for large scale problem >>>>>>>>>>>>> such as 400,000,000 unknowns. >>>>>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> I mean, why are you setting this at all. Just do not set the >>>>>>>>>>>> coarse solver. The default should work fine. >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Could you give me some suggestions? >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks. >>>>>>>>>>>>> >>>>>>>>>>>>> Wenbo >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve >>>>>>>>>>>>>>> two-group neutron diffusion equations with finite difference method. The >>>>>>>>>>>>>>> grid is 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B >>>>>>>>>>>>>>> is diagonally dominant matrix but not symmetry. >>>>>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only >>>>>>>>>>>>>>> non-smooths and preonly is OK. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Test 1 >>>>>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Test 2 >>>>>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>> log_smooth 2>&1 >>>>>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Test 3 >>>>>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>> log_gmres 2>&1 >>>>>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Log files were attched. >>>>>>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Is it correct? Or something wrong with my code or >>>>>>>>>>>>>>> commad-line? >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhaowenbo.npic at gmail.com Mon Oct 2 10:56:13 2017 From: zhaowenbo.npic at gmail.com (Wenbo Zhao) Date: Mon, 2 Oct 2017 23:56:13 +0800 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams wrote: > Whenbo, do you build your PETSc? > > Yes. My configure option is listed below ./configure --with-mpi=1 --with-shared-libraries=1 \ --with-64-bit-indices=1 --with-debugging=1 And I set PETSC_DIR, PETSC_ARCH and SLEPC_DIR in my ~/.bashrc. The Makefile for my problem is listed below, PETSC_ARCH = arch-linux2-c-debug PETSC_DIR = /home/zhaowenbo/research/petsc/petsc_git SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc_git #PETSC_DIR = /home/zhaowenbo/research/petsc/petsc-3.7.4 #SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc-3.7.3 HYPRE_DIR = /usr/local/hypre # DEBUG_OPT = -g COMP_FLAGS = -fPIC -Wall \ -I${SLEPC_DIR}/include -I${SLEPC_DIR}/${PETSC_ARCH}/include \ -I${PETSC_DIR}/include -I${PETSC_DIR}/${PETSC_ARCH}/include \ -Isrc LINK_FLAGS = -fPIC -Wall \ -Wl,-rpath,${SLEPC_DIR}/${PETSC_ARCH}/lib -L${SLEPC_DIR}/${PETSC_ARCH}/lib -lslepc \ -Wl,-rpath,${PETSC_DIR}/${PETSC_ARCH}/lib -L${PETSC_DIR}/${PETSC_ARCH}/lib -lpetsc \ -llapack -lblas -lhwloc -lm -lgfortran -lquadmath step-41: src/main.o src/readinp.o src/base.o src/sp3.o src/diffu.o mpicxx -o step-41 $^ ${LINK_FLAGS} ${DEBUG_OPT} src/main.o: src/main.c mpicxx -o src/main.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} src/readinp.o: src/readinp.c mpicxx -o src/readinp.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} src/sp3.o: src/sp3.c mpicxx -o src/sp3.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} src/diffu.o: src/diffu.c mpicxx -o src/diffu.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} src/base.o: src/base.c mpicxx -o src/base.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} clean: rm step-41 src/main.o src/readinp.o src/sp3.o src/diffu.o src/base.o runkr_smooth: mpirun -n ${NCORE} ./step-41 \ -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ -mata AMAT.dat -matb BMAT.dat \ -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ -st_gamg_est_ksp_converged_reason \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 runkr_nonsmooth: mpirun -n ${NCORE} ./step-41 \ -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ -mata AMAT.dat -matb BMAT.dat \ -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ -st_gamg_est_ksp_converged_reason \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 Thanks, Wenbo > On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams wrote: > >> This is normal: >> >> Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS iterations >> 10 >> >> It looks like ksp->errorifnotconverged got set somehow. If the default >> changed in KSP then (SAGG) GAMG would not ever work. >> >> I assume you don't have a .petscrc file with more (crazy) options in it >> ... >> >> >> On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao >> wrote: >> >>> >>> >>> On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley >>> wrote: >>> >>>> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams wrote: >>>> >>>>> non-smoothed aggregation is converging very fast. smoothed fails in >>>>> the eigen estimator. >>>>> >>>>> Run this again with -st_gamg_est_ksp_view and >>>>> -st_gamg_est_ksp_monitor, and see if you get more output (I'm not 100% sure >>>>> about these args). >>>>> >>>> >>>> I also want -st_gamg_est_ksp_converged_reason >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>> $make NCORE=1 runkr_smooth >>> mpirun -n 1 ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>> -mata AMAT.dat -matb BMAT.dat \ >>> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >>> -st_gamg_est_ksp_converged_reason \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>> makefile:43: recipe for target 'runkr_smooth' failed >>> make: *** [runkr_smooth] Error 91 >>> >>> Thanks >>> Wenbo >>> >>> >>>> >>>> >>>>> >>>>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao >>>>> wrote: >>>>> >>>>>> Matt, >>>>>> >>>>>> Test 1 nonsmooth >>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>> runkr_nonsmooth >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>> 2>&1 >>>>>> >>>>>> Test 2 smooth >>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>> make: *** [runkr_smooth] Error 91 >>>>>> >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Wenbo >>>>>> >>>>>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao < >>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>> >>>>>>>> Mark, >>>>>>>> >>>>>>>> Thanks for your reply. >>>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote: >>>>>>>> >>>>>>>>> Please send the output with -st_ksp_view and -st_ksp_monitor and >>>>>>>>> we can start to debug it. >>>>>>>>> >>>>>>>>> Test 1 with nonsmooth and preonly is OK >>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>> runkr_nonsmooth >>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 >>>>>>>> \ >>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>>>> 2>&1 >>>>>>>> >>>>>>>> Test 2 smooth and preonly is not OK >>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>> runkr_smooth >>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 >>>>>>>> \ >>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>> >>>>>>>> Test 3 nonsmooth and gmres is not OK >>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>> runkr_gmres >>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 >>>>>>>> \ >>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>> >>>>>>> >>>>>>> DO NOT DO THIS. Please send the output where you do NOTHING to the >>>>>>> coarse solver. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>> >>>>>>>> log-files is attached. >>>>>>>> >>>>>>>> >>>>>>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>>>>>> (diffusion). Where does the asymmetry come from? >>>>>>>>> >>>>>>>>> >>>>>>>> It is a two-group diffusion equations, where group denotes neutron >>>>>>>> enegry discretisation. >>>>>>>> Matrix B consists of neutron diffusion/leakage term, removal term >>>>>>>> and minus neutron scatter source term between different energies, when >>>>>>>> matrix A denotes neutron fission source. >>>>>>>> >>>>>>>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal >>>>>>>> term is diagonal only. However scatter term is asymmetry since scatter term >>>>>>>> from high energy to low energy is far greater than the term from low to >>>>>>>> high. >>>>>>>> >>>>>>>> >>>>>>>> Wenbo >>>>>>>> >>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao < >>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Matt, >>>>>>>>>> Thanks for your reply. >>>>>>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>>>>>> to test those options. >>>>>>>>>> >>>>>>>>>> Wenbo >>>>>>>>>> >>>>>>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao < >>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> Because I am not clear about what will happen using 'preonly' >>>>>>>>>>>> for large scale problem. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> The size of the problem has nothing to do with 'preonly'. All it >>>>>>>>>>> means is to apply a preconditioner without a Krylov solver. >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> It seems to use a direct solver from below, >>>>>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>>>>>> KSP/KSPPREONLY.html >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> However, I still cannot understand why you would change the >>>>>>>>>>> default? >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Thanks! >>>>>>>>>>>> Wenbo >>>>>>>>>>>> >>>>>>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Matt, >>>>>>>>>>>>>> Thanks for your reply. >>>>>>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>>>>>> But I am not clear about the 'preonly' option. Which solver >>>>>>>>>>>>>> is used in preonly? I wonder if 'preonly' is suitable for large scale >>>>>>>>>>>>>> problem such as 400,000,000 unknowns. >>>>>>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> I mean, why are you setting this at all. Just do not set the >>>>>>>>>>>>> coarse solver. The default should work fine. >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Could you give me some suggestions? >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve >>>>>>>>>>>>>>>> two-group neutron diffusion equations with finite difference method. The >>>>>>>>>>>>>>>> grid is 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B >>>>>>>>>>>>>>>> is diagonally dominant matrix but not symmetry. >>>>>>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only >>>>>>>>>>>>>>>> non-smooths and preonly is OK. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Test 1 >>>>>>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Test 2 >>>>>>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>> log_smooth 2>&1 >>>>>>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Test 3 >>>>>>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>> log_gmres 2>&1 >>>>>>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Log files were attched. >>>>>>>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Is it correct? Or something wrong with my code or >>>>>>>>>>>>>>>> commad-line? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Oct 2 11:23:39 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 2 Oct 2017 12:23:39 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Mon, Oct 2, 2017 at 11:56 AM, Wenbo Zhao wrote: > > > On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams wrote: > >> Whenbo, do you build your PETSc? >> >> Yes. > My configure option is listed below > ./configure --with-mpi=1 --with-shared-libraries=1 \ > --with-64-bit-indices=1 --with-debugging=1 > > And I set PETSC_DIR, PETSC_ARCH and SLEPC_DIR in my ~/.bashrc. > > > The Makefile for my problem is listed below, > > PETSC_ARCH = arch-linux2-c-debug > PETSC_DIR = /home/zhaowenbo/research/petsc/petsc_git > SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc_git > #PETSC_DIR = /home/zhaowenbo/research/petsc/petsc-3.7.4 > #SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc-3.7.3 > HYPRE_DIR = /usr/local/hypre > # > DEBUG_OPT = -g > COMP_FLAGS = -fPIC -Wall \ > -I${SLEPC_DIR}/include -I${SLEPC_DIR}/${PETSC_ARCH}/include \ > -I${PETSC_DIR}/include -I${PETSC_DIR}/${PETSC_ARCH}/include \ > -Isrc > > LINK_FLAGS = -fPIC -Wall \ > -Wl,-rpath,${SLEPC_DIR}/${PETSC_ARCH}/lib -L${SLEPC_DIR}/${PETSC_ARCH}/lib > -lslepc \ > -Wl,-rpath,${PETSC_DIR}/${PETSC_ARCH}/lib -L${PETSC_DIR}/${PETSC_ARCH}/lib > -lpetsc \ > -llapack -lblas -lhwloc -lm -lgfortran -lquadmath > > step-41: src/main.o src/readinp.o src/base.o src/sp3.o src/diffu.o > mpicxx -o step-41 $^ ${LINK_FLAGS} ${DEBUG_OPT} > > src/main.o: src/main.c > mpicxx -o src/main.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} > > src/readinp.o: src/readinp.c > mpicxx -o src/readinp.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} > > src/sp3.o: src/sp3.c > mpicxx -o src/sp3.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} > > src/diffu.o: src/diffu.c > mpicxx -o src/diffu.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} > > src/base.o: src/base.c > mpicxx -o src/base.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} > > > clean: > rm step-41 src/main.o src/readinp.o src/sp3.o src/diffu.o src/base.o > > runkr_smooth: > mpirun -n ${NCORE} ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ > -mata AMAT.dat -matb BMAT.dat \ > -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ > -st_gamg_est_ksp_converged_reason \ > Add -st_gamg_est_ksp_error_if_not_converged 0 Thanks, Matt -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 > > runkr_nonsmooth: > mpirun -n ${NCORE} ./step-41 \ > -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ > -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ > -mata AMAT.dat -matb BMAT.dat \ > -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ > -st_gamg_est_ksp_converged_reason \ > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 > > > Thanks, > Wenbo > > > > >> On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams wrote: >> >>> This is normal: >>> >>> Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS >>> iterations 10 >>> >>> It looks like ksp->errorifnotconverged got set somehow. If the default >>> changed in KSP then (SAGG) GAMG would not ever work. >>> >>> I assume you don't have a .petscrc file with more (crazy) options in it >>> ... >>> >>> >>> On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao >>> wrote: >>> >>>> >>>> >>>> On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams wrote: >>>>> >>>>>> non-smoothed aggregation is converging very fast. smoothed fails in >>>>>> the eigen estimator. >>>>>> >>>>>> Run this again with -st_gamg_est_ksp_view and >>>>>> -st_gamg_est_ksp_monitor, and see if you get more output (I'm not 100% sure >>>>>> about these args). >>>>>> >>>>> >>>>> I also want -st_gamg_est_ksp_converged_reason >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>> $make NCORE=1 runkr_smooth >>>> mpirun -n 1 ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>> -mata AMAT.dat -matb BMAT.dat \ >>>> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >>>> -st_gamg_est_ksp_converged_reason \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>> makefile:43: recipe for target 'runkr_smooth' failed >>>> make: *** [runkr_smooth] Error 91 >>>> >>>> Thanks >>>> Wenbo >>>> >>>> >>>>> >>>>> >>>>>> >>>>>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao >>>>> > wrote: >>>>>> >>>>>>> Matt, >>>>>>> >>>>>>> Test 1 nonsmooth >>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>> runkr_nonsmooth >>>>>>> mpirun -n 1 ./step-41 \ >>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>>> 2>&1 >>>>>>> >>>>>>> Test 2 smooth >>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>> runkr_smooth >>>>>>> mpirun -n 1 ./step-41 \ >>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>> >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Wenbo >>>>>>> >>>>>>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley >>>>>>> wrote: >>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao < >>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>> >>>>>>>>> Mark, >>>>>>>>> >>>>>>>>> Thanks for your reply. >>>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>>> Please send the output with -st_ksp_view and -st_ksp_monitor and >>>>>>>>>> we can start to debug it. >>>>>>>>>> >>>>>>>>>> Test 1 with nonsmooth and preonly is OK >>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>> runkr_nonsmooth >>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>> 0 \ >>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>>>>> 2>&1 >>>>>>>>> >>>>>>>>> Test 2 smooth and preonly is not OK >>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>> runkr_smooth >>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>> 1 \ >>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>> >>>>>>>>> Test 3 nonsmooth and gmres is not OK >>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>> runkr_gmres >>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>> 0 \ >>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>> >>>>>>>> >>>>>>>> DO NOT DO THIS. Please send the output where you do NOTHING to the >>>>>>>> coarse solver. >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres 2>&1 >>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>> >>>>>>>>> log-files is attached. >>>>>>>>> >>>>>>>>> >>>>>>>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>>>>>>> (diffusion). Where does the asymmetry come from? >>>>>>>>>> >>>>>>>>>> >>>>>>>>> It is a two-group diffusion equations, where group denotes neutron >>>>>>>>> enegry discretisation. >>>>>>>>> Matrix B consists of neutron diffusion/leakage term, removal term >>>>>>>>> and minus neutron scatter source term between different energies, when >>>>>>>>> matrix A denotes neutron fission source. >>>>>>>>> >>>>>>>>> Diffusion term(Laplace operator) is elliptic and symmetric. >>>>>>>>> Removal term is diagonal only. However scatter term is asymmetry since >>>>>>>>> scatter term from high energy to low energy is far greater than the term >>>>>>>>> from low to high. >>>>>>>>> >>>>>>>>> >>>>>>>>> Wenbo >>>>>>>>> >>>>>>>>> >>>>>>>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao < >>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> Matt, >>>>>>>>>>> Thanks for your reply. >>>>>>>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>>>>>>> to test those options. >>>>>>>>>>> >>>>>>>>>>> Wenbo >>>>>>>>>>> >>>>>>>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao < >>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> Because I am not clear about what will happen using 'preonly' >>>>>>>>>>>>> for large scale problem. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> The size of the problem has nothing to do with 'preonly'. All >>>>>>>>>>>> it means is to apply a preconditioner without a Krylov solver. >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> It seems to use a direct solver from below, >>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>>>>>>> KSP/KSPPREONLY.html >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> However, I still cannot understand why you would change the >>>>>>>>>>>> default? >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks! >>>>>>>>>>>>> Wenbo >>>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt, >>>>>>>>>>>>>>> Thanks for your reply. >>>>>>>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>>>>>>> But I am not clear about the 'preonly' option. Which solver >>>>>>>>>>>>>>> is used in preonly? I wonder if 'preonly' is suitable for large scale >>>>>>>>>>>>>>> problem such as 400,000,000 unknowns. >>>>>>>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> I mean, why are you setting this at all. Just do not set the >>>>>>>>>>>>>> coarse solver. The default should work fine. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Could you give me some suggestions? >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve >>>>>>>>>>>>>>>>> two-group neutron diffusion equations with finite difference method. The >>>>>>>>>>>>>>>>> grid is 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B >>>>>>>>>>>>>>>>> is diagonally dominant matrix but not symmetry. >>>>>>>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only >>>>>>>>>>>>>>>>> non-smooths and preonly is OK. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Test 1 >>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Test 2 >>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>> log_smooth 2>&1 >>>>>>>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Test 3 >>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>> log_gmres 2>&1 >>>>>>>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Log files were attched. >>>>>>>>>>>>>>>>> The matrix file were also attched as AMAT.dat and BMAT.dat. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Is it correct? Or something wrong with my code or >>>>>>>>>>>>>>>>> commad-line? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>>> >>> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhaowenbo.npic at gmail.com Mon Oct 2 11:32:42 2017 From: zhaowenbo.npic at gmail.com (Wenbo Zhao) Date: Tue, 3 Oct 2017 00:32:42 +0800 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Tue, Oct 3, 2017 at 12:23 AM, Matthew Knepley wrote: > On Mon, Oct 2, 2017 at 11:56 AM, Wenbo Zhao > wrote: > >> >> >> On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams wrote: >> >>> Whenbo, do you build your PETSc? >>> >>> Yes. >> My configure option is listed below >> ./configure --with-mpi=1 --with-shared-libraries=1 \ >> --with-64-bit-indices=1 --with-debugging=1 >> >> And I set PETSC_DIR, PETSC_ARCH and SLEPC_DIR in my ~/.bashrc. >> >> >> The Makefile for my problem is listed below, >> >> PETSC_ARCH = arch-linux2-c-debug >> PETSC_DIR = /home/zhaowenbo/research/petsc/petsc_git >> SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc_git >> #PETSC_DIR = /home/zhaowenbo/research/petsc/petsc-3.7.4 >> #SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc-3.7.3 >> HYPRE_DIR = /usr/local/hypre >> # >> DEBUG_OPT = -g >> COMP_FLAGS = -fPIC -Wall \ >> -I${SLEPC_DIR}/include -I${SLEPC_DIR}/${PETSC_ARCH}/include \ >> -I${PETSC_DIR}/include -I${PETSC_DIR}/${PETSC_ARCH}/include \ >> -Isrc >> >> LINK_FLAGS = -fPIC -Wall \ >> -Wl,-rpath,${SLEPC_DIR}/${PETSC_ARCH}/lib >> -L${SLEPC_DIR}/${PETSC_ARCH}/lib -lslepc \ >> -Wl,-rpath,${PETSC_DIR}/${PETSC_ARCH}/lib >> -L${PETSC_DIR}/${PETSC_ARCH}/lib -lpetsc \ >> -llapack -lblas -lhwloc -lm -lgfortran -lquadmath >> >> step-41: src/main.o src/readinp.o src/base.o src/sp3.o src/diffu.o >> mpicxx -o step-41 $^ ${LINK_FLAGS} ${DEBUG_OPT} >> >> src/main.o: src/main.c >> mpicxx -o src/main.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >> >> src/readinp.o: src/readinp.c >> mpicxx -o src/readinp.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >> >> src/sp3.o: src/sp3.c >> mpicxx -o src/sp3.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >> >> src/diffu.o: src/diffu.c >> mpicxx -o src/diffu.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >> >> src/base.o: src/base.c >> mpicxx -o src/base.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >> >> >> clean: >> rm step-41 src/main.o src/readinp.o src/sp3.o src/diffu.o src/base.o >> >> runkr_smooth: >> mpirun -n ${NCORE} ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >> -mata AMAT.dat -matb BMAT.dat \ >> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >> -st_gamg_est_ksp_converged_reason \ >> > > Add -st_gamg_est_ksp_error_if_not_converged 0 > > Thanks, > > Matt > It works after adding -st_gamg_est_ksp_error_if_not_converged 0. Thanks, Wenbo > > -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >> >> runkr_nonsmooth: >> mpirun -n ${NCORE} ./step-41 \ >> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >> -mata AMAT.dat -matb BMAT.dat \ >> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >> -st_gamg_est_ksp_converged_reason \ >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >> >> >> Thanks, >> Wenbo >> >> >> >> >>> On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams wrote: >>> >>>> This is normal: >>>> >>>> Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS >>>> iterations 10 >>>> >>>> It looks like ksp->errorifnotconverged got set somehow. If the default >>>> changed in KSP then (SAGG) GAMG would not ever work. >>>> >>>> I assume you don't have a .petscrc file with more (crazy) options in it >>>> ... >>>> >>>> >>>> On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao >>>> wrote: >>>> >>>>> >>>>> >>>>> On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams wrote: >>>>>> >>>>>>> non-smoothed aggregation is converging very fast. smoothed fails in >>>>>>> the eigen estimator. >>>>>>> >>>>>>> Run this again with -st_gamg_est_ksp_view and >>>>>>> -st_gamg_est_ksp_monitor, and see if you get more output (I'm not 100% sure >>>>>>> about these args). >>>>>>> >>>>>> >>>>>> I also want -st_gamg_est_ksp_converged_reason >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>> $make NCORE=1 runkr_smooth >>>>> mpirun -n 1 ./step-41 \ >>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >>>>> -st_gamg_est_ksp_converged_reason \ >>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>> make: *** [runkr_smooth] Error 91 >>>>> >>>>> Thanks >>>>> Wenbo >>>>> >>>>> >>>>>> >>>>>> >>>>>>> >>>>>>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao < >>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>> >>>>>>>> Matt, >>>>>>>> >>>>>>>> Test 1 nonsmooth >>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>> runkr_nonsmooth >>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 >>>>>>>> \ >>>>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>>>> 2>&1 >>>>>>>> >>>>>>>> Test 2 smooth >>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>> runkr_smooth >>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 >>>>>>>> \ >>>>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>> >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Wenbo >>>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley >>>>>>> > wrote: >>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao < >>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Mark, >>>>>>>>>> >>>>>>>>>> Thanks for your reply. >>>>>>>>>> >>>>>>>>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams >>>>>>>>>> wrote: >>>>>>>>>> >>>>>>>>>>> Please send the output with -st_ksp_view and -st_ksp_monitor and >>>>>>>>>>> we can start to debug it. >>>>>>>>>>> >>>>>>>>>>> Test 1 with nonsmooth and preonly is OK >>>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>>> runkr_nonsmooth >>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>>> 0 \ >>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>> >>>>>>>>>> Test 2 smooth and preonly is not OK >>>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>>> runkr_smooth >>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>>> 1 \ >>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>> 2>&1 >>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>> >>>>>>>>>> Test 3 nonsmooth and gmres is not OK >>>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>>> runkr_gmres >>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>>> 0 \ >>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>> >>>>>>>>> >>>>>>>>> DO NOT DO THIS. Please send the output where you do NOTHING to the >>>>>>>>> coarse solver. >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>> 2>&1 >>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>> >>>>>>>>>> log-files is attached. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>>>>>>>> (diffusion). Where does the asymmetry come from? >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> It is a two-group diffusion equations, where group denotes >>>>>>>>>> neutron enegry discretisation. >>>>>>>>>> Matrix B consists of neutron diffusion/leakage term, removal term >>>>>>>>>> and minus neutron scatter source term between different energies, when >>>>>>>>>> matrix A denotes neutron fission source. >>>>>>>>>> >>>>>>>>>> Diffusion term(Laplace operator) is elliptic and symmetric. >>>>>>>>>> Removal term is diagonal only. However scatter term is asymmetry since >>>>>>>>>> scatter term from high energy to low energy is far greater than the term >>>>>>>>>> from low to high. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Wenbo >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao < >>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Matt, >>>>>>>>>>>> Thanks for your reply. >>>>>>>>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>>>>>>>> to test those options. >>>>>>>>>>>> >>>>>>>>>>>> Wenbo >>>>>>>>>>>> >>>>>>>>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao < >>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> Because I am not clear about what will happen using 'preonly' >>>>>>>>>>>>>> for large scale problem. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> The size of the problem has nothing to do with 'preonly'. All >>>>>>>>>>>>> it means is to apply a preconditioner without a Krylov solver. >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> It seems to use a direct solver from below, >>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>>>>>>>> KSP/KSPPREONLY.html >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> However, I still cannot understand why you would change the >>>>>>>>>>>>> default? >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt, >>>>>>>>>>>>>>>> Thanks for your reply. >>>>>>>>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>>>>>>>> But I am not clear about the 'preonly' option. Which solver >>>>>>>>>>>>>>>> is used in preonly? I wonder if 'preonly' is suitable for large scale >>>>>>>>>>>>>>>> problem such as 400,000,000 unknowns. >>>>>>>>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I mean, why are you setting this at all. Just do not set the >>>>>>>>>>>>>>> coarse solver. The default should work fine. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Could you give me some suggestions? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve >>>>>>>>>>>>>>>>>> two-group neutron diffusion equations with finite difference method. The >>>>>>>>>>>>>>>>>> grid is 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where >>>>>>>>>>>>>>>>>> B is diagonally dominant matrix but not symmetry. >>>>>>>>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only >>>>>>>>>>>>>>>>>> non-smooths and preonly is OK. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Why are you setting the coarse solver. This makes no sense. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Test 1 >>>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Test 2 >>>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>>> log_smooth 2>&1 >>>>>>>>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Test 3 >>>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type gmres >>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>>> log_gmres 2>&1 >>>>>>>>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Log files were attched. >>>>>>>>>>>>>>>>>> The matrix file were also attched as AMAT.dat and >>>>>>>>>>>>>>>>>> BMAT.dat. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Is it correct? Or something wrong with my code or >>>>>>>>>>>>>>>>>> commad-line? >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>>> >>>> >>> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log_new4.tgz Type: application/x-gzip Size: 12488 bytes Desc: not available URL: From mfadams at lbl.gov Mon Oct 2 12:49:04 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 2 Oct 2017 13:49:04 -0400 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: Well that is strange, the PETSc tests work. Wenbo, could you please: > git clone -b gamg-fix-eig-err https://bitbucket.org/petsc/petsc petsc2 > cd petsc2 and reconfigure, make, and then run your test without the -st_gamg_est_ksp_error_if_not_converged 0 fix, and see if this fixes the problem. Don't forget to set PETSC_DIR=..../petsc2 If you have time and this works, you could do a 'git checkout master' and remake, and retest. You should not have to reconfigure. I have tested master on petsc tests. I don't understand how this happened. Thanks, Mark On Mon, Oct 2, 2017 at 12:32 PM, Wenbo Zhao wrote: > > > On Tue, Oct 3, 2017 at 12:23 AM, Matthew Knepley > wrote: > >> On Mon, Oct 2, 2017 at 11:56 AM, Wenbo Zhao >> wrote: >> >>> >>> >>> On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams wrote: >>> >>>> Whenbo, do you build your PETSc? >>>> >>>> Yes. >>> My configure option is listed below >>> ./configure --with-mpi=1 --with-shared-libraries=1 \ >>> --with-64-bit-indices=1 --with-debugging=1 >>> >>> And I set PETSC_DIR, PETSC_ARCH and SLEPC_DIR in my ~/.bashrc. >>> >>> >>> The Makefile for my problem is listed below, >>> >>> PETSC_ARCH = arch-linux2-c-debug >>> PETSC_DIR = /home/zhaowenbo/research/petsc/petsc_git >>> SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc_git >>> #PETSC_DIR = /home/zhaowenbo/research/petsc/petsc-3.7.4 >>> #SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc-3.7.3 >>> HYPRE_DIR = /usr/local/hypre >>> # >>> DEBUG_OPT = -g >>> COMP_FLAGS = -fPIC -Wall \ >>> -I${SLEPC_DIR}/include -I${SLEPC_DIR}/${PETSC_ARCH}/include \ >>> -I${PETSC_DIR}/include -I${PETSC_DIR}/${PETSC_ARCH}/include \ >>> -Isrc >>> >>> LINK_FLAGS = -fPIC -Wall \ >>> -Wl,-rpath,${SLEPC_DIR}/${PETSC_ARCH}/lib >>> -L${SLEPC_DIR}/${PETSC_ARCH}/lib -lslepc \ >>> -Wl,-rpath,${PETSC_DIR}/${PETSC_ARCH}/lib >>> -L${PETSC_DIR}/${PETSC_ARCH}/lib -lpetsc \ >>> -llapack -lblas -lhwloc -lm -lgfortran -lquadmath >>> >>> step-41: src/main.o src/readinp.o src/base.o src/sp3.o src/diffu.o >>> mpicxx -o step-41 $^ ${LINK_FLAGS} ${DEBUG_OPT} >>> >>> src/main.o: src/main.c >>> mpicxx -o src/main.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >>> >>> src/readinp.o: src/readinp.c >>> mpicxx -o src/readinp.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >>> >>> src/sp3.o: src/sp3.c >>> mpicxx -o src/sp3.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >>> >>> src/diffu.o: src/diffu.c >>> mpicxx -o src/diffu.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >>> >>> src/base.o: src/base.c >>> mpicxx -o src/base.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >>> >>> >>> clean: >>> rm step-41 src/main.o src/readinp.o src/sp3.o src/diffu.o src/base.o >>> >>> runkr_smooth: >>> mpirun -n ${NCORE} ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>> -mata AMAT.dat -matb BMAT.dat \ >>> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >>> -st_gamg_est_ksp_converged_reason \ >>> >> >> Add -st_gamg_est_ksp_error_if_not_converged 0 >> >> Thanks, >> >> Matt >> > > It works after adding -st_gamg_est_ksp_error_if_not_converged 0. > > Thanks, > Wenbo > > >> >> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>> >>> runkr_nonsmooth: >>> mpirun -n ${NCORE} ./step-41 \ >>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>> -mata AMAT.dat -matb BMAT.dat \ >>> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >>> -st_gamg_est_ksp_converged_reason \ >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>> >>> >>> Thanks, >>> Wenbo >>> >>> >>> >>> >>>> On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams wrote: >>>> >>>>> This is normal: >>>>> >>>>> Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS >>>>> iterations 10 >>>>> >>>>> It looks like ksp->errorifnotconverged got set somehow. If the >>>>> default changed in KSP then (SAGG) GAMG would not ever work. >>>>> >>>>> I assume you don't have a .petscrc file with more (crazy) options in >>>>> it ... >>>>> >>>>> >>>>> On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao >>>>> wrote: >>>>> >>>>>> >>>>>> >>>>>> On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams wrote: >>>>>>> >>>>>>>> non-smoothed aggregation is converging very fast. smoothed fails in >>>>>>>> the eigen estimator. >>>>>>>> >>>>>>>> Run this again with -st_gamg_est_ksp_view and >>>>>>>> -st_gamg_est_ksp_monitor, and see if you get more output (I'm not 100% sure >>>>>>>> about these args). >>>>>>>> >>>>>>> >>>>>>> I also want -st_gamg_est_ksp_converged_reason >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>> $make NCORE=1 runkr_smooth >>>>>> mpirun -n 1 ./step-41 \ >>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >>>>>> -st_gamg_est_ksp_converged_reason \ >>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>> make: *** [runkr_smooth] Error 91 >>>>>> >>>>>> Thanks >>>>>> Wenbo >>>>>> >>>>>> >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao < >>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>> >>>>>>>>> Matt, >>>>>>>>> >>>>>>>>> Test 1 nonsmooth >>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>> runkr_nonsmooth >>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>> 0 \ >>>>>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth >>>>>>>>> 2>&1 >>>>>>>>> >>>>>>>>> Test 2 smooth >>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>> runkr_smooth >>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>> 1 \ >>>>>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>> >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Wenbo >>>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao < >>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> Mark, >>>>>>>>>>> >>>>>>>>>>> Thanks for your reply. >>>>>>>>>>> >>>>>>>>>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams >>>>>>>>>>> wrote: >>>>>>>>>>> >>>>>>>>>>>> Please send the output with -st_ksp_view and -st_ksp_monitor >>>>>>>>>>>> and we can start to debug it. >>>>>>>>>>>> >>>>>>>>>>>> Test 1 with nonsmooth and preonly is OK >>>>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>>>> runkr_nonsmooth >>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>> >>>>>>>>>>> Test 2 smooth and preonly is not OK >>>>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>>>> runkr_smooth >>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>>> 2>&1 >>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>> >>>>>>>>>>> Test 3 nonsmooth and gmres is not OK >>>>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>>>> runkr_gmres >>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> DO NOT DO THIS. Please send the output where you do NOTHING to >>>>>>>>>> the coarse solver. >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>>> 2>&1 >>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>> >>>>>>>>>>> log-files is attached. >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>>>>>>>>> (diffusion). Where does the asymmetry come from? >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> It is a two-group diffusion equations, where group denotes >>>>>>>>>>> neutron enegry discretisation. >>>>>>>>>>> Matrix B consists of neutron diffusion/leakage term, removal >>>>>>>>>>> term and minus neutron scatter source term between different energies, when >>>>>>>>>>> matrix A denotes neutron fission source. >>>>>>>>>>> >>>>>>>>>>> Diffusion term(Laplace operator) is elliptic and symmetric. >>>>>>>>>>> Removal term is diagonal only. However scatter term is asymmetry since >>>>>>>>>>> scatter term from high energy to low energy is far greater than the term >>>>>>>>>>> from low to high. >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Wenbo >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao < >>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Matt, >>>>>>>>>>>>> Thanks for your reply. >>>>>>>>>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>>>>>>>>> to test those options. >>>>>>>>>>>>> >>>>>>>>>>>>> Wenbo >>>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao < >>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Because I am not clear about what will happen using >>>>>>>>>>>>>>> 'preonly' for large scale problem. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> The size of the problem has nothing to do with 'preonly'. All >>>>>>>>>>>>>> it means is to apply a preconditioner without a Krylov solver. >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> It seems to use a direct solver from below, >>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>>>>>>>>>>>> KSP/KSPPREONLY.html >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> However, I still cannot understand why you would change the >>>>>>>>>>>>>> default? >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley < >>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt, >>>>>>>>>>>>>>>>> Thanks for your reply. >>>>>>>>>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>>>>>>>>> But I am not clear about the 'preonly' option. Which >>>>>>>>>>>>>>>>> solver is used in preonly? I wonder if 'preonly' is suitable for large >>>>>>>>>>>>>>>>> scale problem such as 400,000,000 unknowns. >>>>>>>>>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I mean, why are you setting this at all. Just do not set >>>>>>>>>>>>>>>> the coarse solver. The default should work fine. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Could you give me some suggestions? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve >>>>>>>>>>>>>>>>>>> two-group neutron diffusion equations with finite difference method. The >>>>>>>>>>>>>>>>>>> grid is 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, where >>>>>>>>>>>>>>>>>>> B is diagonally dominant matrix but not symmetry. >>>>>>>>>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);? >>>>>>>>>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only >>>>>>>>>>>>>>>>>>> non-smooths and preonly is OK. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Why are you setting the coarse solver. This makes no >>>>>>>>>>>>>>>>>> sense. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Test 1 >>>>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Test 2 >>>>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>>>> log_smooth 2>&1 >>>>>>>>>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Test 3 >>>>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type gmres >>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>>>> log_gmres 2>&1 >>>>>>>>>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Log files were attched. >>>>>>>>>>>>>>>>>>> The matrix file were also attched as AMAT.dat and >>>>>>>>>>>>>>>>>>> BMAT.dat. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Is it correct? Or something wrong with my code or >>>>>>>>>>>>>>>>>>> commad-line? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>> >>>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mailinglists at xgm.de Mon Oct 2 21:11:28 2017 From: mailinglists at xgm.de (Florian Lindner) Date: Tue, 3 Oct 2017 10:11:28 +0800 Subject: [petsc-users] Mat/Vec with empty ranks In-Reply-To: References: Message-ID: Am 02.10.2017 um 21:04 schrieb Matthew Knepley: > On Mon, Oct 2, 2017 at 6:21 AM, Florian Lindner > wrote: > > Hello, > > I have a matrix and vector that live on 4 ranks, but only rank 2 and 3 have values: > > Doing a simple LSQR solve does not converge. However, when the values are distributed equally, it converges within 3 > iterations. > > What can I do about that? > > I have attached a simple program and creates the matrix and vector or loads them from a file. > > > There are a few problems with this program. I am attaching a cleaned up version. However, convergence still differs starting > at iteration 2. It appears that LSQR has a problem with this system, or we have a bug that I cannot see. Thanks for having a look at it! And good to hear it's not by design. If I can be of any more help tracking this down, pleae let me know. In the meantime, what could be a good way to work around this? This is admittedly a very malformed example. Is there a way to force solving on a single CPU and then distribute the results resp. KSP object to the original parallel layout? Of course, we would first try to solve in parallel, but we have little influence about the actual parallel layout, since we are just a library and other solvers give us the data. Best, Florian From zakaryah at gmail.com Mon Oct 2 22:54:29 2017 From: zakaryah at gmail.com (zakaryah .) Date: Mon, 2 Oct 2017 23:54:29 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <87o9qp5kou.fsf@jedbrown.org> References: <08AA1273-062D-4FDD-8709-40AA1FE8AA92@mcs.anl.gov> <06932E72-E8F3-4EA8-889F-A570AD660E32@mcs.anl.gov> <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> Message-ID: I'm still working on this. I've made some progress, and it looks like the issue is with the KSP, at least for now. The Jacobian may be ill-conditioned. Is it possible to use -snes_test_display during an intermediate step of the analysis? I would like to inspect the Jacobian after several solves have already completed, just to make sure there are no mistakes there. I tried ierr = SNESSetType(mp->PETSc_snes, SNESTEST);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-snes_test_display", ""); CHKERRQ(ierr); and the first line works, of course, but the second line doesn't seem to activate the printing of the Jacobian. I also tried it with "true" in the last argument and that didn't work either. On Tue, Sep 5, 2017 at 9:39 AM, Jed Brown wrote: > "zakaryah ." writes: > > > OK - I've checked the Jacobian and function very thoroughly and I am > > confident there are no errors. > > Does Newton converge quadratically when you have a good enough initial > guess? > > Globalization of large deformation elasticity is a persistent > engineering challenge. The standard approach is to use a continuation, > often in the form of load increments. > > Regarding trust region documentation, the man page says > > The basic algorithm is taken from "The Minpack Project", by More', > Sorensen, Garbow, Hillstrom, pages 88-111 of "Sources and Development > of Mathematical Software", Wayne Cowell, editor. > > You should be able to make sense of it reading from any other source on > trust region methods. > > > I suspect that I am having problems with a bad starting point, and the > SNES > > cannot find the global minimum from there. I know that the global > minimum > > (with residual zero) exists in all cases but I would like the methods for > > finding it to be more robust to the starting value. > > > > The problem comes from the physics of finite deformations of elastic > > materials. In short, I have a functional of smooth maps on a 3D domain > to > > itself. The functional contains two terms. The first term represents > > forces which come from external data, and in the Lagrangian this term > only > > involves the values of the map at the point in question. The second term > > penalizes fluctuations in the map, and can take various forms. The > > simplest form is just the Dirichlet energy, but I'm also interested in > the > > infinitesimal strain energy and the finite strain energy. The first two > > have terms in the Lagrangian which are first order in the second spatial > > derivatives of the map, while the third (finite strain energy) has terms > > which are up to third order in the first and second spatial derivatives > of > > the map. It is the finite strain energy term which has been problematic. > > > > The Euler-Lagrange equations are discretized on a cubic grid, with equal > > interval spacing in each dimension. The map is the dependent variable, > > i.e. the x in F(x) = 0. I prefer Neumann boundary conditions. Because > the > > spatial derivatives of the map are usually small, the Jacobian typically > > has large values in 3x3 blocks along the diagonal (which depend on the > map > > and the external data), and up to 54 values which are functions of the > > spatial derivatives of the map and tend to be smaller. > > > > Do you have any advice on diagnosing and improving situations in which > > Newton's method finds a stationary point that is not the state with > > globally minimal residual? One hint is that -snes_type newtonls does not > > find as good a solution as -snes_type newtontr but I don't know much > about > > these trust region methods, or how to customize and assess them. I'm > > grateful for any advice. > > > > On Mon, Sep 4, 2017 at 5:44 PM, zakaryah . wrote: > > > >> Yes, it looks like it IS the other way around, and I think the row is > >> > >> r.c + r.i*3 + r.j*3*M + r.k*3*M*N, where r.i is in [0,M-1], r.j is in > >> [0,N-1], and r.k is in [0,P-1]. > >> > >> That matches the boundary conditions in the displayed Jacobian. > >> > >> On Mon, Sep 4, 2017 at 5:33 PM, Barry Smith wrote: > >> > >>> > >>> > On Sep 4, 2017, at 4:09 PM, zakaryah . wrote: > >>> > > >>> > OK that is super helpful. Just to be sure - for MxNxP, the row r in > >>> the Jacobian is at r.i*P*N*3 + r.j*P*3 + r.k*3 + r.c? > >>> > >>> It is that way, or the other way around r.k*M*N*3 + r.j*N*3 + r.k*3 + > >>> r.c > >>> > > >>> > > >>> > On Mon, Sep 4, 2017 at 4:58 PM, Barry Smith > wrote: > >>> > > >>> > > On Sep 4, 2017, at 3:48 PM, zakaryah . wrote: > >>> > > > >>> > > One piece of information that would be useful is what ordering > PETSc > >>> uses for the Jacobian in the snes_test_display. Is it a natural > ordering, > >>> or the PETSc ordering? For debugging the Jacobian manually, the > natural > >>> ordering is much easier to work with. > >>> > > >>> > What is displayed is always the natural ordering (internally it is > >>> not the natural ordering). > >>> > > >>> > > For -n 1, are the orderings the same? > >>> > > >>> > yes > >>> > > >>> > > >>> > > >>> > > > >>> > > If I use a MatStencil r to represent a field with 3 degrees of > >>> freedom, and the dimensions of my 3D DMDA are MxNxP, which row of the > >>> Jacobian corresponds to r.i=x, r.j=y, r.k=z, r.c=f? > >>> > > >>> > Internally it is complicated but for any viewing it is just the > natural > >>> ordering and all the degrees of freedom for a single point are next to > each > >>> other in the vector/matrix. > >>> > > >>> > > >>> > >>> > >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhaowenbo.npic at gmail.com Tue Oct 3 02:55:24 2017 From: zhaowenbo.npic at gmail.com (Wenbo Zhao) Date: Tue, 3 Oct 2017 15:55:24 +0800 Subject: [petsc-users] Issue of mg_coarse_ksp not converge In-Reply-To: References: Message-ID: On Tue, Oct 3, 2017 at 1:49 AM, Mark Adams wrote: > Well that is strange, the PETSc tests work. > > Wenbo, could you please: > > > git clone -b gamg-fix-eig-err https://bitbucket.org/petsc/petsc petsc2 > > cd petsc2 > > and reconfigure, make, and then run your test without the > -st_gamg_est_ksp_error_if_not_converged 0 fix, and see if this fixes the > problem. > > Don't forget to set PETSC_DIR=..../petsc2 > > >git clone -b mark/gamg-fix-eig-err https://bitbucket.org/petsc/petsc petsc2 It works well without the -st_gamg_est_ksp_error_if_not_converged 0. mpirun -n 1 ./step-41 \ -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ -mata AMAT.dat -matb BMAT.dat \ -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ -st_gamg_est_ksp_converged_reason \ -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 Thanks, Wenbo > If you have time and this works, you could do a 'git checkout master' and > remake, and retest. You should not have to reconfigure. I have tested > master on petsc tests. I don't understand how this happened. > > Thanks, > Mark > > > On Mon, Oct 2, 2017 at 12:32 PM, Wenbo Zhao > wrote: > >> >> >> On Tue, Oct 3, 2017 at 12:23 AM, Matthew Knepley >> wrote: >> >>> On Mon, Oct 2, 2017 at 11:56 AM, Wenbo Zhao >>> wrote: >>> >>>> >>>> >>>> On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams wrote: >>>> >>>>> Whenbo, do you build your PETSc? >>>>> >>>>> Yes. >>>> My configure option is listed below >>>> ./configure --with-mpi=1 --with-shared-libraries=1 \ >>>> --with-64-bit-indices=1 --with-debugging=1 >>>> >>>> And I set PETSC_DIR, PETSC_ARCH and SLEPC_DIR in my ~/.bashrc. >>>> >>>> >>>> The Makefile for my problem is listed below, >>>> >>>> PETSC_ARCH = arch-linux2-c-debug >>>> PETSC_DIR = /home/zhaowenbo/research/petsc/petsc_git >>>> SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc_git >>>> #PETSC_DIR = /home/zhaowenbo/research/petsc/petsc-3.7.4 >>>> #SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc-3.7.3 >>>> HYPRE_DIR = /usr/local/hypre >>>> # >>>> DEBUG_OPT = -g >>>> COMP_FLAGS = -fPIC -Wall \ >>>> -I${SLEPC_DIR}/include -I${SLEPC_DIR}/${PETSC_ARCH}/include \ >>>> -I${PETSC_DIR}/include -I${PETSC_DIR}/${PETSC_ARCH}/include \ >>>> -Isrc >>>> >>>> LINK_FLAGS = -fPIC -Wall \ >>>> -Wl,-rpath,${SLEPC_DIR}/${PETSC_ARCH}/lib >>>> -L${SLEPC_DIR}/${PETSC_ARCH}/lib -lslepc \ >>>> -Wl,-rpath,${PETSC_DIR}/${PETSC_ARCH}/lib >>>> -L${PETSC_DIR}/${PETSC_ARCH}/lib -lpetsc \ >>>> -llapack -lblas -lhwloc -lm -lgfortran -lquadmath >>>> >>>> step-41: src/main.o src/readinp.o src/base.o src/sp3.o src/diffu.o >>>> mpicxx -o step-41 $^ ${LINK_FLAGS} ${DEBUG_OPT} >>>> >>>> src/main.o: src/main.c >>>> mpicxx -o src/main.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >>>> >>>> src/readinp.o: src/readinp.c >>>> mpicxx -o src/readinp.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >>>> >>>> src/sp3.o: src/sp3.c >>>> mpicxx -o src/sp3.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >>>> >>>> src/diffu.o: src/diffu.c >>>> mpicxx -o src/diffu.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >>>> >>>> src/base.o: src/base.c >>>> mpicxx -o src/base.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT} >>>> >>>> >>>> clean: >>>> rm step-41 src/main.o src/readinp.o src/sp3.o src/diffu.o src/base.o >>>> >>>> runkr_smooth: >>>> mpirun -n ${NCORE} ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>> -mata AMAT.dat -matb BMAT.dat \ >>>> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >>>> -st_gamg_est_ksp_converged_reason \ >>>> >>> >>> Add -st_gamg_est_ksp_error_if_not_converged 0 >>> >>> Thanks, >>> >>> Matt >>> >> >> It works after adding -st_gamg_est_ksp_error_if_not_converged 0. >> >> Thanks, >> Wenbo >> >> >>> >>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>> >>>> runkr_nonsmooth: >>>> mpirun -n ${NCORE} ./step-41 \ >>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \ >>>> -mata AMAT.dat -matb BMAT.dat \ >>>> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >>>> -st_gamg_est_ksp_converged_reason \ >>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_nonsmooth 2>&1 >>>> >>>> >>>> Thanks, >>>> Wenbo >>>> >>>> >>>> >>>> >>>>> On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams wrote: >>>>> >>>>>> This is normal: >>>>>> >>>>>> Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS >>>>>> iterations 10 >>>>>> >>>>>> It looks like ksp->errorifnotconverged got set somehow. If the >>>>>> default changed in KSP then (SAGG) GAMG would not ever work. >>>>>> >>>>>> I assume you don't have a .petscrc file with more (crazy) options in >>>>>> it ... >>>>>> >>>>>> >>>>>> On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao >>>>> > wrote: >>>>>> >>>>>>> >>>>>>> >>>>>>> On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley >>>>>>> wrote: >>>>>>> >>>>>>>> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams >>>>>>>> wrote: >>>>>>>> >>>>>>>>> non-smoothed aggregation is converging very fast. smoothed fails >>>>>>>>> in the eigen estimator. >>>>>>>>> >>>>>>>>> Run this again with -st_gamg_est_ksp_view and >>>>>>>>> -st_gamg_est_ksp_monitor, and see if you get more output (I'm not 100% sure >>>>>>>>> about these args). >>>>>>>>> >>>>>>>> >>>>>>>> I also want -st_gamg_est_ksp_converged_reason >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>> $make NCORE=1 runkr_smooth >>>>>>> mpirun -n 1 ./step-41 \ >>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \ >>>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>>> -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \ >>>>>>> -st_gamg_est_ksp_converged_reason \ >>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth 2>&1 >>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>> >>>>>>> Thanks >>>>>>> Wenbo >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao < >>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Matt, >>>>>>>>>> >>>>>>>>>> Test 1 nonsmooth >>>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>>> runkr_nonsmooth >>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>>> 0 \ >>>>>>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>> >>>>>>>>>> Test 2 smooth >>>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>>> runkr_smooth >>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths >>>>>>>>>> 1 \ >>>>>>>>>> -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>> 2>&1 >>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Wenbo >>>>>>>>>> >>>>>>>>>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao < >>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Mark, >>>>>>>>>>>> >>>>>>>>>>>> Thanks for your reply. >>>>>>>>>>>> >>>>>>>>>>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams >>>>>>>>>>>> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Please send the output with -st_ksp_view and -st_ksp_monitor >>>>>>>>>>>>> and we can start to debug it. >>>>>>>>>>>>> >>>>>>>>>>>>> Test 1 with nonsmooth and preonly is OK >>>>>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>>>>> runkr_nonsmooth >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>> >>>>>>>>>>>> Test 2 smooth and preonly is not OK >>>>>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>>>>> runkr_smooth >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type preonly -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_smooth >>>>>>>>>>>> 2>&1 >>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>> >>>>>>>>>>>> Test 3 nonsmooth and gmres is not OK >>>>>>>>>>>> zhaowenbo at ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 >>>>>>>>>>>> runkr_gmres >>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>> -st_ksp_type gmres -st_ksp_view -st_ksp_monitor \ >>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>> -st_mg_coarse_ksp_type gmres -st_mg_coarse_ksp_monitor >>>>>>>>>>>> -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> DO NOT DO THIS. Please send the output where you do NOTHING to >>>>>>>>>>> the coarse solver. >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > log_gmres >>>>>>>>>>>> 2>&1 >>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>> >>>>>>>>>>>> log-files is attached. >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> You mentioned that B is not symmetric. I assume it is elliptic >>>>>>>>>>>>> (diffusion). Where does the asymmetry come from? >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> It is a two-group diffusion equations, where group denotes >>>>>>>>>>>> neutron enegry discretisation. >>>>>>>>>>>> Matrix B consists of neutron diffusion/leakage term, removal >>>>>>>>>>>> term and minus neutron scatter source term between different energies, when >>>>>>>>>>>> matrix A denotes neutron fission source. >>>>>>>>>>>> >>>>>>>>>>>> Diffusion term(Laplace operator) is elliptic and symmetric. >>>>>>>>>>>> Removal term is diagonal only. However scatter term is asymmetry since >>>>>>>>>>>> scatter term from high energy to low energy is far greater than the term >>>>>>>>>>>> from low to high. >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Wenbo >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao < >>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Matt, >>>>>>>>>>>>>> Thanks for your reply. >>>>>>>>>>>>>> For the defalt option doesnt work firstly( -st_ksp_type gmres >>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried >>>>>>>>>>>>>> to test those options. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao < >>>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Because I am not clear about what will happen using >>>>>>>>>>>>>>>> 'preonly' for large scale problem. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> The size of the problem has nothing to do with 'preonly'. >>>>>>>>>>>>>>> All it means is to apply a preconditioner without a Krylov solver. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> It seems to use a direct solver from below, >>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/p >>>>>>>>>>>>>>>> etsc-current/docs/manualpages/KSP/KSPPREONLY.html >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> However, I still cannot understand why you would change the >>>>>>>>>>>>>>> default? >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley < >>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao < >>>>>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt, >>>>>>>>>>>>>>>>>> Thanks for your reply. >>>>>>>>>>>>>>>>>> It DOES make no sense for this problem. >>>>>>>>>>>>>>>>>> But I am not clear about the 'preonly' option. Which >>>>>>>>>>>>>>>>>> solver is used in preonly? I wonder if 'preonly' is suitable for large >>>>>>>>>>>>>>>>>> scale problem such as 400,000,000 unknowns. >>>>>>>>>>>>>>>>>> So I tried 'gmres' option and found these error messages. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I mean, why are you setting this at all. Just do not set >>>>>>>>>>>>>>>>> the coarse solver. The default should work fine. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Could you give me some suggestions? >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley < >>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao < >>>>>>>>>>>>>>>>>>> zhaowenbo.npic at gmail.com> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> I met some questions when I use PETSC/SLEPC to solve >>>>>>>>>>>>>>>>>>>> two-group neutron diffusion equations with finite difference method. The >>>>>>>>>>>>>>>>>>>> grid is 3*3*3, when DOF on each points is 2. So the matrix size is 54*54. >>>>>>>>>>>>>>>>>>>> It is generalized eigenvalue problem Ax=\lamda Bx, >>>>>>>>>>>>>>>>>>>> where B is diagonally dominant matrix but not symmetry. >>>>>>>>>>>>>>>>>>>> EPS is set as below, >>>>>>>>>>>>>>>>>>>> ierr = EPSSetProblemType(eps,EPS_GNHE >>>>>>>>>>>>>>>>>>>> P);CHKERRQ(ierr);? >>>>>>>>>>>>>>>>>>>> ierr = EPSSetWhichEigenpairs(eps,EPS_ >>>>>>>>>>>>>>>>>>>> LARGEST_REAL);CHKERRQ(ierr);? >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Krylovschur is used as eps sovler. GAMG is used as PC. >>>>>>>>>>>>>>>>>>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only >>>>>>>>>>>>>>>>>>>> non-smooths and preonly is OK. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Why are you setting the coarse solver. This makes no >>>>>>>>>>>>>>>>>>> sense. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Test 1 >>>>>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_nonsmooth >>>>>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>>>>> log_nonsmooth 2>&1 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Test 2 >>>>>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_smooth >>>>>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 1 \ >>>>>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type preonly >>>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor \ >>>>>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>>>>> log_smooth 2>&1 >>>>>>>>>>>>>>>>>>>> makefile:43: recipe for target 'runkr_smooth' failed >>>>>>>>>>>>>>>>>>>> make: *** [runkr_smooth] Error 91 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Test 3 >>>>>>>>>>>>>>>>>>>> $ make NCORE=1 runkr_gmres >>>>>>>>>>>>>>>>>>>> mpirun -n 1 ./step-41 \ >>>>>>>>>>>>>>>>>>>> -st_ksp_type gmres \ >>>>>>>>>>>>>>>>>>>> -st_pc_type gamg -st_pc_gamg_type agg >>>>>>>>>>>>>>>>>>>> -st_pc_gamg_agg_nsmooths 0 \ >>>>>>>>>>>>>>>>>>>> -st_ksp_view -mata AMAT.dat -matb BMAT.dat \ >>>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_type gmres >>>>>>>>>>>>>>>>>>>> -st_mg_coarse_ksp_monitor -st_mg_coarse_ksp_rtol 1.0e-6 \ >>>>>>>>>>>>>>>>>>>> -eps_nev 1 -eps_ncv 10 -eps_monitor -log_view > >>>>>>>>>>>>>>>>>>>> log_gmres 2>&1 >>>>>>>>>>>>>>>>>>>> makefile:59: recipe for target 'runkr_gmres' failed >>>>>>>>>>>>>>>>>>>> make: *** [runkr_gmres] Error 91 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Log files were attched. >>>>>>>>>>>>>>>>>>>> The matrix file were also attched as AMAT.dat and >>>>>>>>>>>>>>>>>>>> BMAT.dat. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Is it correct? Or something wrong with my code or >>>>>>>>>>>>>>>>>>>> commad-line? >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Wenbo >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log_new5.tgz Type: application/x-gzip Size: 12481 bytes Desc: not available URL: From knepley at gmail.com Tue Oct 3 03:05:35 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 3 Oct 2017 04:05:35 -0400 Subject: [petsc-users] Mat/Vec with empty ranks In-Reply-To: References: Message-ID: On Mon, Oct 2, 2017 at 10:11 PM, Florian Lindner wrote: > > > Am 02.10.2017 um 21:04 schrieb Matthew Knepley: > > On Mon, Oct 2, 2017 at 6:21 AM, Florian Lindner > wrote: > > > > Hello, > > > > I have a matrix and vector that live on 4 ranks, but only rank 2 and > 3 have values: > > > > Doing a simple LSQR solve does not converge. However, when the > values are distributed equally, it converges within 3 > > iterations. > > > > What can I do about that? > > > > I have attached a simple program and creates the matrix and vector > or loads them from a file. > > > > > > There are a few problems with this program. I am attaching a cleaned up > version. However, convergence still differs starting > > at iteration 2. It appears that LSQR has a problem with this system, or > we have a bug that I cannot see. > > Thanks for having a look at it! > > And good to hear it's not by design. If I can be of any more help tracking > this down, pleae let me know. > > In the meantime, what could be a good way to work around this? This is > admittedly a very malformed example. Is there a > way to force solving on a single CPU and then distribute the results resp. > KSP object to the original parallel layout? > Of course, we would first try to solve in parallel, but we have little > influence about the actual parallel layout, since > we are just a library and other solvers give us the data. > I need to be more clear. I do not think convergence has anything to do with being on 1 process. I think this is an ill-conditioned example and convergence is an accident in one case. Unless you see this in a bunch of cases, I would not worry about gathering to a single process. However, that can be done using PCREDUNDANT if this is really a problem. Thanks, Matt > Best, > Florian > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From k_burkart at yahoo.com Tue Oct 3 06:40:35 2017 From: k_burkart at yahoo.com (Klaus Burkart) Date: Tue, 3 Oct 2017 11:40:35 +0000 (UTC) Subject: [petsc-users] Loading a PETsc matrix with matrix data in CSR format? In-Reply-To: References: <876053565.1090563.1505658351610.ref@mail.yahoo.com> <876053565.1090563.1505658351610@mail.yahoo.com> Message-ID: <519865087.343382.1507030835073@mail.yahoo.com> Hello, I am still struggling to load my matrix in a PETSc matrix. I tried the following code which is part of a c++ function which converts the original matrix to CSR fromat and should load a PETSc matrix. ??? Mat M; ??? MatSetFromOptions(M); ??? // fill PETSc matrix ??? MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD, n, n, rows, cols, vals, M) but I get the following error message when compiling the code: ?error: cannot convert ?Mat {aka _p_Mat*}? to ?_p_Mat**? for argument ?7? to ?PetscErrorCode MatCreateSeqAIJWithArrays(MPI_Comm, PetscInt, PetscInt, PetscInt*, PetscInt*, PetscScalar*, _p_Mat**)? ???? MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD, n, n, rows, cols, vals, M) What's wrong with this? Klaus Barry Smith schrieb am 16:38 Sonntag, 17.September 2017: 1)? MatSetValues() has absolutely nothing to do with the format used internally by PETSc to store the matrix. MatSetValues() is used to convey blocks of values into the matrix. ? 2) If you want to load a matrix onto multiple processes from a file NEVER write your own parallel ASCII matrix reader. Instead in either C, C++, Python, or Matlab write a routine that reads in the matrix SEQUENTIALLY and then saves it with MatView() and a binary viewer. You can then load the matrix easily and efficiently in PETSc in parallel with MatLoad ? 3) If you have matrix already in CSR format you can use MatCreateSeqAIJWithArrays() ? Barry > On Sep 17, 2017, at 9:25 AM, Klaus Burkart wrote: > > The matrix import function looks like this: > > void csr2pet > ( >? ? const Foam::lduMatrix & matrix, >? ? petsc_declaration & petsc_matrix? // How to declare the PETsc matrix to be filled? > ) > { >? ? int n = matrix.diag().size(); // small case n = 40800 >? ? int nnz = matrix.lower().size() + matrix.upper().size() + matrix.diag().size(); // small case nnz = 203800 > >? ? // allocate memory for CSR sparse matrix using calloc >? ? ScalarType * vals = (ScalarType *)calloc(nnz, sizeof(ScalarType)); >? ? uint * cols = (uint *)calloc(nnz, sizeof(uint)); >? ? uint * rows = (uint *)calloc(n, sizeof(uint)); > >? ? // call function to convert original LDU matrix to CSR format >? ? exPet::ldu2csr(matrix,rows,cols,vals); > >? ? // fill PETsc matrix >? ? MatSetValues(petsc_matrix, ?, ?, ?, ?, ?, INSERT_VALUES); > >? ? // free and release the matrix memory >? ? free(rows); free(cols); free(vals);? // calloc() > } > > > Questions: > > 1: How to declare the petsc_matrix to be filled by the function with the content of the original matrix? > > 2: MatSetValues(petsc_matrix, ?, ?, ?, ?, ?, INSERT_VALUES); is used to actually fill the petsc_matrix and I was of the opinion that PETsc uses the CSR format but I can't work out how CSR format is described by: > >? ? v? ? ? ? - a logically two-dimensional array of values >? ? m, idxm? ? - the number of rows and their global indices >? ? n, idxn? ? - the number of columns and their global indices > > My original matrix is converted to CSR format, i.e. three arrays cols (column_indices), rows (row_start_indices) and vals (values). > > How can I load my matrix into a PETsc matrix for parallel processing? MatSetValues(petsc_matrix, ?, ?, ?, ?, ?, INSERT_VALUES); > > Klaus -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Oct 3 07:58:52 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 3 Oct 2017 14:58:52 +0200 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <08AA1273-062D-4FDD-8709-40AA1FE8AA92@mcs.anl.gov> <06932E72-E8F3-4EA8-889F-A570AD660E32@mcs.anl.gov> <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> Message-ID: <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> > On Oct 3, 2017, at 5:54 AM, zakaryah . wrote: > > I'm still working on this. I've made some progress, and it looks like the issue is with the KSP, at least for now. The Jacobian may be ill-conditioned. Is it possible to use -snes_test_display during an intermediate step of the analysis? I would like to inspect the Jacobian after several solves have already completed, No, our currently code for testing Jacobians is poor quality and poorly organized. Needs a major refactoring to do things properly. Sorry Barry > just to make sure there are no mistakes there. I tried > > ierr = SNESSetType(mp->PETSc_snes, SNESTEST);CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(NULL, "-snes_test_display", ""); CHKERRQ(ierr); > > and the first line works, of course, but the second line doesn't seem to activate the printing of the Jacobian. I also tried it with "true" in the last argument and that didn't work either. > > > On Tue, Sep 5, 2017 at 9:39 AM, Jed Brown wrote: > "zakaryah ." writes: > > > OK - I've checked the Jacobian and function very thoroughly and I am > > confident there are no errors. > > Does Newton converge quadratically when you have a good enough initial guess? > > Globalization of large deformation elasticity is a persistent > engineering challenge. The standard approach is to use a continuation, > often in the form of load increments. > > Regarding trust region documentation, the man page says > > The basic algorithm is taken from "The Minpack Project", by More', > Sorensen, Garbow, Hillstrom, pages 88-111 of "Sources and Development > of Mathematical Software", Wayne Cowell, editor. > > You should be able to make sense of it reading from any other source on > trust region methods. > > > I suspect that I am having problems with a bad starting point, and the SNES > > cannot find the global minimum from there. I know that the global minimum > > (with residual zero) exists in all cases but I would like the methods for > > finding it to be more robust to the starting value. > > > > The problem comes from the physics of finite deformations of elastic > > materials. In short, I have a functional of smooth maps on a 3D domain to > > itself. The functional contains two terms. The first term represents > > forces which come from external data, and in the Lagrangian this term only > > involves the values of the map at the point in question. The second term > > penalizes fluctuations in the map, and can take various forms. The > > simplest form is just the Dirichlet energy, but I'm also interested in the > > infinitesimal strain energy and the finite strain energy. The first two > > have terms in the Lagrangian which are first order in the second spatial > > derivatives of the map, while the third (finite strain energy) has terms > > which are up to third order in the first and second spatial derivatives of > > the map. It is the finite strain energy term which has been problematic. > > > > The Euler-Lagrange equations are discretized on a cubic grid, with equal > > interval spacing in each dimension. The map is the dependent variable, > > i.e. the x in F(x) = 0. I prefer Neumann boundary conditions. Because the > > spatial derivatives of the map are usually small, the Jacobian typically > > has large values in 3x3 blocks along the diagonal (which depend on the map > > and the external data), and up to 54 values which are functions of the > > spatial derivatives of the map and tend to be smaller. > > > > Do you have any advice on diagnosing and improving situations in which > > Newton's method finds a stationary point that is not the state with > > globally minimal residual? One hint is that -snes_type newtonls does not > > find as good a solution as -snes_type newtontr but I don't know much about > > these trust region methods, or how to customize and assess them. I'm > > grateful for any advice. > > > > On Mon, Sep 4, 2017 at 5:44 PM, zakaryah . wrote: > > > >> Yes, it looks like it IS the other way around, and I think the row is > >> > >> r.c + r.i*3 + r.j*3*M + r.k*3*M*N, where r.i is in [0,M-1], r.j is in > >> [0,N-1], and r.k is in [0,P-1]. > >> > >> That matches the boundary conditions in the displayed Jacobian. > >> > >> On Mon, Sep 4, 2017 at 5:33 PM, Barry Smith wrote: > >> > >>> > >>> > On Sep 4, 2017, at 4:09 PM, zakaryah . wrote: > >>> > > >>> > OK that is super helpful. Just to be sure - for MxNxP, the row r in > >>> the Jacobian is at r.i*P*N*3 + r.j*P*3 + r.k*3 + r.c? > >>> > >>> It is that way, or the other way around r.k*M*N*3 + r.j*N*3 + r.k*3 + > >>> r.c > >>> > > >>> > > >>> > On Mon, Sep 4, 2017 at 4:58 PM, Barry Smith wrote: > >>> > > >>> > > On Sep 4, 2017, at 3:48 PM, zakaryah . wrote: > >>> > > > >>> > > One piece of information that would be useful is what ordering PETSc > >>> uses for the Jacobian in the snes_test_display. Is it a natural ordering, > >>> or the PETSc ordering? For debugging the Jacobian manually, the natural > >>> ordering is much easier to work with. > >>> > > >>> > What is displayed is always the natural ordering (internally it is > >>> not the natural ordering). > >>> > > >>> > > For -n 1, are the orderings the same? > >>> > > >>> > yes > >>> > > >>> > > >>> > > >>> > > > >>> > > If I use a MatStencil r to represent a field with 3 degrees of > >>> freedom, and the dimensions of my 3D DMDA are MxNxP, which row of the > >>> Jacobian corresponds to r.i=x, r.j=y, r.k=z, r.c=f? > >>> > > >>> > Internally it is complicated but for any viewing it is just the natural > >>> ordering and all the degrees of freedom for a single point are next to each > >>> other in the vector/matrix. > >>> > > >>> > > >>> > >>> > >> > From bsmith at mcs.anl.gov Tue Oct 3 08:07:09 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 3 Oct 2017 15:07:09 +0200 Subject: [petsc-users] Loading a PETsc matrix with matrix data in CSR format? In-Reply-To: <519865087.343382.1507030835073@mail.yahoo.com> References: <876053565.1090563.1505658351610.ref@mail.yahoo.com> <876053565.1090563.1505658351610@mail.yahoo.com> <519865087.343382.1507030835073@mail.yahoo.com> Message-ID: <91B67743-1079-4F81-9E1A-082D0AC161B6@mcs.anl.gov> > On Oct 3, 2017, at 1:40 PM, Klaus Burkart wrote: > > Hello, > > I am still struggling to load my matrix in a PETSc matrix. > > I tried the following code which is part of a c++ function which converts the original matrix to CSR fromat and should load a PETSc matrix. > > Mat M; > Cannot have the following line > MatSetFromOptions(M); > // fill PETSc matrix Should be &M at the end not M > MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD, n, n, rows, cols, vals, M) > > but I get the following error message when compiling the code: > > error: cannot convert ?Mat {aka _p_Mat*}? to ?_p_Mat**? for argument ?7? to ?PetscErrorCode MatCreateSeqAIJWithArrays(MPI_Comm, PetscInt, PetscInt, PetscInt*, PetscInt*, PetscScalar*, _p_Mat**)? > MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD, n, n, rows, cols, vals, M) > > What's wrong with this? > > Klaus > > > Barry Smith schrieb am 16:38 Sonntag, 17.September 2017: > > > > 1) MatSetValues() has absolutely nothing to do with the format used internally by PETSc to store the matrix. MatSetValues() is used to convey blocks of values into the matrix. > > 2) If you want to load a matrix onto multiple processes from a file NEVER write your own parallel ASCII matrix reader. Instead in either C, C++, Python, or Matlab write a routine that reads in the matrix SEQUENTIALLY and then saves it with MatView() and a binary viewer. You can then load the matrix easily and efficiently in PETSc in parallel with MatLoad > > 3) If you have matrix already in CSR format you can use MatCreateSeqAIJWithArrays() > > Barry > > > > > On Sep 17, 2017, at 9:25 AM, Klaus Burkart wrote: > > > > The matrix import function looks like this: > > > > void csr2pet > > ( > > const Foam::lduMatrix & matrix, > > petsc_declaration & petsc_matrix // How to declare the PETsc matrix to be filled? > > ) > > { > > int n = matrix.diag().size(); // small case n = 40800 > > int nnz = matrix.lower().size() + matrix.upper().size() + matrix.diag().size(); // small case nnz = 203800 > > > > // allocate memory for CSR sparse matrix using calloc > > ScalarType * vals = (ScalarType *)calloc(nnz, sizeof(ScalarType)); > > uint * cols = (uint *)calloc(nnz, sizeof(uint)); > > uint * rows = (uint *)calloc(n, sizeof(uint)); > > > > // call function to convert original LDU matrix to CSR format > > exPet::ldu2csr(matrix,rows,cols,vals); > > > > // fill PETsc matrix > > MatSetValues(petsc_matrix, ?, ?, ?, ?, ?, INSERT_VALUES); > > > > // free and release the matrix memory > > free(rows); free(cols); free(vals); // calloc() > > } > > > > > > Questions: > > > > 1: How to declare the petsc_matrix to be filled by the function with the content of the original matrix? > > > > 2: MatSetValues(petsc_matrix, ?, ?, ?, ?, ?, INSERT_VALUES); is used to actually fill the petsc_matrix and I was of the opinion that PETsc uses the CSR format but I can't work out how CSR format is described by: > > > > v - a logically two-dimensional array of values > > m, idxm - the number of rows and their global indices > > n, idxn - the number of columns and their global indices > > > > My original matrix is converted to CSR format, i.e. three arrays cols (column_indices), rows (row_start_indices) and vals (values). > > > > How can I load my matrix into a PETsc matrix for parallel processing? MatSetValues(petsc_matrix, ?, ?, ?, ?, ?, INSERT_VALUES); > > > > Klaus > > From knepley at gmail.com Tue Oct 3 08:42:42 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 3 Oct 2017 09:42:42 -0400 Subject: [petsc-users] Loading a PETsc matrix with matrix data in CSR format? In-Reply-To: <519865087.343382.1507030835073@mail.yahoo.com> References: <876053565.1090563.1505658351610.ref@mail.yahoo.com> <876053565.1090563.1505658351610@mail.yahoo.com> <519865087.343382.1507030835073@mail.yahoo.com> Message-ID: On Tue, Oct 3, 2017 at 7:40 AM, Klaus Burkart wrote: > Hello, > > I am still struggling to load my matrix in a PETSc matrix. > > I tried the following code which is part of a c++ function which converts > the original matrix to CSR fromat and should load a PETSc matrix. > > Mat M; > MatSetFromOptions(M); > // fill PETSc matrix > MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD, n, n, rows, cols, vals, M) > It should be &M at the end, according to the function definition. Matt > > but I get the following error message when compiling the code: > > error: cannot convert ?Mat {aka _p_Mat*}? to ?_p_Mat**? for argument ?7? > to ?PetscErrorCode MatCreateSeqAIJWithArrays(MPI_Comm, PetscInt, > PetscInt, PetscInt*, PetscInt*, PetscScalar*, _p_Mat**)? > MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD, n, n, rows, cols, vals, > M) > > What's wrong with this? > > Klaus > > > Barry Smith schrieb am 16:38 Sonntag, 17.September > 2017: > > > > 1) MatSetValues() has absolutely nothing to do with the format used > internally by PETSc to store the matrix. MatSetValues() is used to convey > blocks of values into the matrix. > > 2) If you want to load a matrix onto multiple processes from a file > NEVER write your own parallel ASCII matrix reader. Instead in either C, > C++, Python, or Matlab write a routine that reads in the matrix > SEQUENTIALLY and then saves it with MatView() and a binary viewer. You can > then load the matrix easily and efficiently in PETSc in parallel with > MatLoad > > 3) If you have matrix already in CSR format you can use > MatCreateSeqAIJWithArrays() > > Barry > > > > > On Sep 17, 2017, at 9:25 AM, Klaus Burkart wrote: > > > > The matrix import function looks like this: > > > > void csr2pet > > ( > > const Foam::lduMatrix & matrix, > > petsc_declaration & petsc_matrix // How to declare the PETsc > matrix to be filled? > > ) > > { > > int n = matrix.diag().size(); // small case n = 40800 > > int nnz = matrix.lower().size() + matrix.upper().size() + > matrix.diag().size(); // small case nnz = 203800 > > > > // allocate memory for CSR sparse matrix using calloc > > ScalarType * vals = (ScalarType *)calloc(nnz, sizeof(ScalarType)); > > uint * cols = (uint *)calloc(nnz, sizeof(uint)); > > uint * rows = (uint *)calloc(n, sizeof(uint)); > > > > // call function to convert original LDU matrix to CSR format > > exPet::ldu2csr(matrix,rows,cols,vals); > > > > // fill PETsc matrix > > MatSetValues(petsc_matrix, ?, ?, ?, ?, ?, INSERT_VALUES); > > > > // free and release the matrix memory > > free(rows); free(cols); free(vals); // calloc() > > } > > > > > > Questions: > > > > 1: How to declare the petsc_matrix to be filled by the function with the > content of the original matrix? > > > > 2: MatSetValues(petsc_matrix, ?, ?, ?, ?, ?, INSERT_VALUES); is used to > actually fill the petsc_matrix and I was of the opinion that PETsc uses the > CSR format but I can't work out how CSR format is described by: > > > > v - a logically two-dimensional array of values > > m, idxm - the number of rows and their global indices > > n, idxn - the number of columns and their global indices > > > > My original matrix is converted to CSR format, i.e. three arrays cols > (column_indices), rows (row_start_indices) and vals (values). > > > > How can I load my matrix into a PETsc matrix for parallel processing? > MatSetValues(petsc_matrix, ?, ?, ?, ?, ?, INSERT_VALUES); > > > > Klaus > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Oct 3 09:08:31 2017 From: jed at jedbrown.org (Jed Brown) Date: Tue, 03 Oct 2017 08:08:31 -0600 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> Message-ID: <87bmlo9vds.fsf@jedbrown.org> Barry Smith writes: >> On Oct 3, 2017, at 5:54 AM, zakaryah . wrote: >> >> I'm still working on this. I've made some progress, and it looks like the issue is with the KSP, at least for now. The Jacobian may be ill-conditioned. Is it possible to use -snes_test_display during an intermediate step of the analysis? I would like to inspect the Jacobian after several solves have already completed, > > No, our currently code for testing Jacobians is poor quality and poorly organized. Needs a major refactoring to do things properly. Sorry You can use -snes_compare_explicit or -snes_compare_coloring to output differences on each Newton step. From zakaryah at gmail.com Tue Oct 3 10:21:26 2017 From: zakaryah at gmail.com (zakaryah .) Date: Tue, 3 Oct 2017 11:21:26 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <87bmlo9vds.fsf@jedbrown.org> References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> Message-ID: I tried -snes_compare_explicit, and got the following error: [0]PETSC ERROR: Invalid argument [0]PETSC ERROR: Matrix not generated from a DMDA What am I doing wrong? On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown wrote: > Barry Smith writes: > > >> On Oct 3, 2017, at 5:54 AM, zakaryah . wrote: > >> > >> I'm still working on this. I've made some progress, and it looks like > the issue is with the KSP, at least for now. The Jacobian may be > ill-conditioned. Is it possible to use -snes_test_display during an > intermediate step of the analysis? I would like to inspect the Jacobian > after several solves have already completed, > > > > No, our currently code for testing Jacobians is poor quality and > poorly organized. Needs a major refactoring to do things properly. Sorry > > You can use -snes_compare_explicit or -snes_compare_coloring to output > differences on each Newton step. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Tue Oct 3 10:34:45 2017 From: hzhang at mcs.anl.gov (Hong) Date: Tue, 3 Oct 2017 10:34:45 -0500 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: Evan, ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch hzhang/update-mumps-5.1.1-cntl You may give it a try. Once it passes our regression tests, I'll merge it to petsc master branch. Hong On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: > I'll check it. > Hong > > On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: > >> Hi Barry, >> >> Thanks for your comments. To activate block low rank (BLR) approximation >> in MUMPS version 5.1.1, a user needs to turn on the functionality (i.e. >> ICNTL(35)=1) and specify the tolerance value (e.g. CNTL(7)=1e-4). In PETSC, >> I think that we can set up ICNTL and CNTL parameters for MUMPS. I was >> wondering if we can still use BLR approximation for a preconditioner for >> Krylov solvers. >> >> Best, >> Evan >> >> >> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith wrote: >> >>> >>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>> > >>> > Dear PETSC Users, >>> > >>> > My system matrix comes from finite element modeling and is complex and >>> unstructured. Its typical size is a few millions-by a few millions. I >>> wondering if I can use MUMPS parallel direct solver as a preconditioner in >>> PETSC. For example, I want to pass factored matrices to Krylov iterative >>> solvers such as QMR. Is there any PETSC+MUMPS example code for the purpose? >>> >>> You don't pass factored matrices you just pass the original matrix and >>> use -pc_type lu -pc_factor_mat_solver_package mumps >>> >>> > Can PETSC call the latest MUMPS that supports block low rank >>> approximation? >>> >>> No, send us info on it and we'll see if we can add an interface >>> >>> >>> > >>> > In advance, thank you very much for your comments. >>> > >>> > Best, >>> > Evan >>> > >>> > >>> > >>> > >>> > >>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From evanum at gmail.com Tue Oct 3 11:12:09 2017 From: evanum at gmail.com (Evan Um) Date: Tue, 3 Oct 2017 09:12:09 -0700 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: Hi Hong, Thank you very much for your kind support. If I have any question, I will let you know. Regards, Evan On Tue, Oct 3, 2017 at 8:34 AM, Hong wrote: > Evan, > ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch > hzhang/update-mumps-5.1.1-cntl > > You may give it a try. Once it passes our regression tests, I'll merge it > to petsc master branch. > > Hong > > > On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: > >> I'll check it. >> Hong >> >> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: >> >>> Hi Barry, >>> >>> Thanks for your comments. To activate block low rank (BLR) approximation >>> in MUMPS version 5.1.1, a user needs to turn on the functionality (i.e. >>> ICNTL(35)=1) and specify the tolerance value (e.g. CNTL(7)=1e-4). In PETSC, >>> I think that we can set up ICNTL and CNTL parameters for MUMPS. I was >>> wondering if we can still use BLR approximation for a preconditioner for >>> Krylov solvers. >>> >>> Best, >>> Evan >>> >>> >>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith wrote: >>> >>>> >>>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>>> > >>>> > Dear PETSC Users, >>>> > >>>> > My system matrix comes from finite element modeling and is complex >>>> and unstructured. Its typical size is a few millions-by a few millions. I >>>> wondering if I can use MUMPS parallel direct solver as a preconditioner in >>>> PETSC. For example, I want to pass factored matrices to Krylov iterative >>>> solvers such as QMR. Is there any PETSC+MUMPS example code for the purpose? >>>> >>>> You don't pass factored matrices you just pass the original matrix >>>> and use -pc_type lu -pc_factor_mat_solver_package mumps >>>> >>>> > Can PETSC call the latest MUMPS that supports block low rank >>>> approximation? >>>> >>>> No, send us info on it and we'll see if we can add an interface >>>> >>>> >>>> > >>>> > In advance, thank you very much for your comments. >>>> > >>>> > Best, >>>> > Evan >>>> > >>>> > >>>> > >>>> > >>>> > >>>> >>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dylanb at Princeton.EDU Tue Oct 3 15:16:52 2017 From: dylanb at Princeton.EDU (Dylan P. Brennan) Date: Tue, 3 Oct 2017 20:16:52 +0000 Subject: [petsc-users] Configure problem Message-ID: <2F7168C0-E7DC-47FB-A7BC-90652BC3D12F@princeton.edu> Hello, I?m having problems configuring, any ideas? Dylan -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 1038871 bytes Desc: configure.log URL: From balay at mcs.anl.gov Tue Oct 3 15:22:46 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 3 Oct 2017 15:22:46 -0500 Subject: [petsc-users] Configure problem In-Reply-To: <2F7168C0-E7DC-47FB-A7BC-90652BC3D12F@princeton.edu> References: <2F7168C0-E7DC-47FB-A7BC-90652BC3D12F@princeton.edu> Message-ID: >>>>> checking size of bool... 0./configure: line 12435: printf %s\n: command not found ./configure: line 12436: printf %s\n: command not found configure: WARNING: The C++ compiler g++ cannot compile a program containing the header - this may indicate a problem with the C++ installation. Consider configuing with --disable-cxx configure: WARNING: Structures containing long doubles may be aligned differently from structures with floats or longs. MPICH does not handle this case automatically and you should avoid assumed extents for structures containing float types. configure: error: unable to determine matching C type for C++ bool <<<<<< For some reason MPICH configure is failing. balay at asterix /home/balay/petsc (maint=) $ which printf /usr/bin/printf balay at asterix /home/balay/petsc (maint=) $ rpm -qf /usr/bin/printf coreutils-8.27-16.fc27.x86_64 balay at asterix /home/balay/petsc (maint=) $ Your machine does not have this basic coreuitls package installed? Satish On Tue, 3 Oct 2017, Dylan P. Brennan wrote: > > Hello, > > I?m having problems configuring, any ideas? > > Dylan > > From mfadams at lbl.gov Tue Oct 3 15:39:50 2017 From: mfadams at lbl.gov (Mark Adams) Date: Tue, 3 Oct 2017 16:39:50 -0400 Subject: [petsc-users] Configure problem In-Reply-To: References: <2F7168C0-E7DC-47FB-A7BC-90652BC3D12F@princeton.edu> Message-ID: On Tue, Oct 3, 2017 at 4:22 PM, Satish Balay wrote: > >>>>> > checking size of bool... 0./configure: line 12435: printf %s\n: command > not found > ./configure: line 12436: printf %s\n: command not found > configure: WARNING: The C++ compiler g++ cannot compile a program > containing the header - this may indicate a problem with the C++ > installation. Consider configuing with --disable-cxx > configure: WARNING: Structures containing long doubles may be aligned > differently from structures with floats or longs. MPICH does not handle > this case automatically and you should avoid assumed extents for structures > containing float types. > configure: error: unable to determine matching C type for C++ bool > <<<<<< > > For some reason MPICH configure is failing. > > > balay at asterix /home/balay/petsc (maint=) > $ which printf > /usr/bin/printf > balay at asterix /home/balay/petsc (maint=) > $ rpm -qf /usr/bin/printf > coreutils-8.27-16.fc27.x86_64 > balay at asterix /home/balay/petsc (maint=) > $ > > Your machine does not have this basic coreuitls package installed? > > We are probably missing a module, > Satish > > On Tue, 3 Oct 2017, Dylan P. Brennan wrote: > > > > > Hello, > > > > I?m having problems configuring, any ideas? > > > > Dylan > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Oct 3 16:00:21 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 3 Oct 2017 16:00:21 -0500 Subject: [petsc-users] Configure problem In-Reply-To: References: <2F7168C0-E7DC-47FB-A7BC-90652BC3D12F@princeton.edu> Message-ID: On Tue, 3 Oct 2017, Mark Adams wrote: > > $ rpm -qf /usr/bin/printf > > coreutils-8.27-16.fc27.x86_64 > > Your machine does not have this basic coreuitls package installed? > > > > > We are probably missing a module, 2.6.32-696.1.1.el6.x86_64 Its a RHEL6 (or clone) box - and this package should be part of the basic OS - and not a module. It would be very strange to have RHEL box without this package installed. Satish From mfadams at lbl.gov Tue Oct 3 16:55:47 2017 From: mfadams at lbl.gov (Mark Adams) Date: Tue, 3 Oct 2017 17:55:47 -0400 Subject: [petsc-users] Configure problem In-Reply-To: References: <2F7168C0-E7DC-47FB-A7BC-90652BC3D12F@princeton.edu> Message-ID: I think we figured it out. We did not have the modules set and got some local help. Thanks On Tue, Oct 3, 2017 at 5:00 PM, Satish Balay wrote: > On Tue, 3 Oct 2017, Mark Adams wrote: > > > > $ rpm -qf /usr/bin/printf > > > coreutils-8.27-16.fc27.x86_64 > > > > Your machine does not have this basic coreuitls package installed? > > > > > > > > We are probably missing a module, > > 2.6.32-696.1.1.el6.x86_64 > > Its a RHEL6 (or clone) box - and this package should be part of the > basic OS - and not a module. > > It would be very strange to have RHEL box without this package installed. > > Satish > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Oct 3 16:37:17 2017 From: jed at jedbrown.org (Jed Brown) Date: Tue, 03 Oct 2017 15:37:17 -0600 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> Message-ID: <87zi979alu.fsf@jedbrown.org> Always always always send the whole error message. "zakaryah ." writes: > I tried -snes_compare_explicit, and got the following error: > > [0]PETSC ERROR: Invalid argument > > [0]PETSC ERROR: Matrix not generated from a DMDA > > What am I doing wrong? > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown wrote: > >> Barry Smith writes: >> >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . wrote: >> >> >> >> I'm still working on this. I've made some progress, and it looks like >> the issue is with the KSP, at least for now. The Jacobian may be >> ill-conditioned. Is it possible to use -snes_test_display during an >> intermediate step of the analysis? I would like to inspect the Jacobian >> after several solves have already completed, >> > >> > No, our currently code for testing Jacobians is poor quality and >> poorly organized. Needs a major refactoring to do things properly. Sorry >> >> You can use -snes_compare_explicit or -snes_compare_coloring to output >> differences on each Newton step. >> From mailinglists at xgm.de Tue Oct 3 21:27:41 2017 From: mailinglists at xgm.de (Florian Lindner) Date: Wed, 4 Oct 2017 10:27:41 +0800 Subject: [petsc-users] Mat/Vec with empty ranks In-Reply-To: References: Message-ID: Am 03.10.2017 um 16:05 schrieb Matthew Knepley: > On Mon, Oct 2, 2017 at 10:11 PM, Florian Lindner > wrote: > > > > Am 02.10.2017 um 21:04 schrieb Matthew Knepley: > > On Mon, Oct 2, 2017 at 6:21 AM, Florian Lindner > >> wrote: > > > >? ? ?Hello, > > > >? ? ?I have a matrix and vector that live on 4 ranks, but only rank 2 and 3 have values: > > > >? ? ?Doing a simple LSQR solve does not converge. However, when the values are distributed equally, it converges > within 3 > >? ? ?iterations. > > > >? ? ?What can I do about that? > > > >? ? ?I have attached a simple program and creates the matrix and vector or loads them from a file. > > > > > > There are a few problems with this program. I am attaching a cleaned up version. However, convergence still > differs starting > > at iteration 2. It appears that LSQR has a problem with this system, or we have a bug that I cannot see. > > Thanks for having a look at it! > > And good to hear it's not by design. If I can be of any more help tracking this down, pleae let me know. > > In the meantime, what could be a good way to work around this? This is admittedly a very malformed example. Is there a > way to force solving on a single CPU and then distribute the results resp. KSP object to the original parallel layout? > Of course, we would first try to solve in parallel, but we have little influence about the actual parallel layout, since > we are just a library and other solvers give us the data. > > > I need to be more clear. I do not think convergence has anything to do with being on 1 process. I think this is an > ill-conditioned > example and convergence is an accident in one case. Unless you see this in a bunch of cases, I would not worry about > gathering > to a single process. However, that can be done using PCREDUNDANT if this is really a problem. Why do think it is ill-conditioned? The condition number is around 5, The singular values are [5.93710645, 1.85088733, 1.15107911] (both according to NumPy) and QR decomposition with NumPy works fine. I believe you're right about that, giving your proficiency in the subject, just want to learn and maybe fix my input data, if it's possible. Thanks, Florian From knepley at gmail.com Wed Oct 4 05:08:55 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 4 Oct 2017 06:08:55 -0400 Subject: [petsc-users] Mat/Vec with empty ranks In-Reply-To: References: Message-ID: On Tue, Oct 3, 2017 at 10:27 PM, Florian Lindner wrote: > Am 03.10.2017 um 16:05 schrieb Matthew Knepley: > > On Mon, Oct 2, 2017 at 10:11 PM, Florian Lindner > wrote: > > > > > > > > Am 02.10.2017 um 21:04 schrieb Matthew Knepley: > > > On Mon, Oct 2, 2017 at 6:21 AM, Florian Lindner < > mailinglists at xgm.de > > >> wrote: > > > > > > Hello, > > > > > > I have a matrix and vector that live on 4 ranks, but only rank > 2 and 3 have values: > > > > > > Doing a simple LSQR solve does not converge. However, when the > values are distributed equally, it converges > > within 3 > > > iterations. > > > > > > What can I do about that? > > > > > > I have attached a simple program and creates the matrix and > vector or loads them from a file. > > > > > > > > > There are a few problems with this program. I am attaching a > cleaned up version. However, convergence still > > differs starting > > > at iteration 2. It appears that LSQR has a problem with this > system, or we have a bug that I cannot see. > > > > Thanks for having a look at it! > > > > And good to hear it's not by design. If I can be of any more help > tracking this down, pleae let me know. > > > > In the meantime, what could be a good way to work around this? This > is admittedly a very malformed example. Is there a > > way to force solving on a single CPU and then distribute the results > resp. KSP object to the original parallel layout? > > Of course, we would first try to solve in parallel, but we have > little influence about the actual parallel layout, since > > we are just a library and other solvers give us the data. > > > > > > I need to be more clear. I do not think convergence has anything to do > with being on 1 process. I think this is an > > ill-conditioned > > example and convergence is an accident in one case. Unless you see this > in a bunch of cases, I would not worry about > > gathering > > to a single process. However, that can be done using PCREDUNDANT if this > is really a problem. > > Why do think it is ill-conditioned? The condition number is around 5, The > singular values are [5.93710645, 1.85088733, > 1.15107911] (both according to NumPy) and QR decomposition with NumPy > works fine. > > I believe you're right about that, giving your proficiency in the subject, > just want to learn and maybe fix my input > data, if it's possible. > I don't know if that is right. However, the sequential and parallel algorithms agree on both the initial residual (so that parallel matrix and rhs appear correct) and the first iterate. Divergence of the second iterate could still be a bug in our code, but it was harder for me to see how. The real thing to do, which should not be that much work but I don't have time for now unfortunately, is to step through the algorithm in serial and parallel and see what number changes. The algorithm only has 20 or so steps per iterate, so this would probably take one day to do right. Thanks, Matt > Thanks, > Florian > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From k_burkart at yahoo.com Wed Oct 4 09:23:23 2017 From: k_burkart at yahoo.com (Klaus Burkart) Date: Wed, 4 Oct 2017 14:23:23 +0000 (UTC) Subject: [petsc-users] How to interface with PETSc from another application? References: <1533387941.1259180.1507127003520.ref@mail.yahoo.com> Message-ID: <1533387941.1259180.1507127003520@mail.yahoo.com> What's the concept to interface with PETSc from another application to solve a linear system with PETSc? The standard procedure of the job: 1: The application provides a matrix which needs to be converted and be loaded into PETSc 2: The application provides the rhs vector (containing pointers!) which needs to be loaded into PETSc 3: The linear system is to be solved using PETSc 4: The application provides the result vector x, the PETSc result needs to be copied back to the application into vector x (also expecting pointers) The problem - maybe a completely wrong approach when it comes to using PETSc: With other linear algebra libraries, I included the library functionality in the code of a new solver accessing the functionality usually via header files and created a plugin which can be called from the application when running a simulation. Even so the mixed code including PETSc code can be compiled, the bit of the plugin, interfacing with the application is broken as soon as I include more than a PETSc declaration in the mixed code. How to interface with PETSc from a software application?? (I am using c++ and Ubuntu) Klaus -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Oct 4 10:17:21 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 04 Oct 2017 09:17:21 -0600 Subject: [petsc-users] How to interface with PETSc from another application? In-Reply-To: <1533387941.1259180.1507127003520@mail.yahoo.com> References: <1533387941.1259180.1507127003520.ref@mail.yahoo.com> <1533387941.1259180.1507127003520@mail.yahoo.com> Message-ID: <87k20bgcxq.fsf@jedbrown.org> Klaus Burkart writes: > What's the concept to interface with PETSc from another application to solve a linear system with PETSc? > > The standard procedure of the job: > > 1: The application provides a matrix which needs to be converted and be loaded into PETSc > > 2: The application provides the rhs vector (containing pointers!) which needs to be loaded into PETSc > > 3: The linear system is to be solved using PETSc > > 4: The application provides the result vector x, the PETSc result needs to be copied back to the application into vector x (also expecting pointers) > > > The problem - maybe a completely wrong approach when it comes to using PETSc: > > With other linear algebra libraries, I included the library functionality in the code of a new solver accessing the functionality usually via header files and created a plugin which can be called from the application when running a simulation. > > Even so the mixed code including PETSc code can be compiled, the bit of the plugin, interfacing with the application is broken as soon as I include more than a PETSc declaration in the mixed code. Sounds like maybe you haven't correctly linked to the PETSc library. Sending us the commands run and output/errors would be helpful to debug. > How to interface with PETSc from a software application?? (I am using c++ and Ubuntu) > > Klaus From k_burkart at yahoo.com Wed Oct 4 11:15:35 2017 From: k_burkart at yahoo.com (Klaus Burkart) Date: Wed, 4 Oct 2017 16:15:35 +0000 (UTC) Subject: [petsc-users] How to interface with PETSc from another application? In-Reply-To: <87k20bgcxq.fsf@jedbrown.org> References: <1533387941.1259180.1507127003520.ref@mail.yahoo.com> <1533387941.1259180.1507127003520@mail.yahoo.com> <87k20bgcxq.fsf@jedbrown.org> Message-ID: <1867850615.1376253.1507133735389@mail.yahoo.com> My setup: .bashrc export PETSC_DIR=/home/klaus/OpenFOAM/klaus-5.0/petsc-3.7.6 export PETSC_ARCH=arch-linux2-c-debug export PETSC_CONFIGDIR=${PETSC_DIR}/lib/petsc make options ??? -I$(PETSC_CONFIGDIR)/conf \ ??? -I$(PETSC_DIR)/include \ ??? -I$(PETSC_DIR)/arch-linux2-c-debug/include Installation and tests worked fine. The output using:???? PetscInitialize(0,0,NULL,NULL); at the beginning and PetscFinalize(); at the end of the code section including PETSc (solver section) is: simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: PetscInitialize No simulation is triggered When I just declare Mat M; and call a function with M as a parameter which outputs "Petsc - Hello" and sets the matrix M to symmetric (using: MatSetOption(A, MAT_SYMMETRIC, PETSC_TRUE);), the execution of a simulation is triggered but MatSetOption is causing a problem, I assume because PetscInitialize is missing Petsc - Hello nonepetGMRES:? Solving for Ux, Initial residual = 1, Final residual = 1, No Iterations 0 Petsc - Hello nonepetGMRES:? Solving for Uz, Initial residual = 1, Final residual = 1, No Iterations 0 simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: MatSetOption Maybe important to know, there's no way to enter commmand line input in the terminal while a simulation is running because the application displays continuously the intermediate simulation results. That's why I use PetscInitialize(0,0,NULL,NULL); There's now way to provide command line input. I came back to this simple test after writing "the complete code" which showed these problems and stripped it down, step-by-step, to figure out what causes the problem i.e. everything but a declaration. Jed Brown schrieb am 17:17 Mittwoch, 4.Oktober 2017: Klaus Burkart writes: > What's the concept to interface with PETSc from another application to solve a linear system with PETSc? > > The standard procedure of the job: > > 1: The application provides a matrix which needs to be converted and be loaded into PETSc > > 2: The application provides the rhs vector (containing pointers!) which needs to be loaded into PETSc > > 3: The linear system is to be solved using PETSc > > 4: The application provides the result vector x, the PETSc result needs to be copied back to the application into vector x (also expecting pointers) > > > The problem - maybe a completely wrong approach when it comes to using PETSc: > > With other linear algebra libraries, I included the library functionality in the code of a new solver accessing the functionality usually via header files and created a plugin which can be called from the application when running a simulation. > > Even so the mixed code including PETSc code can be compiled, the bit of the plugin, interfacing with the application is broken as soon as I include more than a PETSc declaration in the mixed code. Sounds like maybe you haven't correctly linked to the PETSc library. Sending us the commands run and output/errors would be helpful to debug. > How to interface with PETSc from a software application?? (I am using c++ and Ubuntu) > > Klaus -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Oct 4 11:45:55 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 04 Oct 2017 10:45:55 -0600 Subject: [petsc-users] How to interface with PETSc from another application? In-Reply-To: <1867850615.1376253.1507133735389@mail.yahoo.com> References: <1533387941.1259180.1507127003520.ref@mail.yahoo.com> <1533387941.1259180.1507127003520@mail.yahoo.com> <87k20bgcxq.fsf@jedbrown.org> <1867850615.1376253.1507133735389@mail.yahoo.com> Message-ID: <87fuayhnek.fsf@jedbrown.org> Klaus Burkart writes: > My setup: > > .bashrc > export PETSC_DIR=/home/klaus/OpenFOAM/klaus-5.0/petsc-3.7.6 > export PETSC_ARCH=arch-linux2-c-debug > export PETSC_CONFIGDIR=${PETSC_DIR}/lib/petsc > > > make options > ??? -I$(PETSC_CONFIGDIR)/conf \ The above should not be needed. > ??? -I$(PETSC_DIR)/include \ > ??? -I$(PETSC_DIR)/arch-linux2-c-debug/include > > Installation and tests worked fine. > > The output using:???? PetscInitialize(0,0,NULL,NULL); at the beginning and PetscFinalize(); at the end of the code section including PETSc (solver section) is: > > simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: PetscInitialize How have you linked libpetFoam.so to libpetsc.so and did you use RPATH? That seems to be the problem. > No simulation is triggered > > > When I just declare Mat M; and call a function with M as a parameter which outputs "Petsc - Hello" and sets the matrix M to symmetric (using: MatSetOption(A, MAT_SYMMETRIC, PETSC_TRUE);), the execution of a simulation is triggered but MatSetOption is causing a problem, I assume because PetscInitialize is missing > > Petsc - Hello nonepetGMRES:? Solving for Ux, Initial residual = 1, Final residual = 1, No Iterations 0 > Petsc - Hello nonepetGMRES:? Solving for Uz, Initial residual = 1, Final residual = 1, No Iterations 0 > simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: MatSetOption > > Maybe important to know, there's no way to enter commmand line input in the terminal while a simulation is running because the application displays continuously the intermediate simulation results. That's why I use PetscInitialize(0,0,NULL,NULL); There's now way to provide command line input. That's fine. You can use the PETSC_OPTIONS environment variable or a configuration file to get run-time options to PETSc. > I came back to this simple test after writing "the complete code" which showed these problems and stripped it down, step-by-step, to figure out what causes the problem i.e. everything but a declaration. > > > > Jed Brown schrieb am 17:17 Mittwoch, 4.Oktober 2017: > > > Klaus Burkart writes: > >> What's the concept to interface with PETSc from another application to solve a linear system with PETSc? >> >> The standard procedure of the job: >> >> 1: The application provides a matrix which needs to be converted and be loaded into PETSc >> >> 2: The application provides the rhs vector (containing pointers!) which needs to be loaded into PETSc >> >> 3: The linear system is to be solved using PETSc >> >> 4: The application provides the result vector x, the PETSc result needs to be copied back to the application into vector x (also expecting pointers) >> >> >> The problem - maybe a completely wrong approach when it comes to using PETSc: >> >> With other linear algebra libraries, I included the library functionality in the code of a new solver accessing the functionality usually via header files and created a plugin which can be called from the application when running a simulation. >> >> Even so the mixed code including PETSc code can be compiled, the bit of the plugin, interfacing with the application is broken as soon as I include more than a PETSc declaration in the mixed code. > > Sounds like maybe you haven't correctly linked to the PETSc library. > Sending us the commands run and output/errors would be helpful to debug. > >> How to interface with PETSc from a software application?? (I am using c++ and Ubuntu) >> >> Klaus > > From k_burkart at yahoo.com Wed Oct 4 13:43:25 2017 From: k_burkart at yahoo.com (Klaus Burkart) Date: Wed, 4 Oct 2017 18:43:25 +0000 (UTC) Subject: [petsc-users] How to interface with PETSc from another application? In-Reply-To: <87fuayhnek.fsf@jedbrown.org> References: <1533387941.1259180.1507127003520.ref@mail.yahoo.com> <1533387941.1259180.1507127003520@mail.yahoo.com> <87k20bgcxq.fsf@jedbrown.org> <1867850615.1376253.1507133735389@mail.yahoo.com> <87fuayhnek.fsf@jedbrown.org> Message-ID: <718363980.1460567.1507142605486@mail.yahoo.com> When I link the petsc library, the application side code is not properly compiled and the solver is not available for selection --> FOAM FATAL IO ERROR: Unknown asymmetric matrix solver petGMRES Valid asymmetric matrix solvers are : 4 ( GAMG PBiCG PBiCGStab smoothSolver ) I added the following to my makefile to link the petsc library ??? -L$(PETSC_DIR)/arch-linux2-c-debug/lib??? -lpetsc Jed Brown schrieb am 18:45 Mittwoch, 4.Oktober 2017: Klaus Burkart writes: > My setup: > > .bashrc > export PETSC_DIR=/home/klaus/OpenFOAM/klaus-5.0/petsc-3.7.6 > export PETSC_ARCH=arch-linux2-c-debug > export PETSC_CONFIGDIR=${PETSC_DIR}/lib/petsc > > > make options > ??? -I$(PETSC_CONFIGDIR)/conf \ The above should not be needed. > ??? -I$(PETSC_DIR)/include \ > ??? -I$(PETSC_DIR)/arch-linux2-c-debug/include > > Installation and tests worked fine. > > The output using:???? PetscInitialize(0,0,NULL,NULL); at the beginning and PetscFinalize(); at the end of the code section including PETSc (solver section) is: > > simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: PetscInitialize How have you linked libpetFoam.so to libpetsc.so and did you use RPATH? That seems to be the problem. > No simulation is triggered > > > When I just declare Mat M; and call a function with M as a parameter which outputs "Petsc - Hello" and sets the matrix M to symmetric (using: MatSetOption(A, MAT_SYMMETRIC, PETSC_TRUE);), the execution of a simulation is triggered but MatSetOption is causing a problem, I assume because PetscInitialize is missing > > Petsc - Hello nonepetGMRES:? Solving for Ux, Initial residual = 1, Final residual = 1, No Iterations 0 > Petsc - Hello nonepetGMRES:? Solving for Uz, Initial residual = 1, Final residual = 1, No Iterations 0 > simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: MatSetOption > > Maybe important to know, there's no way to enter commmand line input in the terminal while a simulation is running because the application displays continuously the intermediate simulation results. That's why I use PetscInitialize(0,0,NULL,NULL); There's now way to provide command line input. That's fine.? You can use the PETSC_OPTIONS environment variable or a configuration file to get run-time options to PETSc. > I came back to this simple test after writing "the complete code" which showed these problems and stripped it down, step-by-step, to figure out what causes the problem i.e. everything but a declaration. > >? > >? ? Jed Brown schrieb am 17:17 Mittwoch, 4.Oktober 2017: >? > >? Klaus Burkart writes: > >> What's the concept to interface with PETSc from another application to solve a linear system with PETSc? >> >> The standard procedure of the job: >> >> 1: The application provides a matrix which needs to be converted and be loaded into PETSc >> >> 2: The application provides the rhs vector (containing pointers!) which needs to be loaded into PETSc >> >> 3: The linear system is to be solved using PETSc >> >> 4: The application provides the result vector x, the PETSc result needs to be copied back to the application into vector x (also expecting pointers) >> >> >> The problem - maybe a completely wrong approach when it comes to using PETSc: >> >> With other linear algebra libraries, I included the library functionality in the code of a new solver accessing the functionality usually via header files and created a plugin which can be called from the application when running a simulation. >> >> Even so the mixed code including PETSc code can be compiled, the bit of the plugin, interfacing with the application is broken as soon as I include more than a PETSc declaration in the mixed code. > > Sounds like maybe you haven't correctly linked to the PETSc library. > Sending us the commands run and output/errors would be helpful to debug. > >> How to interface with PETSc from a software application?? (I am using c++ and Ubuntu) >> >> Klaus > >? ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Oct 4 14:39:45 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 04 Oct 2017 13:39:45 -0600 Subject: [petsc-users] How to interface with PETSc from another application? In-Reply-To: <718363980.1460567.1507142605486@mail.yahoo.com> References: <1533387941.1259180.1507127003520.ref@mail.yahoo.com> <1533387941.1259180.1507127003520@mail.yahoo.com> <87k20bgcxq.fsf@jedbrown.org> <1867850615.1376253.1507133735389@mail.yahoo.com> <87fuayhnek.fsf@jedbrown.org> <718363980.1460567.1507142605486@mail.yahoo.com> Message-ID: <87efqisnwe.fsf@jedbrown.org> Klaus Burkart writes: > When I link the petsc library, the application side code is not properly compiled and the solver is not available for selection > > --> FOAM FATAL IO ERROR: > Unknown asymmetric matrix solver petGMRES There must be some earlier error message. > Valid asymmetric matrix solvers are : > > 4 > ( > GAMG > PBiCG > PBiCGStab > smoothSolver > ) > I added the following to my makefile to link the petsc library > > ??? -L$(PETSC_DIR)/arch-linux2-c-debug/lib??? -lpetsc In this case, you'd need to add that path to LD_LIBRARY_PATH so the loader can find it. None of these are PETSc issues, just linking dynamic libraries. > Jed Brown schrieb am 18:45 Mittwoch, 4.Oktober 2017: > > > Klaus Burkart writes: > >> My setup: >> >> .bashrc >> export PETSC_DIR=/home/klaus/OpenFOAM/klaus-5.0/petsc-3.7.6 >> export PETSC_ARCH=arch-linux2-c-debug >> export PETSC_CONFIGDIR=${PETSC_DIR}/lib/petsc >> >> >> make options >> ??? -I$(PETSC_CONFIGDIR)/conf \ > > The above should not be needed. > >> ??? -I$(PETSC_DIR)/include \ >> ??? -I$(PETSC_DIR)/arch-linux2-c-debug/include >> >> Installation and tests worked fine. >> >> The output using:???? PetscInitialize(0,0,NULL,NULL); at the beginning and PetscFinalize(); at the end of the code section including PETSc (solver section) is: >> >> simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: PetscInitialize > > How have you linked libpetFoam.so to libpetsc.so and did you use RPATH? > That seems to be the problem. > >> No simulation is triggered >> >> >> When I just declare Mat M; and call a function with M as a parameter which outputs "Petsc - Hello" and sets the matrix M to symmetric (using: MatSetOption(A, MAT_SYMMETRIC, PETSC_TRUE);), the execution of a simulation is triggered but MatSetOption is causing a problem, I assume because PetscInitialize is missing >> >> Petsc - Hello nonepetGMRES:? Solving for Ux, Initial residual = 1, Final residual = 1, No Iterations 0 >> Petsc - Hello nonepetGMRES:? Solving for Uz, Initial residual = 1, Final residual = 1, No Iterations 0 >> simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: MatSetOption >> >> Maybe important to know, there's no way to enter commmand line input in the terminal while a simulation is running because the application displays continuously the intermediate simulation results. That's why I use PetscInitialize(0,0,NULL,NULL); There's now way to provide command line input. > > That's fine.? You can use the PETSC_OPTIONS environment variable or a > configuration file to get run-time options to PETSc. > >> I came back to this simple test after writing "the complete code" which showed these problems and stripped it down, step-by-step, to figure out what causes the problem i.e. everything but a declaration. >> >>? >> >>? ? Jed Brown schrieb am 17:17 Mittwoch, 4.Oktober 2017: >>? >> >>? Klaus Burkart writes: >> >>> What's the concept to interface with PETSc from another application to solve a linear system with PETSc? >>> >>> The standard procedure of the job: >>> >>> 1: The application provides a matrix which needs to be converted and be loaded into PETSc >>> >>> 2: The application provides the rhs vector (containing pointers!) which needs to be loaded into PETSc >>> >>> 3: The linear system is to be solved using PETSc >>> >>> 4: The application provides the result vector x, the PETSc result needs to be copied back to the application into vector x (also expecting pointers) >>> >>> >>> The problem - maybe a completely wrong approach when it comes to using PETSc: >>> >>> With other linear algebra libraries, I included the library functionality in the code of a new solver accessing the functionality usually via header files and created a plugin which can be called from the application when running a simulation. >>> >>> Even so the mixed code including PETSc code can be compiled, the bit of the plugin, interfacing with the application is broken as soon as I include more than a PETSc declaration in the mixed code. >> >> Sounds like maybe you haven't correctly linked to the PETSc library. >> Sending us the commands run and output/errors would be helpful to debug. >> >>> How to interface with PETSc from a software application?? (I am using c++ and Ubuntu) >>> >>> Klaus >> >>? ? > > From k_burkart at yahoo.com Wed Oct 4 15:38:56 2017 From: k_burkart at yahoo.com (Klaus Burkart) Date: Wed, 4 Oct 2017 20:38:56 +0000 (UTC) Subject: [petsc-users] How to interface with PETSc from another application? In-Reply-To: <87efqisnwe.fsf@jedbrown.org> References: <1533387941.1259180.1507127003520.ref@mail.yahoo.com> <1533387941.1259180.1507127003520@mail.yahoo.com> <87k20bgcxq.fsf@jedbrown.org> <1867850615.1376253.1507133735389@mail.yahoo.com> <87fuayhnek.fsf@jedbrown.org> <718363980.1460567.1507142605486@mail.yahoo.com> <87efqisnwe.fsf@jedbrown.org> Message-ID: <1947060922.1545491.1507149536804@mail.yahoo.com> Adding the path to LD_LIBRARY_PATH doesn't help, the problems remain the same, it has no effect Jed Brown schrieb am 21:39 Mittwoch, 4.Oktober 2017: Klaus Burkart writes: > When I link the petsc library, the application side code is not properly compiled and the solver is not available for selection > > --> FOAM FATAL IO ERROR: > Unknown asymmetric matrix solver petGMRES There must be some earlier error message. > Valid asymmetric matrix solvers are : > > 4 > ( > GAMG > PBiCG > PBiCGStab > smoothSolver > ) > I added the following to my makefile to link the petsc library > > ??? -L$(PETSC_DIR)/arch-linux2-c-debug/lib??? -lpetsc In this case, you'd need to add that path to LD_LIBRARY_PATH so the loader can find it.? None of these are PETSc issues, just linking dynamic libraries. >? ? Jed Brown schrieb am 18:45 Mittwoch, 4.Oktober 2017: >? > >? Klaus Burkart writes: > >> My setup: >> >> .bashrc >> export PETSC_DIR=/home/klaus/OpenFOAM/klaus-5.0/petsc-3.7.6 >> export PETSC_ARCH=arch-linux2-c-debug >> export PETSC_CONFIGDIR=${PETSC_DIR}/lib/petsc >> >> >> make options >> ??? -I$(PETSC_CONFIGDIR)/conf \ > > The above should not be needed. > >> ??? -I$(PETSC_DIR)/include \ >> ??? -I$(PETSC_DIR)/arch-linux2-c-debug/include >> >> Installation and tests worked fine. >> >> The output using:???? PetscInitialize(0,0,NULL,NULL); at the beginning and PetscFinalize(); at the end of the code section including PETSc (solver section) is: >> >> simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: PetscInitialize > > How have you linked libpetFoam.so to libpetsc.so and did you use RPATH? > That seems to be the problem. > >> No simulation is triggered >> >> >> When I just declare Mat M; and call a function with M as a parameter which outputs "Petsc - Hello" and sets the matrix M to symmetric (using: MatSetOption(A, MAT_SYMMETRIC, PETSC_TRUE);), the execution of a simulation is triggered but MatSetOption is causing a problem, I assume because PetscInitialize is missing >> >> Petsc - Hello nonepetGMRES:? Solving for Ux, Initial residual = 1, Final residual = 1, No Iterations 0 >> Petsc - Hello nonepetGMRES:? Solving for Uz, Initial residual = 1, Final residual = 1, No Iterations 0 >> simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: MatSetOption >> >> Maybe important to know, there's no way to enter commmand line input in the terminal while a simulation is running because the application displays continuously the intermediate simulation results. That's why I use PetscInitialize(0,0,NULL,NULL); There's now way to provide command line input. > > That's fine.? You can use the PETSC_OPTIONS environment variable or a > configuration file to get run-time options to PETSc. > >> I came back to this simple test after writing "the complete code" which showed these problems and stripped it down, step-by-step, to figure out what causes the problem i.e. everything but a declaration. >> >>? >> >>? ? Jed Brown schrieb am 17:17 Mittwoch, 4.Oktober 2017: >>? >> >>? Klaus Burkart writes: >> >>> What's the concept to interface with PETSc from another application to solve a linear system with PETSc? >>> >>> The standard procedure of the job: >>> >>> 1: The application provides a matrix which needs to be converted and be loaded into PETSc >>> >>> 2: The application provides the rhs vector (containing pointers!) which needs to be loaded into PETSc >>> >>> 3: The linear system is to be solved using PETSc >>> >>> 4: The application provides the result vector x, the PETSc result needs to be copied back to the application into vector x (also expecting pointers) >>> >>> >>> The problem - maybe a completely wrong approach when it comes to using PETSc: >>> >>> With other linear algebra libraries, I included the library functionality in the code of a new solver accessing the functionality usually via header files and created a plugin which can be called from the application when running a simulation. >>> >>> Even so the mixed code including PETSc code can be compiled, the bit of the plugin, interfacing with the application is broken as soon as I include more than a PETSc declaration in the mixed code. >> >> Sounds like maybe you haven't correctly linked to the PETSc library. >> Sending us the commands run and output/errors would be helpful to debug. >> >>> How to interface with PETSc from a software application?? (I am using c++ and Ubuntu) >>> >>> Klaus >> >>? ? > >? ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Oct 4 15:51:28 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 04 Oct 2017 14:51:28 -0600 Subject: [petsc-users] How to interface with PETSc from another application? In-Reply-To: <1947060922.1545491.1507149536804@mail.yahoo.com> References: <1533387941.1259180.1507127003520.ref@mail.yahoo.com> <1533387941.1259180.1507127003520@mail.yahoo.com> <87k20bgcxq.fsf@jedbrown.org> <1867850615.1376253.1507133735389@mail.yahoo.com> <87fuayhnek.fsf@jedbrown.org> <718363980.1460567.1507142605486@mail.yahoo.com> <87efqisnwe.fsf@jedbrown.org> <1947060922.1545491.1507149536804@mail.yahoo.com> Message-ID: <8760buskkv.fsf@jedbrown.org> Klaus Burkart writes: > Adding the path to LD_LIBRARY_PATH doesn't help, the problems remain the same, it has no effect You're having a problem linking correctly from the plugin, and maybe some error messages are being swallowed, but this is not a PETSc issue. You can use ldd to check whether the plugin was linked correctly (it should find libpetsc.so). > > Jed Brown schrieb am 21:39 Mittwoch, 4.Oktober 2017: > > > Klaus Burkart writes: > >> When I link the petsc library, the application side code is not properly compiled and the solver is not available for selection >> >> --> FOAM FATAL IO ERROR: >> Unknown asymmetric matrix solver petGMRES > > There must be some earlier error message. > >> Valid asymmetric matrix solvers are : >> >> 4 >> ( >> GAMG >> PBiCG >> PBiCGStab >> smoothSolver >> ) >> I added the following to my makefile to link the petsc library >> >> ??? -L$(PETSC_DIR)/arch-linux2-c-debug/lib??? -lpetsc > > In this case, you'd need to add that path to LD_LIBRARY_PATH so the > loader can find it.? None of these are PETSc issues, just linking > dynamic libraries. > >>? ? Jed Brown schrieb am 18:45 Mittwoch, 4.Oktober 2017: >>? >> >>? Klaus Burkart writes: >> >>> My setup: >>> >>> .bashrc >>> export PETSC_DIR=/home/klaus/OpenFOAM/klaus-5.0/petsc-3.7.6 >>> export PETSC_ARCH=arch-linux2-c-debug >>> export PETSC_CONFIGDIR=${PETSC_DIR}/lib/petsc >>> >>> >>> make options >>> ??? -I$(PETSC_CONFIGDIR)/conf \ >> >> The above should not be needed. >> >>> ??? -I$(PETSC_DIR)/include \ >>> ??? -I$(PETSC_DIR)/arch-linux2-c-debug/include >>> >>> Installation and tests worked fine. >>> >>> The output using:???? PetscInitialize(0,0,NULL,NULL); at the beginning and PetscFinalize(); at the end of the code section including PETSc (solver section) is: >>> >>> simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: PetscInitialize >> >> How have you linked libpetFoam.so to libpetsc.so and did you use RPATH? >> That seems to be the problem. >> >>> No simulation is triggered >>> >>> >>> When I just declare Mat M; and call a function with M as a parameter which outputs "Petsc - Hello" and sets the matrix M to symmetric (using: MatSetOption(A, MAT_SYMMETRIC, PETSC_TRUE);), the execution of a simulation is triggered but MatSetOption is causing a problem, I assume because PetscInitialize is missing >>> >>> Petsc - Hello nonepetGMRES:? Solving for Ux, Initial residual = 1, Final residual = 1, No Iterations 0 >>> Petsc - Hello nonepetGMRES:? Solving for Uz, Initial residual = 1, Final residual = 1, No Iterations 0 >>> simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: MatSetOption >>> >>> Maybe important to know, there's no way to enter commmand line input in the terminal while a simulation is running because the application displays continuously the intermediate simulation results. That's why I use PetscInitialize(0,0,NULL,NULL); There's now way to provide command line input. >> >> That's fine.? You can use the PETSC_OPTIONS environment variable or a >> configuration file to get run-time options to PETSc. >> >>> I came back to this simple test after writing "the complete code" which showed these problems and stripped it down, step-by-step, to figure out what causes the problem i.e. everything but a declaration. >>> >>>? >>> >>>? ? Jed Brown schrieb am 17:17 Mittwoch, 4.Oktober 2017: >>>? >>> >>>? Klaus Burkart writes: >>> >>>> What's the concept to interface with PETSc from another application to solve a linear system with PETSc? >>>> >>>> The standard procedure of the job: >>>> >>>> 1: The application provides a matrix which needs to be converted and be loaded into PETSc >>>> >>>> 2: The application provides the rhs vector (containing pointers!) which needs to be loaded into PETSc >>>> >>>> 3: The linear system is to be solved using PETSc >>>> >>>> 4: The application provides the result vector x, the PETSc result needs to be copied back to the application into vector x (also expecting pointers) >>>> >>>> >>>> The problem - maybe a completely wrong approach when it comes to using PETSc: >>>> >>>> With other linear algebra libraries, I included the library functionality in the code of a new solver accessing the functionality usually via header files and created a plugin which can be called from the application when running a simulation. >>>> >>>> Even so the mixed code including PETSc code can be compiled, the bit of the plugin, interfacing with the application is broken as soon as I include more than a PETSc declaration in the mixed code. >>> >>> Sounds like maybe you haven't correctly linked to the PETSc library. >>> Sending us the commands run and output/errors would be helpful to debug. >>> >>>> How to interface with PETSc from a software application?? (I am using c++ and Ubuntu) >>>> >>>> Klaus >>> >>>? ? >> >>? ? > > From k_burkart at yahoo.com Wed Oct 4 17:52:40 2017 From: k_burkart at yahoo.com (Klaus Burkart) Date: Wed, 4 Oct 2017 22:52:40 +0000 (UTC) Subject: [petsc-users] How to interface with PETSc from another application? In-Reply-To: <8760buskkv.fsf@jedbrown.org> References: <1533387941.1259180.1507127003520.ref@mail.yahoo.com> <1533387941.1259180.1507127003520@mail.yahoo.com> <87k20bgcxq.fsf@jedbrown.org> <1867850615.1376253.1507133735389@mail.yahoo.com> <87fuayhnek.fsf@jedbrown.org> <718363980.1460567.1507142605486@mail.yahoo.com> <87efqisnwe.fsf@jedbrown.org> <1947060922.1545491.1507149536804@mail.yahoo.com> <8760buskkv.fsf@jedbrown.org> Message-ID: <698530422.1658909.1507157560169@mail.yahoo.com> Thank you for the hint, the problem becomes clearer, it's looking for ??? libpetsc.so.3.7 ? which is not found? but is located in the same directory as libpetsc.so? but doesn't come with the .so ending ldd ??? libpetsc.so.3.7 => not found Jed Brown schrieb am 22:51 Mittwoch, 4.Oktober 2017: Klaus Burkart writes: > Adding the path to LD_LIBRARY_PATH doesn't help, the problems remain the same, it has no effect You're having a problem linking correctly from the plugin, and maybe some error messages are being swallowed, but this is not a PETSc issue. You can use ldd to check whether the plugin was linked correctly (it should find libpetsc.so). > >? ? Jed Brown schrieb am 21:39 Mittwoch, 4.Oktober 2017: >? > >? Klaus Burkart writes: > >> When I link the petsc library, the application side code is not properly compiled and the solver is not available for selection >> >> --> FOAM FATAL IO ERROR: >> Unknown asymmetric matrix solver petGMRES > > There must be some earlier error message. > >> Valid asymmetric matrix solvers are : >> >> 4 >> ( >> GAMG >> PBiCG >> PBiCGStab >> smoothSolver >> ) >> I added the following to my makefile to link the petsc library >> >> ??? -L$(PETSC_DIR)/arch-linux2-c-debug/lib??? -lpetsc > > In this case, you'd need to add that path to LD_LIBRARY_PATH so the > loader can find it.? None of these are PETSc issues, just linking > dynamic libraries. > >>? ? Jed Brown schrieb am 18:45 Mittwoch, 4.Oktober 2017: >>? >> >>? Klaus Burkart writes: >> >>> My setup: >>> >>> .bashrc >>> export PETSC_DIR=/home/klaus/OpenFOAM/klaus-5.0/petsc-3.7.6 >>> export PETSC_ARCH=arch-linux2-c-debug >>> export PETSC_CONFIGDIR=${PETSC_DIR}/lib/petsc >>> >>> >>> make options >>> ??? -I$(PETSC_CONFIGDIR)/conf \ >> >> The above should not be needed. >> >>> ??? -I$(PETSC_DIR)/include \ >>> ??? -I$(PETSC_DIR)/arch-linux2-c-debug/include >>> >>> Installation and tests worked fine. >>> >>> The output using:???? PetscInitialize(0,0,NULL,NULL); at the beginning and PetscFinalize(); at the end of the code section including PETSc (solver section) is: >>> >>> simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: PetscInitialize >> >> How have you linked libpetFoam.so to libpetsc.so and did you use RPATH? >> That seems to be the problem. >> >>> No simulation is triggered >>> >>> >>> When I just declare Mat M; and call a function with M as a parameter which outputs "Petsc - Hello" and sets the matrix M to symmetric (using: MatSetOption(A, MAT_SYMMETRIC, PETSC_TRUE);), the execution of a simulation is triggered but MatSetOption is causing a problem, I assume because PetscInitialize is missing >>> >>> Petsc - Hello nonepetGMRES:? Solving for Ux, Initial residual = 1, Final residual = 1, No Iterations 0 >>> Petsc - Hello nonepetGMRES:? Solving for Uz, Initial residual = 1, Final residual = 1, No Iterations 0 >>> simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5.0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: MatSetOption >>> >>> Maybe important to know, there's no way to enter commmand line input in the terminal while a simulation is running because the application displays continuously the intermediate simulation results. That's why I use PetscInitialize(0,0,NULL,NULL); There's now way to provide command line input. >> >> That's fine.? You can use the PETSC_OPTIONS environment variable or a >> configuration file to get run-time options to PETSc. >> >>> I came back to this simple test after writing "the complete code" which showed these problems and stripped it down, step-by-step, to figure out what causes the problem i.e. everything but a declaration. >>> >>>? >>> >>>? ? Jed Brown schrieb am 17:17 Mittwoch, 4.Oktober 2017: >>>? >>> >>>? Klaus Burkart writes: >>> >>>> What's the concept to interface with PETSc from another application to solve a linear system with PETSc? >>>> >>>> The standard procedure of the job: >>>> >>>> 1: The application provides a matrix which needs to be converted and be loaded into PETSc >>>> >>>> 2: The application provides the rhs vector (containing pointers!) which needs to be loaded into PETSc >>>> >>>> 3: The linear system is to be solved using PETSc >>>> >>>> 4: The application provides the result vector x, the PETSc result needs to be copied back to the application into vector x (also expecting pointers) >>>> >>>> >>>> The problem - maybe a completely wrong approach when it comes to using PETSc: >>>> >>>> With other linear algebra libraries, I included the library functionality in the code of a new solver accessing the functionality usually via header files and created a plugin which can be called from the application when running a simulation. >>>> >>>> Even so the mixed code including PETSc code can be compiled, the bit of the plugin, interfacing with the application is broken as soon as I include more than a PETSc declaration in the mixed code. >>> >>> Sounds like maybe you haven't correctly linked to the PETSc library. >>> Sending us the commands run and output/errors would be helpful to debug. >>> >>>> How to interface with PETSc from a software application?? (I am using c++ and Ubuntu) >>>> >>>> Klaus >>> >>>? ? >> >>? ? > >? ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 4 18:08:18 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 4 Oct 2017 19:08:18 -0400 Subject: [petsc-users] How to interface with PETSc from another application? In-Reply-To: <698530422.1658909.1507157560169@mail.yahoo.com> References: <1533387941.1259180.1507127003520.ref@mail.yahoo.com> <1533387941.1259180.1507127003520@mail.yahoo.com> <87k20bgcxq.fsf@jedbrown.org> <1867850615.1376253.1507133735389@mail.yahoo.com> <87fuayhnek.fsf@jedbrown.org> <718363980.1460567.1507142605486@mail.yahoo.com> <87efqisnwe.fsf@jedbrown.org> <1947060922.1545491.1507149536804@mail.yahoo.com> <8760buskkv.fsf@jedbrown.org> <698530422.1658909.1507157560169@mail.yahoo.com> Message-ID: On Wed, Oct 4, 2017 at 6:52 PM, Klaus Burkart wrote: > Thank you for the hint, the problem becomes clearer, it's looking for > libpetsc.so.3.7 which is not found but is located in the same directory > as libpetsc.so but doesn't come with the .so ending > > ldd > > libpetsc.so.3.7 => not found > The build system you have is adding a spurious suffix to the library. You can either fix it, or make a link to that name in the lib directory. Matt > Jed Brown schrieb am 22:51 Mittwoch, 4.Oktober 2017: > > > Klaus Burkart writes: > > > Adding the path to LD_LIBRARY_PATH doesn't help, the problems remain the > same, it has no effect > > You're having a problem linking correctly from the plugin, and maybe > some error messages are being swallowed, but this is not a PETSc issue. > You can use ldd to check whether the plugin was linked correctly (it > should find libpetsc.so). > > > > > > Jed Brown schrieb am 21:39 Mittwoch, 4.Oktober > 2017: > > > > > > Klaus Burkart writes: > > > >> When I link the petsc library, the application side code is not > properly compiled and the solver is not available for selection > >> > >> --> FOAM FATAL IO ERROR: > >> Unknown asymmetric matrix solver petGMRES > > > > There must be some earlier error message. > > > >> Valid asymmetric matrix solvers are : > >> > >> 4 > >> ( > >> GAMG > >> PBiCG > >> PBiCGStab > >> smoothSolver > >> ) > >> I added the following to my makefile to link the petsc library > >> > >> -L$(PETSC_DIR)/arch-linux2-c-debug/lib -lpetsc > > > > In this case, you'd need to add that path to LD_LIBRARY_PATH so the > > loader can find it. None of these are PETSc issues, just linking > > dynamic libraries. > > > >> Jed Brown schrieb am 18:45 Mittwoch, 4.Oktober > 2017: > >> > >> > >> Klaus Burkart writes: > >> > >>> My setup: > >>> > >>> .bashrc > >>> export PETSC_DIR=/home/klaus/OpenFOAM/klaus-5.0/petsc-3.7.6 > >>> export PETSC_ARCH=arch-linux2-c-debug > >>> export PETSC_CONFIGDIR=${PETSC_DIR}/lib/petsc > >>> > >>> > >>> make options > >>> -I$(PETSC_CONFIGDIR)/conf \ > >> > >> The above should not be needed. > >> > >>> -I$(PETSC_DIR)/include \ > >>> -I$(PETSC_DIR)/arch-linux2-c-debug/include > >>> > >>> Installation and tests worked fine. > >>> > >>> The output using: PetscInitialize(0,0,NULL,NULL); at the > beginning and PetscFinalize(); at the end of the code section including > PETSc (solver section) is: > >>> > >>> simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5. > 0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: > PetscInitialize > >> > >> How have you linked libpetFoam.so to libpetsc.so and did you use RPATH? > >> That seems to be the problem. > >> > >>> No simulation is triggered > >>> > >>> > >>> When I just declare Mat M; and call a function with M as a parameter > which outputs "Petsc - Hello" and sets the matrix M to symmetric (using: > MatSetOption(A, MAT_SYMMETRIC, PETSC_TRUE);), the execution of a simulation > is triggered but MatSetOption is causing a problem, I assume because > PetscInitialize is missing > >>> > >>> Petsc - Hello nonepetGMRES: Solving for Ux, Initial residual = 1, > Final residual = 1, No Iterations 0 > >>> Petsc - Hello nonepetGMRES: Solving for Uz, Initial residual = 1, > Final residual = 1, No Iterations 0 > >>> simpleFoam: symbol lookup error: /home/klaus/OpenFOAM/klaus-5. > 0/platforms/linux64GccDPInt32Opt/lib/libpetFoam.so: undefined symbol: > MatSetOption > >>> > >>> Maybe important to know, there's no way to enter commmand line input > in the terminal while a simulation is running because the application > displays continuously the intermediate simulation results. That's why I use > PetscInitialize(0,0,NULL,NULL); There's now way to provide command line > input. > >> > >> That's fine. You can use the PETSC_OPTIONS environment variable or a > >> configuration file to get run-time options to PETSc. > >> > >>> I came back to this simple test after writing "the complete code" > which showed these problems and stripped it down, step-by-step, to figure > out what causes the problem i.e. everything but a declaration. > >>> > >>> > >>> > >>> Jed Brown schrieb am 17:17 Mittwoch, 4.Oktober > 2017: > >>> > >>> > >>> Klaus Burkart writes: > >>> > >>>> What's the concept to interface with PETSc from another application > to solve a linear system with PETSc? > >>>> > >>>> The standard procedure of the job: > >>>> > >>>> 1: The application provides a matrix which needs to be converted and > be loaded into PETSc > >>>> > >>>> 2: The application provides the rhs vector (containing pointers!) > which needs to be loaded into PETSc > >>>> > >>>> 3: The linear system is to be solved using PETSc > >>>> > >>>> 4: The application provides the result vector x, the PETSc result > needs to be copied back to the application into vector x (also expecting > pointers) > >>>> > >>>> > >>>> The problem - maybe a completely wrong approach when it comes to > using PETSc: > >>>> > >>>> With other linear algebra libraries, I included the library > functionality in the code of a new solver accessing the functionality > usually via header files and created a plugin which can be called from the > application when running a simulation. > >>>> > >>>> Even so the mixed code including PETSc code can be compiled, the bit > of the plugin, interfacing with the application is broken as soon as I > include more than a PETSc declaration in the mixed code. > >>> > >>> Sounds like maybe you haven't correctly linked to the PETSc library. > >>> Sending us the commands run and output/errors would be helpful to > debug. > >>> > >>>> How to interface with PETSc from a software application? (I am using > c++ and Ubuntu) > >>>> > >>>> Klaus > >>> > >>> > >> > >> > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Oct 4 18:14:43 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 04 Oct 2017 17:14:43 -0600 Subject: [petsc-users] How to interface with PETSc from another application? In-Reply-To: References: <1533387941.1259180.1507127003520.ref@mail.yahoo.com> <1533387941.1259180.1507127003520@mail.yahoo.com> <87k20bgcxq.fsf@jedbrown.org> <1867850615.1376253.1507133735389@mail.yahoo.com> <87fuayhnek.fsf@jedbrown.org> <718363980.1460567.1507142605486@mail.yahoo.com> <87efqisnwe.fsf@jedbrown.org> <1947060922.1545491.1507149536804@mail.yahoo.com> <8760buskkv.fsf@jedbrown.org> <698530422.1658909.1507157560169@mail.yahoo.com> Message-ID: <87376ysdy4.fsf@jedbrown.org> Matthew Knepley writes: > On Wed, Oct 4, 2017 at 6:52 PM, Klaus Burkart wrote: > >> Thank you for the hint, the problem becomes clearer, it's looking for >> libpetsc.so.3.7 which is not found but is located in the same directory >> as libpetsc.so but doesn't come with the .so ending >> >> ldd >> >> libpetsc.so.3.7 => not found >> > > The build system you have is adding a spurious suffix to the library. You > can either fix it, or make a link > to that name in the lib directory. WUT? That file should exist in $PETSC_DIR/$PETSC_ARCH/lib/ and should be a symlink to the real library (libpetsc.so.3.7.5 or whatever). (libpetsc.so is also a symlink to the same place.) Your LD_LIBRARY_PATH is probably still not set correctly -- when you get it right, ldd will resolve. From mailinglists at xgm.de Wed Oct 4 23:39:31 2017 From: mailinglists at xgm.de (Florian Lindner) Date: Thu, 5 Oct 2017 12:39:31 +0800 Subject: [petsc-users] Mat/Vec with empty ranks In-Reply-To: References: Message-ID: Am 04.10.2017 um 18:08 schrieb Matthew Knepley: > I don't know if that is right. However, the sequential and parallel algorithms agree on both the initial residual (so > that parallel > matrix and rhs appear correct) and the first iterate. Divergence of the second iterate could still be a bug in our code, > but it > was harder for me to see how. > > The real thing to do, which should not be that much work but I don't have time for now unfortunately, is to step through the > algorithm in serial and parallel and see what number changes. The algorithm only has 20 or so steps per iterate, so this > would probably take one day to do right. Ok, I try to dig a bit into petsc. I worked on the cleaned up code you gave me, ran it on 4 MPI ranks and compared output with and without using -load. Other options were: -ksp_max_it 10 -ksp_view -ksp_monitor_true_residual -ksp_lsqr_monitor -ksp_view_pre -vecscatter_view" All on the maint branch. Starting from lsqr.c, I identified values to start differing after KSP_MatMultTranspose(ksp,Amat,U1,V1); With -load (converging), V1 has the value: Vec Object: 4 MPI processes type: mpi Process [0] -0.544245 Process [1] 1.11245 Process [2] -1.25846 Process [3] Without -load: Vec Object: 4 MPI processes type: mpi Process [0] 0.316288 Process [1] 2.85233 Process [2] -0.776467 Process [3] Other input values are same. I tracked it further down to MatMultTranspose_MPIDense in mpidense.c where the value of yy starts to differ after the VecScatterBegin/End. At this place, a->lvec, the scatter source also differes, whereas Mat A is identical (by looking at MatView output). However, no idea where a->lvec (which is A->data->lvec) is filled. I hope that helps a bit. Best, Florian From l.verzeroli at studenti.unibg.it Thu Oct 5 01:59:28 2017 From: l.verzeroli at studenti.unibg.it (Luca Verzeroli) Date: Thu, 5 Oct 2017 08:59:28 +0200 Subject: [petsc-users] New nonzero caused a malloc Message-ID: Good mornig, I'm new to PETSc and i'm wondering about this problem. When i run my code with 2 processes I have no problem. When i use more than 2 processes I receive this message: [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: Argument out of range [1]PETSC ERROR: New nonzero at (0,43) caused a malloc Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off this check This means that a process is trying to write in cell (0,43)? Then I try with MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) and I check with MatView if there's an element in (0,43) but nothing is written there. Otherwise, I have checked the index I use to insert element in the matrix but no (0,43) is present. Could you give me some advices about the possible solution of this problem? Luca From bsmith at mcs.anl.gov Thu Oct 5 02:11:19 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 5 Oct 2017 09:11:19 +0200 Subject: [petsc-users] Mat/Vec with empty ranks In-Reply-To: References: Message-ID: <87CA14CB-342E-407E-AF92-EE960514EC24@mcs.anl.gov> Florian, Thanks for reporting the problem. It is a serious bug in PETSc with dense matrices. Here is my proposed fix https://bitbucket.org/petsc/petsc/pull-requests/764/fix-bug-in-sequential-dense-multiply-and/diff Barry > On Oct 5, 2017, at 6:39 AM, Florian Lindner wrote: > > Am 04.10.2017 um 18:08 schrieb Matthew Knepley: > >> I don't know if that is right. However, the sequential and parallel algorithms agree on both the initial residual (so >> that parallel >> matrix and rhs appear correct) and the first iterate. Divergence of the second iterate could still be a bug in our code, >> but it >> was harder for me to see how. >> >> The real thing to do, which should not be that much work but I don't have time for now unfortunately, is to step through the >> algorithm in serial and parallel and see what number changes. The algorithm only has 20 or so steps per iterate, so this >> would probably take one day to do right. > > Ok, I try to dig a bit into petsc. > > I worked on the cleaned up code you gave me, ran it on 4 MPI ranks and compared output with and without using -load. > > Other options were: > > -ksp_max_it 10 -ksp_view -ksp_monitor_true_residual -ksp_lsqr_monitor -ksp_view_pre -vecscatter_view" > > All on the maint branch. > > Starting from lsqr.c, I identified values to start differing after KSP_MatMultTranspose(ksp,Amat,U1,V1); > > With -load (converging), V1 has the value: > > Vec Object: 4 MPI processes > type: mpi > Process [0] > -0.544245 > Process [1] > 1.11245 > Process [2] > -1.25846 > Process [3] > > Without -load: > > Vec Object: 4 MPI processes > type: mpi > Process [0] > 0.316288 > Process [1] > 2.85233 > Process [2] > -0.776467 > Process [3] > > Other input values are same. > > I tracked it further down to MatMultTranspose_MPIDense in mpidense.c where the value of yy starts to differ after the > VecScatterBegin/End. At this place, a->lvec, the scatter source also differes, whereas Mat A is identical (by looking at > MatView output). > > However, no idea where a->lvec (which is A->data->lvec) is filled. > > I hope that helps a bit. > > Best, > Florian > From bsmith at mcs.anl.gov Thu Oct 5 03:47:11 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 5 Oct 2017 10:47:11 +0200 Subject: [petsc-users] New nonzero caused a malloc In-Reply-To: References: Message-ID: <80673F68-2084-4423-897F-99D89F4EB160@mcs.anl.gov> Please always, always, always include the entire error message not just part of it. > On Oct 5, 2017, at 8:59 AM, Luca Verzeroli wrote: > > Good mornig, > > I'm new to PETSc and i'm wondering about this problem. > > When i run my code with 2 processes I have no problem. When i use more than 2 processes I receive this message: > > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Argument out of range > [1]PETSC ERROR: New nonzero at (0,43) caused a malloc > Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off this check > > This means that a process is trying to write in cell (0,43)? It may not be at the global location 0,43 if we had the full error message then we would know more. But at some location it is trying to put in a value and not enough space has been preallocated. > > Then I try with MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) and I > > check with MatView if there's an element in (0,43) but nothing is written there. > > Otherwise, I have checked the index I use to insert element in the matrix but no (0,43) is present. > > > Could you give me some advices about the possible solution of this problem? Please read up on the whole business of preallocation, http://www.mcs.anl.gov/petsc/documentation/faq.html#efficient-assembly > > Luca > From knepley at gmail.com Thu Oct 5 03:47:38 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 5 Oct 2017 04:47:38 -0400 Subject: [petsc-users] New nonzero caused a malloc In-Reply-To: References: Message-ID: On Thu, Oct 5, 2017 at 2:59 AM, Luca Verzeroli < l.verzeroli at studenti.unibg.it> wrote: > Good mornig, > > I'm new to PETSc and i'm wondering about this problem. > > When i run my code with 2 processes I have no problem. When i use more > than 2 processes I receive this message: > > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [1]PETSC ERROR: Argument out of range > [1]PETSC ERROR: New nonzero at (0,43) caused a malloc > Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn > off this check > > This means that a process is trying to write in cell (0,43)? > > Then I try with MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, > PETSC_FALSE) and I > > check with MatView if there's an element in (0,43) but nothing is written > there. > > Otherwise, I have checked the index I use to insert element in the matrix > but no (0,43) is present. > Unfortunately, the error reporting here is substandard and we should fix it. You have exceeded the preallocation on process 1, but it is caught in the code for the serial matrix for the diagonal block, so this means 1st row on proc 1, and column 43 relative to the first column. Thanks, Matt > Could you give me some advices about the possible solution of this problem? > > Luca > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stormweiner at berkeley.edu Thu Oct 5 15:56:18 2017 From: stormweiner at berkeley.edu (Storm Weiner) Date: Thu, 5 Oct 2017 13:56:18 -0700 Subject: [petsc-users] Reading in a matrix from ASCII file Message-ID: Hey there, I'm working through some basic examples, and I want to read in a test matrix I have stored in a (row, column, value) ASCII file. How can I read this in using PETSc? I found the routine PetscViewerASCIIOpen and PetscViewerASCIIRead but I'm not sure how to tell it that what I have is actually a matrix. In this example, I know the matrix, so I can specify how many entries to read, but is there a way to just read "the whole file"? What's the preferred PETSc way to do this? I'm using F90 by the way. -Storm -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Oct 5 16:10:02 2017 From: jed at jedbrown.org (Jed Brown) Date: Thu, 05 Oct 2017 15:10:02 -0600 Subject: [petsc-users] Reading in a matrix from ASCII file In-Reply-To: References: Message-ID: <87h8vdjo7p.fsf@jedbrown.org> http://www.mcs.anl.gov/petsc/documentation/faq.html#sparse-matrix-ascii-format Better, read it using Python on MATLAB and write it (using the provided PETSc scripts) in PETSc binary format. Parallel IO with ASCII files is a dead end. Storm Weiner writes: > Hey there, > > I'm working through some basic examples, and I want to read in a test > matrix I have stored in a (row, column, value) ASCII file. How can I read > this in using PETSc? > > > I found the routine PetscViewerASCIIOpen and PetscViewerASCIIRead but I'm > not sure how to tell it that what I have is actually a matrix. In this > example, I know the matrix, so I can specify how many entries to read, but > is there a way to just read "the whole file"? > > What's the preferred PETSc way to do this? > > I'm using F90 by the way. > > > -Storm From rchurchi at pppl.gov Thu Oct 5 16:48:16 2017 From: rchurchi at pppl.gov (Randy Michael Churchill) Date: Thu, 5 Oct 2017 17:48:16 -0400 Subject: [petsc-users] TAO setup with modules in Fortran 90 Message-ID: A simple setup question with TAO: if I were to convert the rosenbrock1f.F90 example to use a module instead of common structures, how would I setup the include statements? I've tried various combinations (using petscXXXdef.h, petscXXX.h, petscXXX.h90, along with use petscXXX), but seem to get errors with all. file:rosenbrock1f.h: module commondat PetscReal :: alpha PetscInt :: n end module commondat file:rosenbrock1f.90: program rosenbrock1f !!include statements??? which and where???!!! use commondat ... subroutine FormFunctionGradient(tao, X, f, G, dummy, ierr) use commondat implicit none ... ( https://www.mcs.anl.gov/petsc/petsc-dev/src/tao/unconstrained/examples/tutorials/rosenbrock1f.F90.html ) -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Oct 5 17:21:17 2017 From: jed at jedbrown.org (Jed Brown) Date: Thu, 05 Oct 2017 16:21:17 -0600 Subject: [petsc-users] Reading in a matrix from ASCII file In-Reply-To: References: <87h8vdjo7p.fsf@jedbrown.org> Message-ID: <874lrdjkwy.fsf@jedbrown.org> Please always use "reply-all" so that your messages go to the list. This is standard mailing list etiquette. It is important to preserve threading for people who find this discussion later and so that we do not waste our time re-answering the same questions that have already been answered in private side-conversations. You'll likely get an answer faster that way too. Storm Weiner writes: > Thanks, > > I decided to use one of the example matrices provided on the ftp to start > with but now I have another issue. > I tried to port /mat/ex1.c to f90 but when I do > > CALL PetscViewerBinaryOpen(PETSC_COMM_WORLD,DataFile,fd,ierr) > > I get a bad file descriptor error. > > A simple open() close() check immediately before PetscViewerBinaryOpen() > passes, so I know I'm using an appropriate file path. > > Furthermore, the unaltered ex1.c works fine with the file I'm attempting. > Is there something special about petsc+f90 file descriptors, or did I mess > up somewhere along the way? No, but Fortran doesn't warn when you pass the wrong number of parameters. You need this. diff --git i/ex1.F90 w/ex1.F90 index 049e6ae..311d6f0 100644 --- i/ex1.F90 +++ w/ex1.F90 @@ -25,7 +25,7 @@ close(10) write(*,*) "before viewer create" - CALL PetscViewerBinaryOpen(PETSC_COMM_WORLD,DataFile,fd,ierr);CHKERRA(ierr) + CALL PetscViewerBinaryOpen(PETSC_COMM_WORLD,DataFile,FILE_MODE_READ,fd,ierr);CHKERRA(ierr) !! Load the matrix; then destroy the viewer. > The complete error message is: > > > $ ./ex1 -f $PWD/testmat > before initialize > DataFile= > /usr/workspace/wsa/weiner6/MPI_sandbox/Minimal/testmat > > > before test open/close > before viewer create > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X > to find memory corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > [0]PETSC ERROR: INSTEAD the line number of the start of the function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Signal received > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for > trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.8.0, unknown > [0]PETSC ERROR: ./ex1 on a arch-linux2-c-debug named borax2 by weiner6 Thu > Oct 5 14:48:33 2017 > [0]PETSC ERROR: Configure options --with-scalar-type=complex > --with-valgrind-dir=/usr/bin/ > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59 > : > system msg for write_line failure : Bad file descriptor > > -Storm > > P.S. Should I be using reply-all to in these email threads? > > On Thu, Oct 5, 2017 at 2:10 PM, Jed Brown wrote: > >> http://www.mcs.anl.gov/petsc/documentation/faq.html#sparse- >> matrix-ascii-format >> >> Better, read it using Python on MATLAB and write it (using the provided >> PETSc scripts) in PETSc binary format. Parallel IO with ASCII files is >> a dead end. >> >> Storm Weiner writes: >> >> > Hey there, >> > >> > I'm working through some basic examples, and I want to read in a test >> > matrix I have stored in a (row, column, value) ASCII file. How can I >> read >> > this in using PETSc? >> > >> > >> > I found the routine PetscViewerASCIIOpen and PetscViewerASCIIRead but >> I'm >> > not sure how to tell it that what I have is actually a matrix. In this >> > example, I know the matrix, so I can specify how many entries to read, >> but >> > is there a way to just read "the whole file"? >> > >> > What's the preferred PETSc way to do this? >> > >> > I'm using F90 by the way. >> > >> > >> > -Storm >> > Program Main > use PETSCMAT > implicit NONE > #include > > > Mat :: A > PetscViewer :: fd > character(100) :: DataFile ! input file name > IS :: isrow,iscol ! row and column permutations > PetscErrorCode :: ierr > MatOrderingType :: rtype = MATORDERINGRCM > PetscBool :: flg,PetscPreLoad = PETSC_FALSE > > write(*,*) "before initialize" > call PETSCinitialize(PETSC_NULL_CHARACTER,ierr) > > call PetscOptionsGetString(petsc_NULL_OPTIONS,PETSC_NULL_CHARACTER,"-f",DataFile,flg,ierr) > CHKERRA(ierr) > write(*,*) "DataFile=",DataFile > !!Open ascii file. Note that we use FILE_MODE_READ to indicate > !!reading from this file. > write(*,*) "before test open/close" > open(10, FILE=DataFile) > close(10) > > write(*,*) "before viewer create" > CALL PetscViewerBinaryOpen(PETSC_COMM_WORLD,DataFile,fd,ierr);CHKERRA(ierr) > > > !! Load the matrix; then destroy the viewer. > write(*,*) "before matcreate" > call MatCreate(PETSC_COMM_WORLD,A,ierr);CHKERRA(ierr); > write(*,*) "before matsettype" > call MatSetType(A,MATSEQAIJ,ierr);CHKERRA(ierr); > write(*,*) "before matload" > call MatLoad(A,fd,ierr);CHKERRA(ierr); > write(*,*) "before destroyviewer" > call PetscViewerDestroy(fd,ierr);CHKERRA(ierr); > > > > call MatGetOrdering(A,rtype,isrow,iscol,ierr);CHKERRA(ierr); > > > !! All PETSc objects should be destroyed when they are no longer needed. > call MatDestroy(A,ierr);CHKERRA(ierr); > call ISDestroy(isrow,ierr);CHKERRA(ierr); > call ISDestroy(iscol,ierr);CHKERRA(ierr); > write(*,*) isrow,iscol > call PetscFinalize(ierr) > stop > end program Main From stormweiner at berkeley.edu Thu Oct 5 17:43:54 2017 From: stormweiner at berkeley.edu (Storm Weiner) Date: Thu, 5 Oct 2017 15:43:54 -0700 Subject: [petsc-users] Reading in a matrix from ASCII file In-Reply-To: <874lrdjkwy.fsf@jedbrown.org> References: <87h8vdjo7p.fsf@jedbrown.org> <874lrdjkwy.fsf@jedbrown.org> Message-ID: Ahh of course. Thanks! I had switched to PetscViewerASCIIOpen and then back to PetscViewerBinaryOpen and forgot to include the file mode. I appreciate the rapid and useful response! -Storm On Thu, Oct 5, 2017 at 3:21 PM, Jed Brown wrote: > Please always use "reply-all" so that your messages go to the list. > This is standard mailing list etiquette. It is important to preserve > threading for people who find this discussion later and so that we do > not waste our time re-answering the same questions that have already > been answered in private side-conversations. You'll likely get an > answer faster that way too. > > > Storm Weiner writes: > > > Thanks, > > > > I decided to use one of the example matrices provided on the ftp to start > > with but now I have another issue. > > I tried to port /mat/ex1.c to f90 but when I do > > > > CALL PetscViewerBinaryOpen(PETSC_COMM_WORLD,DataFile,fd,ierr) > > > > I get a bad file descriptor error. > > > > A simple open() close() check immediately before PetscViewerBinaryOpen() > > passes, so I know I'm using an appropriate file path. > > > > Furthermore, the unaltered ex1.c works fine with the file I'm attempting. > > Is there something special about petsc+f90 file descriptors, or did I > mess > > up somewhere along the way? > > No, but Fortran doesn't warn when you pass the wrong number of > parameters. You need this. > > diff --git i/ex1.F90 w/ex1.F90 > index 049e6ae..311d6f0 100644 > --- i/ex1.F90 > +++ w/ex1.F90 > @@ -25,7 +25,7 @@ > close(10) > > write(*,*) "before viewer create" > - CALL PetscViewerBinaryOpen(PETSC_COMM_WORLD,DataFile,fd,ierr); > CHKERRA(ierr) > + CALL PetscViewerBinaryOpen(PETSC_COMM_WORLD,DataFile,FILE_MODE_ > READ,fd,ierr);CHKERRA(ierr) > > > !! Load the matrix; then destroy the viewer. > > > > The complete error message is: > > > > > > $ ./ex1 -f $PWD/testmat > > before initialize > > DataFile= > > /usr/workspace/wsa/weiner6/MPI_sandbox/Minimal/testmat > > > > > > before test open/close > > before viewer create > > [0]PETSC ERROR: > > ------------------------------------------------------------------------ > > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > > probably memory access out of range > > [0]PETSC ERROR: Try option -start_in_debugger or > -on_error_attach_debugger > > [0]PETSC ERROR: or see > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac > OS X > > to find memory corruption errors > > [0]PETSC ERROR: likely location of problem given in stack below > > [0]PETSC ERROR: --------------------- Stack Frames > > ------------------------------------ > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > > [0]PETSC ERROR: INSTEAD the line number of the start of the > function > > [0]PETSC ERROR: is given. > > [0]PETSC ERROR: --------------------- Error Message > > -------------------------------------------------------------- > > [0]PETSC ERROR: Signal received > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for > > trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.8.0, unknown > > [0]PETSC ERROR: ./ex1 on a arch-linux2-c-debug named borax2 by weiner6 > Thu > > Oct 5 14:48:33 2017 > > [0]PETSC ERROR: Configure options --with-scalar-type=complex > > --with-valgrind-dir=/usr/bin/ > > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file > > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59 > > : > > system msg for write_line failure : Bad file descriptor > > > > -Storm > > > > P.S. Should I be using reply-all to in these email threads? > > > > On Thu, Oct 5, 2017 at 2:10 PM, Jed Brown wrote: > > > >> http://www.mcs.anl.gov/petsc/documentation/faq.html#sparse- > >> matrix-ascii-format > >> > >> Better, read it using Python on MATLAB and write it (using the provided > >> PETSc scripts) in PETSc binary format. Parallel IO with ASCII files is > >> a dead end. > >> > >> Storm Weiner writes: > >> > >> > Hey there, > >> > > >> > I'm working through some basic examples, and I want to read in a test > >> > matrix I have stored in a (row, column, value) ASCII file. How can I > >> read > >> > this in using PETSc? > >> > > >> > > >> > I found the routine PetscViewerASCIIOpen and PetscViewerASCIIRead but > >> I'm > >> > not sure how to tell it that what I have is actually a matrix. In > this > >> > example, I know the matrix, so I can specify how many entries to read, > >> but > >> > is there a way to just read "the whole file"? > >> > > >> > What's the preferred PETSc way to do this? > >> > > >> > I'm using F90 by the way. > >> > > >> > > >> > -Storm > >> > > Program Main > > use PETSCMAT > > implicit NONE > > #include > > > > > > Mat :: A > > PetscViewer :: fd > > character(100) :: DataFile ! input file name > > IS :: isrow,iscol ! row and column > permutations > > PetscErrorCode :: ierr > > MatOrderingType :: rtype = MATORDERINGRCM > > PetscBool :: flg,PetscPreLoad = PETSC_FALSE > > > > write(*,*) "before initialize" > > call PETSCinitialize(PETSC_NULL_CHARACTER,ierr) > > > > call PetscOptionsGetString(petsc_NULL_OPTIONS,PETSC_NULL_ > CHARACTER,"-f",DataFile,flg,ierr) > > CHKERRA(ierr) > > write(*,*) "DataFile=",DataFile > > !!Open ascii file. Note that we use FILE_MODE_READ to indicate > > !!reading from this file. > > write(*,*) "before test open/close" > > open(10, FILE=DataFile) > > close(10) > > > > write(*,*) "before viewer create" > > CALL PetscViewerBinaryOpen(PETSC_COMM_WORLD,DataFile,fd,ierr); > CHKERRA(ierr) > > > > > > !! Load the matrix; then destroy the viewer. > > write(*,*) "before matcreate" > > call MatCreate(PETSC_COMM_WORLD,A,ierr);CHKERRA(ierr); > > write(*,*) "before matsettype" > > call MatSetType(A,MATSEQAIJ,ierr);CHKERRA(ierr); > > write(*,*) "before matload" > > call MatLoad(A,fd,ierr);CHKERRA(ierr); > > write(*,*) "before destroyviewer" > > call PetscViewerDestroy(fd,ierr);CHKERRA(ierr); > > > > > > > > call MatGetOrdering(A,rtype,isrow,iscol,ierr);CHKERRA(ierr); > > > > > > !! All PETSc objects should be destroyed when they are no longer > needed. > > call MatDestroy(A,ierr);CHKERRA(ierr); > > call ISDestroy(isrow,ierr);CHKERRA(ierr); > > call ISDestroy(iscol,ierr);CHKERRA(ierr); > > write(*,*) isrow,iscol > > call PetscFinalize(ierr) > > stop > > end program Main > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Oct 6 06:36:58 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 6 Oct 2017 13:36:58 +0200 Subject: [petsc-users] TAO setup with modules in Fortran 90 In-Reply-To: References: Message-ID: <88546AF0-385D-44DA-965D-169A6DBAA523@mcs.anl.gov> Randy, First you absolutely must use version 3.8 or the master development copy. We improved and simplified dramatically how Fortran (90) is utilized from PETSc. Note that there is only one simple set of include files and modules for Fortran; see the newest documentation. Barry > On Oct 5, 2017, at 11:48 PM, Randy Michael Churchill wrote: > > A simple setup question with TAO: if I were to convert the rosenbrock1f.F90 example to use a module instead of common structures, how would I setup the include statements? I've tried various combinations (using petscXXXdef.h, petscXXX.h, petscXXX.h90, along with use petscXXX), but seem to get errors with all. > > file:rosenbrock1f.h: > module commondat > PetscReal :: alpha > PetscInt :: n > end module commondat > > file:rosenbrock1f.90: > program rosenbrock1f > !!include statements??? which and where???!!! > use commondat > ... > > subroutine FormFunctionGradient(tao, X, f, G, dummy, ierr) > use commondat > implicit none > ... > > (https://www.mcs.anl.gov/petsc/petsc-dev/src/tao/unconstrained/examples/tutorials/rosenbrock1f.F90.html) From knepley at gmail.com Fri Oct 6 06:54:51 2017 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 6 Oct 2017 07:54:51 -0400 Subject: [petsc-users] TAO setup with modules in Fortran 90 In-Reply-To: <88546AF0-385D-44DA-965D-169A6DBAA523@mcs.anl.gov> References: <88546AF0-385D-44DA-965D-169A6DBAA523@mcs.anl.gov> Message-ID: On Fri, Oct 6, 2017 at 7:36 AM, Barry Smith wrote: > > Randy, > > First you absolutely must use version 3.8 or the master development > copy. We improved and simplified dramatically how Fortran (90) is utilized > from PETSc. > > Note that there is only one simple set of include files and modules > for Fortran; see the newest documentation. > > http://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/Sys/UsingFortran.html Matt > > Barry > > > > On Oct 5, 2017, at 11:48 PM, Randy Michael Churchill > wrote: > > > > A simple setup question with TAO: if I were to convert the > rosenbrock1f.F90 example to use a module instead of common structures, how > would I setup the include statements? I've tried various combinations > (using petscXXXdef.h, petscXXX.h, petscXXX.h90, along with use petscXXX), > but seem to get errors with all. > > > > file:rosenbrock1f.h: > > module commondat > > PetscReal :: alpha > > PetscInt :: n > > end module commondat > > > > file:rosenbrock1f.90: > > program rosenbrock1f > > !!include statements??? which and where???!!! > > use commondat > > ... > > > > subroutine FormFunctionGradient(tao, X, f, G, dummy, ierr) > > use commondat > > implicit none > > ... > > > > (https://www.mcs.anl.gov/petsc/petsc-dev/src/tao/unconstrained/examples/ > tutorials/rosenbrock1f.F90.html) > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.appel17 at imperial.ac.uk Fri Oct 6 09:08:35 2017 From: t.appel17 at imperial.ac.uk (Thibaut Appel) Date: Fri, 6 Oct 2017 15:08:35 +0100 Subject: [petsc-users] Preallocation (dnz, onz arrays) in sparse parallel matrix Message-ID: Dear PETSc users, I am trying to assemble a sparse matrix in parallel where my main objective is efficiency and scalability. Precisely, I am using MatMPIAIJSetPreallocation with diagonal entries (dnz) and off-diagonal entries (onz) _arrays_ (non zero elements for each rows) to allocate the memory needed. Prior to the insertion of the elements in the matrix, I am doing a primary loop to determine those arrays dnz and onz for each processor owning its own set of rows. Ideally,? this loop would look like ??? for irow=istart,? iend-1,? i++ ----> count dnz(irow) and onz(irow) But it seems that you cannot call MatGetOwnershipRange(Mat,istart,iend,ierr) before MatMPIAIJSetPreallocation to get istart and iend. Why is that? Which optimal approach should be followed to count your non-zero elements for each processor? I saw two conversations where Barry Smith suggested the use of MatPreallocateInitialize/Finalize or PetscSplitOwnership, which means you have to determine yourself the rows owned by each processor? Is that not contrary to the "PETSc spirit"? Thanks for your help and have a nice weekend Thibaut -------------- next part -------------- An HTML attachment was scrubbed... URL: From lvella at gmail.com Fri Oct 6 11:20:18 2017 From: lvella at gmail.com (Lucas Clemente Vella) Date: Fri, 6 Oct 2017 13:20:18 -0300 Subject: [petsc-users] A really huge hash is being requested. Message-ID: Hi. I am trying to assemble a big matrix, but PETSc fail with the following message (this was executed with 5 processes): [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Argument out of range [0]PETSC ERROR: A really huge hash is being requested.. cannot process: 3418625 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.7.5, Jan, 01, 2017 [0]PETSC ERROR: /home/lvella/src/cyberex/cyberex-3d on a x86_64 named r1i1n4 by lvella Thu Oct 5 19:31:29 2017 [0]PETSC ERROR: Configure options --prefix=/opt/sw/petsc/3.7.5-gcc4.9.1-openmpi1.8.4 --PETSC_ARCH=x86_64 --with-mpi=1 --with-mpi-dir=/opt/sw/openmpi/1.8.4-gcc4.9.1 --with-hwloc=1 --with-hwloc-dir=/opt/sw/hwloc/1.6.1 --with-hdf5=1 --with-h df5-dir=/opt/sw/hdf5/1.8.14-gcc4.9.1-openmpi1.8.4 F77=mpif77 F90=mpif90 --with-shared-libraries=0 --download-f2cblaslapack=1 --download-hdf5=0 --with-clanguage=C --with-c++-support --with-x=0 --with-debubbing=yes --download-hypre=1 [0]PETSC ERROR: #1 PetscTableCreateHashSize() line 28 in /home/rsaramago/build-pkgs/petsc-3.7.5/src/sys/utils/ctable.c [0]PETSC ERROR: ------------------------------------------------------------------------ What caused the problem and how to handle it? Thanks, -- Lucas Clemente Vella lvella at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Fri Oct 6 11:22:40 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 6 Oct 2017 11:22:40 -0500 Subject: [petsc-users] A really huge hash is being requested. In-Reply-To: References: Message-ID: upgrade to petsc version 3.7.7 or 3.8 Satish On Fri, 6 Oct 2017, Lucas Clemente Vella wrote: > Hi. > > I am trying to assemble a big matrix, but PETSc fail with the following > message (this was executed with 5 processes): > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Argument out of range > [0]PETSC ERROR: A really huge hash is being requested.. cannot process: > 3418625 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for > trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.7.5, Jan, 01, 2017 > [0]PETSC ERROR: /home/lvella/src/cyberex/cyberex-3d on a x86_64 named > r1i1n4 by lvella Thu Oct 5 19:31:29 2017 > [0]PETSC ERROR: Configure options > --prefix=/opt/sw/petsc/3.7.5-gcc4.9.1-openmpi1.8.4 --PETSC_ARCH=x86_64 > --with-mpi=1 --with-mpi-dir=/opt/sw/openmpi/1.8.4-gcc4.9.1 --with-hwloc=1 > --with-hwloc-dir=/opt/sw/hwloc/1.6.1 --with-hdf5=1 --with-h > df5-dir=/opt/sw/hdf5/1.8.14-gcc4.9.1-openmpi1.8.4 F77=mpif77 F90=mpif90 > --with-shared-libraries=0 --download-f2cblaslapack=1 --download-hdf5=0 > --with-clanguage=C --with-c++-support --with-x=0 --with-debubbing=yes > --download-hypre=1 > [0]PETSC ERROR: #1 PetscTableCreateHashSize() line 28 in > /home/rsaramago/build-pkgs/petsc-3.7.5/src/sys/utils/ctable.c > [0]PETSC ERROR: > ------------------------------------------------------------------------ > > What caused the problem and how to handle it? > > Thanks, > > -- > Lucas Clemente Vella > lvella at gmail.com > From bsmith at mcs.anl.gov Fri Oct 6 14:57:13 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 6 Oct 2017 21:57:13 +0200 Subject: [petsc-users] Preallocation (dnz, onz arrays) in sparse parallel matrix In-Reply-To: References: Message-ID: > On Oct 6, 2017, at 4:08 PM, Thibaut Appel wrote: > > Dear PETSc users, > > I am trying to assemble a sparse matrix in parallel where my main objective is efficiency and scalability. > > Precisely, I am using MatMPIAIJSetPreallocation with diagonal entries (dnz) and off-diagonal entries (onz) arrays (non zero elements for each rows) to allocate the memory needed. > > Prior to the insertion of the elements in the matrix, I am doing a primary loop to determine those arrays dnz and onz for each processor owning its own set of rows. Ideally, this loop would look like > > for irow=istart, iend-1, i++ ----> count dnz(irow) and onz(irow) > But it seems that you cannot call MatGetOwnershipRange(Mat,istart,iend,ierr) before MatMPIAIJSetPreallocation to get istart and iend. Why is that? > Which optimal approach should be followed to count your non-zero elements for each processor? I saw two conversations where Barry Smith suggested the use of MatPreallocateInitialize/Finalize or PetscSplitOwnership, which means you have to determine yourself the rows owned by each processor? Is that not contrary to the "PETSc spirit"? Use PetscSplitOwnership() to determine the ownerships. The reason for not being about to use MatGetOwnershipRange() before setting the preallocation is just because of the design of the data constructor for the matrix class. You are correct that it might be possible to refactor the code to have more steps in the constructor allowing the call to MatGetOwnershipRange(). We welcome pull requests but are unlikely to make the change ourselves, the reason is that normally one is working with a mesh data structure (offend a DM) that provides (based on its decomposition) the ownership ranges (rather than just having the matrix decide on it) and hence one does not normally call the MatSetSizes() allowing the matrix to determine the ownership ranges. Barry\ > Thanks for your help and have a nice weekend > > Thibaut From rchurchi at pppl.gov Fri Oct 6 16:18:03 2017 From: rchurchi at pppl.gov (Randy Michael Churchill) Date: Fri, 6 Oct 2017 17:18:03 -0400 Subject: [petsc-users] TAO setup with modules in Fortran 90 In-Reply-To: References: <88546AF0-385D-44DA-965D-169A6DBAA523@mcs.anl.gov> Message-ID: So if I'm limited to petsc 3.7.6 for reasons of eventually using within an existing, larger codebase that depends on 3.7.6, is it possible to use TAO with a user-defined module in Fortran90 using 3.7.6? I had tried the various forms of includes listed in the documentation, e.g. see below. I think I now realize this is an issue with the petsc installation on Edison, it does not seem to have the petsctao module in the library file (confirmed using nm -D on the library file). If I do the same include and use statement but with, for example, petscmat, it compiles fine. I built v3.8 from source, and the petsctao module is in the library file, and now the make works. commondat.F90 module commondat #include use petsc PetscReal :: alpha PetscInt :: n end module commondat program rosenbrock1f #include use petsctao use commondat On Fri, Oct 6, 2017 at 7:54 AM, Matthew Knepley wrote: > On Fri, Oct 6, 2017 at 7:36 AM, Barry Smith wrote: > >> >> Randy, >> >> First you absolutely must use version 3.8 or the master development >> copy. We improved and simplified dramatically how Fortran (90) is utilized >> from PETSc. >> >> Note that there is only one simple set of include files and modules >> for Fortran; see the newest documentation. >> >> > http://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/ > Sys/UsingFortran.html > > Matt > > >> >> Barry >> >> >> > On Oct 5, 2017, at 11:48 PM, Randy Michael Churchill >> wrote: >> > >> > A simple setup question with TAO: if I were to convert the >> rosenbrock1f.F90 example to use a module instead of common structures, how >> would I setup the include statements? I've tried various combinations >> (using petscXXXdef.h, petscXXX.h, petscXXX.h90, along with use petscXXX), >> but seem to get errors with all. >> > >> > file:rosenbrock1f.h: >> > module commondat >> > PetscReal :: alpha >> > PetscInt :: n >> > end module commondat >> > >> > file:rosenbrock1f.90: >> > program rosenbrock1f >> > !!include statements??? which and where???!!! >> > use commondat >> > ... >> > >> > subroutine FormFunctionGradient(tao, X, f, G, dummy, ierr) >> > use commondat >> > implicit none >> > ... >> > >> > (https://www.mcs.anl.gov/petsc/petsc-dev/src/tao/unconstrain >> ed/examples/tutorials/rosenbrock1f.F90.html) >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -- R. Michael Churchill -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Fri Oct 6 16:33:28 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 6 Oct 2017 16:33:28 -0500 Subject: [petsc-users] TAO setup with modules in Fortran 90 In-Reply-To: References: <88546AF0-385D-44DA-965D-169A6DBAA523@mcs.anl.gov> Message-ID: Here is the change to petsc-3.7 for it to work without 'use petsctao' -i.e only using include files. [this works as long as you only need stuff from petscdef.h in your module - and not the parameters defined in petsc.h] Satish ------- balay at asterix /home/balay/tmp/petsc/src/tao/unconstrained/examples/tutorials (maint-3.7 *=) $ git diff diff --git a/src/tao/unconstrained/examples/tutorials/rosenbrock1f.F b/src/tao/unconstrained/examples/tutorials/rosenbrock1f.F index 59c467714d..d160d540c0 100644 --- a/src/tao/unconstrained/examples/tutorials/rosenbrock1f.F +++ b/src/tao/unconstrained/examples/tutorials/rosenbrock1f.F @@ -22,9 +22,16 @@ ! ---------------------------------------------------------------------- ! +#include "petsc/finclude/petscdef.h" + module commondat + PetscReal :: alpha + PetscInt :: n + end module commondat + + program main + use commondat implicit none - -#include "rosenbrock1f.h" +#include "petsc/finclude/petsc.h" ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Variable declarations @@ -144,10 +151,9 @@ ! f - function value subroutine FormFunctionGradient(tao, X, f, G, dummy, ierr) + use commondat implicit none - -! n,alpha defined in rosenbrock1f.h -#include "rosenbrock1f.h" +#include "petsc/finclude/petsc.h" Tao tao Vec X,G @@ -216,9 +222,9 @@ ! require this matrix. subroutine FormHessian(tao,X,H,PrecH,dummy,ierr) + use commondat implicit none - -#include "rosenbrock1f.h" +#include "petsc/finclude/petsc.h" ! Input/output variables: Tao tao balay at asterix /home/balay/tmp/petsc/src/tao/unconstrained/examples/tutorials (maint-3.7 *=) $ make rosenbrock1 mpicc -o rosenbrock1.o -c -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 -I/home/balay/tmp/petsc/include -I/home/balay/tmp/petsc/arch-linux2-c-debug/include `pwd`/rosenbrock1.c mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 -o rosenbrock1 rosenbrock1.o -Wl,-rpath,/home/balay/tmp/petsc/arch-linux2-c-debug/lib -L/home/balay/tmp/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/balay/soft/mpich-3.3a2/lib -L/home/balay/soft/mpich-3.3a2/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/7 -L/usr/lib/gcc/x86_64-redhat-linux/7 -lpetsc -llapack -lblas -lX11 -lpthread -lm -lmpifort -lgfortran -lm -lgfortran -lm -lquadmath -lmpicxx -lstdc++ -lm -Wl,-rpath,/home/balay/soft/mpich-3.3a2/lib -L/home/balay/soft/mpich-3.3a2/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/7 -L/usr/lib/gcc/x86_64-redhat-linux/7 -ldl -Wl,-rpath,/home/balay/soft/mpich-3.3a2/lib -lmpi -lgcc_s -ldl /usr/bin/rm -f rosenbrock1.o balay at asterix /home/balay/tmp/petsc/src/tao/unconstrained/examples/tutorials (maint-3.7 *=) $ make runrosenbrock1 balay at asterix /home/balay/tmp/petsc/src/tao/unconstrained/examples/tutorials (maint-3.7 *=) $ On Fri, 6 Oct 2017, Randy Michael Churchill wrote: > So if I'm limited to petsc 3.7.6 for reasons of eventually using within an > existing, larger codebase that depends on 3.7.6, is it possible to use TAO > with a user-defined module in Fortran90 using 3.7.6? > > I had tried the various forms of includes listed in the documentation, e.g. > see below. I think I now realize this is an issue with the petsc > installation on Edison, it does not seem to have the petsctao module in the > library file (confirmed using nm -D on the library file). If I do the same > include and use statement but with, for example, petscmat, it compiles > fine. > > I built v3.8 from source, and the petsctao module is in the library file, > and now the make works. > > commondat.F90 > module commondat > #include > use petsc > PetscReal :: alpha > PetscInt :: n > end module commondat > > program rosenbrock1f > #include > use petsctao > use commondat > > > > > On Fri, Oct 6, 2017 at 7:54 AM, Matthew Knepley wrote: > > > On Fri, Oct 6, 2017 at 7:36 AM, Barry Smith wrote: > > > >> > >> Randy, > >> > >> First you absolutely must use version 3.8 or the master development > >> copy. We improved and simplified dramatically how Fortran (90) is utilized > >> from PETSc. > >> > >> Note that there is only one simple set of include files and modules > >> for Fortran; see the newest documentation. > >> > >> > > http://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/ > > Sys/UsingFortran.html > > > > Matt > > > > > >> > >> Barry > >> > >> > >> > On Oct 5, 2017, at 11:48 PM, Randy Michael Churchill > >> wrote: > >> > > >> > A simple setup question with TAO: if I were to convert the > >> rosenbrock1f.F90 example to use a module instead of common structures, how > >> would I setup the include statements? I've tried various combinations > >> (using petscXXXdef.h, petscXXX.h, petscXXX.h90, along with use petscXXX), > >> but seem to get errors with all. > >> > > >> > file:rosenbrock1f.h: > >> > module commondat > >> > PetscReal :: alpha > >> > PetscInt :: n > >> > end module commondat > >> > > >> > file:rosenbrock1f.90: > >> > program rosenbrock1f > >> > !!include statements??? which and where???!!! > >> > use commondat > >> > ... > >> > > >> > subroutine FormFunctionGradient(tao, X, f, G, dummy, ierr) > >> > use commondat > >> > implicit none > >> > ... > >> > > >> > (https://www.mcs.anl.gov/petsc/petsc-dev/src/tao/unconstrain > >> ed/examples/tutorials/rosenbrock1f.F90.html) > >> > >> > > > > > > -- > > What most experimenters take for granted before they begin their > > experiments is infinitely more interesting than any results to which their > > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > From bsmith at mcs.anl.gov Fri Oct 6 16:59:30 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 6 Oct 2017 23:59:30 +0200 Subject: [petsc-users] TAO setup with modules in Fortran 90 In-Reply-To: References: <88546AF0-385D-44DA-965D-169A6DBAA523@mcs.anl.gov> Message-ID: > On Oct 6, 2017, at 11:18 PM, Randy Michael Churchill wrote: > > So if I'm limited to petsc 3.7.6 for reasons of eventually using within an existing, larger codebase that depends on 3.7.6, is it possible to use TAO with a user-defined module in Fortran90 using 3.7.6? You should really pus this "existing, larger codebase" to transition to PETSc 3.8 sooner, rather than later. Especially for developments in Fortran it will make life better for everyone. We are always willing to help users, once they have read the changes information, with information to make transitioning to the latest PETSc version easy. For any parts of the code in C, transitioning from 3.7 to 3.8 should be really simple. Barry > > I had tried the various forms of includes listed in the documentation, e.g. see below. I think I now realize this is an issue with the petsc installation on Edison, it does not seem to have the petsctao module in the library file (confirmed using nm -D on the library file). If I do the same include and use statement but with, for example, petscmat, it compiles fine. > > I built v3.8 from source, and the petsctao module is in the library file, and now the make works. > > commondat.F90 > module commondat > #include > use petsc > PetscReal :: alpha > PetscInt :: n > end module commondat > > program rosenbrock1f > #include > use petsctao > use commondat > > > > > On Fri, Oct 6, 2017 at 7:54 AM, Matthew Knepley wrote: > On Fri, Oct 6, 2017 at 7:36 AM, Barry Smith wrote: > > Randy, > > First you absolutely must use version 3.8 or the master development copy. We improved and simplified dramatically how Fortran (90) is utilized from PETSc. > > Note that there is only one simple set of include files and modules for Fortran; see the newest documentation. > > > http://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/Sys/UsingFortran.html > > Matt > > > Barry > > > > On Oct 5, 2017, at 11:48 PM, Randy Michael Churchill wrote: > > > > A simple setup question with TAO: if I were to convert the rosenbrock1f.F90 example to use a module instead of common structures, how would I setup the include statements? I've tried various combinations (using petscXXXdef.h, petscXXX.h, petscXXX.h90, along with use petscXXX), but seem to get errors with all. > > > > file:rosenbrock1f.h: > > module commondat > > PetscReal :: alpha > > PetscInt :: n > > end module commondat > > > > file:rosenbrock1f.90: > > program rosenbrock1f > > !!include statements??? which and where???!!! > > use commondat > > ... > > > > subroutine FormFunctionGradient(tao, X, f, G, dummy, ierr) > > use commondat > > implicit none > > ... > > > > (https://www.mcs.anl.gov/petsc/petsc-dev/src/tao/unconstrained/examples/tutorials/rosenbrock1f.F90.html) > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > -- > R. Michael Churchill From zakaryah at gmail.com Sat Oct 7 12:49:34 2017 From: zakaryah at gmail.com (zakaryah .) Date: Sat, 7 Oct 2017 13:49:34 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <87zi979alu.fsf@jedbrown.org> References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> Message-ID: OK - I ran with -snes_monitor -snes_converged_reason -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls -snes_compare_explicit and here is the full error message, output immediately after Finite difference Jacobian Mat Object: 24 MPI processes type: mpiaij [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Invalid argument [0]PETSC ERROR: Matrix not generated from a DMDA [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 --download-fblaslapack -with-debugging=0 [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/mat/interface/matrix.c [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c [0]PETSC ERROR: #5 SNESSolve() line 4005 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c [0]PETSC ERROR: #6 solveWarp3D() line 659 in /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown wrote: > Always always always send the whole error message. > > "zakaryah ." writes: > > > I tried -snes_compare_explicit, and got the following error: > > > > [0]PETSC ERROR: Invalid argument > > > > [0]PETSC ERROR: Matrix not generated from a DMDA > > > > What am I doing wrong? > > > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown wrote: > > > >> Barry Smith writes: > >> > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . wrote: > >> >> > >> >> I'm still working on this. I've made some progress, and it looks > like > >> the issue is with the KSP, at least for now. The Jacobian may be > >> ill-conditioned. Is it possible to use -snes_test_display during an > >> intermediate step of the analysis? I would like to inspect the Jacobian > >> after several solves have already completed, > >> > > >> > No, our currently code for testing Jacobians is poor quality and > >> poorly organized. Needs a major refactoring to do things properly. Sorry > >> > >> You can use -snes_compare_explicit or -snes_compare_coloring to output > >> differences on each Newton step. > >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zakaryah at gmail.com Sun Oct 8 01:05:58 2017 From: zakaryah at gmail.com (zakaryah .) Date: Sun, 8 Oct 2017 02:05:58 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> Message-ID: I'm more confused than ever. I don't understand the output of -snes_type test -snes_test_display. For the user-defined state of the vector (where I'd like to test the Jacobian), the finite difference Jacobian at row 0 evaluates as: row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) (37, 16.325) (38, 4.83918) But the hand-coded Jacobian at row 0 evaluates as: row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) and the difference between the Jacobians at row 0 evaluates as: row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, 0.) (41, 0.) The difference between the column numbering between the finite difference and the hand-coded Jacobians looks like a serious problem to me, but I'm probably missing something. I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, and for this test problem the grid dimensions are 11x7x6. For a grid point x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? If so, then the column numbers of the hand-coded Jacobian match those of the 27 point stencil I have in mind. However, I am then at a loss to explain the column numbers in the finite difference Jacobian. On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . wrote: > OK - I ran with -snes_monitor -snes_converged_reason > -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual > -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls > -snes_compare_explicit > > and here is the full error message, output immediately after > > Finite difference Jacobian > Mat Object: 24 MPI processes > type: mpiaij > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [0]PETSC ERROR: Invalid argument > > [0]PETSC ERROR: Matrix not generated from a DMDA > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 > > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named > node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 > > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 > --download-fblaslapack -with-debugging=0 > > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in /rugpfs/fs0/home/zfrentz/ > PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c > > [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/ > PETSc/build/petsc-3.7.6/src/mat/interface/matrix.c > > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in > /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in > /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c > > [0]PETSC ERROR: #5 SNESSolve() line 4005 in /rugpfs/fs0/home/zfrentz/ > PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > [0]PETSC ERROR: #6 solveWarp3D() line 659 in /ru-auth/local/home/zfrentz/ > Code/OpticalFlow/working/October6_2017/mshs.c > > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown wrote: > >> Always always always send the whole error message. >> >> "zakaryah ." writes: >> >> > I tried -snes_compare_explicit, and got the following error: >> > >> > [0]PETSC ERROR: Invalid argument >> > >> > [0]PETSC ERROR: Matrix not generated from a DMDA >> > >> > What am I doing wrong? >> > >> > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown wrote: >> > >> >> Barry Smith writes: >> >> >> >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . wrote: >> >> >> >> >> >> I'm still working on this. I've made some progress, and it looks >> like >> >> the issue is with the KSP, at least for now. The Jacobian may be >> >> ill-conditioned. Is it possible to use -snes_test_display during an >> >> intermediate step of the analysis? I would like to inspect the >> Jacobian >> >> after several solves have already completed, >> >> > >> >> > No, our currently code for testing Jacobians is poor quality and >> >> poorly organized. Needs a major refactoring to do things properly. >> Sorry >> >> >> >> You can use -snes_compare_explicit or -snes_compare_coloring to output >> >> differences on each Newton step. >> >> >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Oct 8 04:57:46 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 8 Oct 2017 11:57:46 +0200 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> Message-ID: <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> There is apparently confusing in understanding the ordering. Is this all on one process that you get funny results? Are you using MatSetValuesStencil() to provide the matrix (it is generally easier than providing it yourself). In parallel MatView() always maps the rows and columns to the natural ordering before printing, if you use a matrix created from the DMDA. If you create the matrix yourself it has a different MatView in parallel that is in in thePETSc ordering.\ Barry > On Oct 8, 2017, at 8:05 AM, zakaryah . wrote: > > I'm more confused than ever. I don't understand the output of -snes_type test -snes_test_display. > > For the user-defined state of the vector (where I'd like to test the Jacobian), the finite difference Jacobian at row 0 evaluates as: > > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) (37, 16.325) (38, 4.83918) > > But the hand-coded Jacobian at row 0 evaluates as: > > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) > and the difference between the Jacobians at row 0 evaluates as: > > row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, 0.) (41, 0.) > > The difference between the column numbering between the finite difference and the hand-coded Jacobians looks like a serious problem to me, but I'm probably missing something. > > I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, and for this test problem the grid dimensions are 11x7x6. For a grid point x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? If so, then the column numbers of the hand-coded Jacobian match those of the 27 point stencil I have in mind. However, I am then at a loss to explain the column numbers in the finite difference Jacobian. > > > > > On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . wrote: > OK - I ran with -snes_monitor -snes_converged_reason -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls -snes_compare_explicit > > and here is the full error message, output immediately after > > Finite difference Jacobian > Mat Object: 24 MPI processes > type: mpiaij > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > [0]PETSC ERROR: Invalid argument > > [0]PETSC ERROR: Matrix not generated from a DMDA > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 > > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 > > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 --download-fblaslapack -with-debugging=0 > > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c > > [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/mat/interface/matrix.c > > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c > > [0]PETSC ERROR: #5 SNESSolve() line 4005 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > [0]PETSC ERROR: #6 solveWarp3D() line 659 in /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c > > > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown wrote: > Always always always send the whole error message. > > "zakaryah ." writes: > > > I tried -snes_compare_explicit, and got the following error: > > > > [0]PETSC ERROR: Invalid argument > > > > [0]PETSC ERROR: Matrix not generated from a DMDA > > > > What am I doing wrong? > > > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown wrote: > > > >> Barry Smith writes: > >> > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . wrote: > >> >> > >> >> I'm still working on this. I've made some progress, and it looks like > >> the issue is with the KSP, at least for now. The Jacobian may be > >> ill-conditioned. Is it possible to use -snes_test_display during an > >> intermediate step of the analysis? I would like to inspect the Jacobian > >> after several solves have already completed, > >> > > >> > No, our currently code for testing Jacobians is poor quality and > >> poorly organized. Needs a major refactoring to do things properly. Sorry > >> > >> You can use -snes_compare_explicit or -snes_compare_coloring to output > >> differences on each Newton step. > >> > > From fuentesdt at gmail.com Mon Oct 9 09:02:58 2017 From: fuentesdt at gmail.com (David Fuentes) Date: Mon, 9 Oct 2017 09:02:58 -0500 Subject: [petsc-users] ex12 poisson solver Message-ID: Hi, I'm trying to use petsc 3.8.0 with ex12.c example to setup a poisson solver: http://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex12.c.html I seem to be getting zeros in my jacobian with this example? I attached a debugger and the assembly routines seems ok... but am getting zero jacobian somewhere along the way to the MatAssembly... Am I missing something with the command line arguments ? $ ./ex12 -snes_type ksponly -snes_monitor -snes_converged_reason -ksp_converged_reason -ksp_monitor -ksp_rtol 1.e-6 -pc_type jacobi -info -info_exclude null,pc,vec -------------------------------------------------------------------------- [[55310,1],0]: A high-performance Open MPI point-to-point messaging module was unable to find any relevant network interfaces: Module: OpenFabrics (openib) Host: SCRGP2 Another transport will be used instead, although this may result in lower performance. -------------------------------------------------------------------------- [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 unneeded,8 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 8 nodes out of 8 rows. Not using Inode routines [0] DMGetDMSNES(): Creating new DMSNES [0] DMGetDMKSP(): Creating new DMKSP 0 SNES Function norm 1.414213562373e+00 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 unneeded,8 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. [0] SNESComputeJacobian(): Rebuilding preconditioner 0 KSP Residual norm 1.414213562373e+00 1 KSP Residual norm 1.414213562373e+00 [0] KSPGMRESBuildSoln(): Likely your matrix or preconditioner is singular. HH(it,it) is identically zero; it = 0 GRS(it) = 1.41421 Linear solve did not converge due to DIVERGED_BREAKDOWN iterations 1 [0] SNESSolve_KSPONLY(): iter=0, number linear solve failures 1 greater than current SNES allowed, stopping solve Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 Number of SNES iterations = 0 L_2 Error: 0.751285 configured with: ./config/configure.py --with-shared-libraries --with-clanguage=c++ --CFLAGS='-g -O0' --CXXFLAGS='-g -O0' --download-ctetgen --download-triangle --with-debugging=yes --with-exodusii-lib=[/usr/lib/x86_64-linux-gnu/libexoIIv2.so] --with-netcdf-lib=[/usr/lib/libnetcdf.so] --with-netcdf-include=/usr/include --with-hdf5-include=/usr/include/hdf5/serial/ --with-hdf5-lib=[/usr/lib/x86_64-linux-gnu/hdf5/serial/libhdf5.so] --with-c2html=0 --with-exodusii-include=/usr/include thanks! David -------------- next part -------------- An HTML attachment was scrubbed... URL: From jychang48 at gmail.com Mon Oct 9 11:40:25 2017 From: jychang48 at gmail.com (Justin Chang) Date: Mon, 9 Oct 2017 11:40:25 -0500 Subject: [petsc-users] ex12 poisson solver In-Reply-To: References: Message-ID: You need this additional command line argument: -petscspace_order 1 On Mon, Oct 9, 2017 at 9:02 AM, David Fuentes wrote: > > Hi, > > I'm trying to use petsc 3.8.0 with ex12.c example to setup a poisson > solver: http://www.mcs.anl.gov/petsc/petsc-current/src/snes/ > examples/tutorials/ex12.c.html > > I seem to be getting zeros in my jacobian with this example? > I attached a debugger and the assembly routines seems ok... but am getting > zero jacobian somewhere along the way to the MatAssembly... > Am I missing something with the command line arguments ? > > > > $ ./ex12 -snes_type ksponly -snes_monitor -snes_converged_reason > -ksp_converged_reason -ksp_monitor -ksp_rtol 1.e-6 -pc_type jacobi -info > -info_exclude null,pc,vec > -------------------------------------------------------------------------- > [[55310,1],0]: A high-performance Open MPI point-to-point messaging module > was unable to find any relevant network interfaces: > > Module: OpenFabrics (openib) > Host: SCRGP2 > > Another transport will be used instead, although this may result in > lower performance. > -------------------------------------------------------------------------- > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 > unneeded,8 used > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 > [0] MatCheckCompressedRow(): Found the ratio (num_zerorows > 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. > [0] MatSeqAIJCheckInode(): Found 8 nodes out of 8 rows. Not using Inode > routines > [0] DMGetDMSNES(): Creating new DMSNES > [0] DMGetDMKSP(): Creating new DMKSP > 0 SNES Function norm 1.414213562373e+00 > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 > unneeded,8 used > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 > [0] MatCheckCompressedRow(): Found the ratio (num_zerorows > 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. > [0] SNESComputeJacobian(): Rebuilding preconditioner > 0 KSP Residual norm 1.414213562373e+00 > 1 KSP Residual norm 1.414213562373e+00 > [0] KSPGMRESBuildSoln(): Likely your matrix or preconditioner is singular. > HH(it,it) is identically zero; it = 0 GRS(it) = 1.41421 > Linear solve did not converge due to DIVERGED_BREAKDOWN iterations 1 > [0] SNESSolve_KSPONLY(): iter=0, number linear solve failures 1 greater > than current SNES allowed, stopping solve > Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 > Number of SNES iterations = 0 > L_2 Error: 0.751285 > > > configured with: ./config/configure.py --with-shared-libraries > --with-clanguage=c++ --CFLAGS='-g -O0' --CXXFLAGS='-g -O0' > --download-ctetgen --download-triangle --with-debugging=yes > --with-exodusii-lib=[/usr/lib/x86_64-linux-gnu/libexoIIv2.so] > --with-netcdf-lib=[/usr/lib/libnetcdf.so] --with-netcdf-include=/usr/include > --with-hdf5-include=/usr/include/hdf5/serial/ > --with-hdf5-lib=[/usr/lib/x86_64-linux-gnu/hdf5/serial/libhdf5.so] > --with-c2html=0 --with-exodusii-include=/usr/include > > > thanks! > David > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Oct 9 11:56:56 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 9 Oct 2017 12:56:56 -0400 Subject: [petsc-users] ex12 poisson solver In-Reply-To: References: Message-ID: On Mon, Oct 9, 2017 at 12:40 PM, Justin Chang wrote: > You need this additional command line argument: -petscspace_order 1 > Justin is correct. By default, it is P0, which would be a purely discontinuous solution to Laplace, without jump terms. Thanks, Matt > On Mon, Oct 9, 2017 at 9:02 AM, David Fuentes wrote: > >> >> Hi, >> >> I'm trying to use petsc 3.8.0 with ex12.c example to setup a poisson >> solver: http://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples >> /tutorials/ex12.c.html >> >> I seem to be getting zeros in my jacobian with this example? >> I attached a debugger and the assembly routines seems ok... but am >> getting zero jacobian somewhere along the way to the MatAssembly... >> Am I missing something with the command line arguments ? >> >> >> >> $ ./ex12 -snes_type ksponly -snes_monitor -snes_converged_reason >> -ksp_converged_reason -ksp_monitor -ksp_rtol 1.e-6 -pc_type jacobi -info >> -info_exclude null,pc,vec >> ------------------------------------------------------------ >> -------------- >> [[55310,1],0]: A high-performance Open MPI point-to-point messaging module >> was unable to find any relevant network interfaces: >> >> Module: OpenFabrics (openib) >> Host: SCRGP2 >> >> Another transport will be used instead, although this may result in >> lower performance. >> ------------------------------------------------------------ >> -------------- >> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 >> unneeded,8 used >> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 >> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 >> [0] MatCheckCompressedRow(): Found the ratio (num_zerorows >> 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. >> [0] MatSeqAIJCheckInode(): Found 8 nodes out of 8 rows. Not using Inode >> routines >> [0] DMGetDMSNES(): Creating new DMSNES >> [0] DMGetDMKSP(): Creating new DMKSP >> 0 SNES Function norm 1.414213562373e+00 >> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 >> unneeded,8 used >> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 >> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 >> [0] MatCheckCompressedRow(): Found the ratio (num_zerorows >> 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. >> [0] SNESComputeJacobian(): Rebuilding preconditioner >> 0 KSP Residual norm 1.414213562373e+00 >> 1 KSP Residual norm 1.414213562373e+00 >> [0] KSPGMRESBuildSoln(): Likely your matrix or preconditioner is >> singular. HH(it,it) is identically zero; it = 0 GRS(it) = 1.41421 >> Linear solve did not converge due to DIVERGED_BREAKDOWN iterations 1 >> [0] SNESSolve_KSPONLY(): iter=0, number linear solve failures 1 greater >> than current SNES allowed, stopping solve >> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 >> Number of SNES iterations = 0 >> L_2 Error: 0.751285 >> >> >> configured with: ./config/configure.py --with-shared-libraries >> --with-clanguage=c++ --CFLAGS='-g -O0' --CXXFLAGS='-g -O0' >> --download-ctetgen --download-triangle --with-debugging=yes >> --with-exodusii-lib=[/usr/lib/x86_64-linux-gnu/libexoIIv2.so] >> --with-netcdf-lib=[/usr/lib/libnetcdf.so] --with-netcdf-include=/usr/include >> --with-hdf5-include=/usr/include/hdf5/serial/ >> --with-hdf5-lib=[/usr/lib/x86_64-linux-gnu/hdf5/serial/libhdf5.so] >> --with-c2html=0 --with-exodusii-include=/usr/include >> >> >> thanks! >> David >> >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Oct 9 11:57:34 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 9 Oct 2017 12:57:34 -0400 Subject: [petsc-users] ex12 poisson solver In-Reply-To: References: Message-ID: On Mon, Oct 9, 2017 at 12:56 PM, Matthew Knepley wrote: > On Mon, Oct 9, 2017 at 12:40 PM, Justin Chang wrote: > >> You need this additional command line argument: -petscspace_order 1 >> > > Justin is correct. By default, it is P0, which would be a purely > discontinuous solution to Laplace, without > jump terms. > I would also note that all the tests are at the bottom of the file. Matt > Thanks, > > Matt > > >> On Mon, Oct 9, 2017 at 9:02 AM, David Fuentes >> wrote: >> >>> >>> Hi, >>> >>> I'm trying to use petsc 3.8.0 with ex12.c example to setup a poisson >>> solver: http://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples >>> /tutorials/ex12.c.html >>> >>> I seem to be getting zeros in my jacobian with this example? >>> I attached a debugger and the assembly routines seems ok... but am >>> getting zero jacobian somewhere along the way to the MatAssembly... >>> Am I missing something with the command line arguments ? >>> >>> >>> >>> $ ./ex12 -snes_type ksponly -snes_monitor -snes_converged_reason >>> -ksp_converged_reason -ksp_monitor -ksp_rtol 1.e-6 -pc_type jacobi -info >>> -info_exclude null,pc,vec >>> ------------------------------------------------------------ >>> -------------- >>> [[55310,1],0]: A high-performance Open MPI point-to-point messaging >>> module >>> was unable to find any relevant network interfaces: >>> >>> Module: OpenFabrics (openib) >>> Host: SCRGP2 >>> >>> Another transport will be used instead, although this may result in >>> lower performance. >>> ------------------------------------------------------------ >>> -------------- >>> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 >>> unneeded,8 used >>> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 >>> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 >>> [0] MatCheckCompressedRow(): Found the ratio (num_zerorows >>> 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. >>> [0] MatSeqAIJCheckInode(): Found 8 nodes out of 8 rows. Not using Inode >>> routines >>> [0] DMGetDMSNES(): Creating new DMSNES >>> [0] DMGetDMKSP(): Creating new DMKSP >>> 0 SNES Function norm 1.414213562373e+00 >>> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 >>> unneeded,8 used >>> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 >>> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 >>> [0] MatCheckCompressedRow(): Found the ratio (num_zerorows >>> 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. >>> [0] SNESComputeJacobian(): Rebuilding preconditioner >>> 0 KSP Residual norm 1.414213562373e+00 >>> 1 KSP Residual norm 1.414213562373e+00 >>> [0] KSPGMRESBuildSoln(): Likely your matrix or preconditioner is >>> singular. HH(it,it) is identically zero; it = 0 GRS(it) = 1.41421 >>> Linear solve did not converge due to DIVERGED_BREAKDOWN iterations 1 >>> [0] SNESSolve_KSPONLY(): iter=0, number linear solve failures 1 greater >>> than current SNES allowed, stopping solve >>> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations >>> 0 >>> Number of SNES iterations = 0 >>> L_2 Error: 0.751285 >>> >>> >>> configured with: ./config/configure.py --with-shared-libraries >>> --with-clanguage=c++ --CFLAGS='-g -O0' --CXXFLAGS='-g -O0' >>> --download-ctetgen --download-triangle --with-debugging=yes >>> --with-exodusii-lib=[/usr/lib/x86_64-linux-gnu/libexoIIv2.so] >>> --with-netcdf-lib=[/usr/lib/libnetcdf.so] --with-netcdf-include=/usr/include >>> --with-hdf5-include=/usr/include/hdf5/serial/ >>> --with-hdf5-lib=[/usr/lib/x86_64-linux-gnu/hdf5/serial/libhdf5.so] >>> --with-c2html=0 --with-exodusii-include=/usr/include >>> >>> >>> thanks! >>> David >>> >>> >>> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.appel17 at imperial.ac.uk Mon Oct 9 12:58:39 2017 From: t.appel17 at imperial.ac.uk (Appel, Thibaut) Date: Mon, 9 Oct 2017 17:58:39 +0000 Subject: [petsc-users] Preallocation (dnz, onz arrays) in sparse parallel matrix In-Reply-To: References: , Message-ID: <535FC702-A4D2-4901-9B58-92899FF60181@imperial.ac.uk> Hi Barry, PetscSplitOwnership works indeed even though it needs some further work to determine the local ownership istart and iend (only gives the ownership size) I found out you can "cheat" and call MatMPIAIJSetallocation(Mat,0,PETSC_NULL_INTEGER,0,PETSC_NULL_INTEGER), then MatGetOwnershipRange, compute your dnz and onz arrays and call the preallocation routine once again with the right parameters. Unless there's something wrong in doing this... Thanks for your reactivity and support, much appreciated. Thibaut > On 6 Oct 2017, at 20:57, Barry Smith wrote: > > >> On Oct 6, 2017, at 4:08 PM, Thibaut Appel wrote: >> >> Dear PETSc users, >> >> I am trying to assemble a sparse matrix in parallel where my main objective is efficiency and scalability. >> >> Precisely, I am using MatMPIAIJSetPreallocation with diagonal entries (dnz) and off-diagonal entries (onz) arrays (non zero elements for each rows) to allocate the memory needed. >> >> Prior to the insertion of the elements in the matrix, I am doing a primary loop to determine those arrays dnz and onz for each processor owning its own set of rows. Ideally, this loop would look like >> >> for irow=istart, iend-1, i++ ----> count dnz(irow) and onz(irow) >> But it seems that you cannot call MatGetOwnershipRange(Mat,istart,iend,ierr) before MatMPIAIJSetPreallocation to get istart and iend. Why is that? >> Which optimal approach should be followed to count your non-zero elements for each processor? I saw two conversations where Barry Smith suggested the use of MatPreallocateInitialize/Finalize or PetscSplitOwnership, which means you have to determine yourself the rows owned by each processor? Is that not contrary to the "PETSc spirit"? > > Use PetscSplitOwnership() to determine the ownerships. > > The reason for not being about to use MatGetOwnershipRange() before setting the preallocation is just because of the design of the data constructor for the matrix class. You are correct that it might be possible to refactor the code to have more steps in the constructor allowing the call to MatGetOwnershipRange(). > > We welcome pull requests but are unlikely to make the change ourselves, the reason is that normally one is working with a mesh data structure (offend a DM) that provides (based on its decomposition) the ownership ranges (rather than just having the matrix decide on it) and hence one does not normally call the MatSetSizes() allowing the matrix to determine the ownership ranges. > > Barry > > >> Thanks for your help and have a nice weekend >> >> Thibaut > From bsmith at mcs.anl.gov Tue Oct 10 04:09:51 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 10 Oct 2017 11:09:51 +0200 Subject: [petsc-users] ex12 poisson solver In-Reply-To: References: Message-ID: <5F4455FF-69AF-4996-B947-7A16C4EF19F1@mcs.anl.gov> > On Oct 9, 2017, at 6:56 PM, Matthew Knepley wrote: > > On Mon, Oct 9, 2017 at 12:40 PM, Justin Chang wrote: > You need this additional command line argument: -petscspace_order 1 > > Justin is correct. By default, it is P0, which would be a purely discontinuous solution to Laplace, without > jump terms. Pretty bad default. Maybe better to have P1 be the default. > > Thanks, > > Matt > > On Mon, Oct 9, 2017 at 9:02 AM, David Fuentes wrote: > > Hi, > > I'm trying to use petsc 3.8.0 with ex12.c example to setup a poisson solver: http://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex12.c.html > > I seem to be getting zeros in my jacobian with this example? > I attached a debugger and the assembly routines seems ok... but am getting zero jacobian somewhere along the way to the MatAssembly... > Am I missing something with the command line arguments ? > > > > $ ./ex12 -snes_type ksponly -snes_monitor -snes_converged_reason -ksp_converged_reason -ksp_monitor -ksp_rtol 1.e-6 -pc_type jacobi -info -info_exclude null,pc,vec > -------------------------------------------------------------------------- > [[55310,1],0]: A high-performance Open MPI point-to-point messaging module > was unable to find any relevant network interfaces: > > Module: OpenFabrics (openib) > Host: SCRGP2 > > Another transport will be used instead, although this may result in > lower performance. > -------------------------------------------------------------------------- > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 unneeded,8 used > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 > [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. > [0] MatSeqAIJCheckInode(): Found 8 nodes out of 8 rows. Not using Inode routines > [0] DMGetDMSNES(): Creating new DMSNES > [0] DMGetDMKSP(): Creating new DMKSP > 0 SNES Function norm 1.414213562373e+00 > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 unneeded,8 used > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 > [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. > [0] SNESComputeJacobian(): Rebuilding preconditioner > 0 KSP Residual norm 1.414213562373e+00 > 1 KSP Residual norm 1.414213562373e+00 > [0] KSPGMRESBuildSoln(): Likely your matrix or preconditioner is singular. HH(it,it) is identically zero; it = 0 GRS(it) = 1.41421 > Linear solve did not converge due to DIVERGED_BREAKDOWN iterations 1 > [0] SNESSolve_KSPONLY(): iter=0, number linear solve failures 1 greater than current SNES allowed, stopping solve > Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 > Number of SNES iterations = 0 > L_2 Error: 0.751285 > > > configured with: ./config/configure.py --with-shared-libraries --with-clanguage=c++ --CFLAGS='-g -O0' --CXXFLAGS='-g -O0' --download-ctetgen --download-triangle --with-debugging=yes --with-exodusii-lib=[/usr/lib/x86_64-linux-gnu/libexoIIv2.so] --with-netcdf-lib=[/usr/lib/libnetcdf.so] --with-netcdf-include=/usr/include --with-hdf5-include=/usr/include/hdf5/serial/ --with-hdf5-lib=[/usr/lib/x86_64-linux-gnu/hdf5/serial/libhdf5.so] --with-c2html=0 --with-exodusii-include=/usr/include > > > thanks! > David > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From knepley at gmail.com Tue Oct 10 07:40:53 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 10 Oct 2017 08:40:53 -0400 Subject: [petsc-users] ex12 poisson solver In-Reply-To: <5F4455FF-69AF-4996-B947-7A16C4EF19F1@mcs.anl.gov> References: <5F4455FF-69AF-4996-B947-7A16C4EF19F1@mcs.anl.gov> Message-ID: On Tue, Oct 10, 2017 at 5:09 AM, Barry Smith wrote: > > > On Oct 9, 2017, at 6:56 PM, Matthew Knepley wrote: > > > > On Mon, Oct 9, 2017 at 12:40 PM, Justin Chang > wrote: > > You need this additional command line argument: -petscspace_order 1 > > > > Justin is correct. By default, it is P0, which would be a purely > discontinuous solution to Laplace, without > > jump terms. > > Pretty bad default. Maybe better to have P1 be the default. > I will look into it. I recall something being hard about setting everything up. It would be nice to have a better system for default values for options database keys. Matt > > > > Thanks, > > > > Matt > > > > On Mon, Oct 9, 2017 at 9:02 AM, David Fuentes > wrote: > > > > Hi, > > > > I'm trying to use petsc 3.8.0 with ex12.c example to setup a poisson > solver: http://www.mcs.anl.gov/petsc/petsc-current/src/snes/ > examples/tutorials/ex12.c.html > > > > I seem to be getting zeros in my jacobian with this example? > > I attached a debugger and the assembly routines seems ok... but am > getting zero jacobian somewhere along the way to the MatAssembly... > > Am I missing something with the command line arguments ? > > > > > > > > $ ./ex12 -snes_type ksponly -snes_monitor -snes_converged_reason > -ksp_converged_reason -ksp_monitor -ksp_rtol 1.e-6 -pc_type jacobi -info > -info_exclude null,pc,vec > > ------------------------------------------------------------ > -------------- > > [[55310,1],0]: A high-performance Open MPI point-to-point messaging > module > > was unable to find any relevant network interfaces: > > > > Module: OpenFabrics (openib) > > Host: SCRGP2 > > > > Another transport will be used instead, although this may result in > > lower performance. > > ------------------------------------------------------------ > -------------- > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 > unneeded,8 used > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 > > [0] MatCheckCompressedRow(): Found the ratio (num_zerorows > 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. > > [0] MatSeqAIJCheckInode(): Found 8 nodes out of 8 rows. Not using Inode > routines > > [0] DMGetDMSNES(): Creating new DMSNES > > [0] DMGetDMKSP(): Creating new DMKSP > > 0 SNES Function norm 1.414213562373e+00 > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8 X 8; storage space: 0 > unneeded,8 used > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 > > [0] MatCheckCompressedRow(): Found the ratio (num_zerorows > 0)/(num_localrows 8) < 0.6. Do not use CompressedRow routines. > > [0] SNESComputeJacobian(): Rebuilding preconditioner > > 0 KSP Residual norm 1.414213562373e+00 > > 1 KSP Residual norm 1.414213562373e+00 > > [0] KSPGMRESBuildSoln(): Likely your matrix or preconditioner is > singular. HH(it,it) is identically zero; it = 0 GRS(it) = 1.41421 > > Linear solve did not converge due to DIVERGED_BREAKDOWN iterations 1 > > [0] SNESSolve_KSPONLY(): iter=0, number linear solve failures 1 greater > than current SNES allowed, stopping solve > > Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations > 0 > > Number of SNES iterations = 0 > > L_2 Error: 0.751285 > > > > > > configured with: ./config/configure.py --with-shared-libraries > --with-clanguage=c++ --CFLAGS='-g -O0' --CXXFLAGS='-g -O0' > --download-ctetgen --download-triangle --with-debugging=yes > --with-exodusii-lib=[/usr/lib/x86_64-linux-gnu/libexoIIv2.so] > --with-netcdf-lib=[/usr/lib/libnetcdf.so] --with-netcdf-include=/usr/include > --with-hdf5-include=/usr/include/hdf5/serial/ > --with-hdf5-lib=[/usr/lib/x86_64-linux-gnu/hdf5/serial/libhdf5.so] > --with-c2html=0 --with-exodusii-include=/usr/include > > > > > > thanks! > > David > > > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From aliberkkahraman at yahoo.com Tue Oct 10 07:56:24 2017 From: aliberkkahraman at yahoo.com (Ali Berk Kahraman) Date: Tue, 10 Oct 2017 15:56:24 +0300 Subject: [petsc-users] TSSetMaxSteps gives undefined reference error Message-ID: <4619f363-2f64-a9c5-d06f-4df472ffa290@yahoo.com> Hello All, When I try to use TSSetMaxSteps function in my code, the compiler gives me "undefined reference to TSSetMaxSteps" error. I have petscts.h included, and my makefile is also operational for ts. Any ideas why this might be? I use petsc 3.7.3. The code sample is as follows, #include . . . . TS ts; ierr= TSCreate(PETSC_COMM_WORLD,&ts); CHKERRQ(ierr); ierr= TSSetProblemType(ts,TS_LINEAR); CHKERRQ(ierr); ierr= TSSetSolution(ts, dummyvec); CHKERRQ(ierr); ierr= TSSetType(ts,TSRK); CHKERRQ(ierr); ierr= TSSetTime(ts,time); CHKERRQ(ierr); ierr= TSSetTimeStep(ts,timestep); CHKERRQ(ierr); ierr=TSSetExactFinalTime(ts,TS_EXACTFINALTIME_STEPOVER); CHKERRQ(ierr); ierr=TSSetMaxSteps(ts,maxsteps);CHKERRQ(ierr); TSSetRHSFunction(ts,residual, FormRHSFunction,&mycontext); ierr= TSSolve(ts,uJi); CHKERRQ(ierr); From hongzhang at anl.gov Tue Oct 10 09:14:23 2017 From: hongzhang at anl.gov (Zhang, Hong) Date: Tue, 10 Oct 2017 14:14:23 +0000 Subject: [petsc-users] TSSetMaxSteps gives undefined reference error In-Reply-To: <4619f363-2f64-a9c5-d06f-4df472ffa290@yahoo.com> References: <4619f363-2f64-a9c5-d06f-4df472ffa290@yahoo.com> Message-ID: <5662200F-7E2A-4E43-BF98-B8C4F3FF7A08@anl.gov> TSSetMaxSteps() was added in PETSc 3.8. You can either update PETSc or use TSSetDuration() in older versions. Hong (Mr.) > On Oct 10, 2017, at 7:56 AM, Ali Berk Kahraman wrote: > > Hello All, > > When I try to use TSSetMaxSteps function in my code, the compiler gives me "undefined reference to TSSetMaxSteps" error. I have petscts.h included, and my makefile is also operational for ts. Any ideas why this might be? I use petsc 3.7.3. The code sample is as follows, > > > #include > > . > > . > > . > > . > > TS ts; > ierr= TSCreate(PETSC_COMM_WORLD,&ts); > CHKERRQ(ierr); > ierr= TSSetProblemType(ts,TS_LINEAR); > CHKERRQ(ierr); > ierr= TSSetSolution(ts, dummyvec); > CHKERRQ(ierr); > ierr= TSSetType(ts,TSRK); > CHKERRQ(ierr); > ierr= TSSetTime(ts,time); > CHKERRQ(ierr); > ierr= TSSetTimeStep(ts,timestep); > CHKERRQ(ierr); > ierr=TSSetExactFinalTime(ts,TS_EXACTFINALTIME_STEPOVER); > CHKERRQ(ierr); > ierr=TSSetMaxSteps(ts,maxsteps);CHKERRQ(ierr); > TSSetRHSFunction(ts,residual, > FormRHSFunction,&mycontext); > ierr= TSSolve(ts,uJi); > CHKERRQ(ierr); > From zakaryah at gmail.com Tue Oct 10 11:08:11 2017 From: zakaryah at gmail.com (zakaryah .) Date: Tue, 10 Oct 2017 12:08:11 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> Message-ID: Thanks for clearing that up. I'd appreciate any further help. Here's a summary: My ultimate goal is to find a vector field which minimizes an action. The action is a (nonlinear) function of the field and its first spatial derivatives. My current approach is to derive the (continuous) Euler-Lagrange equations, which results in a nonlinear PDE that the minimizing field must satisfy. These Euler-Lagrange equations are then discretized, and I'm trying to use an SNES to solve them. The problem is that the solver seems to reach a point at which the Jacobian (this corresponds to the second variation of the action, which is like a Hessian of the energy) becomes nearly singular, but where the residual (RHS of PDE) is not close to zero. The residual does not decrease over additional SNES iterations, and the line search results in tiny step sizes. My interpretation is that this point of stagnation is a critical point. I have checked the hand-coded Jacobian very carefully and I am confident that it is correct. I am guessing that such a situation is well-known in the field, but I don't know the lingo or literature. If anyone has suggestions I'd be thrilled. Are there documentation/methodologies within PETSc for this type of situation? Is there any advantage to discretizing the action itself and using the optimization routines? With minor modifications I'll have the gradient and Hessian calculations coded. Are the optimization routines likely to stagnate in the same way as the nonlinear solver, or can they take advantage of the structure of the problem to overcome this? Thanks a lot in advance for any help. On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith wrote: > > There is apparently confusing in understanding the ordering. Is this all > on one process that you get funny results? Are you using > MatSetValuesStencil() to provide the matrix (it is generally easier than > providing it yourself). In parallel MatView() always maps the rows and > columns to the natural ordering before printing, if you use a matrix > created from the DMDA. If you create the matrix yourself it has a different > MatView in parallel that is in in thePETSc ordering.\ > > > Barry > > > > > On Oct 8, 2017, at 8:05 AM, zakaryah . wrote: > > > > I'm more confused than ever. I don't understand the output of > -snes_type test -snes_test_display. > > > > For the user-defined state of the vector (where I'd like to test the > Jacobian), the finite difference Jacobian at row 0 evaluates as: > > > > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, > 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, > 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) > (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) > (37, 16.325) (38, 4.83918) > > > > But the hand-coded Jacobian at row 0 evaluates as: > > > > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, > 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, > 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) > (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, > 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) > > and the difference between the Jacobians at row 0 evaluates as: > > > > row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, > 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, > -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) > (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, > -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) > (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, > 0.) (41, 0.) > > > > The difference between the column numbering between the finite > difference and the hand-coded Jacobians looks like a serious problem to me, > but I'm probably missing something. > > > > I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, and > for this test problem the grid dimensions are 11x7x6. For a grid point > x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? > If so, then the column numbers of the hand-coded Jacobian match those of > the 27 point stencil I have in mind. However, I am then at a loss to > explain the column numbers in the finite difference Jacobian. > > > > > > > > > > On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . wrote: > > OK - I ran with -snes_monitor -snes_converged_reason > -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual > -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls > -snes_compare_explicit > > > > and here is the full error message, output immediately after > > > > Finite difference Jacobian > > Mat Object: 24 MPI processes > > type: mpiaij > > > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > > > [0]PETSC ERROR: Invalid argument > > > > [0]PETSC ERROR: Matrix not generated from a DMDA > > > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > > > > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 > > > > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named > node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 > > > > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 > --download-fblaslapack -with-debugging=0 > > > > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in /rugpfs/fs0/home/zfrentz/ > PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c > > > > [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/ > PETSc/build/petsc-3.7.6/src/mat/interface/matrix.c > > > > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in > /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > > > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in > /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c > > > > [0]PETSC ERROR: #5 SNESSolve() line 4005 in /rugpfs/fs0/home/zfrentz/ > PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > > > [0]PETSC ERROR: #6 solveWarp3D() line 659 in /ru-auth/local/home/zfrentz/ > Code/OpticalFlow/working/October6_2017/mshs.c > > > > > > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown wrote: > > Always always always send the whole error message. > > > > "zakaryah ." writes: > > > > > I tried -snes_compare_explicit, and got the following error: > > > > > > [0]PETSC ERROR: Invalid argument > > > > > > [0]PETSC ERROR: Matrix not generated from a DMDA > > > > > > What am I doing wrong? > > > > > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown wrote: > > > > > >> Barry Smith writes: > > >> > > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . wrote: > > >> >> > > >> >> I'm still working on this. I've made some progress, and it looks > like > > >> the issue is with the KSP, at least for now. The Jacobian may be > > >> ill-conditioned. Is it possible to use -snes_test_display during an > > >> intermediate step of the analysis? I would like to inspect the > Jacobian > > >> after several solves have already completed, > > >> > > > >> > No, our currently code for testing Jacobians is poor quality and > > >> poorly organized. Needs a major refactoring to do things properly. > Sorry > > >> > > >> You can use -snes_compare_explicit or -snes_compare_coloring to output > > >> differences on each Newton step. > > >> > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rchurchi at pppl.gov Tue Oct 10 11:59:54 2017 From: rchurchi at pppl.gov (Randy Michael Churchill) Date: Tue, 10 Oct 2017 12:59:54 -0400 Subject: [petsc-users] TAO setup with modules in Fortran 90 In-Reply-To: References: <88546AF0-385D-44DA-965D-169A6DBAA523@mcs.anl.gov> Message-ID: Thanks, I have the rosenbrock1f.F90 example compiling and running now with both 3.7 and 3.8. Transferring this setup knowledge to my codebase, I ran into a problem, since I setup my module a little differently than the rosenbrock1f example. In the specification part of the module, I put a Tao object, e.g using the rosebrock1f as an example I do: #include module commondat PetscReal :: alpha PetscInt :: n Tao :: taotest end module commondat The compiler throws errors on this however. : commondat.F90(22): error #5082: Syntax error, found '::' when expecting one of: => = . [ % ( : Tao :: taotest ----------^ commondat.F90(22): error #6274: This statement must not appear in the specification part of a module. Tao :: taotest ------^ commondat.F90(22): error #6793: The POINTER attribute is required. [TAO] Tao :: taotest ------^ What is the right way to use this? Do I have to define a type within the module for the Tao object to be in? Or is this an issue in 3.7 when using only the petscdef.h instead of petsc.h? And thanks for the advice on moving to Petsc 3.8, we work with some Petsc people, who know our code base, so will discuss with them on working towards that. On Fri, Oct 6, 2017 at 5:59 PM, Barry Smith wrote: > > > On Oct 6, 2017, at 11:18 PM, Randy Michael Churchill > wrote: > > > > So if I'm limited to petsc 3.7.6 for reasons of eventually using within > an existing, larger codebase that depends on 3.7.6, is it possible to use > TAO with a user-defined module in Fortran90 using 3.7.6? > > You should really pus this "existing, larger codebase" to transition to > PETSc 3.8 sooner, rather than later. Especially for developments in Fortran > it will make life better for everyone. We are always willing to help users, > once they have read the changes information, with information to make > transitioning to the latest PETSc version easy. For any parts of the code > in C, transitioning from 3.7 to 3.8 should be really simple. > > Barry > > > > > I had tried the various forms of includes listed in the documentation, > e.g. see below. I think I now realize this is an issue with the petsc > installation on Edison, it does not seem to have the petsctao module in the > library file (confirmed using nm -D on the library file). If I do the same > include and use statement but with, for example, petscmat, it compiles fine. > > > > I built v3.8 from source, and the petsctao module is in the library > file, and now the make works. > > > > commondat.F90 > > module commondat > > #include > > use petsc > > PetscReal :: alpha > > PetscInt :: n > > end module commondat > > > > program rosenbrock1f > > #include > > use petsctao > > use commondat > > > > > > > > > > On Fri, Oct 6, 2017 at 7:54 AM, Matthew Knepley > wrote: > > On Fri, Oct 6, 2017 at 7:36 AM, Barry Smith wrote: > > > > Randy, > > > > First you absolutely must use version 3.8 or the master development > copy. We improved and simplified dramatically how Fortran (90) is utilized > from PETSc. > > > > Note that there is only one simple set of include files and modules > for Fortran; see the newest documentation. > > > > > > http://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/ > Sys/UsingFortran.html > > > > Matt > > > > > > Barry > > > > > > > On Oct 5, 2017, at 11:48 PM, Randy Michael Churchill < > rchurchi at pppl.gov> wrote: > > > > > > A simple setup question with TAO: if I were to convert the > rosenbrock1f.F90 example to use a module instead of common structures, how > would I setup the include statements? I've tried various combinations > (using petscXXXdef.h, petscXXX.h, petscXXX.h90, along with use petscXXX), > but seem to get errors with all. > > > > > > file:rosenbrock1f.h: > > > module commondat > > > PetscReal :: alpha > > > PetscInt :: n > > > end module commondat > > > > > > file:rosenbrock1f.90: > > > program rosenbrock1f > > > !!include statements??? which and where???!!! > > > use commondat > > > ... > > > > > > subroutine FormFunctionGradient(tao, X, f, G, dummy, ierr) > > > use commondat > > > implicit none > > > ... > > > > > > (https://www.mcs.anl.gov/petsc/petsc-dev/src/tao/ > unconstrained/examples/tutorials/rosenbrock1f.F90.html) > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > -- > > R. Michael Churchill > > -- R. Michael Churchill -------------- next part -------------- An HTML attachment was scrubbed... URL: From Lukasz.Kaczmarczyk at glasgow.ac.uk Tue Oct 10 12:06:29 2017 From: Lukasz.Kaczmarczyk at glasgow.ac.uk (Lukasz Kaczmarczyk) Date: Tue, 10 Oct 2017 17:06:29 +0000 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> Message-ID: <98C511CF-59AB-40D8-9B79-AE5358697C28@glasgow.ac.uk> On 10 Oct 2017, at 17:08, zakaryah . > wrote: Thanks for clearing that up. I'd appreciate any further help. Here's a summary: My ultimate goal is to find a vector field which minimizes an action. The action is a (nonlinear) function of the field and its first spatial derivatives. My current approach is to derive the (continuous) Euler-Lagrange equations, which results in a nonlinear PDE that the minimizing field must satisfy. These Euler-Lagrange equations are then discretized, and I'm trying to use an SNES to solve them. The problem is that the solver seems to reach a point at which the Jacobian (this corresponds to the second variation of the action, which is like a Hessian of the energy) becomes nearly singular, but where the residual (RHS of PDE) is not close to zero. The residual does not decrease over additional SNES iterations, and the line search results in tiny step sizes. My interpretation is that this point of stagnation is a critical point. I have checked the hand-coded Jacobian very carefully and I am confident that it is correct. I am guessing that such a situation is well-known in the field, but I don't know the lingo or literature. If anyone has suggestions I'd be thrilled. Are there documentation/methodologies within PETSc for this type of situation? Is there any advantage to discretizing the action itself and using the optimization routines? With minor modifications I'll have the gradient and Hessian calculations coded. Are the optimization routines likely to stagnate in the same way as the nonlinear solver, or can they take advantage of the structure of the problem to overcome this? Thanks a lot in advance for any help. Hello, The problem similar to yours, i.e. singular tangent matrix is well known in structural and solid mechanics when the structure becomes unstable and buckling. There are good and bad methods to solve this. One is dynamic relaxation, which is not consistent. A good method is to use control equation, which controls external forces/boundary conditions. Search for spherical arc-length control and many derivatives of this method. Control equation is a scalar equation, and add "moustaches" to the matrix, you can make shell preconditioner which solves such system efficiently with iterative solvers. I using this for some time with petsc with great success. Regards, Lukasz On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith > wrote: There is apparently confusing in understanding the ordering. Is this all on one process that you get funny results? Are you using MatSetValuesStencil() to provide the matrix (it is generally easier than providing it yourself). In parallel MatView() always maps the rows and columns to the natural ordering before printing, if you use a matrix created from the DMDA. If you create the matrix yourself it has a different MatView in parallel that is in in thePETSc ordering.\ Barry > On Oct 8, 2017, at 8:05 AM, zakaryah . > wrote: > > I'm more confused than ever. I don't understand the output of -snes_type test -snes_test_display. > > For the user-defined state of the vector (where I'd like to test the Jacobian), the finite difference Jacobian at row 0 evaluates as: > > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) (37, 16.325) (38, 4.83918) > > But the hand-coded Jacobian at row 0 evaluates as: > > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) > and the difference between the Jacobians at row 0 evaluates as: > > row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, 0.) (41, 0.) > > The difference between the column numbering between the finite difference and the hand-coded Jacobians looks like a serious problem to me, but I'm probably missing something. > > I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, and for this test problem the grid dimensions are 11x7x6. For a grid point x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? If so, then the column numbers of the hand-coded Jacobian match those of the 27 point stencil I have in mind. However, I am then at a loss to explain the column numbers in the finite difference Jacobian. > > > > > On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . > wrote: > OK - I ran with -snes_monitor -snes_converged_reason -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls -snes_compare_explicit > > and here is the full error message, output immediately after > > Finite difference Jacobian > Mat Object: 24 MPI processes > type: mpiaij > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > [0]PETSC ERROR: Invalid argument > > [0]PETSC ERROR: Matrix not generated from a DMDA > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 > > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 > > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 --download-fblaslapack -with-debugging=0 > > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c > > [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/mat/interface/matrix.c > > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c > > [0]PETSC ERROR: #5 SNESSolve() line 4005 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > [0]PETSC ERROR: #6 solveWarp3D() line 659 in /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c > > > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown > wrote: > Always always always send the whole error message. > > "zakaryah ." > writes: > > > I tried -snes_compare_explicit, and got the following error: > > > > [0]PETSC ERROR: Invalid argument > > > > [0]PETSC ERROR: Matrix not generated from a DMDA > > > > What am I doing wrong? > > > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown > wrote: > > > >> Barry Smith > writes: > >> > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . > wrote: > >> >> > >> >> I'm still working on this. I've made some progress, and it looks like > >> the issue is with the KSP, at least for now. The Jacobian may be > >> ill-conditioned. Is it possible to use -snes_test_display during an > >> intermediate step of the analysis? I would like to inspect the Jacobian > >> after several solves have already completed, > >> > > >> > No, our currently code for testing Jacobians is poor quality and > >> poorly organized. Needs a major refactoring to do things properly. Sorry > >> > >> You can use -snes_compare_explicit or -snes_compare_coloring to output > >> differences on each Newton step. > >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zakaryah at gmail.com Tue Oct 10 12:47:24 2017 From: zakaryah at gmail.com (zakaryah .) Date: Tue, 10 Oct 2017 13:47:24 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <98C511CF-59AB-40D8-9B79-AE5358697C28@glasgow.ac.uk> References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <98C511CF-59AB-40D8-9B79-AE5358697C28@glasgow.ac.uk> Message-ID: Thanks Lukasz, this is extremely helpful. By dynamic relaxation, do you mean converting the elliptic PDEs (Euler-Lagrange eqs.) to parabolic PDEs by introducing time, i.e. instead of first variation = 0, solve dx/dt = -nu first variation, where nu is like a viscosity? It seems to me that this should approach an extremum of the Lagrangian. What do you mean by not consistent? I will look into this and the other ideas you mentioned - thanks again! Cheers, Zak On Tue, Oct 10, 2017 at 1:06 PM, Lukasz Kaczmarczyk < Lukasz.Kaczmarczyk at glasgow.ac.uk> wrote: > > > > On 10 Oct 2017, at 17:08, zakaryah . wrote: > > Thanks for clearing that up. > > I'd appreciate any further help. Here's a summary: > > My ultimate goal is to find a vector field which minimizes an action. The > action is a (nonlinear) function of the field and its first spatial > derivatives. > > My current approach is to derive the (continuous) Euler-Lagrange > equations, which results in a nonlinear PDE that the minimizing field must > satisfy. These Euler-Lagrange equations are then discretized, and I'm > trying to use an SNES to solve them. > > The problem is that the solver seems to reach a point at which the > Jacobian (this corresponds to the second variation of the action, which is > like a Hessian of the energy) becomes nearly singular, but where the > residual (RHS of PDE) is not close to zero. The residual does not decrease > over additional SNES iterations, and the line search results in tiny step > sizes. My interpretation is that this point of stagnation is a critical > point. > > I have checked the hand-coded Jacobian very carefully and I am confident > that it is correct. > > I am guessing that such a situation is well-known in the field, but I > don't know the lingo or literature. If anyone has suggestions I'd be > thrilled. Are there documentation/methodologies within PETSc for this type > of situation? > > Is there any advantage to discretizing the action itself and using the > optimization routines? With minor modifications I'll have the gradient and > Hessian calculations coded. Are the optimization routines likely to > stagnate in the same way as the nonlinear solver, or can they take > advantage of the structure of the problem to overcome this? > > Thanks a lot in advance for any help. > > > > Hello, > > The problem similar to yours, i.e. singular tangent matrix is well known > in structural and solid mechanics when the structure becomes unstable and > buckling. There are good and bad methods to solve this. > > One is dynamic relaxation, which is not consistent. A good method is to > use control equation, which controls external forces/boundary conditions. > Search for spherical arc-length control and many derivatives of this method. > > Control equation is a scalar equation, and add "moustaches" to the matrix, > you can make shell preconditioner which solves such system efficiently with > iterative solvers. I using this for some time with petsc with great > success. > > Regards, > Lukasz > > > > On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith wrote: > >> >> There is apparently confusing in understanding the ordering. Is this >> all on one process that you get funny results? Are you using >> MatSetValuesStencil() to provide the matrix (it is generally easier than >> providing it yourself). In parallel MatView() always maps the rows and >> columns to the natural ordering before printing, if you use a matrix >> created from the DMDA. If you create the matrix yourself it has a different >> MatView in parallel that is in in thePETSc ordering.\ >> >> >> Barry >> >> >> >> > On Oct 8, 2017, at 8:05 AM, zakaryah . wrote: >> > >> > I'm more confused than ever. I don't understand the output of >> -snes_type test -snes_test_display. >> > >> > For the user-defined state of the vector (where I'd like to test the >> Jacobian), the finite difference Jacobian at row 0 evaluates as: >> > >> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >> 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, >> 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) >> (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) >> (37, 16.325) (38, 4.83918) >> > >> > But the hand-coded Jacobian at row 0 evaluates as: >> > >> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >> 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, >> 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) >> (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, >> 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) >> > and the difference between the Jacobians at row 0 evaluates as: >> > >> > row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, >> 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, >> -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) >> (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, >> -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) >> (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, >> 0.) (41, 0.) >> > >> > The difference between the column numbering between the finite >> difference and the hand-coded Jacobians looks like a serious problem to me, >> but I'm probably missing something. >> > >> > I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, and >> for this test problem the grid dimensions are 11x7x6. For a grid point >> x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? >> If so, then the column numbers of the hand-coded Jacobian match those of >> the 27 point stencil I have in mind. However, I am then at a loss to >> explain the column numbers in the finite difference Jacobian. >> > >> > >> > >> > >> > On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . wrote: >> > OK - I ran with -snes_monitor -snes_converged_reason >> -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual >> -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls >> -snes_compare_explicit >> > >> > and here is the full error message, output immediately after >> > >> > Finite difference Jacobian >> > Mat Object: 24 MPI processes >> > type: mpiaij >> > >> > [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> > >> > [0]PETSC ERROR: Invalid argument >> > >> > [0]PETSC ERROR: Matrix not generated from a DMDA >> > >> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html >> for trouble shooting. >> > >> > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 >> > >> > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named >> node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 >> > >> > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 >> --download-fblaslapack -with-debugging=0 >> > >> > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c >> > >> > [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/PETSc >> /build/petsc-3.7.6/src/mat/interface/matrix.c >> > >> > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/ >> interface/snes.c >> > >> > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c >> > >> > [0]PETSC ERROR: #5 SNESSolve() line 4005 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/ >> interface/snes.c >> > >> > [0]PETSC ERROR: #6 solveWarp3D() line 659 in >> /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c >> > >> > >> > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown wrote: >> > Always always always send the whole error message. >> > >> > "zakaryah ." writes: >> > >> > > I tried -snes_compare_explicit, and got the following error: >> > > >> > > [0]PETSC ERROR: Invalid argument >> > > >> > > [0]PETSC ERROR: Matrix not generated from a DMDA >> > > >> > > What am I doing wrong? >> > > >> > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown wrote: >> > > >> > >> Barry Smith writes: >> > >> >> > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . >> wrote: >> > >> >> >> > >> >> I'm still working on this. I've made some progress, and it looks >> like >> > >> the issue is with the KSP, at least for now. The Jacobian may be >> > >> ill-conditioned. Is it possible to use -snes_test_display during an >> > >> intermediate step of the analysis? I would like to inspect the >> Jacobian >> > >> after several solves have already completed, >> > >> > >> > >> > No, our currently code for testing Jacobians is poor quality and >> > >> poorly organized. Needs a major refactoring to do things properly. >> Sorry >> > >> >> > >> You can use -snes_compare_explicit or -snes_compare_coloring to >> output >> > >> differences on each Newton step. >> > >> >> > >> > >> >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From aliberkkahraman at yahoo.com Tue Oct 10 12:53:16 2017 From: aliberkkahraman at yahoo.com (Ali Berk Kahraman) Date: Tue, 10 Oct 2017 20:53:16 +0300 Subject: [petsc-users] TSSetMaxSteps gives undefined reference error In-Reply-To: <5662200F-7E2A-4E43-BF98-B8C4F3FF7A08@anl.gov> References: <4619f363-2f64-a9c5-d06f-4df472ffa290@yahoo.com> <5662200F-7E2A-4E43-BF98-B8C4F3FF7A08@anl.gov> Message-ID: This solves the problem. Thank you. On 10-10-2017 17:14, Zhang, Hong wrote: > TSSetMaxSteps() was added in PETSc 3.8. You can either update PETSc or use TSSetDuration() in older versions. > > Hong (Mr.) > > >> On Oct 10, 2017, at 7:56 AM, Ali Berk Kahraman wrote: >> >> Hello All, >> >> When I try to use TSSetMaxSteps function in my code, the compiler gives me "undefined reference to TSSetMaxSteps" error. I have petscts.h included, and my makefile is also operational for ts. Any ideas why this might be? I use petsc 3.7.3. The code sample is as follows, >> >> >> #include >> >> . >> >> . >> >> . >> >> . >> >> TS ts; >> ierr= TSCreate(PETSC_COMM_WORLD,&ts); >> CHKERRQ(ierr); >> ierr= TSSetProblemType(ts,TS_LINEAR); >> CHKERRQ(ierr); >> ierr= TSSetSolution(ts, dummyvec); >> CHKERRQ(ierr); >> ierr= TSSetType(ts,TSRK); >> CHKERRQ(ierr); >> ierr= TSSetTime(ts,time); >> CHKERRQ(ierr); >> ierr= TSSetTimeStep(ts,timestep); >> CHKERRQ(ierr); >> ierr=TSSetExactFinalTime(ts,TS_EXACTFINALTIME_STEPOVER); >> CHKERRQ(ierr); >> ierr=TSSetMaxSteps(ts,maxsteps);CHKERRQ(ierr); >> TSSetRHSFunction(ts,residual, >> FormRHSFunction,&mycontext); >> ierr= TSSolve(ts,uJi); >> CHKERRQ(ierr); >> From bsmith at mcs.anl.gov Tue Oct 10 14:08:19 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 10 Oct 2017 21:08:19 +0200 Subject: [petsc-users] TAO setup with modules in Fortran 90 In-Reply-To: References: <88546AF0-385D-44DA-965D-169A6DBAA523@mcs.anl.gov> Message-ID: <3D63D457-5400-4EE8-ADDB-2248A53446F4@mcs.anl.gov> I don't understand. Are you trying to have the same code that works with 3.7 and 3.8? If so, DO NOT! You will need to write code for 3.8 and just use it (by all means compare the numerical runs with code that works with 3.7 to make sure nothing is broken but only that). If you are having trouble getting something working properly with 3.8 then send a same code that you think should work and doesn't and we'll help you. Barry > On Oct 10, 2017, at 6:59 PM, Randy Michael Churchill wrote: > > Thanks, I have the rosenbrock1f.F90 example compiling and running now with both 3.7 and 3.8. Transferring this setup knowledge to my codebase, I ran into a problem, since I setup my module a little differently than the rosenbrock1f example. In the specification part of the module, I put a Tao object, e.g using the rosebrock1f as an example I do: > > #include > module commondat > PetscReal :: alpha > PetscInt :: n > Tao :: taotest > end module commondat > > The compiler throws errors on this however. : > commondat.F90(22): error #5082: Syntax error, found '::' when expecting one of: => = . [ % ( : > Tao :: taotest > ----------^ > commondat.F90(22): error #6274: This statement must not appear in the specification part of a module. > Tao :: taotest > ------^ > commondat.F90(22): error #6793: The POINTER attribute is required. [TAO] > Tao :: taotest > ------^ > > What is the right way to use this? Do I have to define a type within the module for the Tao object to be in? Or is this an issue in 3.7 when using only the petscdef.h instead of petsc.h? > > And thanks for the advice on moving to Petsc 3.8, we work with some Petsc people, who know our code base, so will discuss with them on working towards that. > > On Fri, Oct 6, 2017 at 5:59 PM, Barry Smith wrote: > > > On Oct 6, 2017, at 11:18 PM, Randy Michael Churchill wrote: > > > > So if I'm limited to petsc 3.7.6 for reasons of eventually using within an existing, larger codebase that depends on 3.7.6, is it possible to use TAO with a user-defined module in Fortran90 using 3.7.6? > > You should really pus this "existing, larger codebase" to transition to PETSc 3.8 sooner, rather than later. Especially for developments in Fortran it will make life better for everyone. We are always willing to help users, once they have read the changes information, with information to make transitioning to the latest PETSc version easy. For any parts of the code in C, transitioning from 3.7 to 3.8 should be really simple. > > Barry > > > > > I had tried the various forms of includes listed in the documentation, e.g. see below. I think I now realize this is an issue with the petsc installation on Edison, it does not seem to have the petsctao module in the library file (confirmed using nm -D on the library file). If I do the same include and use statement but with, for example, petscmat, it compiles fine. > > > > I built v3.8 from source, and the petsctao module is in the library file, and now the make works. > > > > commondat.F90 > > module commondat > > #include > > use petsc > > PetscReal :: alpha > > PetscInt :: n > > end module commondat > > > > program rosenbrock1f > > #include > > use petsctao > > use commondat > > > > > > > > > > On Fri, Oct 6, 2017 at 7:54 AM, Matthew Knepley wrote: > > On Fri, Oct 6, 2017 at 7:36 AM, Barry Smith wrote: > > > > Randy, > > > > First you absolutely must use version 3.8 or the master development copy. We improved and simplified dramatically how Fortran (90) is utilized from PETSc. > > > > Note that there is only one simple set of include files and modules for Fortran; see the newest documentation. > > > > > > http://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/Sys/UsingFortran.html > > > > Matt > > > > > > Barry > > > > > > > On Oct 5, 2017, at 11:48 PM, Randy Michael Churchill wrote: > > > > > > A simple setup question with TAO: if I were to convert the rosenbrock1f.F90 example to use a module instead of common structures, how would I setup the include statements? I've tried various combinations (using petscXXXdef.h, petscXXX.h, petscXXX.h90, along with use petscXXX), but seem to get errors with all. > > > > > > file:rosenbrock1f.h: > > > module commondat > > > PetscReal :: alpha > > > PetscInt :: n > > > end module commondat > > > > > > file:rosenbrock1f.90: > > > program rosenbrock1f > > > !!include statements??? which and where???!!! > > > use commondat > > > ... > > > > > > subroutine FormFunctionGradient(tao, X, f, G, dummy, ierr) > > > use commondat > > > implicit none > > > ... > > > > > > (https://www.mcs.anl.gov/petsc/petsc-dev/src/tao/unconstrained/examples/tutorials/rosenbrock1f.F90.html) > > > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > -- > > R. Michael Churchill > > > > > -- > R. Michael Churchill From evanum at gmail.com Tue Oct 10 14:23:17 2017 From: evanum at gmail.com (Evan Um) Date: Tue, 10 Oct 2017 12:23:17 -0700 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: Hi Hong, Thanks for your help. I am testing if I can use MUMPS block low rank factors as a preconditioner for QMR in MUMPS framework. I would like to ask one more question. STRUMPACK ( http://portal.nersc.gov/project/sparse/strumpack/) also supports low rank approximation. Can PETSC also allow users to use the approximate factors as a preconditioner in PETSC-QMR? Best, Evan On Tue, Oct 3, 2017 at 8:34 AM, Hong wrote: > Evan, > ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch > hzhang/update-mumps-5.1.1-cntl > > You may give it a try. Once it passes our regression tests, I'll merge it > to petsc master branch. > > Hong > > > On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: > >> I'll check it. >> Hong >> >> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: >> >>> Hi Barry, >>> >>> Thanks for your comments. To activate block low rank (BLR) approximation >>> in MUMPS version 5.1.1, a user needs to turn on the functionality (i.e. >>> ICNTL(35)=1) and specify the tolerance value (e.g. CNTL(7)=1e-4). In PETSC, >>> I think that we can set up ICNTL and CNTL parameters for MUMPS. I was >>> wondering if we can still use BLR approximation for a preconditioner for >>> Krylov solvers. >>> >>> Best, >>> Evan >>> >>> >>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith wrote: >>> >>>> >>>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>>> > >>>> > Dear PETSC Users, >>>> > >>>> > My system matrix comes from finite element modeling and is complex >>>> and unstructured. Its typical size is a few millions-by a few millions. I >>>> wondering if I can use MUMPS parallel direct solver as a preconditioner in >>>> PETSC. For example, I want to pass factored matrices to Krylov iterative >>>> solvers such as QMR. Is there any PETSC+MUMPS example code for the purpose? >>>> >>>> You don't pass factored matrices you just pass the original matrix >>>> and use -pc_type lu -pc_factor_mat_solver_package mumps >>>> >>>> > Can PETSC call the latest MUMPS that supports block low rank >>>> approximation? >>>> >>>> No, send us info on it and we'll see if we can add an interface >>>> >>>> >>>> > >>>> > In advance, thank you very much for your comments. >>>> > >>>> > Best, >>>> > Evan >>>> > >>>> > >>>> > >>>> > >>>> > >>>> >>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Tue Oct 10 15:00:47 2017 From: hzhang at mcs.anl.gov (Hong) Date: Tue, 10 Oct 2017 15:00:47 -0500 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: Hmm, I have never used petsc-strumpack interface. I'll take a look and try to add this support. Hong On Tue, Oct 10, 2017 at 2:23 PM, Evan Um wrote: > Hi Hong, > > Thanks for your help. I am testing if I can use MUMPS block low rank > factors as a preconditioner for QMR in MUMPS framework. I would like to ask > one more question. STRUMPACK (http://portal.nersc.gov/ > project/sparse/strumpack/) also supports low rank approximation. Can > PETSC also allow users to use the approximate factors as a preconditioner > in PETSC-QMR? > > Best, > Evan > > > On Tue, Oct 3, 2017 at 8:34 AM, Hong wrote: > >> Evan, >> ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch >> hzhang/update-mumps-5.1.1-cntl >> >> You may give it a try. Once it passes our regression tests, I'll merge it >> to petsc master branch. >> >> Hong >> >> >> On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: >> >>> I'll check it. >>> Hong >>> >>> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: >>> >>>> Hi Barry, >>>> >>>> Thanks for your comments. To activate block low rank (BLR) >>>> approximation in MUMPS version 5.1.1, a user needs to turn on the >>>> functionality (i.e. ICNTL(35)=1) and specify the tolerance value (e.g. >>>> CNTL(7)=1e-4). In PETSC, I think that we can set up ICNTL and CNTL >>>> parameters for MUMPS. I was wondering if we can still use BLR approximation >>>> for a preconditioner for Krylov solvers. >>>> >>>> Best, >>>> Evan >>>> >>>> >>>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith >>>> wrote: >>>> >>>>> >>>>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>>>> > >>>>> > Dear PETSC Users, >>>>> > >>>>> > My system matrix comes from finite element modeling and is complex >>>>> and unstructured. Its typical size is a few millions-by a few millions. I >>>>> wondering if I can use MUMPS parallel direct solver as a preconditioner in >>>>> PETSC. For example, I want to pass factored matrices to Krylov iterative >>>>> solvers such as QMR. Is there any PETSC+MUMPS example code for the purpose? >>>>> >>>>> You don't pass factored matrices you just pass the original matrix >>>>> and use -pc_type lu -pc_factor_mat_solver_package mumps >>>>> >>>>> > Can PETSC call the latest MUMPS that supports block low rank >>>>> approximation? >>>>> >>>>> No, send us info on it and we'll see if we can add an interface >>>>> >>>>> >>>>> > >>>>> > In advance, thank you very much for your comments. >>>>> > >>>>> > Best, >>>>> > Evan >>>>> > >>>>> > >>>>> > >>>>> > >>>>> > >>>>> >>>>> >>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Oct 10 15:08:01 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 10 Oct 2017 16:08:01 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> Message-ID: On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . wrote: > Thanks for clearing that up. > > I'd appreciate any further help. Here's a summary: > > My ultimate goal is to find a vector field which minimizes an action. The > action is a (nonlinear) function of the field and its first spatial > derivatives. > > My current approach is to derive the (continuous) Euler-Lagrange > equations, which results in a nonlinear PDE that the minimizing field must > satisfy. These Euler-Lagrange equations are then discretized, and I'm > trying to use an SNES to solve them. > > The problem is that the solver seems to reach a point at which the > Jacobian (this corresponds to the second variation of the action, which is > like a Hessian of the energy) becomes nearly singular, but where the > residual (RHS of PDE) is not close to zero. The residual does not decrease > over additional SNES iterations, and the line search results in tiny step > sizes. My interpretation is that this point of stagnation is a critical > point. > The normal thing to do here (I think) is to engage solvers which do not depend on that particular point. So using NRichardson, or maybe NGMRES, to get past that. I would be interested to see if this is successful. Matt > I have checked the hand-coded Jacobian very carefully and I am confident > that it is correct. > > I am guessing that such a situation is well-known in the field, but I > don't know the lingo or literature. If anyone has suggestions I'd be > thrilled. Are there documentation/methodologies within PETSc for this type > of situation? > > Is there any advantage to discretizing the action itself and using the > optimization routines? With minor modifications I'll have the gradient and > Hessian calculations coded. Are the optimization routines likely to > stagnate in the same way as the nonlinear solver, or can they take > advantage of the structure of the problem to overcome this? > > Thanks a lot in advance for any help. > > On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith wrote: > >> >> There is apparently confusing in understanding the ordering. Is this >> all on one process that you get funny results? Are you using >> MatSetValuesStencil() to provide the matrix (it is generally easier than >> providing it yourself). In parallel MatView() always maps the rows and >> columns to the natural ordering before printing, if you use a matrix >> created from the DMDA. If you create the matrix yourself it has a different >> MatView in parallel that is in in thePETSc ordering.\ >> >> >> Barry >> >> >> >> > On Oct 8, 2017, at 8:05 AM, zakaryah . wrote: >> > >> > I'm more confused than ever. I don't understand the output of >> -snes_type test -snes_test_display. >> > >> > For the user-defined state of the vector (where I'd like to test the >> Jacobian), the finite difference Jacobian at row 0 evaluates as: >> > >> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >> 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, >> 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) >> (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) >> (37, 16.325) (38, 4.83918) >> > >> > But the hand-coded Jacobian at row 0 evaluates as: >> > >> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >> 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, >> 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) >> (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, >> 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) >> > and the difference between the Jacobians at row 0 evaluates as: >> > >> > row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, >> 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, >> -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) >> (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, >> -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) >> (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, >> 0.) (41, 0.) >> > >> > The difference between the column numbering between the finite >> difference and the hand-coded Jacobians looks like a serious problem to me, >> but I'm probably missing something. >> > >> > I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, and >> for this test problem the grid dimensions are 11x7x6. For a grid point >> x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? >> If so, then the column numbers of the hand-coded Jacobian match those of >> the 27 point stencil I have in mind. However, I am then at a loss to >> explain the column numbers in the finite difference Jacobian. >> > >> > >> > >> > >> > On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . wrote: >> > OK - I ran with -snes_monitor -snes_converged_reason >> -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual >> -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls >> -snes_compare_explicit >> > >> > and here is the full error message, output immediately after >> > >> > Finite difference Jacobian >> > Mat Object: 24 MPI processes >> > type: mpiaij >> > >> > [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> > >> > [0]PETSC ERROR: Invalid argument >> > >> > [0]PETSC ERROR: Matrix not generated from a DMDA >> > >> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html >> for trouble shooting. >> > >> > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 >> > >> > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named >> node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 >> > >> > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 >> --download-fblaslapack -with-debugging=0 >> > >> > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c >> > >> > [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/PETSc >> /build/petsc-3.7.6/src/mat/interface/matrix.c >> > >> > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/ >> interface/snes.c >> > >> > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c >> > >> > [0]PETSC ERROR: #5 SNESSolve() line 4005 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/ >> interface/snes.c >> > >> > [0]PETSC ERROR: #6 solveWarp3D() line 659 in >> /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c >> > >> > >> > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown wrote: >> > Always always always send the whole error message. >> > >> > "zakaryah ." writes: >> > >> > > I tried -snes_compare_explicit, and got the following error: >> > > >> > > [0]PETSC ERROR: Invalid argument >> > > >> > > [0]PETSC ERROR: Matrix not generated from a DMDA >> > > >> > > What am I doing wrong? >> > > >> > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown wrote: >> > > >> > >> Barry Smith writes: >> > >> >> > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . >> wrote: >> > >> >> >> > >> >> I'm still working on this. I've made some progress, and it looks >> like >> > >> the issue is with the KSP, at least for now. The Jacobian may be >> > >> ill-conditioned. Is it possible to use -snes_test_display during an >> > >> intermediate step of the analysis? I would like to inspect the >> Jacobian >> > >> after several solves have already completed, >> > >> > >> > >> > No, our currently code for testing Jacobians is poor quality and >> > >> poorly organized. Needs a major refactoring to do things properly. >> Sorry >> > >> >> > >> You can use -snes_compare_explicit or -snes_compare_coloring to >> output >> > >> differences on each Newton step. >> > >> >> > >> > >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From evanum at gmail.com Tue Oct 10 21:30:56 2017 From: evanum at gmail.com (Evan Um) Date: Tue, 10 Oct 2017 19:30:56 -0700 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: Dear Hong, I just tried to check PETSC develeoper website but couldn't find the updated dev version with name of hzang/update-mumps-5.1.1-cntl. . Could you please let me know the location of the updated dev version? Where do I need to visit to check out the dev version with the new control switches? Thank you very much for your help. Best, Evan On Tue, Oct 3, 2017 at 8:34 AM, Hong wrote: > Evan, > ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch > hzhang/update-mumps-5.1.1-cntl > > You may give it a try. Once it passes our regression tests, I'll merge it > to petsc master branch. > > Hong > > > On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: > >> I'll check it. >> Hong >> >> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: >> >>> Hi Barry, >>> >>> Thanks for your comments. To activate block low rank (BLR) approximation >>> in MUMPS version 5.1.1, a user needs to turn on the functionality (i.e. >>> ICNTL(35)=1) and specify the tolerance value (e.g. CNTL(7)=1e-4). In PETSC, >>> I think that we can set up ICNTL and CNTL parameters for MUMPS. I was >>> wondering if we can still use BLR approximation for a preconditioner for >>> Krylov solvers. >>> >>> Best, >>> Evan >>> >>> >>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith wrote: >>> >>>> >>>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>>> > >>>> > Dear PETSC Users, >>>> > >>>> > My system matrix comes from finite element modeling and is complex >>>> and unstructured. Its typical size is a few millions-by a few millions. I >>>> wondering if I can use MUMPS parallel direct solver as a preconditioner in >>>> PETSC. For example, I want to pass factored matrices to Krylov iterative >>>> solvers such as QMR. Is there any PETSC+MUMPS example code for the purpose? >>>> >>>> You don't pass factored matrices you just pass the original matrix >>>> and use -pc_type lu -pc_factor_mat_solver_package mumps >>>> >>>> > Can PETSC call the latest MUMPS that supports block low rank >>>> approximation? >>>> >>>> No, send us info on it and we'll see if we can add an interface >>>> >>>> >>>> > >>>> > In advance, thank you very much for your comments. >>>> > >>>> > Best, >>>> > Evan >>>> > >>>> > >>>> > >>>> > >>>> > >>>> >>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From l.verzeroli at studenti.unibg.it Wed Oct 11 00:17:32 2017 From: l.verzeroli at studenti.unibg.it (Luca Verzeroli) Date: Wed, 11 Oct 2017 07:17:32 +0200 Subject: [petsc-users] Call MatsetValues in a openMP loop Message-ID: <59dda96d.84881c0a.fb681.321c@mx.google.com> Goodmornig, Is it possible to put MatSetValues or VetSetValues in a openMP loop? Now I'm creating values in a loop and maybe it could be speed up with a multithreads implementation. Are these routines thread safe or not? Luca -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 11 03:54:31 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 11 Oct 2017 04:54:31 -0400 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: On Tue, Oct 10, 2017 at 10:30 PM, Evan Um wrote: > Dear Hong, > > I just tried to check PETSC develeoper website but couldn't find the > updated dev version with name of hzang/update-mumps-5.1.1-cntl. > Hong means a branch. You get the dev repository and checkout that branch git checkout hzang/update-mumps-5.1.1-cntl Thanks, Matt > . Could you please let me know the location of the updated dev version? > Where do I need to visit to check out the dev version with the new control > switches? Thank you very much for your help. > > Best, > Evan > > > > > > > On Tue, Oct 3, 2017 at 8:34 AM, Hong wrote: > >> Evan, >> ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch >> hzhang/update-mumps-5.1.1-cntl >> >> You may give it a try. Once it passes our regression tests, I'll merge it >> to petsc master branch. >> >> Hong >> >> >> On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: >> >>> I'll check it. >>> Hong >>> >>> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: >>> >>>> Hi Barry, >>>> >>>> Thanks for your comments. To activate block low rank (BLR) >>>> approximation in MUMPS version 5.1.1, a user needs to turn on the >>>> functionality (i.e. ICNTL(35)=1) and specify the tolerance value (e.g. >>>> CNTL(7)=1e-4). In PETSC, I think that we can set up ICNTL and CNTL >>>> parameters for MUMPS. I was wondering if we can still use BLR approximation >>>> for a preconditioner for Krylov solvers. >>>> >>>> Best, >>>> Evan >>>> >>>> >>>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith >>>> wrote: >>>> >>>>> >>>>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>>>> > >>>>> > Dear PETSC Users, >>>>> > >>>>> > My system matrix comes from finite element modeling and is complex >>>>> and unstructured. Its typical size is a few millions-by a few millions. I >>>>> wondering if I can use MUMPS parallel direct solver as a preconditioner in >>>>> PETSC. For example, I want to pass factored matrices to Krylov iterative >>>>> solvers such as QMR. Is there any PETSC+MUMPS example code for the purpose? >>>>> >>>>> You don't pass factored matrices you just pass the original matrix >>>>> and use -pc_type lu -pc_factor_mat_solver_package mumps >>>>> >>>>> > Can PETSC call the latest MUMPS that supports block low rank >>>>> approximation? >>>>> >>>>> No, send us info on it and we'll see if we can add an interface >>>>> >>>>> >>>>> > >>>>> > In advance, thank you very much for your comments. >>>>> > >>>>> > Best, >>>>> > Evan >>>>> > >>>>> > >>>>> > >>>>> > >>>>> > >>>>> >>>>> >>>> >>> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Oct 11 04:52:54 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 11 Oct 2017 11:52:54 +0200 Subject: [petsc-users] Call MatsetValues in a openMP loop In-Reply-To: <59dda96d.84881c0a.fb681.321c@mx.google.com> References: <59dda96d.84881c0a.fb681.321c@mx.google.com> Message-ID: <43BB3270-0727-4AC4-B682-846CA3EFD05F@mcs.anl.gov> > On Oct 11, 2017, at 7:17 AM, Luca Verzeroli wrote: > > Goodmornig, > Is it possible to put MatSetValues or VetSetValues in a openMP loop? Now I'm creating values in a loop and maybe it could be speed up with a multithreads implementation. Are these routines thread safe or not? Absolutely not, putting locks around each set values call would be grossly expensive. We don't recommend using OpenMP for HPC computing (that is mixed with MPI) we have yet to see any real evidence that MPI + OpenMP performs better than plan MPI. We are aware that many people "say" you need to have MPI + OpenMP but when you look for real publications that demonstrate more than a trivial improvement of MPI + OpenMP over MPI you will find they are seriously lacking. Barry > > Luca From evanum at gmail.com Wed Oct 11 09:32:57 2017 From: evanum at gmail.com (Evan Um) Date: Wed, 11 Oct 2017 07:32:57 -0700 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: Hi Matt, Still unclear to me. I go to https://bitbucket.org/petsc/petsc/addon/pipelines/home#!/ and then https://bitbucket.org/petsc/petsc/branches/. I don't see any tarred file or directory named "update-mumps-5.1.1-cntl". Could you explain how to download the modified version from the site a little bit in detail? Thank you very much for your help. Evan On Wed, Oct 11, 2017 at 1:54 AM, Matthew Knepley wrote: > On Tue, Oct 10, 2017 at 10:30 PM, Evan Um wrote: > >> Dear Hong, >> >> I just tried to check PETSC develeoper website but couldn't find the >> updated dev version with name of hzang/update-mumps-5.1.1-cntl. >> > > Hong means a branch. You get the dev repository and checkout that branch > > git checkout hzang/update-mumps-5.1.1-cntl > > Thanks, > > Matt > > >> . Could you please let me know the location of the updated dev version? >> Where do I need to visit to check out the dev version with the new control >> switches? Thank you very much for your help. >> >> Best, >> Evan >> >> >> >> >> >> >> On Tue, Oct 3, 2017 at 8:34 AM, Hong wrote: >> >>> Evan, >>> ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch >>> hzhang/update-mumps-5.1.1-cntl >>> >>> You may give it a try. Once it passes our regression tests, I'll merge >>> it to petsc master branch. >>> >>> Hong >>> >>> >>> On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: >>> >>>> I'll check it. >>>> Hong >>>> >>>> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: >>>> >>>>> Hi Barry, >>>>> >>>>> Thanks for your comments. To activate block low rank (BLR) >>>>> approximation in MUMPS version 5.1.1, a user needs to turn on the >>>>> functionality (i.e. ICNTL(35)=1) and specify the tolerance value (e.g. >>>>> CNTL(7)=1e-4). In PETSC, I think that we can set up ICNTL and CNTL >>>>> parameters for MUMPS. I was wondering if we can still use BLR approximation >>>>> for a preconditioner for Krylov solvers. >>>>> >>>>> Best, >>>>> Evan >>>>> >>>>> >>>>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith >>>>> wrote: >>>>> >>>>>> >>>>>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>>>>> > >>>>>> > Dear PETSC Users, >>>>>> > >>>>>> > My system matrix comes from finite element modeling and is complex >>>>>> and unstructured. Its typical size is a few millions-by a few millions. I >>>>>> wondering if I can use MUMPS parallel direct solver as a preconditioner in >>>>>> PETSC. For example, I want to pass factored matrices to Krylov iterative >>>>>> solvers such as QMR. Is there any PETSC+MUMPS example code for the purpose? >>>>>> >>>>>> You don't pass factored matrices you just pass the original matrix >>>>>> and use -pc_type lu -pc_factor_mat_solver_package mumps >>>>>> >>>>>> > Can PETSC call the latest MUMPS that supports block low rank >>>>>> approximation? >>>>>> >>>>>> No, send us info on it and we'll see if we can add an interface >>>>>> >>>>>> >>>>>> > >>>>>> > In advance, thank you very much for your comments. >>>>>> > >>>>>> > Best, >>>>>> > Evan >>>>>> > >>>>>> > >>>>>> > >>>>>> > >>>>>> > >>>>>> >>>>>> >>>>> >>>> >>> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Wed Oct 11 09:56:47 2017 From: hzhang at mcs.anl.gov (Hong) Date: Wed, 11 Oct 2017 09:56:47 -0500 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: Evan: The branch is already merged to master. Just use petsc master branch. Hong Hi Matt, > > Still unclear to me. I go to https://bitbucket.org/ > petsc/petsc/addon/pipelines/home#!/ and then https://bitbucket.org/ > petsc/petsc/branches/. I don't see any tarred file or directory named "update-mumps-5.1.1-cntl". > Could you explain how to download the modified version from the site a > little bit in detail? Thank you very much for your help. > > Evan > > > > > On Wed, Oct 11, 2017 at 1:54 AM, Matthew Knepley > wrote: > >> On Tue, Oct 10, 2017 at 10:30 PM, Evan Um wrote: >> >>> Dear Hong, >>> >>> I just tried to check PETSC develeoper website but couldn't find the >>> updated dev version with name of hzang/update-mumps-5.1.1-cntl. >>> >> >> Hong means a branch. You get the dev repository and checkout that branch >> >> git checkout hzang/update-mumps-5.1.1-cntl >> >> Thanks, >> >> Matt >> >> >>> . Could you please let me know the location of the updated dev version? >>> Where do I need to visit to check out the dev version with the new control >>> switches? Thank you very much for your help. >>> >>> Best, >>> Evan >>> >>> >>> >>> >>> >>> >>> On Tue, Oct 3, 2017 at 8:34 AM, Hong wrote: >>> >>>> Evan, >>>> ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch >>>> hzhang/update-mumps-5.1.1-cntl >>>> >>>> You may give it a try. Once it passes our regression tests, I'll merge >>>> it to petsc master branch. >>>> >>>> Hong >>>> >>>> >>>> On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: >>>> >>>>> I'll check it. >>>>> Hong >>>>> >>>>> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: >>>>> >>>>>> Hi Barry, >>>>>> >>>>>> Thanks for your comments. To activate block low rank (BLR) >>>>>> approximation in MUMPS version 5.1.1, a user needs to turn on the >>>>>> functionality (i.e. ICNTL(35)=1) and specify the tolerance value (e.g. >>>>>> CNTL(7)=1e-4). In PETSC, I think that we can set up ICNTL and CNTL >>>>>> parameters for MUMPS. I was wondering if we can still use BLR approximation >>>>>> for a preconditioner for Krylov solvers. >>>>>> >>>>>> Best, >>>>>> Evan >>>>>> >>>>>> >>>>>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith >>>>>> wrote: >>>>>> >>>>>>> >>>>>>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>>>>>> > >>>>>>> > Dear PETSC Users, >>>>>>> > >>>>>>> > My system matrix comes from finite element modeling and is complex >>>>>>> and unstructured. Its typical size is a few millions-by a few millions. I >>>>>>> wondering if I can use MUMPS parallel direct solver as a preconditioner in >>>>>>> PETSC. For example, I want to pass factored matrices to Krylov iterative >>>>>>> solvers such as QMR. Is there any PETSC+MUMPS example code for the purpose? >>>>>>> >>>>>>> You don't pass factored matrices you just pass the original matrix >>>>>>> and use -pc_type lu -pc_factor_mat_solver_package mumps >>>>>>> >>>>>>> > Can PETSC call the latest MUMPS that supports block low rank >>>>>>> approximation? >>>>>>> >>>>>>> No, send us info on it and we'll see if we can add an interface >>>>>>> >>>>>>> >>>>>>> > >>>>>>> > In advance, thank you very much for your comments. >>>>>>> > >>>>>>> > Best, >>>>>>> > Evan >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zakaryah at gmail.com Wed Oct 11 10:33:48 2017 From: zakaryah at gmail.com (zakaryah .) Date: Wed, 11 Oct 2017 11:33:48 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> Message-ID: Many thanks for the suggestions, Matt. I tried putting the solvers in a loop, like this: do { NewtonLS check convergence if (converged) break NRichardson or NGMRES } while (!converged) The results were interesting, to me at least. With NRichardson, there was indeed improvement in the residual norm, followed by improvement with NewtonLS, and so on for a few iterations of this loop. In each case, after a few iterations the NewtonLS appeared to be stuck in the same way as after the first iteration. Eventually neither method was able to reduce the residual norm, which was still significant, so this was not a total success. With NGMRES, the initial behavior was similar, but eventually the NGMRES progress became erratic. The minimal residual norm was a bit better using NGMRES than NRichardson, but neither combination of methods fully converged. For both NRichardson and NGMRES, I simply used the defaults, as I have no knowledge of how to tune the options for my problem. On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley wrote: > On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . wrote: > >> Thanks for clearing that up. >> >> I'd appreciate any further help. Here's a summary: >> >> My ultimate goal is to find a vector field which minimizes an action. >> The action is a (nonlinear) function of the field and its first spatial >> derivatives. >> >> My current approach is to derive the (continuous) Euler-Lagrange >> equations, which results in a nonlinear PDE that the minimizing field must >> satisfy. These Euler-Lagrange equations are then discretized, and I'm >> trying to use an SNES to solve them. >> >> The problem is that the solver seems to reach a point at which the >> Jacobian (this corresponds to the second variation of the action, which is >> like a Hessian of the energy) becomes nearly singular, but where the >> residual (RHS of PDE) is not close to zero. The residual does not decrease >> over additional SNES iterations, and the line search results in tiny step >> sizes. My interpretation is that this point of stagnation is a critical >> point. >> > > The normal thing to do here (I think) is to engage solvers which do not > depend on that particular point. So using > NRichardson, or maybe NGMRES, to get past that. I would be interested to > see if this is successful. > > Matt > > >> I have checked the hand-coded Jacobian very carefully and I am confident >> that it is correct. >> >> I am guessing that such a situation is well-known in the field, but I >> don't know the lingo or literature. If anyone has suggestions I'd be >> thrilled. Are there documentation/methodologies within PETSc for this type >> of situation? >> >> Is there any advantage to discretizing the action itself and using the >> optimization routines? With minor modifications I'll have the gradient and >> Hessian calculations coded. Are the optimization routines likely to >> stagnate in the same way as the nonlinear solver, or can they take >> advantage of the structure of the problem to overcome this? >> >> Thanks a lot in advance for any help. >> >> On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith wrote: >> >>> >>> There is apparently confusing in understanding the ordering. Is this >>> all on one process that you get funny results? Are you using >>> MatSetValuesStencil() to provide the matrix (it is generally easier than >>> providing it yourself). In parallel MatView() always maps the rows and >>> columns to the natural ordering before printing, if you use a matrix >>> created from the DMDA. If you create the matrix yourself it has a different >>> MatView in parallel that is in in thePETSc ordering.\ >>> >>> >>> Barry >>> >>> >>> >>> > On Oct 8, 2017, at 8:05 AM, zakaryah . wrote: >>> > >>> > I'm more confused than ever. I don't understand the output of >>> -snes_type test -snes_test_display. >>> > >>> > For the user-defined state of the vector (where I'd like to test the >>> Jacobian), the finite difference Jacobian at row 0 evaluates as: >>> > >>> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >>> 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, >>> 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) >>> (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) >>> (37, 16.325) (38, 4.83918) >>> > >>> > But the hand-coded Jacobian at row 0 evaluates as: >>> > >>> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >>> 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, >>> 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) >>> (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, >>> 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) >>> > and the difference between the Jacobians at row 0 evaluates as: >>> > >>> > row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, >>> 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, >>> -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) >>> (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, >>> -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) >>> (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, >>> 0.) (41, 0.) >>> > >>> > The difference between the column numbering between the finite >>> difference and the hand-coded Jacobians looks like a serious problem to me, >>> but I'm probably missing something. >>> > >>> > I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, and >>> for this test problem the grid dimensions are 11x7x6. For a grid point >>> x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? >>> If so, then the column numbers of the hand-coded Jacobian match those of >>> the 27 point stencil I have in mind. However, I am then at a loss to >>> explain the column numbers in the finite difference Jacobian. >>> > >>> > >>> > >>> > >>> > On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . wrote: >>> > OK - I ran with -snes_monitor -snes_converged_reason >>> -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual >>> -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls >>> -snes_compare_explicit >>> > >>> > and here is the full error message, output immediately after >>> > >>> > Finite difference Jacobian >>> > Mat Object: 24 MPI processes >>> > type: mpiaij >>> > >>> > [0]PETSC ERROR: --------------------- Error Message >>> -------------------------------------------------------------- >>> > >>> > [0]PETSC ERROR: Invalid argument >>> > >>> > [0]PETSC ERROR: Matrix not generated from a DMDA >>> > >>> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/d >>> ocumentation/faq.html for trouble shooting. >>> > >>> > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 >>> > >>> > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named >>> node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 >>> > >>> > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 >>> --download-fblaslapack -with-debugging=0 >>> > >>> > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in >>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c >>> > >>> > [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/PETSc >>> /build/petsc-3.7.6/src/mat/interface/matrix.c >>> > >>> > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in >>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/in >>> terface/snes.c >>> > >>> > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in >>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c >>> > >>> > [0]PETSC ERROR: #5 SNESSolve() line 4005 in >>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/in >>> terface/snes.c >>> > >>> > [0]PETSC ERROR: #6 solveWarp3D() line 659 in >>> /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October >>> 6_2017/mshs.c >>> > >>> > >>> > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown wrote: >>> > Always always always send the whole error message. >>> > >>> > "zakaryah ." writes: >>> > >>> > > I tried -snes_compare_explicit, and got the following error: >>> > > >>> > > [0]PETSC ERROR: Invalid argument >>> > > >>> > > [0]PETSC ERROR: Matrix not generated from a DMDA >>> > > >>> > > What am I doing wrong? >>> > > >>> > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown wrote: >>> > > >>> > >> Barry Smith writes: >>> > >> >>> > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . >>> wrote: >>> > >> >> >>> > >> >> I'm still working on this. I've made some progress, and it >>> looks like >>> > >> the issue is with the KSP, at least for now. The Jacobian may be >>> > >> ill-conditioned. Is it possible to use -snes_test_display during an >>> > >> intermediate step of the analysis? I would like to inspect the >>> Jacobian >>> > >> after several solves have already completed, >>> > >> > >>> > >> > No, our currently code for testing Jacobians is poor quality >>> and >>> > >> poorly organized. Needs a major refactoring to do things properly. >>> Sorry >>> > >> >>> > >> You can use -snes_compare_explicit or -snes_compare_coloring to >>> output >>> > >> differences on each Newton step. >>> > >> >>> > >>> > >>> >>> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From evanum at gmail.com Wed Oct 11 11:14:03 2017 From: evanum at gmail.com (Evan Um) Date: Wed, 11 Oct 2017 09:14:03 -0700 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: Hi Hong, Thanks for your kind email. I write another email to make sure I understand it correctly. Does the zipped file from https://www.mcs.anl.gov/petsc/developers/index.html#browsing have the updated feature? Best, Evan - https://bitbucket.org/petsc/petsc/get/master.tar.gz On Wed, Oct 11, 2017 at 7:56 AM, Hong wrote: > Evan: > The branch is already merged to master. Just use petsc master branch. > Hong > > Hi Matt, >> >> Still unclear to me. I go to https://bitbucket.org/petsc >> /petsc/addon/pipelines/home#!/ and then https://bitbucket.org/pet >> sc/petsc/branches/. I don't see any tarred file or directory named "update-mumps-5.1.1-cntl". >> Could you explain how to download the modified version from the site a >> little bit in detail? Thank you very much for your help. >> >> Evan >> >> >> >> >> On Wed, Oct 11, 2017 at 1:54 AM, Matthew Knepley >> wrote: >> >>> On Tue, Oct 10, 2017 at 10:30 PM, Evan Um wrote: >>> >>>> Dear Hong, >>>> >>>> I just tried to check PETSC develeoper website but couldn't find the >>>> updated dev version with name of hzang/update-mumps-5.1.1-cntl. >>>> >>> >>> Hong means a branch. You get the dev repository and checkout that branch >>> >>> git checkout hzang/update-mumps-5.1.1-cntl >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> . Could you please let me know the location of the updated dev version? >>>> Where do I need to visit to check out the dev version with the new control >>>> switches? Thank you very much for your help. >>>> >>>> Best, >>>> Evan >>>> >>>> >>>> >>>> >>>> >>>> >>>> On Tue, Oct 3, 2017 at 8:34 AM, Hong wrote: >>>> >>>>> Evan, >>>>> ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch >>>>> hzhang/update-mumps-5.1.1-cntl >>>>> >>>>> You may give it a try. Once it passes our regression tests, I'll merge >>>>> it to petsc master branch. >>>>> >>>>> Hong >>>>> >>>>> >>>>> On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: >>>>> >>>>>> I'll check it. >>>>>> Hong >>>>>> >>>>>> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: >>>>>> >>>>>>> Hi Barry, >>>>>>> >>>>>>> Thanks for your comments. To activate block low rank (BLR) >>>>>>> approximation in MUMPS version 5.1.1, a user needs to turn on the >>>>>>> functionality (i.e. ICNTL(35)=1) and specify the tolerance value (e.g. >>>>>>> CNTL(7)=1e-4). In PETSC, I think that we can set up ICNTL and CNTL >>>>>>> parameters for MUMPS. I was wondering if we can still use BLR approximation >>>>>>> for a preconditioner for Krylov solvers. >>>>>>> >>>>>>> Best, >>>>>>> Evan >>>>>>> >>>>>>> >>>>>>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith >>>>>>> wrote: >>>>>>> >>>>>>>> >>>>>>>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>>>>>>> > >>>>>>>> > Dear PETSC Users, >>>>>>>> > >>>>>>>> > My system matrix comes from finite element modeling and is >>>>>>>> complex and unstructured. Its typical size is a few millions-by a few >>>>>>>> millions. I wondering if I can use MUMPS parallel direct solver as a >>>>>>>> preconditioner in PETSC. For example, I want to pass factored matrices to >>>>>>>> Krylov iterative solvers such as QMR. Is there any PETSC+MUMPS example code >>>>>>>> for the purpose? >>>>>>>> >>>>>>>> You don't pass factored matrices you just pass the original >>>>>>>> matrix and use -pc_type lu -pc_factor_mat_solver_package mumps >>>>>>>> >>>>>>>> > Can PETSC call the latest MUMPS that supports block low rank >>>>>>>> approximation? >>>>>>>> >>>>>>>> No, send us info on it and we'll see if we can add an interface >>>>>>>> >>>>>>>> >>>>>>>> > >>>>>>>> > In advance, thank you very much for your comments. >>>>>>>> > >>>>>>>> > Best, >>>>>>>> > Evan >>>>>>>> > >>>>>>>> > >>>>>>>> > >>>>>>>> > >>>>>>>> > >>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Wed Oct 11 11:16:29 2017 From: hzhang at mcs.anl.gov (Hong) Date: Wed, 11 Oct 2017 11:16:29 -0500 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: Evan, I start looking at petsc/STRUMPACK-sparse interface. Currently, the interface supports preconditioners: lu and ilu. First, you configure petsc with '--download-strumpack --with-cxx-dialect=C++11' (some external packages are also required). After building petsc, I tested petsc/src/ksp/ksp/examples/tutorials/ex2.c: mpiexec -n 4 ./ex2 -pc_type ilu -pc_factor_mat_solver_package strumpack -mat_strumpack_hssminsize 2 -ksp_monitor # WARNING STRUMPACK: There were unrecognized options. 0 KSP Residual norm 7.459478705273e+00 1 KSP Residual norm 1.525843261155e-02 2 KSP Residual norm 6.672464933691e-05 Norm of error 6.61922e-05 iterations 2 using options '-help', I see strumpack supports two preconditioners, lu and ilu, with following few options: $ mpiexec -n 4 ./ex2 -pc_type ilu -pc_factor_mat_solver_package strumpack -mat_strumpack_hssminsize 2 -ksp_monitor -h |grep strumpack -mat_strumpack_verbose: Print STRUMPACK information (None) -mat_strumpack_rctol <0.01>: Relative compression tolerance (None) -mat_strumpack_colperm: Find a col perm to get nonzero diagonal (None) -mat_strumpack_hssminsize <2500>: Minimum size of dense block for HSS compression My guess is that hss is what you mentioned 'low rank approximation' and '-mat_strumpack_hssminsize' is the block size (excuse me for ignorant of HSS). If this is true, you requested feature is already in our interface. Otherwise, please send me more detailed info about your request. The STRUMPACK interface was contributed by an user, so I have to learn it for supporting it. Hong On Tue, Oct 10, 2017 at 2:23 PM, Evan Um wrote: > Hi Hong, > > Thanks for your help. I am testing if I can use MUMPS block low rank > factors as a preconditioner for QMR in MUMPS framework. I would like to ask > one more question. STRUMPACK (http://portal.nersc.gov/ > project/sparse/strumpack/) also supports low rank approximation. Can > PETSC also allow users to use the approximate factors as a preconditioner > in PETSC-QMR? > > Best, > Evan > > > On Tue, Oct 3, 2017 at 8:34 AM, Hong wrote: > >> Evan, >> ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch >> hzhang/update-mumps-5.1.1-cntl >> >> You may give it a try. Once it passes our regression tests, I'll merge it >> to petsc master branch. >> >> Hong >> >> >> On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: >> >>> I'll check it. >>> Hong >>> >>> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: >>> >>>> Hi Barry, >>>> >>>> Thanks for your comments. To activate block low rank (BLR) >>>> approximation in MUMPS version 5.1.1, a user needs to turn on the >>>> functionality (i.e. ICNTL(35)=1) and specify the tolerance value (e.g. >>>> CNTL(7)=1e-4). In PETSC, I think that we can set up ICNTL and CNTL >>>> parameters for MUMPS. I was wondering if we can still use BLR approximation >>>> for a preconditioner for Krylov solvers. >>>> >>>> Best, >>>> Evan >>>> >>>> >>>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith >>>> wrote: >>>> >>>>> >>>>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>>>> > >>>>> > Dear PETSC Users, >>>>> > >>>>> > My system matrix comes from finite element modeling and is complex >>>>> and unstructured. Its typical size is a few millions-by a few millions. I >>>>> wondering if I can use MUMPS parallel direct solver as a preconditioner in >>>>> PETSC. For example, I want to pass factored matrices to Krylov iterative >>>>> solvers such as QMR. Is there any PETSC+MUMPS example code for the purpose? >>>>> >>>>> You don't pass factored matrices you just pass the original matrix >>>>> and use -pc_type lu -pc_factor_mat_solver_package mumps >>>>> >>>>> > Can PETSC call the latest MUMPS that supports block low rank >>>>> approximation? >>>>> >>>>> No, send us info on it and we'll see if we can add an interface >>>>> >>>>> >>>>> > >>>>> > In advance, thank you very much for your comments. >>>>> > >>>>> > Best, >>>>> > Evan >>>>> > >>>>> > >>>>> > >>>>> > >>>>> > >>>>> >>>>> >>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Wed Oct 11 11:26:33 2017 From: hzhang at mcs.anl.gov (Hong) Date: Wed, 11 Oct 2017 11:26:33 -0500 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: On Wed, Oct 11, 2017 at 11:14 AM, Evan Um wrote: > Hi Hong, > > Thanks for your kind email. I write another email to make sure I > understand it correctly. Does the zipped file from > https://www.mcs.anl.gov/petsc/developers/index.html#browsing have the > updated feature? > > Best, > Evan > > > - https://bitbucket.org/petsc/petsc/get/master.tar.gz > > I do not see it :-( Get petsc-dev by - git clone https://bitbucket.org/petsc/petsc Then do following: git checkout master git pull Check file src/mat/impls/aij/mpi/mumps/mumps.c to see if ICNTL(35) is there or not. Hong > > > > On Wed, Oct 11, 2017 at 7:56 AM, Hong wrote: > >> Evan: >> The branch is already merged to master. Just use petsc master branch. >> Hong >> >> Hi Matt, >>> >>> Still unclear to me. I go to https://bitbucket.org/petsc >>> /petsc/addon/pipelines/home#!/ and then https://bitbucket.org/pet >>> sc/petsc/branches/. I don't see any tarred file or directory named "update-mumps-5.1.1-cntl". >>> Could you explain how to download the modified version from the site a >>> little bit in detail? Thank you very much for your help. >>> >>> Evan >>> >>> >>> >>> >>> On Wed, Oct 11, 2017 at 1:54 AM, Matthew Knepley >>> wrote: >>> >>>> On Tue, Oct 10, 2017 at 10:30 PM, Evan Um wrote: >>>> >>>>> Dear Hong, >>>>> >>>>> I just tried to check PETSC develeoper website but couldn't find the >>>>> updated dev version with name of hzang/update-mumps-5.1.1-cntl. >>>>> >>>> >>>> Hong means a branch. You get the dev repository and checkout that branch >>>> >>>> git checkout hzang/update-mumps-5.1.1-cntl >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> . Could you please let me know the location of the updated dev >>>>> version? Where do I need to visit to check out the dev version with the new >>>>> control switches? Thank you very much for your help. >>>>> >>>>> Best, >>>>> Evan >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> On Tue, Oct 3, 2017 at 8:34 AM, Hong wrote: >>>>> >>>>>> Evan, >>>>>> ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch >>>>>> hzhang/update-mumps-5.1.1-cntl >>>>>> >>>>>> You may give it a try. Once it passes our regression tests, I'll >>>>>> merge it to petsc master branch. >>>>>> >>>>>> Hong >>>>>> >>>>>> >>>>>> On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: >>>>>> >>>>>>> I'll check it. >>>>>>> Hong >>>>>>> >>>>>>> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: >>>>>>> >>>>>>>> Hi Barry, >>>>>>>> >>>>>>>> Thanks for your comments. To activate block low rank (BLR) >>>>>>>> approximation in MUMPS version 5.1.1, a user needs to turn on the >>>>>>>> functionality (i.e. ICNTL(35)=1) and specify the tolerance value (e.g. >>>>>>>> CNTL(7)=1e-4). In PETSC, I think that we can set up ICNTL and CNTL >>>>>>>> parameters for MUMPS. I was wondering if we can still use BLR approximation >>>>>>>> for a preconditioner for Krylov solvers. >>>>>>>> >>>>>>>> Best, >>>>>>>> Evan >>>>>>>> >>>>>>>> >>>>>>>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith >>>>>>>> wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>>>>>>>> > >>>>>>>>> > Dear PETSC Users, >>>>>>>>> > >>>>>>>>> > My system matrix comes from finite element modeling and is >>>>>>>>> complex and unstructured. Its typical size is a few millions-by a few >>>>>>>>> millions. I wondering if I can use MUMPS parallel direct solver as a >>>>>>>>> preconditioner in PETSC. For example, I want to pass factored matrices to >>>>>>>>> Krylov iterative solvers such as QMR. Is there any PETSC+MUMPS example code >>>>>>>>> for the purpose? >>>>>>>>> >>>>>>>>> You don't pass factored matrices you just pass the original >>>>>>>>> matrix and use -pc_type lu -pc_factor_mat_solver_package mumps >>>>>>>>> >>>>>>>>> > Can PETSC call the latest MUMPS that supports block low rank >>>>>>>>> approximation? >>>>>>>>> >>>>>>>>> No, send us info on it and we'll see if we can add an interface >>>>>>>>> >>>>>>>>> >>>>>>>>> > >>>>>>>>> > In advance, thank you very much for your comments. >>>>>>>>> > >>>>>>>>> > Best, >>>>>>>>> > Evan >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From evanum at gmail.com Wed Oct 11 11:37:05 2017 From: evanum at gmail.com (Evan Um) Date: Wed, 11 Oct 2017 09:37:05 -0700 Subject: [petsc-users] Using factored complex matrices from MUMPS as a preconditioner in PETSC In-Reply-To: References: Message-ID: Hi Hong, I was able to check out the dev version. Now, I see ICNTL(35) inside mumps.c I am going to test the functionality. If I see any issue, I will report it to you. Thank you very much for your kind help. Best, Evan On Wed, Oct 11, 2017 at 9:26 AM, Hong wrote: > > > On Wed, Oct 11, 2017 at 11:14 AM, Evan Um wrote: > >> Hi Hong, >> >> Thanks for your kind email. I write another email to make sure I >> understand it correctly. Does the zipped file from >> https://www.mcs.anl.gov/petsc/developers/index.html#browsing have the >> updated feature? >> >> Best, >> Evan >> >> >> - https://bitbucket.org/petsc/petsc/get/master.tar.gz >> >> I do not see it :-( > Get petsc-dev by > > - git clone https://bitbucket.org/petsc/petsc > > Then do following: > git checkout master > git pull > > Check file > src/mat/impls/aij/mpi/mumps/mumps.c to see if ICNTL(35) is there or not. > > Hong > >> >> >> >> On Wed, Oct 11, 2017 at 7:56 AM, Hong wrote: >> >>> Evan: >>> The branch is already merged to master. Just use petsc master branch. >>> Hong >>> >>> Hi Matt, >>>> >>>> Still unclear to me. I go to https://bitbucket.org/petsc >>>> /petsc/addon/pipelines/home#!/ and then https://bitbucket.org/pet >>>> sc/petsc/branches/. I don't see any tarred file or directory named "update-mumps-5.1.1-cntl". >>>> Could you explain how to download the modified version from the site a >>>> little bit in detail? Thank you very much for your help. >>>> >>>> Evan >>>> >>>> >>>> >>>> >>>> On Wed, Oct 11, 2017 at 1:54 AM, Matthew Knepley >>>> wrote: >>>> >>>>> On Tue, Oct 10, 2017 at 10:30 PM, Evan Um wrote: >>>>> >>>>>> Dear Hong, >>>>>> >>>>>> I just tried to check PETSC develeoper website but couldn't find the >>>>>> updated dev version with name of hzang/update-mumps-5.1.1-cntl. >>>>>> >>>>> >>>>> Hong means a branch. You get the dev repository and checkout that >>>>> branch >>>>> >>>>> git checkout hzang/update-mumps-5.1.1-cntl >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> . Could you please let me know the location of the updated dev >>>>>> version? Where do I need to visit to check out the dev version with the new >>>>>> control switches? Thank you very much for your help. >>>>>> >>>>>> Best, >>>>>> Evan >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Tue, Oct 3, 2017 at 8:34 AM, Hong wrote: >>>>>> >>>>>>> Evan, >>>>>>> ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch >>>>>>> hzhang/update-mumps-5.1.1-cntl >>>>>>> >>>>>>> You may give it a try. Once it passes our regression tests, I'll >>>>>>> merge it to petsc master branch. >>>>>>> >>>>>>> Hong >>>>>>> >>>>>>> >>>>>>> On Sun, Sep 24, 2017 at 8:08 PM, Hong wrote: >>>>>>> >>>>>>>> I'll check it. >>>>>>>> Hong >>>>>>>> >>>>>>>> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um wrote: >>>>>>>> >>>>>>>>> Hi Barry, >>>>>>>>> >>>>>>>>> Thanks for your comments. To activate block low rank (BLR) >>>>>>>>> approximation in MUMPS version 5.1.1, a user needs to turn on the >>>>>>>>> functionality (i.e. ICNTL(35)=1) and specify the tolerance value (e.g. >>>>>>>>> CNTL(7)=1e-4). In PETSC, I think that we can set up ICNTL and CNTL >>>>>>>>> parameters for MUMPS. I was wondering if we can still use BLR approximation >>>>>>>>> for a preconditioner for Krylov solvers. >>>>>>>>> >>>>>>>>> Best, >>>>>>>>> Evan >>>>>>>>> >>>>>>>>> >>>>>>>>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> > On Sep 23, 2017, at 8:38 PM, Evan Um wrote: >>>>>>>>>> > >>>>>>>>>> > Dear PETSC Users, >>>>>>>>>> > >>>>>>>>>> > My system matrix comes from finite element modeling and is >>>>>>>>>> complex and unstructured. Its typical size is a few millions-by a few >>>>>>>>>> millions. I wondering if I can use MUMPS parallel direct solver as a >>>>>>>>>> preconditioner in PETSC. For example, I want to pass factored matrices to >>>>>>>>>> Krylov iterative solvers such as QMR. Is there any PETSC+MUMPS example code >>>>>>>>>> for the purpose? >>>>>>>>>> >>>>>>>>>> You don't pass factored matrices you just pass the original >>>>>>>>>> matrix and use -pc_type lu -pc_factor_mat_solver_package mumps >>>>>>>>>> >>>>>>>>>> > Can PETSC call the latest MUMPS that supports block low rank >>>>>>>>>> approximation? >>>>>>>>>> >>>>>>>>>> No, send us info on it and we'll see if we can add an interface >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> > >>>>>>>>>> > In advance, thank you very much for your comments. >>>>>>>>>> > >>>>>>>>>> > Best, >>>>>>>>>> > Evan >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From aliberkkahraman at yahoo.com Wed Oct 11 12:12:50 2017 From: aliberkkahraman at yahoo.com (Ali Berk Kahraman) Date: Wed, 11 Oct 2017 20:12:50 +0300 Subject: [petsc-users] TSSetTimeStep or -ts_dt not working Message-ID: Hello All, I am trying to use timesteppers, however I have a problem. Whenever I want to set a dt for the timstepping integration, I cannot succeed. Neither the function TSSetTimeStep nor the command line option seems to work. Using either of them(but not at the same time) and setting it to 1, when I call TSGetTimeStep to learn the value of dt, it gives me the answer 0.002871. Perhaps this is the default value, however I cannot work with this, since the equation I have is slowly evolving. Any ideas what might be causing this? The part of the code I use, PetscReal time=0, timestep=1; int maxsteps=30; float maxtime=300; TS ts; ierr= TSCreate(PETSC_COMM_WORLD,&ts); CHKERRQ(ierr); ierr= TSSetProblemType(ts,TS_LINEAR); CHKERRQ(ierr); ierr= TSSetSolution(ts, dummyvec); CHKERRQ(ierr); ierr= TSSetType(ts,TSRK); CHKERRQ(ierr); ierr= TSSetTime(ts,time); CHKERRQ(ierr); ierr= TSSetTimeStep(ts,timestep); CHKERRQ(ierr); ierr=TSSetExactFinalTime(ts,TS_EXACTFINALTIME_STEPOVER); CHKERRQ(ierr); TSSetMaxSteps(ts,maxsteps); TSSetMaxTime(ts,maxtime); ierr=TSSetFromOptions(ts);CHKERRQ(ierr); TSSetRHSFunction(ts,residual, FormRHSFunction,&mycontext); ierr= TSSolve(ts,uJi); CHKERRQ(ierr); TSView(ts,PETSC_VIEWER_STDOUT_SELF); TSConvergedReason reason; ierr=TSGetConvergedReason(ts,&reason);CHKERRQ(ierr); printf("Why Converged: %d\n",reason); PetscReal usedtimestep; TSGetTimeStep(ts,&usedtimestep); printf("Used timestep: %f\n",usedtimestep); From mfadams at lbl.gov Wed Oct 11 12:18:40 2017 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 11 Oct 2017 13:18:40 -0400 Subject: [petsc-users] TSSetTimeStep or -ts_dt not working In-Reply-To: References: Message-ID: Try running with -ts_view and -options_left and send all of the output. On Wed, Oct 11, 2017 at 1:12 PM, Ali Berk Kahraman < aliberkkahraman at yahoo.com> wrote: > Hello All, > > > I am trying to use timesteppers, however I have a problem. Whenever I want > to set a dt for the timstepping integration, I cannot succeed. Neither the > function TSSetTimeStep nor the command line option seems to work. Using > either of them(but not at the same time) and setting it to 1, when I call > TSGetTimeStep to learn the value of dt, it gives me the answer 0.002871. > Perhaps this is the default value, however I cannot work with this, since > the equation I have is slowly evolving. > > > Any ideas what might be causing this? > > > The part of the code I use, > > > PetscReal time=0, timestep=1; > int maxsteps=30; > float maxtime=300; > > > TS ts; > ierr= TSCreate(PETSC_COMM_WORLD,&ts); > CHKERRQ(ierr); > ierr= TSSetProblemType(ts,TS_LINEAR); > CHKERRQ(ierr); > ierr= TSSetSolution(ts, dummyvec); > CHKERRQ(ierr); > ierr= TSSetType(ts,TSRK); > CHKERRQ(ierr); > ierr= TSSetTime(ts,time); > CHKERRQ(ierr); > ierr= TSSetTimeStep(ts,timestep); > CHKERRQ(ierr); > ierr=TSSetExactFinalTime(ts,TS_EXACTFINALTIME_STEPOVER); > CHKERRQ(ierr); > TSSetMaxSteps(ts,maxsteps); > TSSetMaxTime(ts,maxtime); > > ierr=TSSetFromOptions(ts);CHKERRQ(ierr); > TSSetRHSFunction(ts,residual, > FormRHSFunction,&mycontext); > ierr= TSSolve(ts,uJi); > CHKERRQ(ierr); > TSView(ts,PETSC_VIEWER_STDOUT_SELF); > > TSConvergedReason reason; > ierr=TSGetConvergedReason(ts,&reason);CHKERRQ(ierr); > printf("Why Converged: %d\n",reason); > PetscReal usedtimestep; > TSGetTimeStep(ts,&usedtimestep); > printf("Used timestep: %f\n",usedtimestep); > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hongzhang at anl.gov Wed Oct 11 12:49:04 2017 From: hongzhang at anl.gov (Zhang, Hong) Date: Wed, 11 Oct 2017 17:49:04 +0000 Subject: [petsc-users] TSSetTimeStep or -ts_dt not working In-Reply-To: References: Message-ID: By default TSRK uses adaptive time stepping. TSSetTimeStep sets the initial time step, and the time step will be adapted automatically during the integration. Since you call TSGetTimeStep after TSSolve, you actually get the step size for the last time step. To view the step size at each step, run with -ts_monitor. If you want to use a fixed step size, you can do -ts_adapt_type none -ts_dt . Hong (Mr.) > On Oct 11, 2017, at 12:12 PM, Ali Berk Kahraman wrote: > > Hello All, > > > I am trying to use timesteppers, however I have a problem. Whenever I want to set a dt for the timstepping integration, I cannot succeed. Neither the function TSSetTimeStep nor the command line option seems to work. Using either of them(but not at the same time) and setting it to 1, when I call TSGetTimeStep to learn the value of dt, it gives me the answer 0.002871. Perhaps this is the default value, however I cannot work with this, since the equation I have is slowly evolving. > > > Any ideas what might be causing this? > > > The part of the code I use, > > > PetscReal time=0, timestep=1; > int maxsteps=30; > float maxtime=300; > > > TS ts; > ierr= TSCreate(PETSC_COMM_WORLD,&ts); > CHKERRQ(ierr); > ierr= TSSetProblemType(ts,TS_LINEAR); > CHKERRQ(ierr); > ierr= TSSetSolution(ts, dummyvec); > CHKERRQ(ierr); > ierr= TSSetType(ts,TSRK); > CHKERRQ(ierr); > ierr= TSSetTime(ts,time); > CHKERRQ(ierr); > ierr= TSSetTimeStep(ts,timestep); > CHKERRQ(ierr); > ierr=TSSetExactFinalTime(ts,TS_EXACTFINALTIME_STEPOVER); > CHKERRQ(ierr); > TSSetMaxSteps(ts,maxsteps); > TSSetMaxTime(ts,maxtime); > > ierr=TSSetFromOptions(ts);CHKERRQ(ierr); > TSSetRHSFunction(ts,residual, > FormRHSFunction,&mycontext); > ierr= TSSolve(ts,uJi); > CHKERRQ(ierr); > TSView(ts,PETSC_VIEWER_STDOUT_SELF); > > TSConvergedReason reason; > ierr=TSGetConvergedReason(ts,&reason);CHKERRQ(ierr); > printf("Why Converged: %d\n",reason); > PetscReal usedtimestep; > TSGetTimeStep(ts,&usedtimestep); > printf("Used timestep: %f\n",usedtimestep); > > From aliberkkahraman at yahoo.com Wed Oct 11 14:24:27 2017 From: aliberkkahraman at yahoo.com (Ali Berk Kahraman) Date: Wed, 11 Oct 2017 22:24:27 +0300 Subject: [petsc-users] TSSetTimeStep or -ts_dt not working In-Reply-To: References: Message-ID: <2a0f74d9-09b3-0e3f-2de1-58ea808214a4@yahoo.com> Dr. Zhang, Thank you for your reply. My problem is solved using a slight variation of your suggestions. To execute your suggestion inside the code, I have used TSGetAdapt and TSAdaptSetType and set the type to TSADAPTNONE. This solved my problem. On 11-10-2017 20:49, Zhang, Hong wrote: > By default TSRK uses adaptive time stepping. TSSetTimeStep sets the initial time step, and the time step will be adapted automatically during the integration. Since you call TSGetTimeStep after TSSolve, you actually get the step size for the last time step. To view the step size at each step, run with -ts_monitor. > > If you want to use a fixed step size, you can do -ts_adapt_type none -ts_dt . > > Hong (Mr.) > >> On Oct 11, 2017, at 12:12 PM, Ali Berk Kahraman wrote: >> >> Hello All, >> >> >> I am trying to use timesteppers, however I have a problem. Whenever I want to set a dt for the timstepping integration, I cannot succeed. Neither the function TSSetTimeStep nor the command line option seems to work. Using either of them(but not at the same time) and setting it to 1, when I call TSGetTimeStep to learn the value of dt, it gives me the answer 0.002871. Perhaps this is the default value, however I cannot work with this, since the equation I have is slowly evolving. >> >> >> Any ideas what might be causing this? >> >> >> The part of the code I use, >> >> >> PetscReal time=0, timestep=1; >> int maxsteps=30; >> float maxtime=300; >> >> >> TS ts; >> ierr= TSCreate(PETSC_COMM_WORLD,&ts); >> CHKERRQ(ierr); >> ierr= TSSetProblemType(ts,TS_LINEAR); >> CHKERRQ(ierr); >> ierr= TSSetSolution(ts, dummyvec); >> CHKERRQ(ierr); >> ierr= TSSetType(ts,TSRK); >> CHKERRQ(ierr); >> ierr= TSSetTime(ts,time); >> CHKERRQ(ierr); >> ierr= TSSetTimeStep(ts,timestep); >> CHKERRQ(ierr); >> ierr=TSSetExactFinalTime(ts,TS_EXACTFINALTIME_STEPOVER); >> CHKERRQ(ierr); >> TSSetMaxSteps(ts,maxsteps); >> TSSetMaxTime(ts,maxtime); >> >> ierr=TSSetFromOptions(ts);CHKERRQ(ierr); >> TSSetRHSFunction(ts,residual, >> FormRHSFunction,&mycontext); >> ierr= TSSolve(ts,uJi); >> CHKERRQ(ierr); >> TSView(ts,PETSC_VIEWER_STDOUT_SELF); >> >> TSConvergedReason reason; >> ierr=TSGetConvergedReason(ts,&reason);CHKERRQ(ierr); >> printf("Why Converged: %d\n",reason); >> PetscReal usedtimestep; >> TSGetTimeStep(ts,&usedtimestep); >> printf("Used timestep: %f\n",usedtimestep); >> >> From aliberkkahraman at yahoo.com Wed Oct 11 14:25:17 2017 From: aliberkkahraman at yahoo.com (Ali Berk Kahraman) Date: Wed, 11 Oct 2017 22:25:17 +0300 Subject: [petsc-users] TSSetTimeStep or -ts_dt not working In-Reply-To: References: Message-ID: <68e11fe9-868e-f2d7-c10f-abf17db36d1c@yahoo.com> Dr. Adams, thank you for your reply. I have solved the problem with suggestions of Dr.Zhang, who answered right after you. On 11-10-2017 20:18, Mark Adams wrote: > Try running with -ts_view and -options_left and send all of the output. > > On Wed, Oct 11, 2017 at 1:12 PM, Ali Berk Kahraman > > wrote: > > Hello All, > > > I am trying to use timesteppers, however I have a problem. > Whenever I want to set a dt for the timstepping integration, I > cannot succeed. Neither the function TSSetTimeStep nor the command > line option seems to work. Using either of them(but not at the > same time) and setting it to 1, when I call TSGetTimeStep to learn > the value of dt, it gives me the answer 0.002871. Perhaps this is > the default value, however I cannot work with this, since the > equation I have is slowly evolving. > > > Any ideas what might be causing this? > > > The part of the code I use, > > > PetscReal time=0, timestep=1; > ? ? int maxsteps=30; > ? ? float maxtime=300; > > > ? ? ? ? TS ts; > ? ? ? ? ierr= TSCreate(PETSC_COMM_WORLD,&ts); > ? ? ? ? CHKERRQ(ierr); > ? ? ? ? ierr= TSSetProblemType(ts,TS_LINEAR); > ? ? ? ? CHKERRQ(ierr); > ? ? ? ? ierr= TSSetSolution(ts, dummyvec); > ? ? ? ? CHKERRQ(ierr); > ? ? ? ? ierr= TSSetType(ts,TSRK); > ? ? ? ? CHKERRQ(ierr); > ? ? ? ? ierr= TSSetTime(ts,time); > ? ? ? ? CHKERRQ(ierr); > ? ? ? ? ierr= TSSetTimeStep(ts,timestep); > ? ? ? ? CHKERRQ(ierr); > ? ? ? ? ierr=TSSetExactFinalTime(ts,TS_EXACTFINALTIME_STEPOVER); > ? ? ? ? CHKERRQ(ierr); > ? ? ? ? TSSetMaxSteps(ts,maxsteps); > ? ? ? ? TSSetMaxTime(ts,maxtime); > > ? ? ? ? ierr=TSSetFromOptions(ts);CHKERRQ(ierr); > ? ? ? ? TSSetRHSFunction(ts,residual, > ? ? ? ? ? ? ? ? ? ? ? ? ?FormRHSFunction,&mycontext); > ? ? ? ? ierr= TSSolve(ts,uJi); > ? ? ? ? CHKERRQ(ierr); > ? ? ? ? TSView(ts,PETSC_VIEWER_STDOUT_SELF); > > ? ? ? ? TSConvergedReason reason; > ? ? ? ? ierr=TSGetConvergedReason(ts,&reason);CHKERRQ(ierr); > ? ? ? ? printf("Why Converged: %d\n",reason); > ? ? ? ? PetscReal usedtimestep; > ? ? ? ? TSGetTimeStep(ts,&usedtimestep); > ? ? ? ? printf("Used timestep: %f\n",usedtimestep); > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From w.t.jones at nasa.gov Wed Oct 11 16:29:47 2017 From: w.t.jones at nasa.gov (William T Jones) Date: Wed, 11 Oct 2017 17:29:47 -0400 Subject: [petsc-users] Trouble installing petsc4py in Anaconda environment Message-ID: <839806b2-d462-a261-be74-80948ddb658b@nasa.gov> I have created an Anaconda Pytyoh 2.7 environment on an SGI-ICE machine and included cython, numpy=1.12, scipy, and mpi4py (based SGI-MPT). While petsc installs fine with: % PETSC_CONFIGURE_OPTIONS="--download-fblaslapack=1" pip install https://bitbucket.org/petsc/petsc/get/maint.tar.gz I cannot get petsc4py to build/install. I am attempting with: % export PETSC_DIR=${PREFIX}/envs/myenv/lib/python2.7/site-packages/petsc % pip install --no-dependencies petsc4py Note, I am using "--no-dependencies" because I want to leave numpy at 1.12 and do not want it to be upgraded. Either way I get the output below. It appears that the link command has been corrupted with the addition of the "gcc" command in the middle of the link command. Any help is appreciated, % pip install --no-dependencies petsc4py Collecting petsc4py Using cached petsc4py-3.8.0.tar.gz Building wheels for collected packages: petsc4py Running setup.py bdist_wheel for petsc4py ... error Complete output from command /home/login/anaconda2/envs/myenv/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/tmpD2KqLjpip-wheel- --python-tag cp27: running bdist_wheel running build running build_src running build_py creating build creating build/lib.linux-x86_64-2.7 creating build/lib.linux-x86_64-2.7/petsc4py copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py creating build/lib.linux-x86_64-2.7/petsc4py/lib copying src/lib/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py/lib creating build/lib.linux-x86_64-2.7/petsc4py/include creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/petsc4py.PETSc.h -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/numpy.h -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/petsc4py.PETSc_api.h -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/petsc4py.h -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/petsc4py.i -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/__init__.pxd -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/__init__.pyx -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py copying src/lib/petsc.cfg -> build/lib.linux-x86_64-2.7/petsc4py/lib running build_ext PETSC_DIR: /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc PETSC_ARCH: version: 3.8.0 release integer-size: 32-bit scalar-type: real precision: double language: CONLY compiler: /vendor/sgi/mpt/2.14r19/bin/mpicc linker: /vendor/sgi/mpt/2.14r19/bin/mpicc building 'PETSc' extension creating build/temp.linux-x86_64-2.7 creating build/temp.linux-x86_64-2.7/src /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include -Isrc/include -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c -o build/temp.linux-x86_64-2.7/src/PETSc.o In file included from /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1788:0, from /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:18, from /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:4, from src/include/petsc4py/numpy.h:11, from src/petsc4py.PETSc.c:519, from src/PETSc.c:3: /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: warning: #warning "Using deprecated NumPy API, disable it by " "#defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] #warning "Using deprecated NumPy API, disable it by " \ ^ /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include -Isrc/include -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/libpetsc4py.c -o build/temp.linux-x86_64-2.7/src/libpetsc4py.o /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O gcc -pthread -shared -B /home/login/anaconda2/envs/myenv/compiler_compat -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -L/home/login/anaconda2/envs/myenv/lib -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib -Wl,--no-as-needed -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o build/temp.linux-x86_64-2.7/src/libpetsc4py.o -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib -L/home/login/anaconda2/envs/myenv/lib -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 -o build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so gcc: error: gcc: No such file or directory error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with exit status 1 ---------------------------------------- Failed building wheel for petsc4py Running setup.py clean for petsc4py Failed to build petsc4py Installing collected packages: petsc4py Running setup.py install for petsc4py ... error Complete output from command /home/login/anaconda2/envs/myenv/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-d6NYW8-record/install-record.txt --single-version-externally-managed --compile: running install running build running build_src running build_py creating build creating build/lib.linux-x86_64-2.7 creating build/lib.linux-x86_64-2.7/petsc4py copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py creating build/lib.linux-x86_64-2.7/petsc4py/lib copying src/lib/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py/lib creating build/lib.linux-x86_64-2.7/petsc4py/include creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/petsc4py.PETSc.h -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/numpy.h -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/petsc4py.PETSc_api.h -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/petsc4py.h -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/petsc4py.i -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/__init__.pxd -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/include/petsc4py/__init__.pyx -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py copying src/lib/petsc.cfg -> build/lib.linux-x86_64-2.7/petsc4py/lib running build_ext PETSC_DIR: /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc PETSC_ARCH: version: 3.8.0 release integer-size: 32-bit scalar-type: real precision: double language: CONLY compiler: /vendor/sgi/mpt/2.14r19/bin/mpicc linker: /vendor/sgi/mpt/2.14r19/bin/mpicc building 'PETSc' extension creating build/temp.linux-x86_64-2.7 creating build/temp.linux-x86_64-2.7/src /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include -Isrc/include -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c -o build/temp.linux-x86_64-2.7/src/PETSc.o In file included from /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1788:0, from /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:18, from /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:4, from src/include/petsc4py/numpy.h:11, from src/petsc4py.PETSc.c:519, from src/PETSc.c:3: /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: warning: #warning "Using deprecated NumPy API, disable it by " "#defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] #warning "Using deprecated NumPy API, disable it by " \ ^ /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include -Isrc/include -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/libpetsc4py.c -o build/temp.linux-x86_64-2.7/src/libpetsc4py.o /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O gcc -pthread -shared -B /home/login/anaconda2/envs/myenv/compiler_compat -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -L/home/login/anaconda2/envs/myenv/lib -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib -Wl,--no-as-needed -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o build/temp.linux-x86_64-2.7/src/libpetsc4py.o -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib -L/home/login/anaconda2/envs/myenv/lib -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 -o build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so gcc: error: gcc: No such file or directory error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with exit status 1 ---------------------------------------- Command "/home/login/anaconda2/envs/myenv/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-d6NYW8-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-v6lDIk/petsc4py/ -- =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Bill Jones W.T.JONES at NASA.GOV Mail Stop 128 Computational AeroSciences Branch 15 Langley Boulevard Research Directorate NASA Langley Research Center Building 1268, Room 1044 Hampton, VA 23681-2199 Phone +1 757 864-5318 Fax +1 757 864-8816 http://fun3d.larc.nasa.gov From knepley at gmail.com Wed Oct 11 18:06:58 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 11 Oct 2017 19:06:58 -0400 Subject: [petsc-users] Trouble installing petsc4py in Anaconda environment In-Reply-To: <839806b2-d462-a261-be74-80948ddb658b@nasa.gov> References: <839806b2-d462-a261-be74-80948ddb658b@nasa.gov> Message-ID: On Wed, Oct 11, 2017 at 5:29 PM, William T Jones wrote: > I have created an Anaconda Pytyoh 2.7 environment on an SGI-ICE machine > and included cython, numpy=1.12, scipy, and mpi4py (based SGI-MPT). While > petsc installs fine with: > > % PETSC_CONFIGURE_OPTIONS="--download-fblaslapack=1" pip install > https://bitbucket.org/petsc/petsc/get/maint.tar.gz > > I cannot get petsc4py to build/install. I am attempting with: > > % export PETSC_DIR=${PREFIX}/envs/myenv/lib/python2.7/site-packages/petsc > % pip install --no-dependencies petsc4py > > Note, I am using "--no-dependencies" because I want to leave numpy at 1.12 > and do not want it to be upgraded. Either way I get the output below. It > appears that the link command has been corrupted with the addition of the > "gcc" command in the middle of the link command. > /vendor/sgi/mpt/2.14r19/bin/mpicc is being called. Does this actually work? I would suspect it of calling 'gcc' Matt > Any help is appreciated, > > > % pip install --no-dependencies petsc4py > Collecting petsc4py > Using cached petsc4py-3.8.0.tar.gz > Building wheels for collected packages: petsc4py > Running setup.py bdist_wheel for petsc4py ... error > Complete output from command /home/login/anaconda2/envs/myenv/bin/python > -u -c "import setuptools, tokenize;__file__='/tmp/pip-bu > ild-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, 'open', > open)(__file__);code=f.read().replace('\r\n', > '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d > /tmp/tmpD2KqLjpip-wheel- --python-tag cp27: > running bdist_wheel > running build > running build_src > running build_py > creating build > creating build/lib.linux-x86_64-2.7 > creating build/lib.linux-x86_64-2.7/petsc4py > copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py > copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py > copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py > creating build/lib.linux-x86_64-2.7/petsc4py/lib > copying src/lib/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py/lib > creating build/lib.linux-x86_64-2.7/petsc4py/include > creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.PETSc.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/numpy.h -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > > copying src/include/petsc4py/petsc4py.PETSc_api.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.i -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/__init__.pxd -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > > copying src/include/petsc4py/__init__.pyx -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py > copying src/lib/petsc.cfg -> build/lib.linux-x86_64-2.7/petsc4py/lib > running build_ext > PETSC_DIR: /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > > PETSC_ARCH: > version: 3.8.0 release > integer-size: 32-bit > scalar-type: real > precision: double > language: CONLY > compiler: /vendor/sgi/mpt/2.14r19/bin/mpicc > linker: /vendor/sgi/mpt/2.14r19/bin/mpicc > building 'PETSc' extension > creating build/temp.linux-x86_64-2.7 > creating build/temp.linux-x86_64-2.7/src > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector > -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c -o > build/temp.linux-x86_64-2.7/src/PETSc.o > In file included from /home/login/anaconda2/envs/mye > nv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1788:0, > > from /home/login/anaconda2/envs/mye > nv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:18, > > from /home/login/anaconda2/envs/mye > nv/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:4, > from src/include/petsc4py/numpy.h:11, > from src/petsc4py.PETSc.c:519, > from src/PETSc.c:3: > > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages > /numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: warning: > #warning "Using deprecated NumPy API, disable it by " "#defining > NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] > #warning "Using deprecated NumPy API, disable it by " \ > ^ > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector > -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/libpetsc4py.c > -o build/temp.linux-x86_64-2.7/src/libpetsc4py.o > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector > -g -O gcc -pthread -shared -B /home/login/anaconda2/envs/myenv/compiler_compat > -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -L/home/login/anaconda2/envs/myenv/lib -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib > -Wl,--no-as-needed -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o > build/temp.linux-x86_64-2.7/src/libpetsc4py.o > -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -L/home/login/anaconda2/envs/myenv/lib -Wl,-rpath,/home/login/anacond > a2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 -o > build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so > gcc: error: gcc: No such file or directory > error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with exit > status 1 > > > ---------------------------------------- > Failed building wheel for petsc4py > Running setup.py clean for petsc4py > Failed to build petsc4py > Installing collected packages: petsc4py > Running setup.py install for petsc4py ... error > Complete output from command /home/login/anaconda2/envs/myenv/bin/python > -u -c "import setuptools, tokenize;__file__='/tmp/pip-bu > ild-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, 'open', > open)(__file__);code=f.read().replace('\r\n', > '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record > /tmp/pip-d6NYW8-record/install-record.txt --single-version-externally-managed > --compile: > running install > running build > running build_src > running build_py > creating build > creating build/lib.linux-x86_64-2.7 > creating build/lib.linux-x86_64-2.7/petsc4py > copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py > copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py > copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py > creating build/lib.linux-x86_64-2.7/petsc4py/lib > copying src/lib/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py/lib > creating build/lib.linux-x86_64-2.7/petsc4py/include > creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.PETSc.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/numpy.h -> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > > copying src/include/petsc4py/petsc4py.PETSc_api.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.i -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/__init__.pxd -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/PETSc.pxd -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/__init__.pyx -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py > copying src/lib/petsc.cfg -> build/lib.linux-x86_64-2.7/petsc4py/lib > running build_ext > PETSC_DIR: /home/login/anaconda2/envs/mye > nv/lib/python2.7/site-packages/petsc > PETSC_ARCH: > version: 3.8.0 release > integer-size: 32-bit > scalar-type: real > precision: double > language: CONLY > compiler: /vendor/sgi/mpt/2.14r19/bin/mpicc > linker: /vendor/sgi/mpt/2.14r19/bin/mpicc > building 'PETSc' extension > creating build/temp.linux-x86_64-2.7 > creating build/temp.linux-x86_64-2.7/src > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector > -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c -o > build/temp.linux-x86_64-2.7/src/PETSc.o > In file included from /home/login/anaconda2/envs/mye > nv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1788:0, > > from /home/login/anaconda2/envs/mye > nv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:18, > > from /home/login/anaconda2/envs/mye > nv/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:4, > from src/include/petsc4py/numpy.h:11, > from src/petsc4py.PETSc.c:519, > from src/PETSc.c:3: > > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages > /numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: warning: > #warning "Using deprecated NumPy API, disable it by " "#defining > NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] > #warning "Using deprecated NumPy API, disable it by " \ > ^ > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector > -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/libpetsc4py.c > -o build/temp.linux-x86_64-2.7/src/libpetsc4py.o > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector > -g -O gcc -pthread -shared -B /home/login/anaconda2/envs/myenv/compiler_compat > -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -L/home/login/anaconda2/envs/myenv/lib -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib > -Wl,--no-as-needed -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o > build/temp.linux-x86_64-2.7/src/libpetsc4py.o > -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -L/home/login/anaconda2/envs/myenv/lib -Wl,-rpath,/home/login/anacond > a2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 -o > build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so > gcc: error: gcc: No such file or directory > error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with exit > status 1 > > ---------------------------------------- > Command "/home/login/anaconda2/envs/myenv/bin/python -u -c "import > setuptools, tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, > 'open', open)(__file__);code=f.read().replace('\r\n', > '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record > /tmp/pip-d6NYW8-record/install-record.txt --single-version-externally-managed > --compile" failed with error code 1 in /tmp/pip-build-v6lDIk/petsc4py/ > > > -- > =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- > > Bill Jones W.T.JONES at NASA.GOV > Mail Stop 128 Computational AeroSciences Branch > 15 Langley Boulevard Research Directorate > NASA Langley Research Center Building 1268, Room 1044 > Hampton, VA 23681-2199 Phone +1 757 864-5318 > Fax +1 757 864-8816 > http://fun3d.larc.nasa.gov > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 11 19:53:23 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 11 Oct 2017 20:53:23 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> Message-ID: On Wed, Oct 11, 2017 at 11:33 AM, zakaryah . wrote: > Many thanks for the suggestions, Matt. > > I tried putting the solvers in a loop, like this: > > do { > NewtonLS > check convergence > if (converged) break > NRichardson or NGMRES > } while (!converged) > > The results were interesting, to me at least. With NRichardson, there was > indeed improvement in the residual norm, followed by improvement with > NewtonLS, and so on for a few iterations of this loop. In each case, after > a few iterations the NewtonLS appeared to be stuck in the same way as after > the first iteration. Eventually neither method was able to reduce the > residual norm, which was still significant, so this was not a total > success. With NGMRES, the initial behavior was similar, but eventually the > NGMRES progress became erratic. The minimal residual norm was a bit better > using NGMRES than NRichardson, but neither combination of methods fully > converged. For both NRichardson and NGMRES, I simply used the defaults, as > I have no knowledge of how to tune the options for my problem. > Are you certain that the equations have a solution? I become a little concerned when richardson stops converging. Its still possible you have really hard to solve equations, it just becomes less likely. And even if they truly are hard to solve, then there should be physical reasons for this. For example, it could be that discretizing the minimizing PDE is just the wrong thing to do. I believe this is the case in fracture, where you attack the minimization problem directly. Matt > On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley > wrote: > >> On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . wrote: >> >>> Thanks for clearing that up. >>> >>> I'd appreciate any further help. Here's a summary: >>> >>> My ultimate goal is to find a vector field which minimizes an action. >>> The action is a (nonlinear) function of the field and its first spatial >>> derivatives. >>> >>> My current approach is to derive the (continuous) Euler-Lagrange >>> equations, which results in a nonlinear PDE that the minimizing field must >>> satisfy. These Euler-Lagrange equations are then discretized, and I'm >>> trying to use an SNES to solve them. >>> >>> The problem is that the solver seems to reach a point at which the >>> Jacobian (this corresponds to the second variation of the action, which is >>> like a Hessian of the energy) becomes nearly singular, but where the >>> residual (RHS of PDE) is not close to zero. The residual does not decrease >>> over additional SNES iterations, and the line search results in tiny step >>> sizes. My interpretation is that this point of stagnation is a critical >>> point. >>> >> >> The normal thing to do here (I think) is to engage solvers which do not >> depend on that particular point. So using >> NRichardson, or maybe NGMRES, to get past that. I would be interested to >> see if this is successful. >> >> Matt >> >> >>> I have checked the hand-coded Jacobian very carefully and I am confident >>> that it is correct. >>> >>> I am guessing that such a situation is well-known in the field, but I >>> don't know the lingo or literature. If anyone has suggestions I'd be >>> thrilled. Are there documentation/methodologies within PETSc for this type >>> of situation? >>> >>> Is there any advantage to discretizing the action itself and using the >>> optimization routines? With minor modifications I'll have the gradient and >>> Hessian calculations coded. Are the optimization routines likely to >>> stagnate in the same way as the nonlinear solver, or can they take >>> advantage of the structure of the problem to overcome this? >>> >>> Thanks a lot in advance for any help. >>> >>> On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith wrote: >>> >>>> >>>> There is apparently confusing in understanding the ordering. Is this >>>> all on one process that you get funny results? Are you using >>>> MatSetValuesStencil() to provide the matrix (it is generally easier than >>>> providing it yourself). In parallel MatView() always maps the rows and >>>> columns to the natural ordering before printing, if you use a matrix >>>> created from the DMDA. If you create the matrix yourself it has a different >>>> MatView in parallel that is in in thePETSc ordering.\ >>>> >>>> >>>> Barry >>>> >>>> >>>> >>>> > On Oct 8, 2017, at 8:05 AM, zakaryah . wrote: >>>> > >>>> > I'm more confused than ever. I don't understand the output of >>>> -snes_type test -snes_test_display. >>>> > >>>> > For the user-defined state of the vector (where I'd like to test the >>>> Jacobian), the finite difference Jacobian at row 0 evaluates as: >>>> > >>>> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >>>> 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, >>>> 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) >>>> (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) >>>> (37, 16.325) (38, 4.83918) >>>> > >>>> > But the hand-coded Jacobian at row 0 evaluates as: >>>> > >>>> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >>>> 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, >>>> 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) >>>> (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, >>>> 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) >>>> > and the difference between the Jacobians at row 0 evaluates as: >>>> > >>>> > row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, >>>> 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, >>>> -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) >>>> (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, >>>> -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) >>>> (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, >>>> 0.) (41, 0.) >>>> > >>>> > The difference between the column numbering between the finite >>>> difference and the hand-coded Jacobians looks like a serious problem to me, >>>> but I'm probably missing something. >>>> > >>>> > I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, >>>> and for this test problem the grid dimensions are 11x7x6. For a grid point >>>> x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? >>>> If so, then the column numbers of the hand-coded Jacobian match those of >>>> the 27 point stencil I have in mind. However, I am then at a loss to >>>> explain the column numbers in the finite difference Jacobian. >>>> > >>>> > >>>> > >>>> > >>>> > On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . >>>> wrote: >>>> > OK - I ran with -snes_monitor -snes_converged_reason >>>> -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual >>>> -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls >>>> -snes_compare_explicit >>>> > >>>> > and here is the full error message, output immediately after >>>> > >>>> > Finite difference Jacobian >>>> > Mat Object: 24 MPI processes >>>> > type: mpiaij >>>> > >>>> > [0]PETSC ERROR: --------------------- Error Message >>>> -------------------------------------------------------------- >>>> > >>>> > [0]PETSC ERROR: Invalid argument >>>> > >>>> > [0]PETSC ERROR: Matrix not generated from a DMDA >>>> > >>>> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/d >>>> ocumentation/faq.html for trouble shooting. >>>> > >>>> > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 >>>> > >>>> > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named >>>> node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 >>>> > >>>> > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 >>>> --download-fblaslapack -with-debugging=0 >>>> > >>>> > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in >>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c >>>> > >>>> > [0]PETSC ERROR: #2 MatView() line 901 in >>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/mat/int >>>> erface/matrix.c >>>> > >>>> > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in >>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/in >>>> terface/snes.c >>>> > >>>> > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in >>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c >>>> > >>>> > [0]PETSC ERROR: #5 SNESSolve() line 4005 in >>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/in >>>> terface/snes.c >>>> > >>>> > [0]PETSC ERROR: #6 solveWarp3D() line 659 in >>>> /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October >>>> 6_2017/mshs.c >>>> > >>>> > >>>> > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown wrote: >>>> > Always always always send the whole error message. >>>> > >>>> > "zakaryah ." writes: >>>> > >>>> > > I tried -snes_compare_explicit, and got the following error: >>>> > > >>>> > > [0]PETSC ERROR: Invalid argument >>>> > > >>>> > > [0]PETSC ERROR: Matrix not generated from a DMDA >>>> > > >>>> > > What am I doing wrong? >>>> > > >>>> > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown >>>> wrote: >>>> > > >>>> > >> Barry Smith writes: >>>> > >> >>>> > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . >>>> wrote: >>>> > >> >> >>>> > >> >> I'm still working on this. I've made some progress, and it >>>> looks like >>>> > >> the issue is with the KSP, at least for now. The Jacobian may be >>>> > >> ill-conditioned. Is it possible to use -snes_test_display during >>>> an >>>> > >> intermediate step of the analysis? I would like to inspect the >>>> Jacobian >>>> > >> after several solves have already completed, >>>> > >> > >>>> > >> > No, our currently code for testing Jacobians is poor quality >>>> and >>>> > >> poorly organized. Needs a major refactoring to do things properly. >>>> Sorry >>>> > >> >>>> > >> You can use -snes_compare_explicit or -snes_compare_coloring to >>>> output >>>> > >> differences on each Newton step. >>>> > >> >>>> > >>>> > >>>> >>>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From w.t.jones at nasa.gov Wed Oct 11 21:13:20 2017 From: w.t.jones at nasa.gov (William T. Jones) Date: Wed, 11 Oct 2017 22:13:20 -0400 Subject: [petsc-users] Trouble installing petsc4py in Anaconda environment In-Reply-To: References: <839806b2-d462-a261-be74-80948ddb658b@nasa.gov> Message-ID: On 10/11/17 7:06 PM, Matthew Knepley wrote: > On Wed, Oct 11, 2017 at 5:29 PM, William T Jones > wrote: > > I have created an Anaconda Pytyoh 2.7 environment on an SGI-ICE > machine and included cython, numpy=1.12, scipy, and mpi4py (based > SGI-MPT). While petsc installs fine with: > > % PETSC_CONFIGURE_OPTIONS="--download-fblaslapack=1" pip install > https://bitbucket.org/petsc/petsc/get/maint.tar.gz > > > I cannot get petsc4py to build/install.? I am attempting with: > > % export > PETSC_DIR=${PREFIX}/envs/myenv/lib/python2.7/site-packages/petsc > % pip install --no-dependencies petsc4py > > Note, I am using "--no-dependencies" because I want to leave numpy > at 1.12 and do not want it to be upgraded.? Either way I get the > output below.? It appears that the link command has been corrupted > with the addition of the "gcc" command in the middle of the link > command. > > > /vendor/sgi/mpt/2.14r19/bin/mpicc is being called. Does this actually > work? I would suspect it of calling 'gcc' > Matt, Thanks for the response. Yes, `/vendor/sgi/mpt/2.14r19/bin/mpicc` is a compiler wrapper that ultimately calls gcc. The problem is that in the middle of the link line there is an extraneous "...-fstack-protector -g -O gcc -pthread -shared -B...". From the error message, it is this embedded "gcc" that is the problem. I cannot see where petsc4py gets its compiler and linker commands. It would appear that they come from petsc. Again, it appears that the link line is being constructed inappropriately. > ? ?Matt > > Any help is appreciated, > > > % pip install --no-dependencies petsc4py > Collecting petsc4py > ? Using cached petsc4py-3.8.0.tar.gz > Building wheels for collected packages: petsc4py > ? Running setup.py bdist_wheel for petsc4py ... error > ? Complete output from command > /home/login/anaconda2/envs/myenv/bin/python -u -c "import > setuptools, > tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, > 'open', open)(__file__);code=f.read().replace('\r\n', > '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel > -d /tmp/tmpD2KqLjpip-wheel- --python-tag cp27: > ? running bdist_wheel > ? running build > ? running build_src > ? running build_py > ? creating build > ? creating build/lib.linux-x86_64-2.7 > ? creating build/lib.linux-x86_64-2.7/petsc4py > ? copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py > ? copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py > ? copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py > ? creating build/lib.linux-x86_64-2.7/petsc4py/lib > ? copying src/lib/__init__.py -> > build/lib.linux-x86_64-2.7/petsc4py/lib > ? creating build/lib.linux-x86_64-2.7/petsc4py/include > ? creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? copying src/include/petsc4py/petsc4py.PETSc.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? copying src/include/petsc4py/numpy.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? copying src/include/petsc4py/petsc4py.PETSc_api.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? copying src/include/petsc4py/petsc4py.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? copying src/include/petsc4py/petsc4py.i -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? copying src/include/petsc4py/__init__.pxd -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? copying src/include/petsc4py/PETSc.pxd -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? copying src/include/petsc4py/__init__.pyx -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py > ? copying src/lib/petsc.cfg -> build/lib.linux-x86_64-2.7/petsc4py/lib > ? running build_ext > ? PETSC_DIR: > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > ? PETSC_ARCH: > ? version:? ? ? 3.8.0 release > ? integer-size: 32-bit > ? scalar-type:? real > ? precision:? ? double > ? language:? ? ?CONLY > ? compiler:? ? ?/vendor/sgi/mpt/2.14r19/bin/mpicc > ? linker:? ? ? ?/vendor/sgi/mpt/2.14r19/bin/mpicc > ? building 'PETSc' extension > ? creating build/temp.linux-x86_64-2.7 > ? creating build/temp.linux-x86_64-2.7/src > ? /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas > -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG > -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c > -o build/temp.linux-x86_64-2.7/src/PETSc.o > ? In file included from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1788:0, > > ? ? ? ? ? ? ? ? ? ?from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:18, > > ? ? ? ? ? ? ? ? ? ?from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:4, > > ? ? ? ? ? ? ? ? ? ?from src/include/petsc4py/numpy.h:11, > ? ? ? ? ? ? ? ? ? ?from src/petsc4py.PETSc.c:519, > ? ? ? ? ? ? ? ? ? ?from src/PETSc.c:3: > > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: > warning: #warning "Using deprecated NumPy API, disable it by " > "#defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] > ? ?#warning "Using deprecated NumPy API, disable it by " \ > ? ? ^ > ? /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas > -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG > -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c > src/libpetsc4py.c -o build/temp.linux-x86_64-2.7/src/libpetsc4py.o > ? /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas > -fstack-protector -g -O gcc -pthread -shared -B > /home/login/anaconda2/envs/myenv/compiler_compat > -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -L/home/login/anaconda2/envs/myenv/lib > -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib -Wl,--no-as-needed > -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o > build/temp.linux-x86_64-2.7/src/libpetsc4py.o > -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -L/home/login/anaconda2/envs/myenv/lib > -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 > -o build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so > ? gcc: error: gcc: No such file or directory > ? error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with > exit status 1 > > > ? ---------------------------------------- > ? Failed building wheel for petsc4py > ? Running setup.py clean for petsc4py > Failed to build petsc4py > Installing collected packages: petsc4py > ? Running setup.py install for petsc4py ... error > ? ? Complete output from command > /home/login/anaconda2/envs/myenv/bin/python -u -c "import > setuptools, > tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, > 'open', open)(__file__);code=f.read().replace('\r\n', > '\n');f.close();exec(compile(code, __file__, 'exec'))" install > --record /tmp/pip-d6NYW8-record/install-record.txt > --single-version-externally-managed --compile: > ? ? running install > ? ? running build > ? ? running build_src > ? ? running build_py > ? ? creating build > ? ? creating build/lib.linux-x86_64-2.7 > ? ? creating build/lib.linux-x86_64-2.7/petsc4py > ? ? copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py > ? ? copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py > ? ? copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py > ? ? creating build/lib.linux-x86_64-2.7/petsc4py/lib > ? ? copying src/lib/__init__.py -> > build/lib.linux-x86_64-2.7/petsc4py/lib > ? ? creating build/lib.linux-x86_64-2.7/petsc4py/include > ? ? creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? ? copying src/include/petsc4py/petsc4py.PETSc.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? ? copying src/include/petsc4py/numpy.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? ? copying src/include/petsc4py/petsc4py.PETSc_api.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? ? copying src/include/petsc4py/petsc4py.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? ? copying src/include/petsc4py/petsc4py.i -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? ? copying src/include/petsc4py/__init__.pxd -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? ? copying src/include/petsc4py/PETSc.pxd -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? ? copying src/include/petsc4py/__init__.pyx -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > ? ? copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py > ? ? copying src/lib/petsc.cfg -> > build/lib.linux-x86_64-2.7/petsc4py/lib > ? ? running build_ext > ? ? PETSC_DIR: > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > ? ? PETSC_ARCH: > ? ? version:? ? ? 3.8.0 release > ? ? integer-size: 32-bit > ? ? scalar-type:? real > ? ? precision:? ? double > ? ? language:? ? ?CONLY > ? ? compiler:? ? ?/vendor/sgi/mpt/2.14r19/bin/mpicc > ? ? linker:? ? ? ?/vendor/sgi/mpt/2.14r19/bin/mpicc > ? ? building 'PETSc' extension > ? ? creating build/temp.linux-x86_64-2.7 > ? ? creating build/temp.linux-x86_64-2.7/src > ? ? /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas > -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG > -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c > -o build/temp.linux-x86_64-2.7/src/PETSc.o > ? ? In file included from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1788:0, > > ? ? ? ? ? ? ? ? ? ? ?from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:18, > > ? ? ? ? ? ? ? ? ? ? ?from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:4, > > ? ? ? ? ? ? ? ? ? ? ?from src/include/petsc4py/numpy.h:11, > ? ? ? ? ? ? ? ? ? ? ?from src/petsc4py.PETSc.c:519, > ? ? ? ? ? ? ? ? ? ? ?from src/PETSc.c:3: > > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: > warning: #warning "Using deprecated NumPy API, disable it by " > "#defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] > ? ? ?#warning "Using deprecated NumPy API, disable it by " \ > ? ? ? ^ > ? ? /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas > -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG > -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c > src/libpetsc4py.c -o build/temp.linux-x86_64-2.7/src/libpetsc4py.o > ? ? /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas > -fstack-protector -g -O gcc -pthread -shared -B > /home/login/anaconda2/envs/myenv/compiler_compat > -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -L/home/login/anaconda2/envs/myenv/lib > -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib -Wl,--no-as-needed > -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o > build/temp.linux-x86_64-2.7/src/libpetsc4py.o > -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -L/home/login/anaconda2/envs/myenv/lib > -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 > -o build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so > ? ? gcc: error: gcc: No such file or directory > ? ? error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with > exit status 1 > > ? ? ---------------------------------------- > Command "/home/login/anaconda2/envs/myenv/bin/python -u -c "import > setuptools, > tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, > 'open', open)(__file__);code=f.read().replace('\r\n', > '\n');f.close();exec(compile(code, __file__, 'exec'))" install > --record /tmp/pip-d6NYW8-record/install-record.txt > --single-version-externally-managed --compile" failed with error > code 1 in /tmp/pip-build-v6lDIk/petsc4py/ > > > -- > =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- > > ? ? Bill Jones W.T.JONES at NASA.GOV > ? ? Mail Stop 128? ? ? ? ? ? ? ? ? ? ?Computational AeroSciences Branch > ? ? 15 Langley Boulevard? ? ? ? ? ? ? ? ? ? ? ? ? ?Research Directorate > ? ? NASA Langley Research Center? ? ? ? ? ? ? ?Building 1268, Room 1044 > ? ? Hampton, VA? 23681-2199? ? ? ? ? ? ? ? ? ? ? ?Phone +1 757 > 864-5318 > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Fax +1 757 > 864-8816 > http://fun3d.larc.nasa.gov > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -- =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Bill Jones W.T.JONES at NASA.GOV Mail Stop 128 Computational AeroSciences Branch 15 Langley Boulevard Research Directorate NASA Langley Research Center Building 1268, Room 1060 Hampton, VA 23681-2199 Phone +1 757 864-5318 Fax +1 757 864-8816 http://fun3d.larc.nasa.gov From knepley at gmail.com Wed Oct 11 21:19:39 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 11 Oct 2017 22:19:39 -0400 Subject: [petsc-users] Trouble installing petsc4py in Anaconda environment In-Reply-To: References: <839806b2-d462-a261-be74-80948ddb658b@nasa.gov> Message-ID: On Wed, Oct 11, 2017 at 10:13 PM, William T. Jones wrote: > On 10/11/17 7:06 PM, Matthew Knepley wrote: > >> On Wed, Oct 11, 2017 at 5:29 PM, William T Jones > > wrote: >> >> I have created an Anaconda Pytyoh 2.7 environment on an SGI-ICE >> machine and included cython, numpy=1.12, scipy, and mpi4py (based >> SGI-MPT). While petsc installs fine with: >> >> % PETSC_CONFIGURE_OPTIONS="--download-fblaslapack=1" pip install >> https://bitbucket.org/petsc/petsc/get/maint.tar.gz >> >> >> I cannot get petsc4py to build/install. I am attempting with: >> >> % export >> PETSC_DIR=${PREFIX}/envs/myenv/lib/python2.7/site-packages/petsc >> % pip install --no-dependencies petsc4py >> >> Note, I am using "--no-dependencies" because I want to leave numpy >> at 1.12 and do not want it to be upgraded. Either way I get the >> output below. It appears that the link command has been corrupted >> with the addition of the "gcc" command in the middle of the link >> command. >> >> >> /vendor/sgi/mpt/2.14r19/bin/mpicc is being called. Does this actually >> work? I would suspect it of calling 'gcc' >> >> > Matt, Thanks for the response. > > Yes, `/vendor/sgi/mpt/2.14r19/bin/mpicc` is a compiler wrapper that > ultimately calls gcc. The problem is that in the middle of the link line > there is an extraneous "...-fstack-protector -g -O gcc -pthread -shared > -B...". From the error message, it is this embedded "gcc" that is the > problem. > > I cannot see where petsc4py gets its compiler and linker commands. It > would appear that they come from petsc. I would not have that interpretation. This is because the build outputs the Petsc information underneath PETSC_ARCH and it shows the compiler and linker given to it. Neither has 'gcc' in it. You could grep the files in $PETSC_DIR/$PETSC_ARCH/lib/petsc/conf for 'gcc', but i suspect it will not be there. Thanks, Matt > Again, it appears that the link line is being constructed > inappropriately. > > Matt >> >> Any help is appreciated, >> >> >> % pip install --no-dependencies petsc4py >> Collecting petsc4py >> Using cached petsc4py-3.8.0.tar.gz >> Building wheels for collected packages: petsc4py >> Running setup.py bdist_wheel for petsc4py ... error >> Complete output from command >> /home/login/anaconda2/envs/myenv/bin/python -u -c "import >> setuptools, >> tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py'; >> f=getattr(tokenize, >> 'open', open)(__file__);code=f.read().replace('\r\n', >> '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel >> -d /tmp/tmpD2KqLjpip-wheel- --python-tag cp27: >> running bdist_wheel >> running build >> running build_src >> running build_py >> creating build >> creating build/lib.linux-x86_64-2.7 >> creating build/lib.linux-x86_64-2.7/petsc4py >> copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py >> creating build/lib.linux-x86_64-2.7/petsc4py/lib >> copying src/lib/__init__.py -> >> build/lib.linux-x86_64-2.7/petsc4py/lib >> creating build/lib.linux-x86_64-2.7/petsc4py/include >> creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.PETSc.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/numpy.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.PETSc_api.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.i -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/__init__.pxd -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/PETSc.pxd -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/__init__.pyx -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/lib/petsc.cfg -> build/lib.linux-x86_64-2.7/pet >> sc4py/lib >> running build_ext >> PETSC_DIR: >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc >> PETSC_ARCH: >> version: 3.8.0 release >> integer-size: 32-bit >> scalar-type: real >> precision: double >> language: CONLY >> compiler: /vendor/sgi/mpt/2.14r19/bin/mpicc >> linker: /vendor/sgi/mpt/2.14r19/bin/mpicc >> building 'PETSc' extension >> creating build/temp.linux-x86_64-2.7 >> creating build/temp.linux-x86_64-2.7/src >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall >> -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas >> -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG >> -g -fwrapv -O3 -Wall >> -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/ >> site-packages/petsc >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packag >> es/petsc/include >> -Isrc/include >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packag >> es/numpy/core/include >> -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c >> -o build/temp.linux-x86_64-2.7/src/PETSc.o >> In file included from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages >> /numpy/core/include/numpy/ndarraytypes.h:1788:0, >> >> from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages >> /numpy/core/include/numpy/ndarrayobject.h:18, >> >> from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages >> /numpy/core/include/numpy/arrayobject.h:4, >> >> from src/include/petsc4py/numpy.h:11, >> from src/petsc4py.PETSc.c:519, >> from src/PETSc.c:3: >> >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages >> /numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: >> warning: #warning "Using deprecated NumPy API, disable it by " >> "#defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] >> #warning "Using deprecated NumPy API, disable it by " \ >> ^ >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall >> -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas >> -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG >> -g -fwrapv -O3 -Wall >> -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/ >> site-packages/petsc >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packag >> es/petsc/include >> -Isrc/include >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packag >> es/numpy/core/include >> -I/home/login/anaconda2/envs/myenv/include/python2.7 -c >> src/libpetsc4py.c -o build/temp.linux-x86_64-2.7/src/libpetsc4py.o >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall >> -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas >> -fstack-protector -g -O gcc -pthread -shared -B >> /home/login/anaconda2/envs/myenv/compiler_compat >> -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall >> -L/home/login/anaconda2/envs/myenv/lib >> -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib -Wl,--no-as-needed >> -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o >> build/temp.linux-x86_64-2.7/src/libpetsc4py.o >> -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packag >> es/petsc/lib >> -L/home/login/anaconda2/envs/myenv/lib >> -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib/python2.7/ >> site-packages/petsc/lib >> -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 >> -o build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so >> gcc: error: gcc: No such file or directory >> error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with >> exit status 1 >> >> >> ---------------------------------------- >> Failed building wheel for petsc4py >> Running setup.py clean for petsc4py >> Failed to build petsc4py >> Installing collected packages: petsc4py >> Running setup.py install for petsc4py ... error >> Complete output from command >> /home/login/anaconda2/envs/myenv/bin/python -u -c "import >> setuptools, >> tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py'; >> f=getattr(tokenize, >> 'open', open)(__file__);code=f.read().replace('\r\n', >> '\n');f.close();exec(compile(code, __file__, 'exec'))" install >> --record /tmp/pip-d6NYW8-record/install-record.txt >> --single-version-externally-managed --compile: >> running install >> running build >> running build_src >> running build_py >> creating build >> creating build/lib.linux-x86_64-2.7 >> creating build/lib.linux-x86_64-2.7/petsc4py >> copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py >> creating build/lib.linux-x86_64-2.7/petsc4py/lib >> copying src/lib/__init__.py -> >> build/lib.linux-x86_64-2.7/petsc4py/lib >> creating build/lib.linux-x86_64-2.7/petsc4py/include >> creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.PETSc.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/numpy.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.PETSc_api.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.i -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/__init__.pxd -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/PETSc.pxd -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/__init__.pyx -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/lib/petsc.cfg -> >> build/lib.linux-x86_64-2.7/petsc4py/lib >> running build_ext >> PETSC_DIR: >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc >> PETSC_ARCH: >> version: 3.8.0 release >> integer-size: 32-bit >> scalar-type: real >> precision: double >> language: CONLY >> compiler: /vendor/sgi/mpt/2.14r19/bin/mpicc >> linker: /vendor/sgi/mpt/2.14r19/bin/mpicc >> building 'PETSc' extension >> creating build/temp.linux-x86_64-2.7 >> creating build/temp.linux-x86_64-2.7/src >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall >> -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas >> -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG >> -g -fwrapv -O3 -Wall >> -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/ >> site-packages/petsc >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packag >> es/petsc/include >> -Isrc/include >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packag >> es/numpy/core/include >> -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c >> -o build/temp.linux-x86_64-2.7/src/PETSc.o >> In file included from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages >> /numpy/core/include/numpy/ndarraytypes.h:1788:0, >> >> from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages >> /numpy/core/include/numpy/ndarrayobject.h:18, >> >> from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages >> /numpy/core/include/numpy/arrayobject.h:4, >> >> from src/include/petsc4py/numpy.h:11, >> from src/petsc4py.PETSc.c:519, >> from src/PETSc.c:3: >> >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages >> /numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: >> warning: #warning "Using deprecated NumPy API, disable it by " >> "#defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] >> #warning "Using deprecated NumPy API, disable it by " \ >> ^ >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall >> -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas >> -fstack-protector -g -O -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG >> -g -fwrapv -O3 -Wall >> -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/ >> site-packages/petsc >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packag >> es/petsc/include >> -Isrc/include >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packag >> es/numpy/core/include >> -I/home/login/anaconda2/envs/myenv/include/python2.7 -c >> src/libpetsc4py.c -o build/temp.linux-x86_64-2.7/src/libpetsc4py.o >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall >> -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas >> -fstack-protector -g -O gcc -pthread -shared -B >> /home/login/anaconda2/envs/myenv/compiler_compat >> -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall >> -L/home/login/anaconda2/envs/myenv/lib >> -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib -Wl,--no-as-needed >> -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o >> build/temp.linux-x86_64-2.7/src/libpetsc4py.o >> -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packag >> es/petsc/lib >> -L/home/login/anaconda2/envs/myenv/lib >> -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib/python2.7/ >> site-packages/petsc/lib >> -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 >> -o build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so >> gcc: error: gcc: No such file or directory >> error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with >> exit status 1 >> >> ---------------------------------------- >> Command "/home/login/anaconda2/envs/myenv/bin/python -u -c "import >> setuptools, >> tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py'; >> f=getattr(tokenize, >> 'open', open)(__file__);code=f.read().replace('\r\n', >> '\n');f.close();exec(compile(code, __file__, 'exec'))" install >> --record /tmp/pip-d6NYW8-record/install-record.txt >> --single-version-externally-managed --compile" failed with error >> code 1 in /tmp/pip-build-v6lDIk/petsc4py/ >> >> >> -- =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= >> -=-=-=-=-=-=- >> >> Bill Jones W.T.JONES at NASA.GOV >> Mail Stop 128 Computational AeroSciences >> Branch >> 15 Langley Boulevard Research >> Directorate >> NASA Langley Research Center Building 1268, Room >> 1044 >> Hampton, VA 23681-2199 Phone +1 757 >> 864-5318 >> Fax +1 757 >> 864-8816 >> http://fun3d.larc.nasa.gov >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -- > =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- > > Bill Jones W.T.JONES at NASA.GOV > Mail Stop 128 Computational AeroSciences Branch > 15 Langley Boulevard Research Directorate > NASA Langley Research Center Building 1268, Room 1060 > Hampton, VA 23681-2199 Phone +1 757 864-5318 > Fax +1 757 864-8816 > http://fun3d.larc.nasa.gov > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From dalcinl at gmail.com Thu Oct 12 02:43:39 2017 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Thu, 12 Oct 2017 10:43:39 +0300 Subject: [petsc-users] Trouble installing petsc4py in Anaconda environment In-Reply-To: <839806b2-d462-a261-be74-80948ddb658b@nasa.gov> References: <839806b2-d462-a261-be74-80948ddb658b@nasa.gov> Message-ID: Please install the maint branch (see issue https://bitbucket.org/petsc/petsc4py/issues/75/) $ conda install cython # required to build the C wrapper source code $ pip install --no-dependencies https://bitbucket.org/petsc/petsc4py/get/maint.tar.gz On 12 October 2017 at 00:29, William T Jones wrote: > I have created an Anaconda Pytyoh 2.7 environment on an SGI-ICE machine and > included cython, numpy=1.12, scipy, and mpi4py (based SGI-MPT). While petsc > installs fine with: > > % PETSC_CONFIGURE_OPTIONS="--download-fblaslapack=1" pip install > https://bitbucket.org/petsc/petsc/get/maint.tar.gz > > I cannot get petsc4py to build/install. I am attempting with: > > % export PETSC_DIR=${PREFIX}/envs/myenv/lib/python2.7/site-packages/petsc > % pip install --no-dependencies petsc4py > > Note, I am using "--no-dependencies" because I want to leave numpy at 1.12 > and do not want it to be upgraded. Either way I get the output below. It > appears that the link command has been corrupted with the addition of the > "gcc" command in the middle of the link command. > > Any help is appreciated, > > > % pip install --no-dependencies petsc4py > Collecting petsc4py > Using cached petsc4py-3.8.0.tar.gz > Building wheels for collected packages: petsc4py > Running setup.py bdist_wheel for petsc4py ... error > Complete output from command /home/login/anaconda2/envs/myenv/bin/python > -u -c "import setuptools, > tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, > 'open', open)(__file__);code=f.read().replace('\r\n', > '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d > /tmp/tmpD2KqLjpip-wheel- --python-tag cp27: > running bdist_wheel > running build > running build_src > running build_py > creating build > creating build/lib.linux-x86_64-2.7 > creating build/lib.linux-x86_64-2.7/petsc4py > copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py > copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py > copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py > creating build/lib.linux-x86_64-2.7/petsc4py/lib > copying src/lib/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py/lib > creating build/lib.linux-x86_64-2.7/petsc4py/include > creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.PETSc.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/numpy.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.PETSc_api.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.i -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/__init__.pxd -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/PETSc.pxd -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/__init__.pyx -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py > copying src/lib/petsc.cfg -> build/lib.linux-x86_64-2.7/petsc4py/lib > running build_ext > PETSC_DIR: > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > PETSC_ARCH: > version: 3.8.0 release > integer-size: 32-bit > scalar-type: real > precision: double > language: CONLY > compiler: /vendor/sgi/mpt/2.14r19/bin/mpicc > linker: /vendor/sgi/mpt/2.14r19/bin/mpicc > building 'PETSc' extension > creating build/temp.linux-x86_64-2.7 > creating build/temp.linux-x86_64-2.7/src > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC > -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c -o > build/temp.linux-x86_64-2.7/src/PETSc.o > In file included from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1788:0, > from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:18, > from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:4, > from src/include/petsc4py/numpy.h:11, > from src/petsc4py.PETSc.c:519, > from src/PETSc.c:3: > > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: > warning: #warning "Using deprecated NumPy API, disable it by " "#defining > NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] > #warning "Using deprecated NumPy API, disable it by " \ > ^ > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC > -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/libpetsc4py.c -o > build/temp.linux-x86_64-2.7/src/libpetsc4py.o > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O gcc > -pthread -shared -B /home/login/anaconda2/envs/myenv/compiler_compat > -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -L/home/login/anaconda2/envs/myenv/lib > -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib -Wl,--no-as-needed > -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o > build/temp.linux-x86_64-2.7/src/libpetsc4py.o > -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -L/home/login/anaconda2/envs/myenv/lib > -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 -o > build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so > gcc: error: gcc: No such file or directory > error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with exit status > 1 > > > ---------------------------------------- > Failed building wheel for petsc4py > Running setup.py clean for petsc4py > Failed to build petsc4py > Installing collected packages: petsc4py > Running setup.py install for petsc4py ... error > Complete output from command /home/login/anaconda2/envs/myenv/bin/python > -u -c "import setuptools, > tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, > 'open', open)(__file__);code=f.read().replace('\r\n', > '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record > /tmp/pip-d6NYW8-record/install-record.txt > --single-version-externally-managed --compile: > running install > running build > running build_src > running build_py > creating build > creating build/lib.linux-x86_64-2.7 > creating build/lib.linux-x86_64-2.7/petsc4py > copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py > copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py > copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py > creating build/lib.linux-x86_64-2.7/petsc4py/lib > copying src/lib/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py/lib > creating build/lib.linux-x86_64-2.7/petsc4py/include > creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.PETSc.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/numpy.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.PETSc_api.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.h -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/petsc4py.i -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/__init__.pxd -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/PETSc.pxd -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/include/petsc4py/__init__.pyx -> > build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py > copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py > copying src/lib/petsc.cfg -> build/lib.linux-x86_64-2.7/petsc4py/lib > running build_ext > PETSC_DIR: > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > PETSC_ARCH: > version: 3.8.0 release > integer-size: 32-bit > scalar-type: real > precision: double > language: CONLY > compiler: /vendor/sgi/mpt/2.14r19/bin/mpicc > linker: /vendor/sgi/mpt/2.14r19/bin/mpicc > building 'PETSc' extension > creating build/temp.linux-x86_64-2.7 > creating build/temp.linux-x86_64-2.7/src > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC > -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c -o > build/temp.linux-x86_64-2.7/src/PETSc.o > In file included from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1788:0, > from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:18, > from > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:4, > from src/include/petsc4py/numpy.h:11, > from src/petsc4py.PETSc.c:519, > from src/PETSc.c:3: > > /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: > warning: #warning "Using deprecated NumPy API, disable it by " "#defining > NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] > #warning "Using deprecated NumPy API, disable it by " \ > ^ > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC > -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include > -Isrc/include > -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include > -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/libpetsc4py.c -o > build/temp.linux-x86_64-2.7/src/libpetsc4py.o > /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B > /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O gcc > -pthread -shared -B /home/login/anaconda2/envs/myenv/compiler_compat > -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall > -L/home/login/anaconda2/envs/myenv/lib > -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib -Wl,--no-as-needed > -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o > build/temp.linux-x86_64-2.7/src/libpetsc4py.o > -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -L/home/login/anaconda2/envs/myenv/lib > -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib > -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 -o > build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so > gcc: error: gcc: No such file or directory > error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with exit > status 1 > > ---------------------------------------- > Command "/home/login/anaconda2/envs/myenv/bin/python -u -c "import > setuptools, > tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, > 'open', open)(__file__);code=f.read().replace('\r\n', > '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record > /tmp/pip-d6NYW8-record/install-record.txt > --single-version-externally-managed --compile" failed with error code 1 in > /tmp/pip-build-v6lDIk/petsc4py/ > > > -- > =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- > > Bill Jones W.T.JONES at NASA.GOV > Mail Stop 128 Computational AeroSciences Branch > 15 Langley Boulevard Research Directorate > NASA Langley Research Center Building 1268, Room 1044 > Hampton, VA 23681-2199 Phone +1 757 864-5318 > Fax +1 757 864-8816 > http://fun3d.larc.nasa.gov -- Lisandro Dalcin ============ Research Scientist Computer, Electrical and Mathematical Sciences & Engineering (CEMSE) Extreme Computing Research Center (ECRC) King Abdullah University of Science and Technology (KAUST) http://ecrc.kaust.edu.sa/ 4700 King Abdullah University of Science and Technology al-Khawarizmi Bldg (Bldg 1), Office # 0109 Thuwal 23955-6900, Kingdom of Saudi Arabia http://www.kaust.edu.sa Office Phone: +966 12 808-0459 From lawrence.mitchell at imperial.ac.uk Thu Oct 12 04:49:50 2017 From: lawrence.mitchell at imperial.ac.uk (Lawrence Mitchell) Date: Thu, 12 Oct 2017 10:49:50 +0100 Subject: [petsc-users] PETSc user meeting 2018, June 4-6, London Message-ID: <51a0570b-ba74-07a2-1db0-34cc1daa2475@imperial.ac.uk> Dear PETSc-ites, we would like to invite you to join us at the 2018 PETSc user meeting, to be held at Imperial College London on June 4-6 2018. The meeting webpage: https://www.mcs.anl.gov/petsc/meetings/2018/ The first day consists of tutorials on various aspects and features of PETSc. The second and third days will be devoted to exchange, discussions, and a refinement of strategies for the future with our users. We encourage you to present work illustrating your own use of PETSc, for example in applications or in libraries built on top of PETSc. Details on registration, travel grants, and abstract submission will follow in early 2018. We look forward to seeing you in London's famous London. Please contact petsc2018 at mcs.anl.gov if you have any questions, suggestions, or comments. Lawrence Mitchell (for the organising committee) From w.t.jones at nasa.gov Thu Oct 12 07:35:42 2017 From: w.t.jones at nasa.gov (William T. Jones) Date: Thu, 12 Oct 2017 08:35:42 -0400 Subject: [petsc-users] Trouble installing petsc4py in Anaconda environment In-Reply-To: References: <839806b2-d462-a261-be74-80948ddb658b@nasa.gov> Message-ID: <3fe3591d-54cc-67ea-cfbd-b4334e385c68@nasa.gov> On 10/12/17 3:43 AM, Lisandro Dalcin wrote: > Please install the maint branch (see issue > https://bitbucket.org/petsc/petsc4py/issues/75/) > > $ conda install cython # required to build the C wrapper source code > $ pip install --no-dependencies > https://bitbucket.org/petsc/petsc4py/get/maint.tar.gz Thank You. Thank You! Thank You!! I spent a day trying to fix this before emailing this list. Works like a charm. > > On 12 October 2017 at 00:29, William T Jones wrote: >> I have created an Anaconda Pytyoh 2.7 environment on an SGI-ICE machine and >> included cython, numpy=1.12, scipy, and mpi4py (based SGI-MPT). While petsc >> installs fine with: >> >> % PETSC_CONFIGURE_OPTIONS="--download-fblaslapack=1" pip install >> https://bitbucket.org/petsc/petsc/get/maint.tar.gz >> >> I cannot get petsc4py to build/install. I am attempting with: >> >> % export PETSC_DIR=${PREFIX}/envs/myenv/lib/python2.7/site-packages/petsc >> % pip install --no-dependencies petsc4py >> >> Note, I am using "--no-dependencies" because I want to leave numpy at 1.12 >> and do not want it to be upgraded. Either way I get the output below. It >> appears that the link command has been corrupted with the addition of the >> "gcc" command in the middle of the link command. >> >> Any help is appreciated, >> >> >> % pip install --no-dependencies petsc4py >> Collecting petsc4py >> Using cached petsc4py-3.8.0.tar.gz >> Building wheels for collected packages: petsc4py >> Running setup.py bdist_wheel for petsc4py ... error >> Complete output from command /home/login/anaconda2/envs/myenv/bin/python >> -u -c "import setuptools, >> tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, >> 'open', open)(__file__);code=f.read().replace('\r\n', >> '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d >> /tmp/tmpD2KqLjpip-wheel- --python-tag cp27: >> running bdist_wheel >> running build >> running build_src >> running build_py >> creating build >> creating build/lib.linux-x86_64-2.7 >> creating build/lib.linux-x86_64-2.7/petsc4py >> copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py >> creating build/lib.linux-x86_64-2.7/petsc4py/lib >> copying src/lib/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py/lib >> creating build/lib.linux-x86_64-2.7/petsc4py/include >> creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.PETSc.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/numpy.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.PETSc_api.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.i -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/__init__.pxd -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/PETSc.pxd -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/__init__.pyx -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/lib/petsc.cfg -> build/lib.linux-x86_64-2.7/petsc4py/lib >> running build_ext >> PETSC_DIR: >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc >> PETSC_ARCH: >> version: 3.8.0 release >> integer-size: 32-bit >> scalar-type: real >> precision: double >> language: CONLY >> compiler: /vendor/sgi/mpt/2.14r19/bin/mpicc >> linker: /vendor/sgi/mpt/2.14r19/bin/mpicc >> building 'PETSc' extension >> creating build/temp.linux-x86_64-2.7 >> creating build/temp.linux-x86_64-2.7/src >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings >> -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC >> -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall >> -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include >> -Isrc/include >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include >> -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c -o >> build/temp.linux-x86_64-2.7/src/PETSc.o >> In file included from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1788:0, >> from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:18, >> from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:4, >> from src/include/petsc4py/numpy.h:11, >> from src/petsc4py.PETSc.c:519, >> from src/PETSc.c:3: >> >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: >> warning: #warning "Using deprecated NumPy API, disable it by " "#defining >> NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] >> #warning "Using deprecated NumPy API, disable it by " \ >> ^ >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings >> -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC >> -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall >> -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include >> -Isrc/include >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include >> -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/libpetsc4py.c -o >> build/temp.linux-x86_64-2.7/src/libpetsc4py.o >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings >> -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O gcc >> -pthread -shared -B /home/login/anaconda2/envs/myenv/compiler_compat >> -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall >> -L/home/login/anaconda2/envs/myenv/lib >> -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib -Wl,--no-as-needed >> -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o >> build/temp.linux-x86_64-2.7/src/libpetsc4py.o >> -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib >> -L/home/login/anaconda2/envs/myenv/lib >> -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib >> -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 -o >> build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so >> gcc: error: gcc: No such file or directory >> error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with exit status >> 1 >> >> >> ---------------------------------------- >> Failed building wheel for petsc4py >> Running setup.py clean for petsc4py >> Failed to build petsc4py >> Installing collected packages: petsc4py >> Running setup.py install for petsc4py ... error >> Complete output from command /home/login/anaconda2/envs/myenv/bin/python >> -u -c "import setuptools, >> tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, >> 'open', open)(__file__);code=f.read().replace('\r\n', >> '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record >> /tmp/pip-d6NYW8-record/install-record.txt >> --single-version-externally-managed --compile: >> running install >> running build >> running build_src >> running build_py >> creating build >> creating build/lib.linux-x86_64-2.7 >> creating build/lib.linux-x86_64-2.7/petsc4py >> copying src/PETSc.py -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/__main__.py -> build/lib.linux-x86_64-2.7/petsc4py >> creating build/lib.linux-x86_64-2.7/petsc4py/lib >> copying src/lib/__init__.py -> build/lib.linux-x86_64-2.7/petsc4py/lib >> creating build/lib.linux-x86_64-2.7/petsc4py/include >> creating build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.PETSc.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/numpy.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.PETSc_api.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.h -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/petsc4py.i -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/__init__.pxd -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/PETSc.pxd -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/include/petsc4py/__init__.pyx -> >> build/lib.linux-x86_64-2.7/petsc4py/include/petsc4py >> copying src/PETSc.pxd -> build/lib.linux-x86_64-2.7/petsc4py >> copying src/lib/petsc.cfg -> build/lib.linux-x86_64-2.7/petsc4py/lib >> running build_ext >> PETSC_DIR: >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc >> PETSC_ARCH: >> version: 3.8.0 release >> integer-size: 32-bit >> scalar-type: real >> precision: double >> language: CONLY >> compiler: /vendor/sgi/mpt/2.14r19/bin/mpicc >> linker: /vendor/sgi/mpt/2.14r19/bin/mpicc >> building 'PETSc' extension >> creating build/temp.linux-x86_64-2.7 >> creating build/temp.linux-x86_64-2.7/src >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings >> -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC >> -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall >> -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include >> -Isrc/include >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include >> -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/PETSc.c -o >> build/temp.linux-x86_64-2.7/src/PETSc.o >> In file included from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1788:0, >> from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:18, >> from >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:4, >> from src/include/petsc4py/numpy.h:11, >> from src/petsc4py.PETSc.c:519, >> from src/PETSc.c:3: >> >> /home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: >> warning: #warning "Using deprecated NumPy API, disable it by " "#defining >> NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] >> #warning "Using deprecated NumPy API, disable it by " \ >> ^ >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings >> -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O -fPIC >> -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall >> -DPETSC_DIR=/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/include >> -Isrc/include >> -I/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/numpy/core/include >> -I/home/login/anaconda2/envs/myenv/include/python2.7 -c src/libpetsc4py.c -o >> build/temp.linux-x86_64-2.7/src/libpetsc4py.o >> /vendor/sgi/mpt/2.14r19/bin/mpicc -pthread -B >> /home/login/anaconda2/envs/myenv/compiler_compat -fPIC -Wall -Wwrite-strings >> -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -g -O gcc >> -pthread -shared -B /home/login/anaconda2/envs/myenv/compiler_compat >> -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall >> -L/home/login/anaconda2/envs/myenv/lib >> -Wl,-rpath=/home/login/anaconda2/envs/myenv/lib -Wl,--no-as-needed >> -Wl,--sysroot=/ build/temp.linux-x86_64-2.7/src/PETSc.o >> build/temp.linux-x86_64-2.7/src/libpetsc4py.o >> -L/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib >> -L/home/login/anaconda2/envs/myenv/lib >> -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib/python2.7/site-packages/petsc/lib >> -Wl,-rpath,/home/login/anaconda2/envs/myenv/lib -lpetsc -lpython2.7 -o >> build/lib.linux-x86_64-2.7/petsc4py/lib/PETSc.so >> gcc: error: gcc: No such file or directory >> error: command '/vendor/sgi/mpt/2.14r19/bin/mpicc' failed with exit >> status 1 >> >> ---------------------------------------- >> Command "/home/login/anaconda2/envs/myenv/bin/python -u -c "import >> setuptools, >> tokenize;__file__='/tmp/pip-build-v6lDIk/petsc4py/setup.py';f=getattr(tokenize, >> 'open', open)(__file__);code=f.read().replace('\r\n', >> '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record >> /tmp/pip-d6NYW8-record/install-record.txt >> --single-version-externally-managed --compile" failed with error code 1 in >> /tmp/pip-build-v6lDIk/petsc4py/ >> >> >> -- >> =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- >> >> Bill Jones W.T.JONES at NASA.GOV >> Mail Stop 128 Computational AeroSciences Branch >> 15 Langley Boulevard Research Directorate >> NASA Langley Research Center Building 1268, Room 1044 >> Hampton, VA 23681-2199 Phone +1 757 864-5318 >> Fax +1 757 864-8816 >> http://fun3d.larc.nasa.gov > > > -- =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Bill Jones W.T.JONES at NASA.GOV Mail Stop 128 Computational AeroSciences Branch 15 Langley Boulevard Research Directorate NASA Langley Research Center Building 1268, Room 1060 Hampton, VA 23681-2199 Phone +1 757 864-5318 Fax +1 757 864-8816 http://fun3d.larc.nasa.gov From guillaume.emond at polymtl.ca Thu Oct 12 08:40:13 2017 From: guillaume.emond at polymtl.ca (Guillaume Emond) Date: Thu, 12 Oct 2017 13:40:13 +0000 Subject: [petsc-users] MatSetValues with openMP and no data races Message-ID: <20171012134013.Horde.8CkrpuHOZoa7X2lFvKlFtc5@www.imp.polymtl.ca> Goodmorning, I would like to clarify a point about the insertion of values with MatSetValues in a openMP loop. I know these routines are not thread safe. But, in our situation, we used a graph coloring algorithm on our mesh to make sure no adjacent element is inserted at the same time so no data races occurs when inserting values. Could these routines be used with openmp then or is there some internal variables that would not be thread safe? Guillaume From mfadams at lbl.gov Thu Oct 12 08:44:26 2017 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 12 Oct 2017 09:44:26 -0400 Subject: [petsc-users] MatSetValues with openMP and no data races In-Reply-To: <20171012134013.Horde.8CkrpuHOZoa7X2lFvKlFtc5@www.imp.polymtl.ca> References: <20171012134013.Horde.8CkrpuHOZoa7X2lFvKlFtc5@www.imp.polymtl.ca> Message-ID: Coloring works for OMP assembly. On Thu, Oct 12, 2017 at 9:40 AM, Guillaume Emond wrote: > Goodmorning, > > I would like to clarify a point about the insertion of values with > MatSetValues in a openMP loop. I know these routines are not thread safe. > But, in our situation, we used a graph coloring algorithm on our mesh to > make sure no adjacent element is inserted at the same time so no data races > occurs when inserting values. Could these routines be used with openmp then > or is there some internal variables that would not be thread safe? > > Guillaume > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lawrence.mitchell at imperial.ac.uk Thu Oct 12 08:45:20 2017 From: lawrence.mitchell at imperial.ac.uk (Lawrence Mitchell) Date: Thu, 12 Oct 2017 14:45:20 +0100 Subject: [petsc-users] MatSetValues with openMP and no data races In-Reply-To: <20171012134013.Horde.8CkrpuHOZoa7X2lFvKlFtc5@www.imp.polymtl.ca> References: <20171012134013.Horde.8CkrpuHOZoa7X2lFvKlFtc5@www.imp.polymtl.ca> Message-ID: <5e4b0be0-c0d8-3f10-f6fc-67d5401b421d@imperial.ac.uk> On 12/10/17 14:40, Guillaume Emond wrote: > Goodmorning, > > I would like to clarify a point about the insertion of values with > MatSetValues in a openMP loop. I know these routines are not thread > safe. But, in our situation, we used a graph coloring algorithm on our > mesh to make sure no adjacent element is inserted at the same time so > no data races occurs when inserting values. Could these routines be > used with openmp then or is there some internal variables that would > not be thread safe? The MatStash used for saving and then later communicating off process entries during AssemblyEnd is, I believe, not thread safe. Cheers, Lawrence From aliberkkahraman at yahoo.com Thu Oct 12 11:45:01 2017 From: aliberkkahraman at yahoo.com (Ali Berk Kahraman) Date: Thu, 12 Oct 2017 19:45:01 +0300 Subject: [petsc-users] TSTHETA not working without explicit declaration with TSSetRHSJacobian Message-ID: <47137a4c-c897-39cb-e0d0-d19179635c19@yahoo.com> Hello All, I am trying to use TS solver without declaring RHS Jacobian, because I do not have it analytically. I'm getting the error posted at the end of the e-mail. As I understand, the SNES solver within the TS solver sees that I have not declared a Jacobian, so it calls the function SNESComputeJacobianColor to get a finite difference aproximation of the Jacobian, as I am too lazy to make this approximation myself, and SNESComputeJacobianColor calls MatFDColoringCreate() and this function finally gives the error "Matrix is in wrong state, Matrix must be assembled by calls to MatAssemblyBegin/End(); ". I am not sure if this is a bug, or it is something I'm doing wrong. It looks like a bug to me since the error is generated when the code understands that I haven't provided a Jacobian and consequently trying to compute it for me. However, I cannot be sure because I'm still pretty inexperienced using PETSc, so I'm writing this here and not to petsc-maint. Any ideas? Best Regards , Ali [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Object is in wrong state [0]PETSC ERROR: Matrix must be assembled by calls to MatAssemblyBegin/End(); [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.8.0, unknown [0]PETSC ERROR: ./FastWavelet1DTransientHeat on a arch-linux2-c-debug named abk-CFDLab by abk Thu Oct 12 19:39:21 2017 [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-fblaslapack [0]PETSC ERROR: #1 MatFDColoringCreate() line 464 in /home/abk/petsc/src/mat/matfd/fdmatrix.c [0]PETSC ERROR: #2 SNESComputeJacobianDefaultColor() line 83 in /home/abk/petsc/src/snes/interface/snesj2.c [0]PETSC ERROR: #3 SNESComputeJacobian() line 2358 in /home/abk/petsc/src/snes/interface/snes.c [0]PETSC ERROR: #4 SNESSolve_KSPONLY() line 36 in /home/abk/petsc/src/snes/impls/ksponly/ksponly.c [0]PETSC ERROR: #5 SNESSolve() line 4106 in /home/abk/petsc/src/snes/interface/snes.c [0]PETSC ERROR: #6 TS_SNESSolve() line 176 in /home/abk/petsc/src/ts/impls/implicit/theta/theta.c [0]PETSC ERROR: #7 TSStep_Theta() line 216 in /home/abk/petsc/src/ts/impls/implicit/theta/theta.c [0]PETSC ERROR: #8 TSStep() line 4120 in /home/abk/petsc/src/ts/interface/ts.c [0]PETSC ERROR: #9 TSSolve() line 4374 in /home/abk/petsc/src/ts/interface/ts.c [0]PETSC ERROR: #10 main() line 886 in /home/abk/Dropbox/MyWorkspace/WaveletCollocation/FastWaveletCollocation1D/FastWavelet1DTransientHeat.c [0]PETSC ERROR: No PETSc Option Table entries [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0 [unset]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0 ------------------ (program exited with code: 73) Press return to continue From aliberkkahraman at yahoo.com Thu Oct 12 12:08:52 2017 From: aliberkkahraman at yahoo.com (Ali Berk Kahraman) Date: Thu, 12 Oct 2017 20:08:52 +0300 Subject: [petsc-users] TSTHETA not working without explicit declaration with TSSetRHSJacobian In-Reply-To: <47137a4c-c897-39cb-e0d0-d19179635c19@yahoo.com> References: <47137a4c-c897-39cb-e0d0-d19179635c19@yahoo.com> Message-ID: My apologies for missing this earlier, but I have found these lines in the documentation that provides the solution for my problem. "To use a fully implicit method like TSTHETA or TSGL, either provide the Jacobian of F () (and G() if G() is provided) or use a DM that provides a coloring so the Jacobian can be computed efficiently via finite differences." However, I'm still confused with this, since the computation process of the Jacobian happens anyway, instead of giving an error such as "no jacobian is entered, aborting". On 12-10-2017 19:45, Ali Berk Kahraman wrote: > Hello All, > > > I am trying to use TS solver without declaring RHS Jacobian, because I > do not have it analytically. I'm getting the error posted at the end > of the e-mail. As I understand, the SNES solver within the TS solver > sees that I have not declared a Jacobian, so it calls the function > SNESComputeJacobianColor to get a finite difference aproximation of > the Jacobian, as I am too lazy to make this approximation myself, and > SNESComputeJacobianColor calls MatFDColoringCreate() and this function > finally gives the error "Matrix is in wrong state, Matrix must be > assembled by calls to MatAssemblyBegin/End(); ". > > I am not sure if this is a bug, or it is something I'm doing wrong. It > looks like a bug to me since the error is generated when the code > understands that I haven't provided a Jacobian and consequently trying > to compute it for me. However, I cannot be sure because I'm still > pretty inexperienced using PETSc, so I'm writing this here and not to > petsc-maint. Any ideas? > > > Best Regards , > > Ali > > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: Matrix must be assembled by calls to > MatAssemblyBegin/End(); > [0]PETSC ERROR: See > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.8.0, unknown > [0]PETSC ERROR: ./FastWavelet1DTransientHeat on a arch-linux2-c-debug > named abk-CFDLab by abk Thu Oct 12 19:39:21 2017 > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > --with-fc=gfortran --download-mpich --download-fblaslapack > [0]PETSC ERROR: #1 MatFDColoringCreate() line 464 in > /home/abk/petsc/src/mat/matfd/fdmatrix.c > [0]PETSC ERROR: #2 SNESComputeJacobianDefaultColor() line 83 in > /home/abk/petsc/src/snes/interface/snesj2.c > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2358 in > /home/abk/petsc/src/snes/interface/snes.c > [0]PETSC ERROR: #4 SNESSolve_KSPONLY() line 36 in > /home/abk/petsc/src/snes/impls/ksponly/ksponly.c > [0]PETSC ERROR: #5 SNESSolve() line 4106 in > /home/abk/petsc/src/snes/interface/snes.c > [0]PETSC ERROR: #6 TS_SNESSolve() line 176 in > /home/abk/petsc/src/ts/impls/implicit/theta/theta.c > [0]PETSC ERROR: #7 TSStep_Theta() line 216 in > /home/abk/petsc/src/ts/impls/implicit/theta/theta.c > [0]PETSC ERROR: #8 TSStep() line 4120 in > /home/abk/petsc/src/ts/interface/ts.c > [0]PETSC ERROR: #9 TSSolve() line 4374 in > /home/abk/petsc/src/ts/interface/ts.c > [0]PETSC ERROR: #10 main() line 886 in > /home/abk/Dropbox/MyWorkspace/WaveletCollocation/FastWaveletCollocation1D/FastWavelet1DTransientHeat.c > [0]PETSC ERROR: No PETSc Option Table entries > [0]PETSC ERROR: ----------------End of Error Message -------send > entire error message to petsc-maint at mcs.anl.gov---------- > application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0 > [unset]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0 > > > ------------------ > (program exited with code: 73) > Press return to continue > > From zakaryah at gmail.com Thu Oct 12 13:02:59 2017 From: zakaryah at gmail.com (zakaryah .) Date: Thu, 12 Oct 2017 14:02:59 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> Message-ID: Thanks for the response, Matt - these are excellent questions. On theoretical grounds, I am certain that the solution to the continuous PDE exists. Without any serious treatment, I think this means the discretized system should have a solution up to discretization error, but perhaps this is indeed a bad approach. I am not sure whether the equations are "really hard to solve". At each point, the equations are third order polynomials of the state variable at that point and at nearby points (i.e. in the stencil). One possible complication is that the external forces which are applied to the interior of the material can be fairly complex - they are smooth, but they can have many inflection points. I don't have a great test case for which I know a good solution. To my thinking, there is no way that time-stepping the parabolic version of the same PDE can fail to yield a solution at infinite time. So, I'm going to try starting there. Converting the problem to a minimization is a bit trickier, because the discretization has to be performed one step earlier in the calculation, and therefore the gradient and Hessian would need to be recalculated. Even if there are some problems with time-stepping (speed of convergence?), maybe I can use the solutions as better test cases for the elliptic PDE solved via SNES. Can you give me any additional lingo or references for the fracture problem? Thanks, Zak On Wed, Oct 11, 2017 at 8:53 PM, Matthew Knepley wrote: > On Wed, Oct 11, 2017 at 11:33 AM, zakaryah . wrote: > >> Many thanks for the suggestions, Matt. >> >> I tried putting the solvers in a loop, like this: >> >> do { >> NewtonLS >> check convergence >> if (converged) break >> NRichardson or NGMRES >> } while (!converged) >> >> The results were interesting, to me at least. With NRichardson, there >> was indeed improvement in the residual norm, followed by improvement with >> NewtonLS, and so on for a few iterations of this loop. In each case, after >> a few iterations the NewtonLS appeared to be stuck in the same way as after >> the first iteration. Eventually neither method was able to reduce the >> residual norm, which was still significant, so this was not a total >> success. With NGMRES, the initial behavior was similar, but eventually the >> NGMRES progress became erratic. The minimal residual norm was a bit better >> using NGMRES than NRichardson, but neither combination of methods fully >> converged. For both NRichardson and NGMRES, I simply used the defaults, as >> I have no knowledge of how to tune the options for my problem. >> > > Are you certain that the equations have a solution? I become a little > concerned when richardson stops converging. Its > still possible you have really hard to solve equations, it just becomes > less likely. And even if they truly are hard to solve, > then there should be physical reasons for this. For example, it could be > that discretizing the minimizing PDE is just the > wrong thing to do. I believe this is the case in fracture, where you > attack the minimization problem directly. > > Matt > > >> On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley >> wrote: >> >>> On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . wrote: >>> >>>> Thanks for clearing that up. >>>> >>>> I'd appreciate any further help. Here's a summary: >>>> >>>> My ultimate goal is to find a vector field which minimizes an action. >>>> The action is a (nonlinear) function of the field and its first spatial >>>> derivatives. >>>> >>>> My current approach is to derive the (continuous) Euler-Lagrange >>>> equations, which results in a nonlinear PDE that the minimizing field must >>>> satisfy. These Euler-Lagrange equations are then discretized, and I'm >>>> trying to use an SNES to solve them. >>>> >>>> The problem is that the solver seems to reach a point at which the >>>> Jacobian (this corresponds to the second variation of the action, which is >>>> like a Hessian of the energy) becomes nearly singular, but where the >>>> residual (RHS of PDE) is not close to zero. The residual does not decrease >>>> over additional SNES iterations, and the line search results in tiny step >>>> sizes. My interpretation is that this point of stagnation is a critical >>>> point. >>>> >>> >>> The normal thing to do here (I think) is to engage solvers which do not >>> depend on that particular point. So using >>> NRichardson, or maybe NGMRES, to get past that. I would be interested to >>> see if this is successful. >>> >>> Matt >>> >>> >>>> I have checked the hand-coded Jacobian very carefully and I am >>>> confident that it is correct. >>>> >>>> I am guessing that such a situation is well-known in the field, but I >>>> don't know the lingo or literature. If anyone has suggestions I'd be >>>> thrilled. Are there documentation/methodologies within PETSc for this type >>>> of situation? >>>> >>>> Is there any advantage to discretizing the action itself and using the >>>> optimization routines? With minor modifications I'll have the gradient and >>>> Hessian calculations coded. Are the optimization routines likely to >>>> stagnate in the same way as the nonlinear solver, or can they take >>>> advantage of the structure of the problem to overcome this? >>>> >>>> Thanks a lot in advance for any help. >>>> >>>> On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith wrote: >>>> >>>>> >>>>> There is apparently confusing in understanding the ordering. Is this >>>>> all on one process that you get funny results? Are you using >>>>> MatSetValuesStencil() to provide the matrix (it is generally easier than >>>>> providing it yourself). In parallel MatView() always maps the rows and >>>>> columns to the natural ordering before printing, if you use a matrix >>>>> created from the DMDA. If you create the matrix yourself it has a different >>>>> MatView in parallel that is in in thePETSc ordering.\ >>>>> >>>>> >>>>> Barry >>>>> >>>>> >>>>> >>>>> > On Oct 8, 2017, at 8:05 AM, zakaryah . wrote: >>>>> > >>>>> > I'm more confused than ever. I don't understand the output of >>>>> -snes_type test -snes_test_display. >>>>> > >>>>> > For the user-defined state of the vector (where I'd like to test the >>>>> Jacobian), the finite difference Jacobian at row 0 evaluates as: >>>>> > >>>>> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >>>>> 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, >>>>> 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) >>>>> (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) >>>>> (37, 16.325) (38, 4.83918) >>>>> > >>>>> > But the hand-coded Jacobian at row 0 evaluates as: >>>>> > >>>>> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >>>>> 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, >>>>> 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) >>>>> (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, >>>>> 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) >>>>> > and the difference between the Jacobians at row 0 evaluates as: >>>>> > >>>>> > row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, >>>>> 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, >>>>> -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) >>>>> (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, >>>>> -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) >>>>> (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, >>>>> 0.) (41, 0.) >>>>> > >>>>> > The difference between the column numbering between the finite >>>>> difference and the hand-coded Jacobians looks like a serious problem to me, >>>>> but I'm probably missing something. >>>>> > >>>>> > I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, >>>>> and for this test problem the grid dimensions are 11x7x6. For a grid point >>>>> x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? >>>>> If so, then the column numbers of the hand-coded Jacobian match those of >>>>> the 27 point stencil I have in mind. However, I am then at a loss to >>>>> explain the column numbers in the finite difference Jacobian. >>>>> > >>>>> > >>>>> > >>>>> > >>>>> > On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . >>>>> wrote: >>>>> > OK - I ran with -snes_monitor -snes_converged_reason >>>>> -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual >>>>> -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls >>>>> -snes_compare_explicit >>>>> > >>>>> > and here is the full error message, output immediately after >>>>> > >>>>> > Finite difference Jacobian >>>>> > Mat Object: 24 MPI processes >>>>> > type: mpiaij >>>>> > >>>>> > [0]PETSC ERROR: --------------------- Error Message >>>>> -------------------------------------------------------------- >>>>> > >>>>> > [0]PETSC ERROR: Invalid argument >>>>> > >>>>> > [0]PETSC ERROR: Matrix not generated from a DMDA >>>>> > >>>>> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/d >>>>> ocumentation/faq.html for trouble shooting. >>>>> > >>>>> > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 >>>>> > >>>>> > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named >>>>> node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 >>>>> > >>>>> > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 >>>>> --download-fblaslapack -with-debugging=0 >>>>> > >>>>> > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in >>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impl >>>>> s/da/fdda.c >>>>> > >>>>> > [0]PETSC ERROR: #2 MatView() line 901 in >>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/mat/int >>>>> erface/matrix.c >>>>> > >>>>> > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in >>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/in >>>>> terface/snes.c >>>>> > >>>>> > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in >>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/im >>>>> pls/ls/ls.c >>>>> > >>>>> > [0]PETSC ERROR: #5 SNESSolve() line 4005 in >>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/in >>>>> terface/snes.c >>>>> > >>>>> > [0]PETSC ERROR: #6 solveWarp3D() line 659 in >>>>> /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October >>>>> 6_2017/mshs.c >>>>> > >>>>> > >>>>> > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown wrote: >>>>> > Always always always send the whole error message. >>>>> > >>>>> > "zakaryah ." writes: >>>>> > >>>>> > > I tried -snes_compare_explicit, and got the following error: >>>>> > > >>>>> > > [0]PETSC ERROR: Invalid argument >>>>> > > >>>>> > > [0]PETSC ERROR: Matrix not generated from a DMDA >>>>> > > >>>>> > > What am I doing wrong? >>>>> > > >>>>> > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown >>>>> wrote: >>>>> > > >>>>> > >> Barry Smith writes: >>>>> > >> >>>>> > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . >>>>> wrote: >>>>> > >> >> >>>>> > >> >> I'm still working on this. I've made some progress, and it >>>>> looks like >>>>> > >> the issue is with the KSP, at least for now. The Jacobian may be >>>>> > >> ill-conditioned. Is it possible to use -snes_test_display during >>>>> an >>>>> > >> intermediate step of the analysis? I would like to inspect the >>>>> Jacobian >>>>> > >> after several solves have already completed, >>>>> > >> > >>>>> > >> > No, our currently code for testing Jacobians is poor quality >>>>> and >>>>> > >> poorly organized. Needs a major refactoring to do things >>>>> properly. Sorry >>>>> > >> >>>>> > >> You can use -snes_compare_explicit or -snes_compare_coloring to >>>>> output >>>>> > >> differences on each Newton step. >>>>> > >> >>>>> > >>>>> > >>>>> >>>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Oct 12 13:14:26 2017 From: jed at jedbrown.org (Jed Brown) Date: Thu, 12 Oct 2017 12:14:26 -0600 Subject: [petsc-users] MatSetValues with openMP and no data races In-Reply-To: <5e4b0be0-c0d8-3f10-f6fc-67d5401b421d@imperial.ac.uk> References: <20171012134013.Horde.8CkrpuHOZoa7X2lFvKlFtc5@www.imp.polymtl.ca> <5e4b0be0-c0d8-3f10-f6fc-67d5401b421d@imperial.ac.uk> Message-ID: <87o9pcz119.fsf@jedbrown.org> Lawrence Mitchell writes: > On 12/10/17 14:40, Guillaume Emond wrote: >> Goodmorning, >> >> I would like to clarify a point about the insertion of values with >> MatSetValues in a openMP loop. I know these routines are not thread >> safe. But, in our situation, we used a graph coloring algorithm on our >> mesh to make sure no adjacent element is inserted at the same time so >> no data races occurs when inserting values. Could these routines be >> used with openmp then or is there some internal variables that would >> not be thread safe? > > The MatStash used for saving and then later communicating off process > entries during AssemblyEnd is, I believe, not thread safe. Yeah, we could make a thread-safe MatSetValues (I would suggest per-row or block-row locking), but it is not now, even with coloring. From knepley at gmail.com Fri Oct 13 09:13:34 2017 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 13 Oct 2017 10:13:34 -0400 Subject: [petsc-users] Good recommendation on meshing package? In-Reply-To: References: Message-ID: On Fri, Sep 29, 2017 at 11:06 AM, Zou, Ling wrote: > Hi all, > > I know this is a bit off topic on PETSc email list. > I would like to try some finite volume type of CFD algorithm with PETSc, > but I found it quite troublesome to manage mesh by myself. > I wonder if there is any good existing meshing package that works well > with PTESc. > It possible you could use the DMPlex support in PETSc. > My expectation on such a package would be: > 1) I create the mesh with some tool. > We support at least GMsh, ExodusII, PLY, Triangle, and TetGen. > 2) Read this mesh with the meshing package, so I have things like node > set, edge set, cell set, etc. to play with > 3) discretize my PDE with the mesh > 4) solve it > > I also understand many people here use PETSc solve their CFD problem. > I would appreciate it if you could also point me to some good examples. > There are a bunch of tests, like src/dm/impls/plex/examples/tests/ex1 which reads in a mesh and views it, and also some examples of solving PDEs, all elliptic, such as SNES ex12, ex62, and ex77 and TS ex45, ex46, and ex47. Thanks, Matt > Best, > > Ling > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From evanum at gmail.com Fri Oct 13 16:59:53 2017 From: evanum at gmail.com (Evan Um) Date: Fri, 13 Oct 2017 14:59:53 -0700 Subject: [petsc-users] Controlling MUMPS parameters inside PETSC Message-ID: Dear PETSC-users, I use parallel direct solver MUMPS inside PETSC and need to control some MUMPS parameters inside PETSC. For example, I want to set up MUMPS parameters as shown below. ZMUMPS_STRUC_C id; id.job=-1; /* Initialize mumps instance*/ id.par=1; /* 0: host is not involved in solution processes */ id.sym=2; /* 0:unsysm, 2:general symmetric matrix */ How can I access structure similar to id in PETSC? For example, I control my icntl parameters like this. PCFactorGetMatrix(pc, &F); MatMumpsSetIcntl(F, icntl[i], ival[i]); However, it is not clear about how to control id.par and id.sym. If you have any experience in controlling id.par and id.sym inside PETSC, I request your help. In advance, thank you very much for your help. Best, Evan -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Fri Oct 13 18:16:58 2017 From: hzhang at mcs.anl.gov (Hong) Date: Fri, 13 Oct 2017 18:16:58 -0500 Subject: [petsc-users] Controlling MUMPS parameters inside PETSC In-Reply-To: References: Message-ID: Evan: > Dear PETSC-users, > > I use parallel direct solver MUMPS inside PETSC and need to control some > MUMPS parameters inside PETSC. For example, I want to set up MUMPS > parameters as shown below. > > ZMUMPS_STRUC_C id; > id.job=-1; /* Initialize mumps instance*/ > This is set in petsc/mumps interface. See line 1197, mumps.c > id.par=1; /* 0: host is not involved in solution processes */ > id.par=1; is used in the interface. See line 1198, id.par=1; id.sym=2; /* 0:unsysm, 2:general symmetric matrix */ > id.sym is set in the interface based on the input matrix type, see line 2154-2171 > > How can I access structure similar to id in PETSC? > Why do you want access id? > For example, I control my icntl parameters like this. > > PCFactorGetMatrix(pc, &F); > MatMumpsSetIcntl(F, icntl[i], ival[i]); > > However, it is not clear about how to control id.par and id.sym. If you > have any experience in controlling id.par and id.sym inside PETSC, I > request your help. In advance, thank you very much for your help. > Petsc/mumps interface is developed for petsc users to call mumps direct solvers under petsc environment. The parameters id.par, id.sym ... are set in the interface according to the user's matrix and solver type, thus user does not need to know about them. Hong -------------- next part -------------- An HTML attachment was scrubbed... URL: From evanum at gmail.com Fri Oct 13 19:39:02 2017 From: evanum at gmail.com (Evan Um) Date: Fri, 13 Oct 2017 17:39:02 -0700 Subject: [petsc-users] Controlling MUMPS parameters inside PETSC In-Reply-To: References: Message-ID: Hi Hong, Thanks for your reply. When I use standalone MUMPS for a symmetric matrix, I pass a upper/lower triangular part to MUMPS. To do this, I use id.sym=2. Under PETSC environment, KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); KSPSetType (ksp, KSPPREONLY); KSPGetPC(ksp, &pc); MatSetOption(A, MAT_SYMMETRIC, PETSC_TRUE); PCSetType(pc, PCLU); PCFactorSetMatSolverPackage(pc, MATSOLVERMUMPS); PCFactorSetUpMatSolverPackage(pc); Can the 5th line be an alternative to id.sym=2? Best, Evan On Fri, Oct 13, 2017 at 4:16 PM, Hong wrote: > Evan: > >> Dear PETSC-users, >> >> I use parallel direct solver MUMPS inside PETSC and need to control some >> MUMPS parameters inside PETSC. For example, I want to set up MUMPS >> parameters as shown below. >> >> ZMUMPS_STRUC_C id; >> id.job=-1; /* Initialize mumps instance*/ >> > This is set in petsc/mumps interface. See line 1197, mumps.c > > >> id.par=1; /* 0: host is not involved in solution processes */ >> > id.par=1; is used in the interface. See line 1198, id.par=1; > > id.sym=2; /* 0:unsysm, 2:general symmetric matrix */ >> > id.sym is set in the interface based on the input matrix type, see line > 2154-2171 > >> >> How can I access structure similar to id in PETSC? >> > Why do you want access id? > > >> For example, I control my icntl parameters like this. >> >> PCFactorGetMatrix(pc, &F); >> MatMumpsSetIcntl(F, icntl[i], ival[i]); >> >> However, it is not clear about how to control id.par and id.sym. If you >> have any experience in controlling id.par and id.sym inside PETSC, I >> request your help. In advance, thank you very much for your help. >> > > Petsc/mumps interface is developed for petsc users to call mumps direct > solvers under petsc environment. The parameters id.par, id.sym ... are set > in the interface according to the user's matrix and solver type, thus user > does not need to know about them. > > Hong > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Fri Oct 13 21:00:49 2017 From: hzhang at mcs.anl.gov (Hong) Date: Fri, 13 Oct 2017 21:00:49 -0500 Subject: [petsc-users] Controlling MUMPS parameters inside PETSC In-Reply-To: References: Message-ID: Evan: When PCLU is requested, matrix is treated as unsymmetric by mumps and petsc, thus the setting of id.sym=0 is actually used. If user requests PCCholesky, then mumps needs to be set as sym.id=1 (spd) or 2 (symmetric). In mumps interface, we do #if defined(PETSC_USE_COMPLEX) mumps->sym = 2; #else if (A->spd_set && A->spd) mumps->sym = 1; else mumps->sym = 2; #endif Hong Hi Hong, > > Thanks for your reply. When I use standalone MUMPS for a symmetric matrix, > I pass a upper/lower triangular part to MUMPS. To do this, I use id.sym=2. > Under PETSC environment, > > KSPCreate(PETSC_COMM_WORLD, &ksp); > KSPSetOperators(ksp, A, A); > KSPSetType (ksp, KSPPREONLY); > KSPGetPC(ksp, &pc); > MatSetOption(A, MAT_SYMMETRIC, PETSC_TRUE); > PCSetType(pc, PCLU); > PCFactorSetMatSolverPackage(pc, MATSOLVERMUMPS); > PCFactorSetUpMatSolverPackage(pc); > > Can the 5th line be an alternative to id.sym=2? > > Best, > Evan > > > > > On Fri, Oct 13, 2017 at 4:16 PM, Hong wrote: > >> Evan: >> >>> Dear PETSC-users, >>> >>> I use parallel direct solver MUMPS inside PETSC and need to control some >>> MUMPS parameters inside PETSC. For example, I want to set up MUMPS >>> parameters as shown below. >>> >>> ZMUMPS_STRUC_C id; >>> id.job=-1; /* Initialize mumps instance*/ >>> >> This is set in petsc/mumps interface. See line 1197, mumps.c >> >> >>> id.par=1; /* 0: host is not involved in solution processes */ >>> >> id.par=1; is used in the interface. See line 1198, id.par=1; >> >> id.sym=2; /* 0:unsysm, 2:general symmetric matrix */ >>> >> id.sym is set in the interface based on the input matrix type, see line >> 2154-2171 >> >>> >>> How can I access structure similar to id in PETSC? >>> >> Why do you want access id? >> >> >>> For example, I control my icntl parameters like this. >>> >>> PCFactorGetMatrix(pc, &F); >>> MatMumpsSetIcntl(F, icntl[i], ival[i]); >>> >>> However, it is not clear about how to control id.par and id.sym. If you >>> have any experience in controlling id.par and id.sym inside PETSC, I >>> request your help. In advance, thank you very much for your help. >>> >> >> Petsc/mumps interface is developed for petsc users to call mumps direct >> solvers under petsc environment. The parameters id.par, id.sym ... are set >> in the interface according to the user's matrix and solver type, thus user >> does not need to know about them. >> >> Hong >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From tinap89 at yahoo.com Sat Oct 14 02:31:40 2017 From: tinap89 at yahoo.com (Tina Patel) Date: Sat, 14 Oct 2017 07:31:40 +0000 (UTC) Subject: [petsc-users] Issue with -log_view References: <1670287803.1693376.1507966300567.ref@mail.yahoo.com> Message-ID: <1670287803.1693376.1507966300567@mail.yahoo.com> Hi, I'm using -log_view option from the command line, but it gives me "corrupt argument" and "invalid argument". However, PETSc doesn't throw errors when running without -log_view.Am I using it correctly? Or does this hint at another problem? I'm using petsc-master 3.7.6. Thanks for your time,Tina -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Oct 14 08:10:50 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 14 Oct 2017 08:10:50 -0500 Subject: [petsc-users] Issue with -log_view In-Reply-To: <1670287803.1693376.1507966300567@mail.yahoo.com> References: <1670287803.1693376.1507966300567.ref@mail.yahoo.com> <1670287803.1693376.1507966300567@mail.yahoo.com> Message-ID: Please cut and paste all the output and send it to petsc-maint at mcs.anl.gov Barry > On Oct 14, 2017, at 2:31 AM, Tina Patel wrote: > > Hi, > > I'm using -log_view option from the command line, but it gives me "corrupt argument" and "invalid argument". However, PETSc doesn't throw errors when running without -log_view. > Am I using it correctly? Or does this hint at another problem? I'm using petsc-master 3.7.6. > > Thanks for your time, > Tina From stefano.zampini at gmail.com Sat Oct 14 08:36:52 2017 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Sat, 14 Oct 2017 16:36:52 +0300 Subject: [petsc-users] Issue with -log_view In-Reply-To: References: <1670287803.1693376.1507966300567.ref@mail.yahoo.com> <1670287803.1693376.1507966300567@mail.yahoo.com> Message-ID: cutting and paste a message I sent a couple of days ago on the mailing list. I suspect you have a memory leak on some PetscViewer object. Try running with -malloc -malloc_dump -malloc_debug and without -log_view and see if you PETSc reports a memory leak. You can also try running under valgrind with the --leak-check=full option ---------------------------------------------------------------------------------------------- Instead of reporting a leak, the below code, when run with -log_view, triggers an error #include int main(int argc,char **args) { PetscErrorCode ierr; PetscViewer view; ierr = PetscInitialize(&argc,&args,(char*)0,help);CHKERRQ(ierr); ierr = PetscViewerASCIIGetStdout(PETSC_COMM_WORLD,&view);CHKERRQ(ierr); ierr = PetscViewerCreate(PETSC_COMM_WORLD,&view);CHKERRQ(ierr); ierr = PetscFinalize(); return ierr; } 0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Corrupt argument: http://www.mcs.anl.gov/petsc/ documentation/faq.html#valgrind [0]PETSC ERROR: Invalid type of object: Parameter # 1 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.7.6-4792-gbbfd41f GIT Date: 2017-07-30 13:35:30 +0300 [0]PETSC ERROR: ./ex1 on a arch-debug named localhost.localdomain by szampini Thu Oct 12 15:24:19 2017 [0]PETSC ERROR: Configure options --download-chaco --download-ctetgen --download-hypre --download-metis --download-mumps --download-p4est --download-parmetis --download-suitesparse --download-triangle --with-scalapack CFLAGS="-Wall -g -O0" CXXFLAGS="-Wall -g -O0" FCFLAGS="-g -O0" PETSC_ARCH=arch-debug [0]PETSC ERROR: #1 PetscObjectReference() line 510 in /home/szampini/src/petsc/src/sys/objects/inherit.c [0]PETSC ERROR: #2 PetscOptionsGetViewer() line 259 in /home/szampini/src/petsc/src/sys/classes/viewer/interface/viewreg.c [0]PETSC ERROR: #3 PetscLogViewFromOptions() line 1753 in /home/szampini/src/petsc/src/sys/logging/plog.c [0]PETSC ERROR: #4 PetscFinalize() line 1227 in /home/szampini/src/petsc/src/sys/objects/pinit.c The problem is with the MPIAttribute Petsc_Viewer_Stdout_keyval attached to PETSC_COMM_WORLD. PETSC_VIEWER_STDOUT_WORLD gets destroyed in the first call to PetscObjectRegisterDestroyAll(); Then PetscLogViewFromOptions() call PetscViewerASCIIGetStdout that checks for the presence of the attribute on the communicator, which is still there, since we never called MPI_Comm_free on that communicator. What would be a solution for this issue? At least, we should print a nice error message in PetscViewerASCIIGetStdout. 2017-10-14 16:10 GMT+03:00 Barry Smith : > > Please cut and paste all the output and send it to > petsc-maint at mcs.anl.gov > > Barry > > > On Oct 14, 2017, at 2:31 AM, Tina Patel wrote: > > > > Hi, > > > > I'm using -log_view option from the command line, but it gives me > "corrupt argument" and "invalid argument". However, PETSc doesn't throw > errors when running without -log_view. > > Am I using it correctly? Or does this hint at another problem? I'm using > petsc-master 3.7.6. > > > > Thanks for your time, > > Tina > > -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From tinap89 at yahoo.com Sat Oct 14 14:02:22 2017 From: tinap89 at yahoo.com (Tina Patel) Date: Sat, 14 Oct 2017 19:02:22 +0000 (UTC) Subject: [petsc-users] Issue with -log_view In-Reply-To: References: <1670287803.1693376.1507966300567.ref@mail.yahoo.com> <1670287803.1693376.1507966300567@mail.yahoo.com> Message-ID: <1420190898.1871934.1508007742768@mail.yahoo.com> Thanks! That's what myy error message looked like too. I took away all PetscViewer calls since I don't need it for timing anyways. That solved it for me.? Thanks again,Tina On Saturday, October 14, 2017 6:36 AM, Stefano Zampini wrote: cutting and paste a message I sent a couple of days ago on the mailing list. I suspect you have a memory leak on some PetscViewer object. Try running with -malloc -malloc_dump -malloc_debug and without -log_view and see if you PETSc reports a memory leak. You can also try running under valgrind?with the --leak-check=full option ----------------------------------------------------------------------------------------------Instead?of reporting a leak, the below code,?when run with -log_view, triggers an error #include int main(int argc,char **args){? PetscErrorCode ierr;? PetscViewer? ? view; ? ierr = PetscInitialize(&argc,&args,( char*)0,help);CHKERRQ(ierr);? ierr = PetscViewerASCIIGetStdout( PETSC_COMM_WORLD,&view); CHKERRQ(ierr);? ierr = PetscViewerCreate(PETSC_COMM_ WORLD,&view);CHKERRQ(ierr);? ierr = PetscFinalize();? return ierr;} 0]PETSC ERROR: --------------------- Error Message ------------------------------ ------------------------------ --[0]PETSC ERROR: Corrupt argument:?http://www.mcs.anl.gov/petsc/ documentation/faq.html# valgrind[0]PETSC ERROR: Invalid type of object: Parameter # 1[0]PETSC ERROR: See?http://www.mcs.anl.gov/petsc/ documentation/faq.html?for trouble shooting.[0]PETSC ERROR: Petsc Development GIT revision: v3.7.6-4792-gbbfd41f? GIT Date: 2017-07-30 13:35:30 +0300[0]PETSC ERROR: ./ex1 on a arch-debug named localhost.localdomain by szampini Thu Oct 12 15:24:19 2017[0]PETSC ERROR: Configure options --download-chaco --download-ctetgen --download-hypre --download-metis --download-mumps --download-p4est --download-parmetis --download-suitesparse --download-triangle --with-scalapack CFLAGS="-Wall -g -O0" CXXFLAGS="-Wall -g -O0" FCFLAGS="-g -O0" PETSC_ARCH=arch-debug[0]PETSC ERROR: #1 PetscObjectReference() line 510 in /home/szampini/src/petsc/src/ sys/objects/inherit.c[0]PETSC ERROR: #2 PetscOptionsGetViewer() line 259 in /home/szampini/src/petsc/src/ sys/classes/viewer/interface/ viewreg.c[0]PETSC ERROR: #3 PetscLogViewFromOptions() line 1753 in /home/szampini/src/petsc/src/ sys/logging/plog.c[0]PETSC ERROR: #4 PetscFinalize() line 1227 in /home/szampini/src/petsc/src/ sys/objects/pinit.c The problem is with the MPIAttribute Petsc_Viewer_Stdout_keyval attached to PETSC_COMM_WORLD. PETSC_VIEWER_STDOUT_WORLD gets destroyed in the first call to? ? PetscObjectRegisterDestroyAll( ); Then?PetscLogViewFromOptions() call PetscViewerASCIIGetStdout that checks for the presence of the attribute on the communicator, which is still there, since we never called MPI_Comm_free on that communicator. What would be a solution for this issue? At least, we should print a nice error message in PetscViewerASCIIGetStdout.? 2017-10-14 16:10 GMT+03:00 Barry Smith : ? ?Please cut and paste all the output and send it to petsc-maint at mcs.anl.gov ? ?Barry > On Oct 14, 2017, at 2:31 AM, Tina Patel wrote: > > Hi, > > I'm using -log_view option from the command line, but it gives me "corrupt argument" and "invalid argument". However, PETSc doesn't throw errors when running without -log_view. > Am I using it correctly? Or does this hint at another problem? I'm using petsc-master 3.7.6. > > Thanks for your time, > Tina -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From zakaryah at gmail.com Sun Oct 15 18:20:21 2017 From: zakaryah at gmail.com (zakaryah .) Date: Sun, 15 Oct 2017 19:20:21 -0400 Subject: [petsc-users] DMDA and boundary conditions Message-ID: For finite difference methods, using a DMDA with DMDA_BOUNDARY_GHOSTED is great for Dirichlet boundary conditions, because I can just set the Dirichlet values on the ghost points, correct? It seems to me that the analogous method for Neumann conditions would be to use a reflecting boundary condition, and if the Neumann values are nonzero, these could be added to the constant vector. Is this correct? Is DMDA_BOUNDARY_MIRROR supposed to do this? Is it implemented yet, or are there plans to implement it soon? Otherwise, is there any way to efficiently implement Neumann conditions besides branching with if statements? Thanks in advance! -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Oct 15 20:06:12 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 15 Oct 2017 20:06:12 -0500 Subject: [petsc-users] DMDA and boundary conditions In-Reply-To: References: Message-ID: <04B153E4-59FF-48F2-8DA8-BD8A980BD5BF@mcs.anl.gov> > On Oct 15, 2017, at 6:20 PM, zakaryah . wrote: > > For finite difference methods, using a DMDA with DMDA_BOUNDARY_GHOSTED is great for Dirichlet boundary conditions, because I can just set the Dirichlet values on the ghost points, correct? Yes, > It seems to me that the analogous method for Neumann conditions would be to use a reflecting boundary condition, and if the Neumann values are nonzero, these could be added to the constant vector. I am not sure what you mean by "added to the constant vector", what constant vector? > Is this correct? Is DMDA_BOUNDARY_MIRROR supposed to do this? Yes see http://scicomp.stackexchange.com/questions/5355/writing-the-poisson-equation-finite-difference-matrix-with-neumann-boundary-cond > Is it implemented yet, or are there plans to implement it soon? It is implemented for 2d; if you need 3d you can try to implement it yourself or ask us and we'll look and see how difficult it might be. > Otherwise, is there any way to efficiently implement Neumann conditions besides branching with if statements? Yes, this stuff is here to some degree to eliminate all the horrible if checks in user code. Barry > Thanks in advance! From zakaryah at gmail.com Sun Oct 15 20:52:38 2017 From: zakaryah at gmail.com (zakaryah .) Date: Sun, 15 Oct 2017 21:52:38 -0400 Subject: [petsc-users] DMDA and boundary conditions In-Reply-To: <04B153E4-59FF-48F2-8DA8-BD8A980BD5BF@mcs.anl.gov> References: <04B153E4-59FF-48F2-8DA8-BD8A980BD5BF@mcs.anl.gov> Message-ID: Thanks for all the answers, Barry! By constant vector, I just meant the part of the function or Jacobian which doesn't depend on the state variable. I am working in 3D - I will have a look at the code. To implement, would everything be in da3.c, analogous to the implementation in da2.c? In other words - would there be changes in other source files as well or would they be limited to da3.c? Thanks again. On Sun, Oct 15, 2017 at 9:06 PM, Barry Smith wrote: > > > On Oct 15, 2017, at 6:20 PM, zakaryah . wrote: > > > > For finite difference methods, using a DMDA with DMDA_BOUNDARY_GHOSTED > is great for Dirichlet boundary conditions, because I can just set the > Dirichlet values on the ghost points, correct? > > Yes, > > It seems to me that the analogous method for Neumann conditions would > be to use a reflecting boundary condition, and if the Neumann values are > nonzero, these could be added to the constant vector. > > I am not sure what you mean by "added to the constant vector", what > constant vector? > > > > Is this correct? Is DMDA_BOUNDARY_MIRROR supposed to do this? > > Yes > > see http://scicomp.stackexchange.com/questions/5355/writing- > the-poisson-equation-finite-difference-matrix-with-neumann-boundary-cond > > > > Is it implemented yet, or are there plans to implement it soon? > > It is implemented for 2d; if you need 3d you can try to implement it > yourself or ask us and we'll look and see how difficult it might be. > > > Otherwise, is there any way to efficiently implement Neumann conditions > besides branching with if statements? > > Yes, this stuff is here to some degree to eliminate all the horrible if > checks in user code. > > Barry > > > Thanks in advance! > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sun Oct 15 22:04:28 2017 From: jed at jedbrown.org (Jed Brown) Date: Sun, 15 Oct 2017 21:04:28 -0600 Subject: [petsc-users] DMDA and boundary conditions In-Reply-To: References: <04B153E4-59FF-48F2-8DA8-BD8A980BD5BF@mcs.anl.gov> Message-ID: <871sm3g5dv.fsf@jedbrown.org> "zakaryah ." writes: > Thanks for all the answers, Barry! > > By constant vector, I just meant the part of the function or Jacobian which > doesn't depend on the state variable. If you need inhomogeneous Neumann, you'll have to modify the ghost value. MIRROR is really a cheap hack -- for anything more general, you can use GHOSTED and use whatever algorithm you like to fill in those ghost values. Also, rather than think of the Neumann boundary condition as being a centered first derivative, I like to think about it as extending the domain of the PDE after imposing some (biased) symmetry. This gives you a consistent scaling for the equation and doesn't require a new "stencil". > I am working in 3D - I will have a look at the code. To implement, > would everything be in da3.c, analogous to the implementation in > da2.c? In other words - would there be changes in other source files > as well or would they be limited to da3.c? It looks like it is implemented for 3D, at least in the sense that there are the requisite number of conditionals on DM_BOUNDARY_MIRROR. You might just need to remove the stale line at the top that sets an error claiming that DM_BOUNDARY_MIRROR is not implemented. Let us know if that works. From michael.werner at dlr.de Mon Oct 16 02:26:57 2017 From: michael.werner at dlr.de (Michael Werner) Date: Mon, 16 Oct 2017 09:26:57 +0200 Subject: [petsc-users] Parallelizing a matrix-free code Message-ID: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> Hello, I'm having trouble with parallelizing a matrix-free code with PETSc. In this code, I use an external CFD code to provide the matrix-vector product for an iterative solver in PETSc. To increase convergence rate, I'm using an explicitly stored Jacobian matrix to precondition the solver. This works fine for serial runs. However, when I try to use multiple processes, I face the problem that PETSc decomposes the preconditioner matrix, and probably also the shell matrix, in a different way than the external CFD code decomposes the grid. The Jacobian matrix is built in a way, that its rows and columns correspond to the global IDs of the individual points in my CFD mesh The CFD code decomposes the domain based on the proximity of points to each other, so that the resulting subgrids are coherent. However, since its an unstructured grid, those subgrids are not necessarily made up of points with successive global IDs. This is a problem, since PETSc seems to partition the matrix in? coherent slices. I'm not sure what the best approach to this problem might be. Is it maybe possible to exactly tell PETSc, which rows/columns it should assign to the individual processes? From praveen at tifrbng.res.in Mon Oct 16 02:59:54 2017 From: praveen at tifrbng.res.in (Praveen C) Date: Mon, 16 Oct 2017 13:29:54 +0530 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> Message-ID: <4EFF96FC-B907-47FB-892A-A96F06FF33B0@tifrbng.res.in> > On 16-Oct-2017, at 12:56 PM, Michael Werner wrote: > > However, since its an unstructured grid, those subgrids are not necessarily made up of points with successive global IDs. It should be easy to renumber the points so that each partition has contiguously numbered point ids. This is what we do in our CFD code during the stage where we partition the mesh with Metis. Best praveen -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Mon Oct 16 03:32:08 2017 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Mon, 16 Oct 2017 11:32:08 +0300 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> Message-ID: 2017-10-16 10:26 GMT+03:00 Michael Werner : > Hello, > > I'm having trouble with parallelizing a matrix-free code with PETSc. In > this code, I use an external CFD code to provide the matrix-vector product > for an iterative solver in PETSc. To increase convergence rate, I'm using > an explicitly stored Jacobian matrix to precondition the solver. This works > fine for serial runs. However, when I try to use multiple processes, I face > the problem that PETSc decomposes the preconditioner matrix, and probably > also the shell matrix, in a different way than the external CFD code > decomposes the grid. > > The Jacobian matrix is built in a way, that its rows and columns > correspond to the global IDs of the individual points in my CFD mesh > > The CFD code decomposes the domain based on the proximity of points to > each other, so that the resulting subgrids are coherent. However, since its > an unstructured grid, those subgrids are not necessarily made up of points > with successive global IDs. This is a problem, since PETSc seems to > partition the matrix in coherent slices. > > I'm not sure what the best approach to this problem might be. Is it maybe > possible to exactly tell PETSc, which rows/columns it should assign to the > individual processes? > > If you are explicitly setting the values in your Jacobians via MatSetValues(), you can create a ISLocalToGlobalMapping http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/IS/ISLocalToGlobalMappingCreate.html that maps the numbering you use for the Jacobians to their counterpart in the CFD ordering, then call MatSetLocalToGlobalMapping and then use MatSetValuesLocal with the same arguments you are calling MatSetValues now. Otherwise, you can play with the application ordering http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/index.html -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From ling.zou at inl.gov Mon Oct 16 09:40:55 2017 From: ling.zou at inl.gov (Zou, Ling) Date: Mon, 16 Oct 2017 08:40:55 -0600 Subject: [petsc-users] Good recommendation on meshing package? In-Reply-To: References: Message-ID: Matt, thanks for your input. This is very helpful. Ling On Fri, Oct 13, 2017 at 8:13 AM, Matthew Knepley wrote: > On Fri, Sep 29, 2017 at 11:06 AM, Zou, Ling wrote: > >> Hi all, >> >> I know this is a bit off topic on PETSc email list. >> I would like to try some finite volume type of CFD algorithm with PETSc, >> but I found it quite troublesome to manage mesh by myself. >> I wonder if there is any good existing meshing package that works well >> with PTESc. >> > > It possible you could use the DMPlex support in PETSc. > > >> My expectation on such a package would be: >> 1) I create the mesh with some tool. >> > > We support at least GMsh, ExodusII, PLY, Triangle, and TetGen. > > >> 2) Read this mesh with the meshing package, so I have things like node >> set, edge set, cell set, etc. to play with >> 3) discretize my PDE with the mesh >> 4) solve it >> >> I also understand many people here use PETSc solve their CFD problem. >> I would appreciate it if you could also point me to some good examples. >> > > There are a bunch of tests, like src/dm/impls/plex/examples/tests/ex1 > which reads in a mesh and views it, and also > some examples of solving PDEs, all elliptic, such as SNES ex12, ex62, and > ex77 and TS ex45, ex46, and ex47. > > Thanks, > > Matt > > >> Best, >> >> Ling >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cpraveen at gmail.com Mon Oct 16 10:25:39 2017 From: cpraveen at gmail.com (Praveen C) Date: Mon, 16 Oct 2017 20:55:39 +0530 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> Message-ID: I am interested to learn more about how this works. How are the vectors created if the ids are not contiguous in a partition ? Thanks praveen On Mon, Oct 16, 2017 at 2:02 PM, Stefano Zampini wrote: > > > 2017-10-16 10:26 GMT+03:00 Michael Werner : > >> Hello, >> >> I'm having trouble with parallelizing a matrix-free code with PETSc. In >> this code, I use an external CFD code to provide the matrix-vector product >> for an iterative solver in PETSc. To increase convergence rate, I'm using >> an explicitly stored Jacobian matrix to precondition the solver. This works >> fine for serial runs. However, when I try to use multiple processes, I face >> the problem that PETSc decomposes the preconditioner matrix, and probably >> also the shell matrix, in a different way than the external CFD code >> decomposes the grid. >> >> The Jacobian matrix is built in a way, that its rows and columns >> correspond to the global IDs of the individual points in my CFD mesh >> >> The CFD code decomposes the domain based on the proximity of points to >> each other, so that the resulting subgrids are coherent. However, since its >> an unstructured grid, those subgrids are not necessarily made up of points >> with successive global IDs. This is a problem, since PETSc seems to >> partition the matrix in coherent slices. >> >> I'm not sure what the best approach to this problem might be. Is it maybe >> possible to exactly tell PETSc, which rows/columns it should assign to the >> individual processes? >> >> > If you are explicitly setting the values in your Jacobians via > MatSetValues(), you can create a ISLocalToGlobalMapping > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/IS/ > ISLocalToGlobalMappingCreate.html > > that maps the numbering you use for the Jacobians to their counterpart in > the CFD ordering, then call MatSetLocalToGlobalMapping and then use > MatSetValuesLocal with the same arguments you are calling MatSetValues now. > > Otherwise, you can play with the application ordering > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/index.html > > > > > -- > Stefano > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fande.kong at inl.gov Mon Oct 16 11:07:23 2017 From: fande.kong at inl.gov (Kong, Fande) Date: Mon, 16 Oct 2017 10:07:23 -0600 Subject: [petsc-users] Can not configure PETSc-master with clang-3.9 Message-ID: Hi All, I just upgraded MAC OS, and also updated all other related packages. Now I can not configure PETSc-master any more. See the attachment for more details. Fande, -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log.zip Type: application/zip Size: 15583 bytes Desc: not available URL: From knepley at gmail.com Mon Oct 16 11:30:25 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 16 Oct 2017 12:30:25 -0400 Subject: [petsc-users] Can not configure PETSc-master with clang-3.9 In-Reply-To: References: Message-ID: On Mon, Oct 16, 2017 at 12:07 PM, Kong, Fande wrote: > Hi All, > > I just upgraded MAC OS, and also updated all other related packages. Now > I can not configure PETSc-master any more. > > See the attachment for more details. > Something is really wrong with your compilers Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/var/folders/6q/y12qpzw12dg5qx5x96dd5_bhtzr4_y/T/petsc-mFgio7/config.setCompilers/conftest.c:3:10: fatal error: 'stdlib.h' file not found #include ^ 1 error generated. : Matt > > > Fande, > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Oct 16 11:52:57 2017 From: jed at jedbrown.org (Jed Brown) Date: Mon, 16 Oct 2017 10:52:57 -0600 Subject: [petsc-users] Can not configure PETSc-master with clang-3.9 In-Reply-To: References: Message-ID: <87k1zvdogm.fsf@jedbrown.org> "Kong, Fande" writes: > Hi All, > > I just upgraded MAC OS, and also updated all other related packages. Now > I can not configure PETSc-master any more. Your compiler paths are broken. /var/folders/6q/y12qpzw12dg5qx5x96dd5_bhtzr4_y/T/petsc-mFgio7/config.setCompilers/conftest.c:3:10: fatal error: 'stdlib.h' file not found #include ^ 1 error generated. From rtmills at anl.gov Mon Oct 16 11:58:45 2017 From: rtmills at anl.gov (Richard Tran Mills) Date: Mon, 16 Oct 2017 09:58:45 -0700 Subject: [petsc-users] Can not configure PETSc-master with clang-3.9 In-Reply-To: <9babb1c86e5e408582452b7a3696df88@CY1PR09MB0741.namprd09.prod.outlook.com> References: <9babb1c86e5e408582452b7a3696df88@CY1PR09MB0741.namprd09.prod.outlook.com> Message-ID: Fande, Did you remember to agree to the XCode license after your upgrade, if you did an XCode upgrade? You have to do the license agreement again, otherwise the compilers don't work at all. Apologies if this seems like a silly thing to ask, but this has caused me a few minutes of confusion before. --Richard On Mon, Oct 16, 2017 at 9:52 AM, Jed Brown wrote: > "Kong, Fande" writes: > > > Hi All, > > > > I just upgraded MAC OS, and also updated all other related packages. > Now > > I can not configure PETSc-master any more. > > Your compiler paths are broken. > > /var/folders/6q/y12qpzw12dg5qx5x96dd5_bhtzr4_y/T/petsc-mFgio7/config.setCompilers/conftest.c:3:10: > fatal error: 'stdlib.h' file not found > #include > ^ > 1 error generated. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fande.kong at inl.gov Mon Oct 16 12:02:43 2017 From: fande.kong at inl.gov (Kong, Fande) Date: Mon, 16 Oct 2017 11:02:43 -0600 Subject: [petsc-users] Can not configure PETSc-master with clang-3.9 In-Reply-To: References: <9babb1c86e5e408582452b7a3696df88@CY1PR09MB0741.namprd09.prod.outlook.com> Message-ID: Now it is working. It turns out I need to do something like "xcode-select --install" after upgrading OS, and of course we need to agree the license. Fande, On Mon, Oct 16, 2017 at 10:58 AM, Richard Tran Mills wrote: > Fande, > > Did you remember to agree to the XCode license after your upgrade, if you > did an XCode upgrade? You have to do the license agreement again, otherwise > the compilers don't work at all. Apologies if this seems like a silly thing > to ask, but this has caused me a few minutes of confusion before. > > --Richard > > On Mon, Oct 16, 2017 at 9:52 AM, Jed Brown wrote: > >> "Kong, Fande" writes: >> >> > Hi All, >> > >> > I just upgraded MAC OS, and also updated all other related packages. >> Now >> > I can not configure PETSc-master any more. >> >> Your compiler paths are broken. >> >> /var/folders/6q/y12qpzw12dg5qx5x96dd5_bhtzr4_y/T/petsc- >> mFgio7/config.setCompilers/conftest.c:3:10: fatal error: 'stdlib.h' file >> not found >> #include >> ^ >> 1 error generated. >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Oct 16 13:05:18 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 16 Oct 2017 13:05:18 -0500 Subject: [petsc-users] Can not configure PETSc-master with clang-3.9 In-Reply-To: References: <9babb1c86e5e408582452b7a3696df88@CY1PR09MB0741.namprd09.prod.outlook.com> Message-ID: Thats weird. >From what I can recall - some tools (like pgi compilers) need this - but the xcode compilers do not. Basically xcode clang can pick up includes from the xcode specific location - but other tools look for includes in /usr/incldue And 'xcode-select --install' adds the /usr/include etc links. Satish On Mon, 16 Oct 2017, Kong, Fande wrote: > Now it is working. It turns out I need to do something like "xcode-select > --install" after upgrading OS, and of course we need to agree the license. > > > Fande, > > On Mon, Oct 16, 2017 at 10:58 AM, Richard Tran Mills > wrote: > > > Fande, > > > > Did you remember to agree to the XCode license after your upgrade, if you > > did an XCode upgrade? You have to do the license agreement again, otherwise > > the compilers don't work at all. Apologies if this seems like a silly thing > > to ask, but this has caused me a few minutes of confusion before. > > > > --Richard > > > > On Mon, Oct 16, 2017 at 9:52 AM, Jed Brown wrote: > > > >> "Kong, Fande" writes: > >> > >> > Hi All, > >> > > >> > I just upgraded MAC OS, and also updated all other related packages. > >> Now > >> > I can not configure PETSc-master any more. > >> > >> Your compiler paths are broken. > >> > >> /var/folders/6q/y12qpzw12dg5qx5x96dd5_bhtzr4_y/T/petsc- > >> mFgio7/config.setCompilers/conftest.c:3:10: fatal error: 'stdlib.h' file > >> not found > >> #include > >> ^ > >> 1 error generated. > >> > > > > > From balay at mcs.anl.gov Mon Oct 16 13:07:09 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 16 Oct 2017 13:07:09 -0500 Subject: [petsc-users] Can not configure PETSc-master with clang-3.9 In-Reply-To: References: <9babb1c86e5e408582452b7a3696df88@CY1PR09MB0741.namprd09.prod.outlook.com> Message-ID: BTW: Which clang are you using? mpicc -show mpicc --version Satish On Mon, 16 Oct 2017, Satish Balay wrote: > Thats weird. > > From what I can recall - some tools (like pgi compilers) need this - > but the xcode compilers do not. > > Basically xcode clang can pick up includes from the xcode specific > location - but other tools look for includes in /usr/incldue > > And 'xcode-select --install' adds the /usr/include etc links. > > Satish > > > On Mon, 16 Oct 2017, Kong, Fande wrote: > > > Now it is working. It turns out I need to do something like "xcode-select > > --install" after upgrading OS, and of course we need to agree the license. > > > > > > Fande, > > > > On Mon, Oct 16, 2017 at 10:58 AM, Richard Tran Mills > > wrote: > > > > > Fande, > > > > > > Did you remember to agree to the XCode license after your upgrade, if you > > > did an XCode upgrade? You have to do the license agreement again, otherwise > > > the compilers don't work at all. Apologies if this seems like a silly thing > > > to ask, but this has caused me a few minutes of confusion before. > > > > > > --Richard > > > > > > On Mon, Oct 16, 2017 at 9:52 AM, Jed Brown wrote: > > > > > >> "Kong, Fande" writes: > > >> > > >> > Hi All, > > >> > > > >> > I just upgraded MAC OS, and also updated all other related packages. > > >> Now > > >> > I can not configure PETSc-master any more. > > >> > > >> Your compiler paths are broken. > > >> > > >> /var/folders/6q/y12qpzw12dg5qx5x96dd5_bhtzr4_y/T/petsc- > > >> mFgio7/config.setCompilers/conftest.c:3:10: fatal error: 'stdlib.h' file > > >> not found > > >> #include > > >> ^ > > >> 1 error generated. > > >> > > > > > > > > > > From fande.kong at inl.gov Mon Oct 16 13:12:50 2017 From: fande.kong at inl.gov (Kong, Fande) Date: Mon, 16 Oct 2017 12:12:50 -0600 Subject: [petsc-users] Can not configure PETSc-master with clang-3.9 In-Reply-To: References: <9babb1c86e5e408582452b7a3696df88@CY1PR09MB0741.namprd09.prod.outlook.com> Message-ID: On Mon, Oct 16, 2017 at 12:07 PM, Satish Balay wrote: > BTW: Which clang are you using? > > mpicc -show > mpicc -show clang -Wl,-commons,use_dylibs -I/opt/moose/mpich/mpich-3.2/clang-opt/include -L/opt/moose/mpich/mpich-3.2/clang-opt/lib -lmpi -lpmpi > mpicc --version > mpicc -v mpicc for MPICH version 3.2 clang version 3.9.0 (tags/RELEASE_390/final) Target: x86_64-apple-darwin16.7.0 Thread model: posix InstalledDir: /opt/moose/llvm-3.9.0/bin clang-3.9: warning: argument unused during compilation: '-I /opt/moose/mpich/mpich-3.2/clang-opt/include' I guess because we are using a customize installation of clang. Fande, > > Satish > > On Mon, 16 Oct 2017, Satish Balay wrote: > > > Thats weird. > > > > From what I can recall - some tools (like pgi compilers) need this - > > but the xcode compilers do not. > > > > Basically xcode clang can pick up includes from the xcode specific > > location - but other tools look for includes in /usr/incldue > > > > And 'xcode-select --install' adds the /usr/include etc links. > > > > Satish > > > > > > On Mon, 16 Oct 2017, Kong, Fande wrote: > > > > > Now it is working. It turns out I need to do something like > "xcode-select > > > --install" after upgrading OS, and of course we need to agree the > license. > > > > > > > > > Fande, > > > > > > On Mon, Oct 16, 2017 at 10:58 AM, Richard Tran Mills > > > wrote: > > > > > > > Fande, > > > > > > > > Did you remember to agree to the XCode license after your upgrade, > if you > > > > did an XCode upgrade? You have to do the license agreement again, > otherwise > > > > the compilers don't work at all. Apologies if this seems like a > silly thing > > > > to ask, but this has caused me a few minutes of confusion before. > > > > > > > > --Richard > > > > > > > > On Mon, Oct 16, 2017 at 9:52 AM, Jed Brown wrote: > > > > > > > >> "Kong, Fande" writes: > > > >> > > > >> > Hi All, > > > >> > > > > >> > I just upgraded MAC OS, and also updated all other related > packages. > > > >> Now > > > >> > I can not configure PETSc-master any more. > > > >> > > > >> Your compiler paths are broken. > > > >> > > > >> /var/folders/6q/y12qpzw12dg5qx5x96dd5_bhtzr4_y/T/petsc- > > > >> mFgio7/config.setCompilers/conftest.c:3:10: fatal error: > 'stdlib.h' file > > > >> not found > > > >> #include > > > >> ^ > > > >> 1 error generated. > > > >> > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Oct 16 13:15:01 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 16 Oct 2017 13:15:01 -0500 Subject: [petsc-users] Can not configure PETSc-master with clang-3.9 In-Reply-To: References: <9babb1c86e5e408582452b7a3696df88@CY1PR09MB0741.namprd09.prod.outlook.com> Message-ID: Ok - this is not clang from apple. So I guess it needs that extra 'xcode-select --install' I don't think gcc from brew needed this. [And I don't remember if I checked clang from brew] Satish On Mon, 16 Oct 2017, Kong, Fande wrote: > On Mon, Oct 16, 2017 at 12:07 PM, Satish Balay wrote: > > > BTW: Which clang are you using? > > > > mpicc -show > > > > > > mpicc -show > > clang -Wl,-commons,use_dylibs > -I/opt/moose/mpich/mpich-3.2/clang-opt/include > -L/opt/moose/mpich/mpich-3.2/clang-opt/lib -lmpi -lpmpi > > > > > mpicc --version > > > > mpicc -v > > mpicc for MPICH version 3.2 > clang version 3.9.0 (tags/RELEASE_390/final) > Target: x86_64-apple-darwin16.7.0 > Thread model: posix > InstalledDir: /opt/moose/llvm-3.9.0/bin > clang-3.9: warning: argument unused during compilation: '-I > /opt/moose/mpich/mpich-3.2/clang-opt/include' > > > I guess because we are using a customize installation of clang. > > > Fande, > > > > > > Satish > > > > On Mon, 16 Oct 2017, Satish Balay wrote: > > > > > Thats weird. > > > > > > From what I can recall - some tools (like pgi compilers) need this - > > > but the xcode compilers do not. > > > > > > Basically xcode clang can pick up includes from the xcode specific > > > location - but other tools look for includes in /usr/incldue > > > > > > And 'xcode-select --install' adds the /usr/include etc links. > > > > > > Satish > > > > > > > > > On Mon, 16 Oct 2017, Kong, Fande wrote: > > > > > > > Now it is working. It turns out I need to do something like > > "xcode-select > > > > --install" after upgrading OS, and of course we need to agree the > > license. > > > > > > > > > > > > Fande, > > > > > > > > On Mon, Oct 16, 2017 at 10:58 AM, Richard Tran Mills > > > > wrote: > > > > > > > > > Fande, > > > > > > > > > > Did you remember to agree to the XCode license after your upgrade, > > if you > > > > > did an XCode upgrade? You have to do the license agreement again, > > otherwise > > > > > the compilers don't work at all. Apologies if this seems like a > > silly thing > > > > > to ask, but this has caused me a few minutes of confusion before. > > > > > > > > > > --Richard > > > > > > > > > > On Mon, Oct 16, 2017 at 9:52 AM, Jed Brown wrote: > > > > > > > > > >> "Kong, Fande" writes: > > > > >> > > > > >> > Hi All, > > > > >> > > > > > >> > I just upgraded MAC OS, and also updated all other related > > packages. > > > > >> Now > > > > >> > I can not configure PETSc-master any more. > > > > >> > > > > >> Your compiler paths are broken. > > > > >> > > > > >> /var/folders/6q/y12qpzw12dg5qx5x96dd5_bhtzr4_y/T/petsc- > > > > >> mFgio7/config.setCompilers/conftest.c:3:10: fatal error: > > 'stdlib.h' file > > > > >> not found > > > > >> #include > > > > >> ^ > > > > >> 1 error generated. > > > > >> > > > > > > > > > > > > > > > > > > > > > > > > > From michael.werner at dlr.de Tue Oct 17 03:21:27 2017 From: michael.werner at dlr.de (Michael Werner) Date: Tue, 17 Oct 2017 10:21:27 +0200 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> Message-ID: That's something I'm still struggling with. In the serial case, I can simply extract the values from the original grid, and since the ordering of the Jacobian is the same there is no problem. In the parallel case this is still a more or less open question. That's why I thought about reordering the Jacobian. As long as the position of the individual IDs is the same for both, I don't have to care about their absolute position. I also wanted to thank you for your previous answer, it seems that the application ordering might be what I'm looking for. However, in the meantime I stumbled about another problem, that I have to solve first. My new problem is, that I call the external code within the shell matrix' multiply call. But in a parallel case, this call obviously gets called once per process. So right now I'm trying to circumvent this, so it might take a while before I'm able to come back to the original problem... Kind regards, Michael Am 16.10.2017 um 17:25 schrieb Praveen C: > I am interested to learn more about how this works. How are the > vectors created if the ids are not contiguous in a partition ? > > Thanks > praveen > > On Mon, Oct 16, 2017 at 2:02 PM, Stefano Zampini > > wrote: > > > > 2017-10-16 10:26 GMT+03:00 Michael Werner >: > > Hello, > > I'm having trouble with parallelizing a matrix-free code with > PETSc. In this code, I use an external CFD code to provide the > matrix-vector product for an iterative solver in PETSc. To > increase convergence rate, I'm using an explicitly stored > Jacobian matrix to precondition the solver. This works fine > for serial runs. However, when I try to use multiple > processes, I face the problem that PETSc decomposes the > preconditioner matrix, and probably also the shell matrix, in > a different way than the external CFD code decomposes the grid. > > The Jacobian matrix is built in a way, that its rows and > columns correspond to the global IDs of the individual points > in my CFD mesh > > The CFD code decomposes the domain based on the proximity of > points to each other, so that the resulting subgrids are > coherent. However, since its an unstructured grid, those > subgrids are not necessarily made up of points with successive > global IDs. This is a problem, since PETSc seems to partition > the matrix in? coherent slices. > > I'm not sure what the best approach to this problem might be. > Is it maybe possible to exactly tell PETSc, which rows/columns > it should assign to the individual processes? > > > If you are explicitly setting the values in your Jacobians via > MatSetValues(), you can create a ISLocalToGlobalMapping > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/IS/ISLocalToGlobalMappingCreate.html > > > that maps the numbering you use for the Jacobians to their > counterpart in the CFD ordering, then call > MatSetLocalToGlobalMapping and then use MatSetValuesLocal with the > same arguments you are calling MatSetValues now. > > Otherwise, you can play with the application ordering > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/index.html > > > > > -- > Stefano > > -- ____________________________________________________ Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen Michael Werner Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de DLR.de -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Oct 17 04:11:06 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 17 Oct 2017 05:11:06 -0400 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> Message-ID: On Tue, Oct 17, 2017 at 4:21 AM, Michael Werner wrote: > That's something I'm still struggling with. In the serial case, I can > simply extract the values from the original grid, and since the ordering of > the Jacobian is the same there is no problem. In the parallel case this is > still a more or less open question. That's why I thought about reordering > the Jacobian. As long as the position of the individual IDs is the same for > both, I don't have to care about their absolute position. > > I also wanted to thank you for your previous answer, it seems that the > application ordering might be what I'm looking for. However, in the > meantime I stumbled about another problem, that I have to solve first. My > new problem is, that I call the external code within the shell matrix' > multiply call. But in a parallel case, this call obviously gets called once > per process. So right now I'm trying to circumvent this, so it might take a > while before I'm able to come back to the original problem... > I am not understanding. Is your original code parallel? Thanks, Matt > Kind regards, > Michael > > Am 16.10.2017 um 17:25 schrieb Praveen C: > > I am interested to learn more about how this works. How are the vectors > created if the ids are not contiguous in a partition ? > > Thanks > praveen > > On Mon, Oct 16, 2017 at 2:02 PM, Stefano Zampini < > stefano.zampini at gmail.com> wrote: > >> >> >> 2017-10-16 10:26 GMT+03:00 Michael Werner : >> >>> Hello, >>> >>> I'm having trouble with parallelizing a matrix-free code with PETSc. In >>> this code, I use an external CFD code to provide the matrix-vector product >>> for an iterative solver in PETSc. To increase convergence rate, I'm using >>> an explicitly stored Jacobian matrix to precondition the solver. This works >>> fine for serial runs. However, when I try to use multiple processes, I face >>> the problem that PETSc decomposes the preconditioner matrix, and probably >>> also the shell matrix, in a different way than the external CFD code >>> decomposes the grid. >>> >>> The Jacobian matrix is built in a way, that its rows and columns >>> correspond to the global IDs of the individual points in my CFD mesh >>> >>> The CFD code decomposes the domain based on the proximity of points to >>> each other, so that the resulting subgrids are coherent. However, since its >>> an unstructured grid, those subgrids are not necessarily made up of points >>> with successive global IDs. This is a problem, since PETSc seems to >>> partition the matrix in coherent slices. >>> >>> I'm not sure what the best approach to this problem might be. Is it >>> maybe possible to exactly tell PETSc, which rows/columns it should assign >>> to the individual processes? >>> >>> >> If you are explicitly setting the values in your Jacobians via >> MatSetValues(), you can create a ISLocalToGlobalMapping >> >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >> IS/ISLocalToGlobalMappingCreate.html >> >> that maps the numbering you use for the Jacobians to their counterpart in >> the CFD ordering, then call MatSetLocalToGlobalMapping and then use >> MatSetValuesLocal with the same arguments you are calling MatSetValues now. >> >> Otherwise, you can play with the application ordering >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/index.html >> >> >> >> >> -- >> Stefano >> > > > -- > > ____________________________________________________ > > Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) > Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen > > Michael Werner > Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de > DLR.de > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michael.werner at dlr.de Tue Oct 17 04:46:44 2017 From: michael.werner at dlr.de (Michael Werner) Date: Tue, 17 Oct 2017 11:46:44 +0200 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> Message-ID: <77cc6a05-18fe-fff1-31a9-ea48fa83001c@dlr.de> I'm not sure what you mean with this question? The external CFD code, if that was what you referred to, can be run in parallel. Am 17.10.2017 um 11:11 schrieb Matthew Knepley: > On Tue, Oct 17, 2017 at 4:21 AM, Michael Werner > wrote: > > That's something I'm still struggling with. In the serial case, I > can simply extract the values from the original grid, and since > the ordering of the Jacobian is the same there is no problem. In > the parallel case this is still a more or less open question. > That's why I thought about reordering the Jacobian. As long as the > position of the individual IDs is the same for both, I don't have > to care about their absolute position. > > I also wanted to thank you for your previous answer, it seems that > the application ordering might be what I'm looking for. However, > in the meantime I stumbled about another problem, that I have to > solve first. My new problem is, that I call the external code > within the shell matrix' multiply call. But in a parallel case, > this call obviously gets called once per process. So right now I'm > trying to circumvent this, so it might take a while before I'm > able to come back to the original problem... > > > I am not understanding. Is your original code parallel? > > ? Thanks, > > ? ? ?Matt > > Kind regards, > Michael > > Am 16.10.2017 um 17:25 schrieb Praveen C: >> I am interested to learn more about how this works. How are the >> vectors created if the ids are not contiguous in a partition ? >> >> Thanks >> praveen >> >> On Mon, Oct 16, 2017 at 2:02 PM, Stefano Zampini >> > wrote: >> >> >> >> 2017-10-16 10:26 GMT+03:00 Michael Werner >> >: >> >> Hello, >> >> I'm having trouble with parallelizing a matrix-free code >> with PETSc. In this code, I use an external CFD code to >> provide the matrix-vector product for an iterative solver >> in PETSc. To increase convergence rate, I'm using an >> explicitly stored Jacobian matrix to precondition the >> solver. This works fine for serial runs. However, when I >> try to use multiple processes, I face the problem that >> PETSc decomposes the preconditioner matrix, and probably >> also the shell matrix, in a different way than the >> external CFD code decomposes the grid. >> >> The Jacobian matrix is built in a way, that its rows and >> columns correspond to the global IDs of the individual >> points in my CFD mesh >> >> The CFD code decomposes the domain based on the proximity >> of points to each other, so that the resulting subgrids >> are coherent. However, since its an unstructured grid, >> those subgrids are not necessarily made up of points with >> successive global IDs. This is a problem, since PETSc >> seems to partition the matrix in? coherent slices. >> >> I'm not sure what the best approach to this problem might >> be. Is it maybe possible to exactly tell PETSc, which >> rows/columns it should assign to the individual processes? >> >> >> If you are explicitly setting the values in your Jacobians >> via MatSetValues(), you can create a ISLocalToGlobalMapping >> >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/IS/ISLocalToGlobalMappingCreate.html >> >> >> that maps the numbering you use for the Jacobians to their >> counterpart in the CFD ordering, then call >> MatSetLocalToGlobalMapping and then use MatSetValuesLocal >> with the same arguments you are calling MatSetValues now. >> >> Otherwise, you can play with the application ordering >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/index.html >> >> >> >> >> -- >> Stefano >> >> > > -- > > ____________________________________________________ > > Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) > Institut f?r Aerodynamik und Str?mungstechnik |Bunsenstr. 10 | 37073 G?ttingen > > > Michael Werner > Telefon 0551 709-2627 | Telefax 0551 709-2811 |Michael.Werner at dlr.de > DLR.de > > > > > > > > > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -- ____________________________________________________ Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen Michael Werner Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de DLR.de -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Oct 17 04:50:58 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 17 Oct 2017 05:50:58 -0400 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: <77cc6a05-18fe-fff1-31a9-ea48fa83001c@dlr.de> References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> <77cc6a05-18fe-fff1-31a9-ea48fa83001c@dlr.de> Message-ID: On Tue, Oct 17, 2017 at 5:46 AM, Michael Werner wrote: > I'm not sure what you mean with this question? > The external CFD code, if that was what you referred to, can be run in > parallel. > Then why is it a problem that "in a parallel case, this call obviously gets called once per process"? Matt > Am 17.10.2017 um 11:11 schrieb Matthew Knepley: > > On Tue, Oct 17, 2017 at 4:21 AM, Michael Werner > wrote: > >> That's something I'm still struggling with. In the serial case, I can >> simply extract the values from the original grid, and since the ordering of >> the Jacobian is the same there is no problem. In the parallel case this is >> still a more or less open question. That's why I thought about reordering >> the Jacobian. As long as the position of the individual IDs is the same for >> both, I don't have to care about their absolute position. >> >> I also wanted to thank you for your previous answer, it seems that the >> application ordering might be what I'm looking for. However, in the >> meantime I stumbled about another problem, that I have to solve first. My >> new problem is, that I call the external code within the shell matrix' >> multiply call. But in a parallel case, this call obviously gets called once >> per process. So right now I'm trying to circumvent this, so it might take a >> while before I'm able to come back to the original problem... >> > > I am not understanding. Is your original code parallel? > > Thanks, > > Matt > > >> Kind regards, >> Michael >> >> Am 16.10.2017 um 17:25 schrieb Praveen C: >> >> I am interested to learn more about how this works. How are the vectors >> created if the ids are not contiguous in a partition ? >> >> Thanks >> praveen >> >> On Mon, Oct 16, 2017 at 2:02 PM, Stefano Zampini < >> stefano.zampini at gmail.com> wrote: >> >>> >>> >>> 2017-10-16 10:26 GMT+03:00 Michael Werner : >>> >>>> Hello, >>>> >>>> I'm having trouble with parallelizing a matrix-free code with PETSc. In >>>> this code, I use an external CFD code to provide the matrix-vector product >>>> for an iterative solver in PETSc. To increase convergence rate, I'm using >>>> an explicitly stored Jacobian matrix to precondition the solver. This works >>>> fine for serial runs. However, when I try to use multiple processes, I face >>>> the problem that PETSc decomposes the preconditioner matrix, and probably >>>> also the shell matrix, in a different way than the external CFD code >>>> decomposes the grid. >>>> >>>> The Jacobian matrix is built in a way, that its rows and columns >>>> correspond to the global IDs of the individual points in my CFD mesh >>>> >>>> The CFD code decomposes the domain based on the proximity of points to >>>> each other, so that the resulting subgrids are coherent. However, since its >>>> an unstructured grid, those subgrids are not necessarily made up of points >>>> with successive global IDs. This is a problem, since PETSc seems to >>>> partition the matrix in coherent slices. >>>> >>>> I'm not sure what the best approach to this problem might be. Is it >>>> maybe possible to exactly tell PETSc, which rows/columns it should assign >>>> to the individual processes? >>>> >>>> >>> If you are explicitly setting the values in your Jacobians via >>> MatSetValues(), you can create a ISLocalToGlobalMapping >>> >>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>> IS/ISLocalToGlobalMappingCreate.html >>> >>> that maps the numbering you use for the Jacobians to their counterpart >>> in the CFD ordering, then call MatSetLocalToGlobalMapping and then use >>> MatSetValuesLocal with the same arguments you are calling MatSetValues now. >>> >>> Otherwise, you can play with the application ordering >>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>> AO/index.html >>> >>> >>> >>> >>> -- >>> Stefano >>> >> >> >> -- >> >> ____________________________________________________ >> >> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >> Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen >> >> Michael Werner >> Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de >> DLR.de >> >> >> >> >> >> >> >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -- > > ____________________________________________________ > > Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) > Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen > > Michael Werner > Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de > DLR.de > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michael.werner at dlr.de Tue Oct 17 05:08:16 2017 From: michael.werner at dlr.de (Michael Werner) Date: Tue, 17 Oct 2017 12:08:16 +0200 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> <77cc6a05-18fe-fff1-31a9-ea48fa83001c@dlr.de> Message-ID: <66de01c7-a335-76de-8a65-faca0d677d1c@dlr.de> Because usally this code is called just once. It runs one multiple processes, but there it's still always processing the whole domain. I can't run it on only one subdomain. As I understand it now, when I call it from PETSc, this call is issued once per process, so I would end up running several contesting instances of the computation on the whole domain. But maybe that's only because I haven't completly understood how MPI really works in such cases... Kind regards, Michael Am 17.10.2017 um 11:50 schrieb Matthew Knepley: > On Tue, Oct 17, 2017 at 5:46 AM, Michael Werner > wrote: > > I'm not sure what you mean with this question? > The external CFD code, if that was what you referred to, can be > run in parallel. > > > Then why is it a problem that "in a parallel case, this call obviously > gets called once per process"? > > ? ?Matt > > Am 17.10.2017 um 11:11 schrieb Matthew Knepley: >> On Tue, Oct 17, 2017 at 4:21 AM, Michael Werner >> > wrote: >> >> That's something I'm still struggling with. In the serial >> case, I can simply extract the values from the original grid, >> and since the ordering of the Jacobian is the same there is >> no problem. In the parallel case this is still a more or less >> open question. That's why I thought about reordering the >> Jacobian. As long as the position of the individual IDs is >> the same for both, I don't have to care about their absolute >> position. >> >> I also wanted to thank you for your previous answer, it seems >> that the application ordering might be what I'm looking for. >> However, in the meantime I stumbled about another problem, >> that I have to solve first. My new problem is, that I call >> the external code within the shell matrix' multiply call. But >> in a parallel case, this call obviously gets called once per >> process. So right now I'm trying to circumvent this, so it >> might take a while before I'm able to come back to the >> original problem... >> >> >> I am not understanding. Is your original code parallel? >> >> ? Thanks, >> >> ? ? ?Matt >> >> Kind regards, >> Michael >> >> Am 16.10.2017 um 17:25 schrieb Praveen C: >>> I am interested to learn more about how this works. How are >>> the vectors created if the ids are not contiguous in a >>> partition ? >>> >>> Thanks >>> praveen >>> >>> On Mon, Oct 16, 2017 at 2:02 PM, Stefano Zampini >>> >> > wrote: >>> >>> >>> >>> 2017-10-16 10:26 GMT+03:00 Michael Werner >>> >: >>> >>> Hello, >>> >>> I'm having trouble with parallelizing a matrix-free >>> code with PETSc. In this code, I use an external CFD >>> code to provide the matrix-vector product for an >>> iterative solver in PETSc. To increase convergence >>> rate, I'm using an explicitly stored Jacobian matrix >>> to precondition the solver. This works fine for >>> serial runs. However, when I try to use multiple >>> processes, I face the problem that PETSc decomposes >>> the preconditioner matrix, and probably also the >>> shell matrix, in a different way than the external >>> CFD code decomposes the grid. >>> >>> The Jacobian matrix is built in a way, that its rows >>> and columns correspond to the global IDs of the >>> individual points in my CFD mesh >>> >>> The CFD code decomposes the domain based on the >>> proximity of points to each other, so that the >>> resulting subgrids are coherent. However, since its >>> an unstructured grid, those subgrids are not >>> necessarily made up of points with successive global >>> IDs. This is a problem, since PETSc seems to >>> partition the matrix in? coherent slices. >>> >>> I'm not sure what the best approach to this problem >>> might be. Is it maybe possible to exactly tell >>> PETSc, which rows/columns it should assign to the >>> individual processes? >>> >>> >>> If you are explicitly setting the values in your >>> Jacobians via MatSetValues(), you can create a >>> ISLocalToGlobalMapping >>> >>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/IS/ISLocalToGlobalMappingCreate.html >>> >>> >>> that maps the numbering you use for the Jacobians to >>> their counterpart in the CFD ordering, then call >>> MatSetLocalToGlobalMapping and then use >>> MatSetValuesLocal with the same arguments you are >>> calling MatSetValues now. >>> >>> Otherwise, you can play with the application ordering >>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/index.html >>> >>> >>> >>> >>> -- >>> Stefano >>> >>> >> >> -- >> >> ____________________________________________________ >> >> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >> Institut f?r Aerodynamik und Str?mungstechnik |Bunsenstr. 10 | 37073 G?ttingen >> >> >> Michael Werner >> Telefon 0551 709-2627 | Telefax 0551 709-2811 |Michael.Werner at dlr.de >> DLR.de >> >> >> >> >> >> >> >> >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to >> which their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -- > > ____________________________________________________ > > Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) > Institut f?r Aerodynamik und Str?mungstechnik |Bunsenstr. 10 | 37073 G?ttingen > > > Michael Werner > Telefon 0551 709-2627 | Telefax 0551 709-2811 |Michael.Werner at dlr.de > DLR.de > > > > > > > > > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -- ____________________________________________________ Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen Michael Werner Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de DLR.de -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Oct 17 05:31:51 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 17 Oct 2017 06:31:51 -0400 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: <66de01c7-a335-76de-8a65-faca0d677d1c@dlr.de> References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> <77cc6a05-18fe-fff1-31a9-ea48fa83001c@dlr.de> <66de01c7-a335-76de-8a65-faca0d677d1c@dlr.de> Message-ID: On Tue, Oct 17, 2017 at 6:08 AM, Michael Werner wrote: > Because usally this code is called just once. It runs one multiple > processes, but there it's still always processing the whole domain. I can't > run it on only one subdomain. As I understand it now, when I call it from > PETSc, this call is issued once per process, so I would end up running > several contesting instances of the computation on the whole domain. > > But maybe that's only because I haven't completly understood how MPI > really works in such cases... > No, it makes one call in which all processes participate. So you would call your external CFD routine once from all processes, passing in the MPI communicator. Matt > Kind regards, > Michael > > Am 17.10.2017 um 11:50 schrieb Matthew Knepley: > > On Tue, Oct 17, 2017 at 5:46 AM, Michael Werner > wrote: > >> I'm not sure what you mean with this question? >> The external CFD code, if that was what you referred to, can be run in >> parallel. >> > > Then why is it a problem that "in a parallel case, this call obviously > gets called once per process"? > > Matt > > >> Am 17.10.2017 um 11:11 schrieb Matthew Knepley: >> >> On Tue, Oct 17, 2017 at 4:21 AM, Michael Werner >> wrote: >> >>> That's something I'm still struggling with. In the serial case, I can >>> simply extract the values from the original grid, and since the ordering of >>> the Jacobian is the same there is no problem. In the parallel case this is >>> still a more or less open question. That's why I thought about reordering >>> the Jacobian. As long as the position of the individual IDs is the same for >>> both, I don't have to care about their absolute position. >>> >>> I also wanted to thank you for your previous answer, it seems that the >>> application ordering might be what I'm looking for. However, in the >>> meantime I stumbled about another problem, that I have to solve first. My >>> new problem is, that I call the external code within the shell matrix' >>> multiply call. But in a parallel case, this call obviously gets called once >>> per process. So right now I'm trying to circumvent this, so it might take a >>> while before I'm able to come back to the original problem... >>> >> >> I am not understanding. Is your original code parallel? >> >> Thanks, >> >> Matt >> >> >>> Kind regards, >>> Michael >>> >>> Am 16.10.2017 um 17:25 schrieb Praveen C: >>> >>> I am interested to learn more about how this works. How are the vectors >>> created if the ids are not contiguous in a partition ? >>> >>> Thanks >>> praveen >>> >>> On Mon, Oct 16, 2017 at 2:02 PM, Stefano Zampini < >>> stefano.zampini at gmail.com> wrote: >>> >>>> >>>> >>>> 2017-10-16 10:26 GMT+03:00 Michael Werner : >>>> >>>>> Hello, >>>>> >>>>> I'm having trouble with parallelizing a matrix-free code with PETSc. >>>>> In this code, I use an external CFD code to provide the matrix-vector >>>>> product for an iterative solver in PETSc. To increase convergence rate, I'm >>>>> using an explicitly stored Jacobian matrix to precondition the solver. This >>>>> works fine for serial runs. However, when I try to use multiple processes, >>>>> I face the problem that PETSc decomposes the preconditioner matrix, and >>>>> probably also the shell matrix, in a different way than the external CFD >>>>> code decomposes the grid. >>>>> >>>>> The Jacobian matrix is built in a way, that its rows and columns >>>>> correspond to the global IDs of the individual points in my CFD mesh >>>>> >>>>> The CFD code decomposes the domain based on the proximity of points to >>>>> each other, so that the resulting subgrids are coherent. However, since its >>>>> an unstructured grid, those subgrids are not necessarily made up of points >>>>> with successive global IDs. This is a problem, since PETSc seems to >>>>> partition the matrix in coherent slices. >>>>> >>>>> I'm not sure what the best approach to this problem might be. Is it >>>>> maybe possible to exactly tell PETSc, which rows/columns it should assign >>>>> to the individual processes? >>>>> >>>>> >>>> If you are explicitly setting the values in your Jacobians via >>>> MatSetValues(), you can create a ISLocalToGlobalMapping >>>> >>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>> IS/ISLocalToGlobalMappingCreate.html >>>> >>>> that maps the numbering you use for the Jacobians to their counterpart >>>> in the CFD ordering, then call MatSetLocalToGlobalMapping and then use >>>> MatSetValuesLocal with the same arguments you are calling MatSetValues now. >>>> >>>> Otherwise, you can play with the application ordering >>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>> AO/index.html >>>> >>>> >>>> >>>> >>>> -- >>>> Stefano >>>> >>> >>> >>> -- >>> >>> ____________________________________________________ >>> >>> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >>> Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen >>> >>> Michael Werner >>> Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de >>> DLR.de >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> >> -- >> >> ____________________________________________________ >> >> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >> Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen >> >> Michael Werner >> Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de >> DLR.de >> >> >> >> >> >> >> >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -- > > ____________________________________________________ > > Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) > Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen > > Michael Werner > Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de > DLR.de > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Oct 17 11:55:06 2017 From: jed at jedbrown.org (Jed Brown) Date: Tue, 17 Oct 2017 10:55:06 -0600 Subject: [petsc-users] Issue with -log_view In-Reply-To: References: <1670287803.1693376.1507966300567.ref@mail.yahoo.com> <1670287803.1693376.1507966300567@mail.yahoo.com> Message-ID: <87lgk9af4l.fsf@jedbrown.org> Thanks for the test case. Fixed here (now in 'next' and will merge to 'maint' tomorrow). https://bitbucket.org/petsc/petsc/commits/aa139df67e726a84ab5cd3b5e98c800722a9f20a Stefano Zampini writes: > cutting and paste a message I sent a couple of days ago on the mailing > list. I suspect you have a memory leak on some PetscViewer object. Try > running with -malloc -malloc_dump -malloc_debug and without -log_view and > see if you PETSc reports a memory leak. You can also try running under > valgrind with the --leak-check=full option > > ---------------------------------------------------------------------------------------------- > Instead of reporting a leak, the below code, when run with -log_view, > triggers an error > > #include > > int main(int argc,char **args) > { > PetscErrorCode ierr; > PetscViewer view; > > ierr = PetscInitialize(&argc,&args,(char*)0,help);CHKERRQ(ierr); Maybe you missed the first line of your file because "help" wasn't in this example. > ierr = PetscViewerASCIIGetStdout(PETSC_COMM_WORLD,&view);CHKERRQ(ierr); > ierr = PetscViewerCreate(PETSC_COMM_WORLD,&view);CHKERRQ(ierr); > ierr = PetscFinalize(); > return ierr; > } > > 0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Corrupt argument: http://www.mcs.anl.gov/petsc/ > documentation/faq.html#valgrind > [0]PETSC ERROR: Invalid type of object: Parameter # 1 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for > trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.7.6-4792-gbbfd41f GIT > Date: 2017-07-30 13:35:30 +0300 > [0]PETSC ERROR: ./ex1 on a arch-debug named localhost.localdomain by > szampini Thu Oct 12 15:24:19 2017 > [0]PETSC ERROR: Configure options --download-chaco --download-ctetgen > --download-hypre --download-metis --download-mumps --download-p4est > --download-parmetis --download-suitesparse --download-triangle > --with-scalapack CFLAGS="-Wall -g -O0" CXXFLAGS="-Wall -g -O0" FCFLAGS="-g > -O0" PETSC_ARCH=arch-debug > [0]PETSC ERROR: #1 PetscObjectReference() line 510 in > /home/szampini/src/petsc/src/sys/objects/inherit.c > [0]PETSC ERROR: #2 PetscOptionsGetViewer() line 259 in > /home/szampini/src/petsc/src/sys/classes/viewer/interface/viewreg.c > [0]PETSC ERROR: #3 PetscLogViewFromOptions() line 1753 in > /home/szampini/src/petsc/src/sys/logging/plog.c > [0]PETSC ERROR: #4 PetscFinalize() line 1227 in > /home/szampini/src/petsc/src/sys/objects/pinit.c > > The problem is with the MPIAttribute Petsc_Viewer_Stdout_keyval attached to > PETSC_COMM_WORLD. PETSC_VIEWER_STDOUT_WORLD gets destroyed in the first > call to PetscObjectRegisterDestroyAll(); Then PetscLogViewFromOptions() > call PetscViewerASCIIGetStdout that checks for the presence of the > attribute on the communicator, which is still there, since we never called > MPI_Comm_free on that communicator. > > What would be a solution for this issue? At least, we should print a nice > error message in PetscViewerASCIIGetStdout. > > 2017-10-14 16:10 GMT+03:00 Barry Smith : > >> >> Please cut and paste all the output and send it to >> petsc-maint at mcs.anl.gov >> >> Barry >> >> > On Oct 14, 2017, at 2:31 AM, Tina Patel wrote: >> > >> > Hi, >> > >> > I'm using -log_view option from the command line, but it gives me >> "corrupt argument" and "invalid argument". However, PETSc doesn't throw >> errors when running without -log_view. >> > Am I using it correctly? Or does this hint at another problem? I'm using >> petsc-master 3.7.6. >> > >> > Thanks for your time, >> > Tina >> >> > > > -- > Stefano From hittinger1 at llnl.gov Tue Oct 17 13:15:51 2017 From: hittinger1 at llnl.gov (Hittinger, Jeffrey A. F.) Date: Tue, 17 Oct 2017 18:15:51 +0000 Subject: [petsc-users] High-dimensional DMDA Message-ID: Quick question: is there a version of the DMDA structured grid interface that supports dimensions higher than 3? j- -.-- -.-- --.. Jeffrey A. F. Hittinger Center for Applied Scientific Computing Lawrence Livermore National Laboratory Office: (925) 422-0993 FAX: (925) 423-2993 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Oct 17 13:32:23 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 17 Oct 2017 13:32:23 -0500 Subject: [petsc-users] High-dimensional DMDA In-Reply-To: References: Message-ID: <76C825CB-47A2-434E-8713-E2647537C29B@mcs.anl.gov> No and it is highly unlikely to appear (the 3d code is already too complicated and we tried to write a dimension independent version but that failed) But note that by using a dof argument > 1 one can handle some "4d" problems so long as one does no parallelize in the 4th dimension. Barry > On Oct 17, 2017, at 1:15 PM, Hittinger, Jeffrey A. F. wrote: > > Quick question: is there a version of the DMDA structured grid interface that supports dimensions higher than 3? > > j- > -.-- -.-- --.. > Jeffrey A. F. Hittinger > Center for Applied Scientific Computing > Lawrence Livermore National Laboratory > Office: (925) 422-0993 > FAX: (925) 423-2993 From knepley at gmail.com Tue Oct 17 13:42:42 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 17 Oct 2017 14:42:42 -0400 Subject: [petsc-users] High-dimensional DMDA In-Reply-To: <76C825CB-47A2-434E-8713-E2647537C29B@mcs.anl.gov> References: <76C825CB-47A2-434E-8713-E2647537C29B@mcs.anl.gov> Message-ID: On Tue, Oct 17, 2017 at 2:32 PM, Barry Smith wrote: > > No and it is highly unlikely to appear (the 3d code is already too > complicated and we tried to write a dimension independent version but that > failed) > > But note that by using a dof argument > 1 one can handle some "4d" > problems so long as one does no parallelize in the 4th dimension. History: We did have a version for arbitrary dimension called ADDA written by a student of David Keyes, that exists in the bowels of Git. We are unlikely to replicate it because regular grids in high dimension never seem like the right thing to do. Thanks, Matt > > Barry > > > On Oct 17, 2017, at 1:15 PM, Hittinger, Jeffrey A. F. < > hittinger1 at llnl.gov> wrote: > > > > Quick question: is there a version of the DMDA structured grid interface > that supports dimensions higher than 3? > > > > j- > > -.-- -.-- --.. > > Jeffrey A. F. Hittinger > > Center for Applied Scientific Computing > > Lawrence Livermore National Laboratory > > Office: (925) 422-0993 > > FAX: (925) 423-2993 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From hittinger1 at llnl.gov Tue Oct 17 13:51:20 2017 From: hittinger1 at llnl.gov (Hittinger, Jeffrey A. F.) Date: Tue, 17 Oct 2017 18:51:20 +0000 Subject: [petsc-users] High-dimensional DMDA In-Reply-To: References: <76C825CB-47A2-434E-8713-E2647537C29B@mcs.anl.gov> Message-ID: <2C626E39-6FF8-4619-85B1-CB78D141969C@llnl.gov> Bummer. Matt - Never is a very strong word. Don?t underestimate the power of mappings and/or AMR. Also, sparse grid techniques haven?t (yet) proven to be particularly useful for kinetic problems. Thanks for the quick response. j- -.-- -.-- --.. Jeffrey A. F. Hittinger Center for Applied Scientific Computing Lawrence Livermore National Laboratory Office: (925) 422-0993 FAX: (925) 423-2993 From: Matthew Knepley Date: Tuesday, October 17, 2017 at 11:42 AM To: Barry Smith Cc: Undisclosed recipients , "petsc-users at mcs.anl.gov" Subject: Re: [petsc-users] High-dimensional DMDA On Tue, Oct 17, 2017 at 2:32 PM, Barry Smith > wrote: No and it is highly unlikely to appear (the 3d code is already too complicated and we tried to write a dimension independent version but that failed) But note that by using a dof argument > 1 one can handle some "4d" problems so long as one does no parallelize in the 4th dimension. History: We did have a version for arbitrary dimension called ADDA written by a student of David Keyes, that exists in the bowels of Git. We are unlikely to replicate it because regular grids in high dimension never seem like the right thing to do. Thanks, Matt Barry > On Oct 17, 2017, at 1:15 PM, Hittinger, Jeffrey A. F. > wrote: > > Quick question: is there a version of the DMDA structured grid interface that supports dimensions higher than 3? > > j- > -.-- -.-- --.. > Jeffrey A. F. Hittinger > Center for Applied Scientific Computing > Lawrence Livermore National Laboratory > Office: (925) 422-0993 > FAX: (925) 423-2993 -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Oct 17 13:54:50 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 17 Oct 2017 14:54:50 -0400 Subject: [petsc-users] High-dimensional DMDA In-Reply-To: <2C626E39-6FF8-4619-85B1-CB78D141969C@llnl.gov> References: <76C825CB-47A2-434E-8713-E2647537C29B@mcs.anl.gov> <2C626E39-6FF8-4619-85B1-CB78D141969C@llnl.gov> Message-ID: On Tue, Oct 17, 2017 at 2:51 PM, Hittinger, Jeffrey A. F. < hittinger1 at llnl.gov> wrote: > Bummer. > > > > Matt - Never is a very strong word. Don?t underestimate the power of > mappings and/or AMR. Also, sparse grid techniques haven?t (yet) proven to > be particularly useful for kinetic problems. > None of the above is wrong, but I really meant "regular grids in high dimension". DMDA will never do AMR or sparse grids since its designed to be the simplest thing possible. For AMR we are using p4est, and that could definitely work in higher dimensions. I have read the Irene Gamba stuff on Boltzmann Transport using higher-D regular grids, but its just tremendously expensive, and they really play up spectral convergence, which relies on regularity which is often not there in practical problems. Thanks, Matt > > > Thanks for the quick response. > > > > j- > > -.-- -.-- --.. > > Jeffrey A. F. Hittinger > > Center for Applied Scientific Computing > > Lawrence Livermore National Laboratory > > Office: (925) 422-0993 > > FAX: (925) 423-2993 > > > > *From: *Matthew Knepley > *Date: *Tuesday, October 17, 2017 at 11:42 AM > *To: *Barry Smith > *Cc: *Undisclosed recipients , " > petsc-users at mcs.anl.gov" > *Subject: *Re: [petsc-users] High-dimensional DMDA > > > > On Tue, Oct 17, 2017 at 2:32 PM, Barry Smith wrote: > > > No and it is highly unlikely to appear (the 3d code is already too > complicated and we tried to write a dimension independent version but that > failed) > > But note that by using a dof argument > 1 one can handle some "4d" > problems so long as one does no parallelize in the 4th dimension. > > > > History: We did have a version for arbitrary dimension called ADDA written > by a student of David Keyes, that exists > > in the bowels of Git. We are unlikely to replicate it because regular > grids in high dimension never seem like the right > > thing to do. > > > > Thanks, > > > > Matt > > > > > Barry > > > > On Oct 17, 2017, at 1:15 PM, Hittinger, Jeffrey A. F. < > hittinger1 at llnl.gov> wrote: > > > > Quick question: is there a version of the DMDA structured grid interface > that supports dimensions higher than 3? > > > > j- > > -.-- -.-- --.. > > Jeffrey A. F. Hittinger > > Center for Applied Scientific Computing > > Lawrence Livermore National Laboratory > > Office: (925) 422-0993 > > FAX: (925) 423-2993 > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Tue Oct 17 14:40:10 2017 From: mfadams at lbl.gov (Mark Adams) Date: Tue, 17 Oct 2017 15:40:10 -0400 Subject: [petsc-users] High-dimensional DMDA In-Reply-To: References: <76C825CB-47A2-434E-8713-E2647537C29B@mcs.anl.gov> <2C626E39-6FF8-4619-85B1-CB78D141969C@llnl.gov> Message-ID: Let me just add that we (me and Toby (p4est)) think of tensor grids for kinetic problems. A (phase space) grid at every spatial grid point. THis allows us to compose our existing 3D grids to get 6D, for instance. This work well/easily for Valsov-Maxwell because there are only grad_x and grad_v terms. I have explored this with a Vlasov code and worked some of it out. We have a pretty good idea of what to do but it needs work. So we are moving with the new Plex abstraction and DMDA not in our plans. I like Plex, it abstracts the grid from the numerics nicely. The only bad thing about it is that stencil methods are not natural (at all). So you, for instance, compute fluxes and then divergences, instead of composing them on paper to create a stencil. Mark On Tue, Oct 17, 2017 at 2:54 PM, Matthew Knepley wrote: > On Tue, Oct 17, 2017 at 2:51 PM, Hittinger, Jeffrey A. F. < > hittinger1 at llnl.gov> wrote: > >> Bummer. >> >> >> >> Matt - Never is a very strong word. Don?t underestimate the power of >> mappings and/or AMR. Also, sparse grid techniques haven?t (yet) proven to >> be particularly useful for kinetic problems. >> > > None of the above is wrong, but I really meant "regular grids in high > dimension". DMDA will never do AMR or sparse grids since its designed > to be the simplest thing possible. For AMR we are using p4est, and that > could definitely work in higher dimensions. I have read the Irene Gamba > stuff on Boltzmann Transport using higher-D regular grids, but its just > tremendously expensive, and they really play up spectral convergence, > which relies on regularity which is often not there in practical problems. > > Thanks, > > Matt > > >> >> >> Thanks for the quick response. >> >> >> >> j- >> >> -.-- -.-- --.. >> >> Jeffrey A. F. Hittinger >> >> Center for Applied Scientific Computing >> >> Lawrence Livermore National Laboratory >> >> Office: (925) 422-0993 >> >> FAX: (925) 423-2993 >> >> >> >> *From: *Matthew Knepley >> *Date: *Tuesday, October 17, 2017 at 11:42 AM >> *To: *Barry Smith >> *Cc: *Undisclosed recipients , " >> petsc-users at mcs.anl.gov" >> *Subject: *Re: [petsc-users] High-dimensional DMDA >> >> >> >> On Tue, Oct 17, 2017 at 2:32 PM, Barry Smith wrote: >> >> >> No and it is highly unlikely to appear (the 3d code is already too >> complicated and we tried to write a dimension independent version but that >> failed) >> >> But note that by using a dof argument > 1 one can handle some "4d" >> problems so long as one does no parallelize in the 4th dimension. >> >> >> >> History: We did have a version for arbitrary dimension called ADDA >> written by a student of David Keyes, that exists >> >> in the bowels of Git. We are unlikely to replicate it because regular >> grids in high dimension never seem like the right >> >> thing to do. >> >> >> >> Thanks, >> >> >> >> Matt >> >> >> >> >> Barry >> >> >> > On Oct 17, 2017, at 1:15 PM, Hittinger, Jeffrey A. F. < >> hittinger1 at llnl.gov> wrote: >> > >> > Quick question: is there a version of the DMDA structured grid >> interface that supports dimensions higher than 3? >> > >> > j- >> > -.-- -.-- --.. >> > Jeffrey A. F. Hittinger >> > Center for Applied Scientific Computing >> > Lawrence Livermore National Laboratory >> > Office: (925) 422-0993 >> > FAX: (925) 423-2993 >> >> >> >> >> >> -- >> >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> >> >> https://www.cse.buffalo.edu/~knepley/ >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From tisaac at cc.gatech.edu Tue Oct 17 15:02:54 2017 From: tisaac at cc.gatech.edu (Tobin Isaac) Date: Tue, 17 Oct 2017 16:02:54 -0400 Subject: [petsc-users] High-dimensional DMDA In-Reply-To: References: <76C825CB-47A2-434E-8713-E2647537C29B@mcs.anl.gov> <2C626E39-6FF8-4619-85B1-CB78D141969C@llnl.gov> Message-ID: <20171017200254.ls2wvzerh2elxikz@gatech.edu> On Tue, Oct 17, 2017 at 03:40:10PM -0400, Mark Adams wrote: > Let me just add that we (me and Toby (p4est)) think of tensor grids for > kinetic problems. A (phase space) grid at every spatial grid point. THis > allows us to compose our existing 3D grids to get 6D, for instance. This > work well/easily for Valsov-Maxwell because there are only grad_x and > grad_v terms. > > I have explored this with a Vlasov code and worked some of it out. We have > a pretty good idea of what to do but it needs work. > > So we are moving with the new Plex abstraction and DMDA not in our plans. I > like Plex, it abstracts the grid from the numerics nicely. The only bad > thing about it is that stencil methods are not natural (at all). So you, > for instance, compute fluxes and then divergences, instead of composing > them on paper to create a stencil. Which is something so rote that creating DTFD to convert the pointwise residuals/jacobians into stencils would go along way towards unifying DMDA and DMPlex in a single useful interface. Cheers, Toby > > Mark > > On Tue, Oct 17, 2017 at 2:54 PM, Matthew Knepley wrote: > > > On Tue, Oct 17, 2017 at 2:51 PM, Hittinger, Jeffrey A. F. < > > hittinger1 at llnl.gov> wrote: > > > >> Bummer. > >> > >> > >> > >> Matt - Never is a very strong word. Don?t underestimate the power of > >> mappings and/or AMR. Also, sparse grid techniques haven?t (yet) proven to > >> be particularly useful for kinetic problems. > >> > > > > None of the above is wrong, but I really meant "regular grids in high > > dimension". DMDA will never do AMR or sparse grids since its designed > > to be the simplest thing possible. For AMR we are using p4est, and that > > could definitely work in higher dimensions. I have read the Irene Gamba > > stuff on Boltzmann Transport using higher-D regular grids, but its just > > tremendously expensive, and they really play up spectral convergence, > > which relies on regularity which is often not there in practical problems. > > > > Thanks, > > > > Matt > > > > > >> > >> > >> Thanks for the quick response. > >> > >> > >> > >> j- > >> > >> -.-- -.-- --.. > >> > >> Jeffrey A. F. Hittinger > >> > >> Center for Applied Scientific Computing > >> > >> Lawrence Livermore National Laboratory > >> > >> Office: (925) 422-0993 > >> > >> FAX: (925) 423-2993 > >> > >> > >> > >> *From: *Matthew Knepley > >> *Date: *Tuesday, October 17, 2017 at 11:42 AM > >> *To: *Barry Smith > >> *Cc: *Undisclosed recipients , " > >> petsc-users at mcs.anl.gov" > >> *Subject: *Re: [petsc-users] High-dimensional DMDA > >> > >> > >> > >> On Tue, Oct 17, 2017 at 2:32 PM, Barry Smith wrote: > >> > >> > >> No and it is highly unlikely to appear (the 3d code is already too > >> complicated and we tried to write a dimension independent version but that > >> failed) > >> > >> But note that by using a dof argument > 1 one can handle some "4d" > >> problems so long as one does no parallelize in the 4th dimension. > >> > >> > >> > >> History: We did have a version for arbitrary dimension called ADDA > >> written by a student of David Keyes, that exists > >> > >> in the bowels of Git. We are unlikely to replicate it because regular > >> grids in high dimension never seem like the right > >> > >> thing to do. > >> > >> > >> > >> Thanks, > >> > >> > >> > >> Matt > >> > >> > >> > >> > >> Barry > >> > >> > >> > On Oct 17, 2017, at 1:15 PM, Hittinger, Jeffrey A. F. < > >> hittinger1 at llnl.gov> wrote: > >> > > >> > Quick question: is there a version of the DMDA structured grid > >> interface that supports dimensions higher than 3? > >> > > >> > j- > >> > -.-- -.-- --.. > >> > Jeffrey A. F. Hittinger > >> > Center for Applied Scientific Computing > >> > Lawrence Livermore National Laboratory > >> > Office: (925) 422-0993 > >> > FAX: (925) 423-2993 > >> > >> > >> > >> > >> > >> -- > >> > >> What most experimenters take for granted before they begin their > >> experiments is infinitely more interesting than any results to which their > >> experiments lead. > >> -- Norbert Wiener > >> > >> > >> > >> https://www.cse.buffalo.edu/~knepley/ > >> > > > > > > > > -- > > What most experimenters take for granted before they begin their > > experiments is infinitely more interesting than any results to which their > > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 455 bytes Desc: not available URL: From jed at jedbrown.org Tue Oct 17 16:01:20 2017 From: jed at jedbrown.org (Jed Brown) Date: Tue, 17 Oct 2017 15:01:20 -0600 Subject: [petsc-users] High-dimensional DMDA In-Reply-To: <20171017200254.ls2wvzerh2elxikz@gatech.edu> References: <76C825CB-47A2-434E-8713-E2647537C29B@mcs.anl.gov> <2C626E39-6FF8-4619-85B1-CB78D141969C@llnl.gov> <20171017200254.ls2wvzerh2elxikz@gatech.edu> Message-ID: <87a80pa3q7.fsf@jedbrown.org> Tobin Isaac writes: > Which is something so rote that creating DTFD to convert the pointwise > residuals/jacobians into stencils would go along way towards unifying > DMDA and DMPlex in a single useful interface. On a related topic, I have reason to add Fornberg's algorithm for computing finite difference weights to DMDT. (It's more stable than the usual Vandermonde approach.) I'd imagine some users would appreciate that to take some of the mystery/effort out of the design of higher order FD methods. From knepley at gmail.com Tue Oct 17 16:06:15 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 17 Oct 2017 17:06:15 -0400 Subject: [petsc-users] High-dimensional DMDA In-Reply-To: <87a80pa3q7.fsf@jedbrown.org> References: <76C825CB-47A2-434E-8713-E2647537C29B@mcs.anl.gov> <2C626E39-6FF8-4619-85B1-CB78D141969C@llnl.gov> <20171017200254.ls2wvzerh2elxikz@gatech.edu> <87a80pa3q7.fsf@jedbrown.org> Message-ID: On Tue, Oct 17, 2017 at 5:01 PM, Jed Brown wrote: > Tobin Isaac writes: > > > Which is something so rote that creating DTFD to convert the pointwise > > residuals/jacobians into stencils would go along way towards unifying > > DMDA and DMPlex in a single useful interface. > > On a related topic, I have reason to add Fornberg's algorithm for > computing finite difference weights to DMDT. (It's more stable than the > usual Vandermonde approach.) I'd imagine some users would appreciate > that to take some of the mystery/effort out of the design of higher > order FD methods. > That would be good. Is there a reason not to do all DMDA computation with p4est? We could prescribe the refinement level and have the forest make the decomposition. Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Oct 17 16:39:18 2017 From: jed at jedbrown.org (Jed Brown) Date: Tue, 17 Oct 2017 15:39:18 -0600 Subject: [petsc-users] High-dimensional DMDA In-Reply-To: References: <76C825CB-47A2-434E-8713-E2647537C29B@mcs.anl.gov> <2C626E39-6FF8-4619-85B1-CB78D141969C@llnl.gov> <20171017200254.ls2wvzerh2elxikz@gatech.edu> <87a80pa3q7.fsf@jedbrown.org> Message-ID: <877evta1yx.fsf@jedbrown.org> Matthew Knepley writes: > On Tue, Oct 17, 2017 at 5:01 PM, Jed Brown wrote: > >> Tobin Isaac writes: >> >> > Which is something so rote that creating DTFD to convert the pointwise >> > residuals/jacobians into stencils would go along way towards unifying >> > DMDA and DMPlex in a single useful interface. >> >> On a related topic, I have reason to add Fornberg's algorithm for >> computing finite difference weights to DMDT. (It's more stable than the >> usual Vandermonde approach.) I'd imagine some users would appreciate >> that to take some of the mystery/effort out of the design of higher >> order FD methods. >> > > That would be good. > > Is there a reason not to do all DMDA computation with p4est? We could > prescribe the refinement level and have the forest make the > decomposition. GPL? Are you setting up indexing like DMDA so the same code will work for boundary conditions? From Nicholas.Stegmeier at sdstate.edu Tue Oct 17 23:23:26 2017 From: Nicholas.Stegmeier at sdstate.edu (Stegmeier, Nicholas) Date: Wed, 18 Oct 2017 04:23:26 +0000 Subject: [petsc-users] Installing petsc4py Message-ID: Hello all, I was hoping to get some advice on installing petsc4py. Hopefully this is an appropriate question as it's my first time emailing the list. I failed numerous times installing petsc4py on my personal laptop, so I decided to use a virtual machine with Ubuntu to try installing from a clean slate. I followed the steps listed on this webpage: https://gist.github.com/mrosemeier/088115b2e34f319b913a 1. Install openmpi 2. Install anaconda-python 3. Install petsc 4. Install petsc4py After doing all this, I entered the petsc4py/demo folder and used make to see if all the tests passed. But several of the tests promptly failed. Below are some of the errors I received, but I also attached the full error output. Maybe I have forgotten to assign some environment variable? Thank you, Nicholas Stegmeier inside .bashrc: export LD_LIBRARY_PATH=/home/nick/local/openmpi-1.10.5/lib:$LD_LIBRARY_PATH export PATH=/home/nick/local/openmpi-1.10.5/bin:$PATH export PATH="/home/nick/local/anaconda2/bin:$PATH" export PETSC_DIR=/home/nick/local/petsc.git export PETSC_ARCH=arch-python-linux-i686 nick at nick-VirtualBox:~/local/petsc.git/bin$ echo $PATH /home/nick/local/anaconda2/bin:/home/nick/local/openmpi-1.10.5/bin:/home/nick/local/anaconda2/bin:/home/nick/local/openmpi-1.10.5/bin:/home/nick/bin:/home/nick/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin nick at nick-VirtualBox:~/local/petsc.git/bin$ echo $PATH /home/nick/local/anaconda2/bin:/home/nick/local/openmpi-1.10.5/bin:/home/nick/local/anaconda2/bin:/home/nick/local/openmpi-1.10.5/bin:/home/nick/bin:/home/nick/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin sample error output: make -f makefile.petsc \ PETSC_DIR=/home/nick/local/petsc.git PETSC_ARCH=arch-python-linux-i686 make[2]: Entering directory '/home/nick/local/petsc4py.git/demo/perftest' c App.f90 make[2]: c: Command not found makefile.petsc:15: recipe for target 'driver.exe' failed make[2]: [driver.exe] Error 127 (ignored) /home/nick/local/openmpi-1.10.5/bin/mpicc -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -g -O -I/home/nick/local/petsc.git/include -I/home/nick/local/petsc.git/arch-python-linux-i686/include `pwd`/driver.c /home/nick/local/openmpi-1.10.5/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -g -O -o driver.exe -Wl,-rpath,/home/nick/local/petsc.git/arch-python-linux-i686/lib -L/home/nick/local/petsc.git/arch-python-linux-i686/lib -lpetsc -llapack -lblas -lm -ldl driver.o App.o gcc: error: App.o: No such file or directory makefile.petsc:15: recipe for target 'driver.exe' failed make[2]: [driver.exe] Error 1 (ignored) ... ./Bratu2D.F90:20:0: #include "petsc/finclude/petscdef.h" ^ Fatal Error: petsc/finclude/petscdef.h: No such file or directory compilation terminated. ./Bratu2D.F90:20:0: #include "petsc/finclude/petscdef.h" ^ Fatal Error: petsc/finclude/petscdef.h: No such file or directory compilation terminated. error: Command "/usr/bin/gfortran -Wall -g -fno-second-underscore -DF2PY_REPORT_ON_ARRAY_COPY=1 -I/home/nick/local/petsc.git/arch-python-linux-i686/include -I/home/nick/local/petsc.git/include -I/home/nick/local/anaconda2/lib/python2.7/site-packages/petsc4py/include -I. -Ibuild/src.linux-i686-2.7/. -I/home/nick/local/anaconda2/lib/python2.7/site-packages/numpy/core/include -I/home/nick/local/anaconda2/include/python2.7 -c -c ./Bratu2D.F90 -o build/temp.linux-i686-2.7/Bratu2D.o -Jbuild/temp.linux-i686-2.7/ -Ibuild/temp.linux-i686-2.7/" failed with exit status 1 makefile:20: recipe for target 'Bratu2D.so' failed make[1]: *** [Bratu2D.so] Error 1 make[1]: Leaving directory '/home/nick/local/petsc4py.git/demo/wrap-f2py' makefile:3: recipe for target 'all' failed make: [all] Error 2 (ignored) Thank you, Nicholas Stegmeier -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: demo_output.txt URL: From srikrishna.jaganathan at fau.de Wed Oct 18 04:14:57 2017 From: srikrishna.jaganathan at fau.de (Jaganathan, Srikrishna) Date: Wed, 18 Oct 2017 11:14:57 +0200 Subject: [petsc-users] Distributing already assembled stiffness matrix Message-ID: Hello, I have been trying to distribute a already existing stiffness matrix in my FEM code to petsc parallel matrix object , but I am unable to find any documentation regarding it. It was quite straightforward to create a sequential petsc matrix object and everything was working as intended.I have read some of the user comments in the mailing lists regarding similar situation and most of the times the solution suggested is to create stiffness matrix from the the mesh in distributed format. Since its a little difficult in my case to pass the mesh data in the code , is there anyway to distribute already existing stiffness matrix ? Thanks and Regards Srikrishna Jaganathan From michael.werner at dlr.de Wed Oct 18 05:01:05 2017 From: michael.werner at dlr.de (Michael Werner) Date: Wed, 18 Oct 2017 12:01:05 +0200 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> <77cc6a05-18fe-fff1-31a9-ea48fa83001c@dlr.de> <66de01c7-a335-76de-8a65-faca0d677d1c@dlr.de> Message-ID: Thank your for this explanation, it makes sense. And after I updated my code, the external CFD code runs without problems in parallel. However, now I'm back to the problem with the creation of the vectors/domains. By using the application ordering, I can assign the correct points from PETSc to the corresponding points in my external code. At least, as long as both use the same subdomain size. But sometimes they differ, and then the KSP breaks down, because the solution Vector it receives has a different size than what it expects. An example: I have an unstructured grid with 800,000 datapoints. If I decompose this to run? on 2 processors, PETSc delegates exactly 400,000 points to each process. However, the external code might assign 400,100 points to the first and 399,900 process. As a result, PETSc expects a solution vector of size 400,000 on each process, but receives one of 400,100 and one of 399,900, leading to a breakdown. I suppose I could use VecScatterCreateToAll to collect all the values from the solution vectors of my external code, and then create from those a temporary vector that only contains the expected 400,000 values to hand over to the KSP. But this would create a lot of communication between the different processes and seems quite clunky. Is there a more elegant way? Is it maybe possible to manually assign the size of the PETSc subdomains? Kind regards, Michael Werner Am 17.10.2017 um 12:31 schrieb Matthew Knepley: > On Tue, Oct 17, 2017 at 6:08 AM, Michael Werner > wrote: > > Because usally this code is called just once. It runs one multiple > processes, but there it's still always processing the whole > domain. I can't run it on only one subdomain. As I understand it > now, when I call it from PETSc, this call is issued once per > process, so I would end up running several contesting instances of > the computation on the whole domain. > > But maybe that's only because I haven't completly understood how > MPI really works in such cases... > > > No, it makes one call in which all processes participate. So you would > call your external CFD routine once from all processes, passing in the > MPI communicator. > > ? ?Matt > > Kind regards, > Michael > > Am 17.10.2017 um 11:50 schrieb Matthew Knepley: >> On Tue, Oct 17, 2017 at 5:46 AM, Michael Werner >> > wrote: >> >> I'm not sure what you mean with this question? >> The external CFD code, if that was what you referred to, can >> be run in parallel. >> >> >> Then why is it a problem that "in a parallel case, this call >> obviously gets called once per process"? >> >> ? ?Matt >> >> Am 17.10.2017 um 11:11 schrieb Matthew Knepley: >>> On Tue, Oct 17, 2017 at 4:21 AM, Michael Werner >>> > wrote: >>> >>> That's something I'm still struggling with. In the >>> serial case, I can simply extract the values from the >>> original grid, and since the ordering of the Jacobian is >>> the same there is no problem. In the parallel case this >>> is still a more or less open question. That's why I >>> thought about reordering the Jacobian. As long as the >>> position of the individual IDs is the same for both, I >>> don't have to care about their absolute position. >>> >>> I also wanted to thank you for your previous answer, it >>> seems that the application ordering might be what I'm >>> looking for. However, in the meantime I stumbled about >>> another problem, that I have to solve first. My new >>> problem is, that I call the external code within the >>> shell matrix' multiply call. But in a parallel case, >>> this call obviously gets called once per process. So >>> right now I'm trying to circumvent this, so it might >>> take a while before I'm able to come back to the >>> original problem... >>> >>> >>> I am not understanding. Is your original code parallel? >>> >>> ? Thanks, >>> >>> ? ? ?Matt >>> >>> Kind regards, >>> Michael >>> >>> Am 16.10.2017 um 17:25 schrieb Praveen C: >>>> I am interested to learn more about how this works. How >>>> are the vectors created if the ids are not contiguous >>>> in a partition ? >>>> >>>> Thanks >>>> praveen >>>> >>>> On Mon, Oct 16, 2017 at 2:02 PM, Stefano Zampini >>>> >>> > wrote: >>>> >>>> >>>> >>>> 2017-10-16 10:26 GMT+03:00 Michael Werner >>>> >: >>>> >>>> Hello, >>>> >>>> I'm having trouble with parallelizing a >>>> matrix-free code with PETSc. In this code, I >>>> use an external CFD code to provide the >>>> matrix-vector product for an iterative solver >>>> in PETSc. To increase convergence rate, I'm >>>> using an explicitly stored Jacobian matrix to >>>> precondition the solver. This works fine for >>>> serial runs. However, when I try to use >>>> multiple processes, I face the problem that >>>> PETSc decomposes the preconditioner matrix, and >>>> probably also the shell matrix, in a different >>>> way than the external CFD code decomposes the grid. >>>> >>>> The Jacobian matrix is built in a way, that its >>>> rows and columns correspond to the global IDs >>>> of the individual points in my CFD mesh >>>> >>>> The CFD code decomposes the domain based on the >>>> proximity of points to each other, so that the >>>> resulting subgrids are coherent. However, since >>>> its an unstructured grid, those subgrids are >>>> not necessarily made up of points with >>>> successive global IDs. This is a problem, since >>>> PETSc seems to partition the matrix in coherent >>>> slices. >>>> >>>> I'm not sure what the best approach to this >>>> problem might be. Is it maybe possible to >>>> exactly tell PETSc, which rows/columns it >>>> should assign to the individual processes? >>>> >>>> >>>> If you are explicitly setting the values in your >>>> Jacobians via MatSetValues(), you can create a >>>> ISLocalToGlobalMapping >>>> >>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/IS/ISLocalToGlobalMappingCreate.html >>>> >>>> >>>> that maps the numbering you use for the Jacobians >>>> to their counterpart in the CFD ordering, then call >>>> MatSetLocalToGlobalMapping and then use >>>> MatSetValuesLocal with the same arguments you are >>>> calling MatSetValues now. >>>> >>>> Otherwise, you can play with the application >>>> ordering >>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/index.html >>>> >>>> >>>> >>>> >>>> -- >>>> Stefano >>>> >>>> >>> >>> -- >>> >>> ____________________________________________________ >>> >>> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >>> Institut f?r Aerodynamik und Str?mungstechnik |Bunsenstr. 10 | 37073 G?ttingen >>> >>> >>> Michael Werner >>> Telefon 0551 709-2627 | Telefax 0551 709-2811 |Michael.Werner at dlr.de >>> DLR.de >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin >>> their experiments is infinitely more interesting than any >>> results to which their experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >> >> -- >> >> ____________________________________________________ >> >> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >> Institut f?r Aerodynamik und Str?mungstechnik |Bunsenstr. 10 | 37073 G?ttingen >> >> >> Michael Werner >> Telefon 0551 709-2627 | Telefax 0551 709-2811 |Michael.Werner at dlr.de >> DLR.de >> >> >> >> >> >> >> >> >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to >> which their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -- > > ____________________________________________________ > > Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) > Institut f?r Aerodynamik und Str?mungstechnik |Bunsenstr. 10 | 37073 G?ttingen > > > Michael Werner > Telefon 0551 709-2627 | Telefax 0551 709-2811 |Michael.Werner at dlr.de > DLR.de > > > > > > > > > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Oct 18 06:33:05 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 18 Oct 2017 05:33:05 -0600 Subject: [petsc-users] Distributing already assembled stiffness matrix In-Reply-To: References: Message-ID: <871sm0adxq.fsf@jedbrown.org> Easiest is to assemble into a distributed matrix from rank 0. So instead of calling MatCreate using PETSC_COMM_SELF, use a parallel communicator (like PETSC_COMM_WORLD). It is fine if only rank 0 calls MatSetValues, but all processes must call MatAssemblyBegin/End. "Jaganathan, Srikrishna" writes: > Hello, > > > I have been trying to distribute a already existing stiffness matrix in > my FEM code to petsc parallel matrix object , but I am unable to find > any documentation regarding it. It was quite straightforward to create a > sequential petsc matrix object and everything was working as intended.I > have read some of the user comments in the mailing lists regarding > similar situation and most of the times the solution suggested is to > create stiffness matrix from the the mesh in distributed format. Since > its a little difficult in my case to pass the mesh data in the code , is > there anyway to distribute already existing stiffness matrix ? > > Thanks and Regards > > Srikrishna Jaganathan From dalcinl at gmail.com Wed Oct 18 07:06:08 2017 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Wed, 18 Oct 2017 15:06:08 +0300 Subject: [petsc-users] Installing petsc4py In-Reply-To: References: Message-ID: Dear Nicholas, The errors you reported where just outdated Fortran code and makefiles in demo/, these are just examples. I pushed a fix to the maint branch of the repository. BTW, do you know that conda-forge provides prebuilt packages for petsc and petsc4py? Just create a new conda environment, activate, and then "conda install -c conda-forge petsc4py" On 18 October 2017 at 07:23, Stegmeier, Nicholas wrote: > Hello all, > > > I was hoping to get some advice on installing petsc4py. Hopefully this is an > appropriate question as it's my first time emailing the list. > > > I failed numerous times installing petsc4py on my personal laptop, so I > decided to use a virtual machine with Ubuntu to try installing from a clean > slate. > > > I followed the steps listed on this webpage: > https://gist.github.com/mrosemeier/088115b2e34f319b913a > > Install openmpi > Install anaconda-python > Install petsc > Install petsc4py > > > After doing all this, I entered the petsc4py/demo folder and used make to > see if all the tests passed. But several of the tests promptly failed. Below > are some of the errors I received, but I also attached the full error > output. Maybe I have forgotten to assign some environment variable? > > Thank you, > Nicholas Stegmeier > > inside .bashrc: > export LD_LIBRARY_PATH=/home/nick/local/openmpi-1.10.5/lib:$LD_LIBRARY_PATH > export PATH=/home/nick/local/openmpi-1.10.5/bin:$PATH > export PATH="/home/nick/local/anaconda2/bin:$PATH" > export PETSC_DIR=/home/nick/local/petsc.git > export PETSC_ARCH=arch-python-linux-i686 > > nick at nick-VirtualBox:~/local/petsc.git/bin$ echo $PATH > /home/nick/local/anaconda2/bin:/home/nick/local/openmpi-1.10.5/bin:/home/nick/local/anaconda2/bin:/home/nick/local/openmpi-1.10.5/bin:/home/nick/bin:/home/nick/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin > > nick at nick-VirtualBox:~/local/petsc.git/bin$ echo $PATH > /home/nick/local/anaconda2/bin:/home/nick/local/openmpi-1.10.5/bin:/home/nick/local/anaconda2/bin:/home/nick/local/openmpi-1.10.5/bin:/home/nick/bin:/home/nick/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin > > sample error output: > > make -f makefile.petsc \ > PETSC_DIR=/home/nick/local/petsc.git > PETSC_ARCH=arch-python-linux-i686 > make[2]: Entering directory '/home/nick/local/petsc4py.git/demo/perftest' > c App.f90 > make[2]: c: Command not found > makefile.petsc:15: recipe for target 'driver.exe' failed > make[2]: [driver.exe] Error 127 (ignored) > /home/nick/local/openmpi-1.10.5/bin/mpicc -c -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector > -fvisibility=hidden -g -O -I/home/nick/local/petsc.git/include > -I/home/nick/local/petsc.git/arch-python-linux-i686/include > `pwd`/driver.c > /home/nick/local/openmpi-1.10.5/bin/mpicc -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector > -fvisibility=hidden -g -O -o driver.exe > -Wl,-rpath,/home/nick/local/petsc.git/arch-python-linux-i686/lib > -L/home/nick/local/petsc.git/arch-python-linux-i686/lib -lpetsc -llapack > -lblas -lm -ldl driver.o App.o > gcc: error: App.o: No such file or directory > makefile.petsc:15: recipe for target 'driver.exe' failed > make[2]: [driver.exe] Error 1 (ignored) > > ... > > ./Bratu2D.F90:20:0: > > #include "petsc/finclude/petscdef.h" > ^ > Fatal Error: petsc/finclude/petscdef.h: No such file or directory > compilation terminated. > ./Bratu2D.F90:20:0: > > #include "petsc/finclude/petscdef.h" > ^ > Fatal Error: petsc/finclude/petscdef.h: No such file or directory > compilation terminated. > error: Command "/usr/bin/gfortran -Wall -g -fno-second-underscore > -DF2PY_REPORT_ON_ARRAY_COPY=1 > -I/home/nick/local/petsc.git/arch-python-linux-i686/include > -I/home/nick/local/petsc.git/include > -I/home/nick/local/anaconda2/lib/python2.7/site-packages/petsc4py/include > -I. -Ibuild/src.linux-i686-2.7/. > -I/home/nick/local/anaconda2/lib/python2.7/site-packages/numpy/core/include > -I/home/nick/local/anaconda2/include/python2.7 -c -c ./Bratu2D.F90 -o > build/temp.linux-i686-2.7/Bratu2D.o -Jbuild/temp.linux-i686-2.7/ > -Ibuild/temp.linux-i686-2.7/" failed with exit status 1 > makefile:20: recipe for target 'Bratu2D.so' failed > make[1]: *** [Bratu2D.so] Error 1 > make[1]: Leaving directory '/home/nick/local/petsc4py.git/demo/wrap-f2py' > makefile:3: recipe for target 'all' failed > make: [all] Error 2 (ignored) > > > Thank you, > Nicholas Stegmeier -- Lisandro Dalcin ============ Research Scientist Computer, Electrical and Mathematical Sciences & Engineering (CEMSE) Extreme Computing Research Center (ECRC) King Abdullah University of Science and Technology (KAUST) http://ecrc.kaust.edu.sa/ 4700 King Abdullah University of Science and Technology al-Khawarizmi Bldg (Bldg 1), Office # 0109 Thuwal 23955-6900, Kingdom of Saudi Arabia http://www.kaust.edu.sa Office Phone: +966 12 808-0459 From srikrishna.jaganathan at fau.de Wed Oct 18 07:18:26 2017 From: srikrishna.jaganathan at fau.de (Jaganathan, Srikrishna) Date: Wed, 18 Oct 2017 14:18:26 +0200 Subject: [petsc-users] Distributing already assembled stiffness matrix In-Reply-To: <871sm0adxq.fsf@jedbrown.org> References: <871sm0adxq.fsf@jedbrown.org> Message-ID: <058b9bebf3910c7858ee99e29bc2c5d0@fau.de> Thanks for your response, its helpful. I do have few more questions, most of my matrices are of compressed row storage format. 1)So when I was creating sequentially , I just used MatCreateSeqAIJWithArrays , but the same for MPI version is quite confusing to use. I don't understand how to decide on the local rows(it would be really helpful if there is an example) . 2)When I also tried using MatSetValues it doesn't seem to use the same indexing as compressed row storage format.What type of indexing should be used when MatSetValues are used and called from rank 0 for CRS Matrices? On 2017-10-18 13:33, Jed Brown wrote: > Easiest is to assemble into a distributed matrix from rank 0. So > instead of calling MatCreate using PETSC_COMM_SELF, use a parallel > communicator (like PETSC_COMM_WORLD). It is fine if only rank 0 calls > MatSetValues, but all processes must call MatAssemblyBegin/End. > > "Jaganathan, Srikrishna" writes: > >> Hello, >> >> >> I have been trying to distribute a already existing stiffness matrix >> in >> my FEM code to petsc parallel matrix object , but I am unable to find >> any documentation regarding it. It was quite straightforward to create >> a >> sequential petsc matrix object and everything was working as >> intended.I >> have read some of the user comments in the mailing lists regarding >> similar situation and most of the times the solution suggested is to >> create stiffness matrix from the the mesh in distributed format. Since >> its a little difficult in my case to pass the mesh data in the code , >> is >> there anyway to distribute already existing stiffness matrix ? >> >> Thanks and Regards >> >> Srikrishna Jaganathan From jed at jedbrown.org Wed Oct 18 07:38:35 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 18 Oct 2017 06:38:35 -0600 Subject: [petsc-users] Distributing already assembled stiffness matrix In-Reply-To: <058b9bebf3910c7858ee99e29bc2c5d0@fau.de> References: <871sm0adxq.fsf@jedbrown.org> <058b9bebf3910c7858ee99e29bc2c5d0@fau.de> Message-ID: <87y3o88wc4.fsf@jedbrown.org> "Jaganathan, Srikrishna" writes: > Thanks for your response, its helpful. > > I do have few more questions, most of my matrices are of compressed row > storage format. > > 1)So when I was creating sequentially , I just used > MatCreateSeqAIJWithArrays , but the same for MPI version is quite > confusing to use. I don't understand how to decide on the local rows(it > would be really helpful if there is an example) . Just call MatSetValues once per row. The MatCreateMPIAIJWith* interfaces are not for you. > 2)When I also tried using MatSetValues it doesn't seem to use the same > indexing as compressed row storage format.What type of indexing should > be used when MatSetValues are used and called from rank 0 for CRS > Matrices? > > On 2017-10-18 13:33, Jed Brown wrote: >> Easiest is to assemble into a distributed matrix from rank 0. So >> instead of calling MatCreate using PETSC_COMM_SELF, use a parallel >> communicator (like PETSC_COMM_WORLD). It is fine if only rank 0 calls >> MatSetValues, but all processes must call MatAssemblyBegin/End. >> >> "Jaganathan, Srikrishna" writes: >> >>> Hello, >>> >>> >>> I have been trying to distribute a already existing stiffness matrix >>> in >>> my FEM code to petsc parallel matrix object , but I am unable to find >>> any documentation regarding it. It was quite straightforward to create >>> a >>> sequential petsc matrix object and everything was working as >>> intended.I >>> have read some of the user comments in the mailing lists regarding >>> similar situation and most of the times the solution suggested is to >>> create stiffness matrix from the the mesh in distributed format. Since >>> its a little difficult in my case to pass the mesh data in the code , >>> is >>> there anyway to distribute already existing stiffness matrix ? >>> >>> Thanks and Regards >>> >>> Srikrishna Jaganathan From michael.werner at dlr.de Wed Oct 18 11:01:43 2017 From: michael.werner at dlr.de (Michael Werner) Date: Wed, 18 Oct 2017 18:01:43 +0200 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> <77cc6a05-18fe-fff1-31a9-ea48fa83001c@dlr.de> <66de01c7-a335-76de-8a65-faca0d677d1c@dlr.de> Message-ID: <3d4e1deb-7137-ab0b-229b-f9b1bd6cc200@dlr.de> Ah, never mind. I only misunderstood the creation of subvectors by scattering/gathering. I thought it was necessary to collect the complete vector on each of my processes in order to extract a subvector. After rereading the corresponding section in the manual I learned, that this isn't necessary. So now its possible to simply gather the correct values by their global ID, pass them to the external code and then scatter the result back to the parallel vector. Now my code is working as intended. Thanks for your help! Kind regards, Michael Werner Am 18.10.2017 um 12:01 schrieb Michael Werner: > Thank your for this explanation, it makes sense. And after I updated > my code, the external CFD code runs without problems in parallel. > > However, now I'm back to the problem with the creation of the > vectors/domains. By using the application ordering, I can assign the > correct points from PETSc to the corresponding points in my external > code. At least, as long as both use the same subdomain size. But > sometimes they differ, and then the KSP breaks down, because the > solution Vector it receives has a different size than what it expects. > > An example: > I have an unstructured grid with 800,000 datapoints. > > If I decompose this to run? on 2 processors, PETSc delegates exactly > 400,000 points to each process. However, the external code might > assign 400,100 points to the first and 399,900 process. As a result, > PETSc expects a solution vector of size 400,000 on each process, but > receives one of 400,100 and one of 399,900, leading to a breakdown. > > I suppose I could use VecScatterCreateToAll to collect all the values > from the solution vectors of my external code, and then create from > those a temporary vector that only contains the expected 400,000 > values to hand over to the KSP. But this would create a lot of > communication between the different processes and seems quite clunky. > > Is there a more elegant way? Is it maybe possible to manually assign > the size of the PETSc subdomains? > > Kind regards, > Michael Werner > > Am 17.10.2017 um 12:31 schrieb Matthew Knepley: >> On Tue, Oct 17, 2017 at 6:08 AM, Michael Werner >> > wrote: >> >> Because usally this code is called just once. It runs one >> multiple processes, but there it's still always processing the >> whole domain. I can't run it on only one subdomain. As I >> understand it now, when I call it from PETSc, this call is issued >> once per process, so I would end up running several contesting >> instances of the computation on the whole domain. >> >> But maybe that's only because I haven't completly understood how >> MPI really works in such cases... >> >> >> No, it makes one call in which all processes participate. So you >> would call your external CFD routine once from all processes, passing >> in the MPI communicator. >> >> ? ?Matt >> >> Kind regards, >> Michael >> >> Am 17.10.2017 um 11:50 schrieb Matthew Knepley: >>> On Tue, Oct 17, 2017 at 5:46 AM, Michael Werner >>> > wrote: >>> >>> I'm not sure what you mean with this question? >>> The external CFD code, if that was what you referred to, can >>> be run in parallel. >>> >>> >>> Then why is it a problem that "in a parallel case, this call >>> obviously gets called once per process"? >>> >>> ? ?Matt >>> >>> Am 17.10.2017 um 11:11 schrieb Matthew Knepley: >>>> On Tue, Oct 17, 2017 at 4:21 AM, Michael Werner >>>> > wrote: >>>> >>>> That's something I'm still struggling with. In the >>>> serial case, I can simply extract the values from the >>>> original grid, and since the ordering of the Jacobian >>>> is the same there is no problem. In the parallel case >>>> this is still a more or less open question. That's why >>>> I thought about reordering the Jacobian. As long as the >>>> position of the individual IDs is the same for both, I >>>> don't have to care about their absolute position. >>>> >>>> I also wanted to thank you for your previous answer, it >>>> seems that the application ordering might be what I'm >>>> looking for. However, in the meantime I stumbled about >>>> another problem, that I have to solve first. My new >>>> problem is, that I call the external code within the >>>> shell matrix' multiply call. But in a parallel case, >>>> this call obviously gets called once per process. So >>>> right now I'm trying to circumvent this, so it might >>>> take a while before I'm able to come back to the >>>> original problem... >>>> >>>> >>>> I am not understanding. Is your original code parallel? >>>> >>>> ? Thanks, >>>> >>>> ? ? ?Matt >>>> >>>> Kind regards, >>>> Michael >>>> >>>> Am 16.10.2017 um 17:25 schrieb Praveen C: >>>>> I am interested to learn more about how this works. >>>>> How are the vectors created if the ids are not >>>>> contiguous in a partition ? >>>>> >>>>> Thanks >>>>> praveen >>>>> >>>>> On Mon, Oct 16, 2017 at 2:02 PM, Stefano Zampini >>>>> >>>> > wrote: >>>>> >>>>> >>>>> >>>>> 2017-10-16 10:26 GMT+03:00 Michael Werner >>>>> >>>> >: >>>>> >>>>> Hello, >>>>> >>>>> I'm having trouble with parallelizing a >>>>> matrix-free code with PETSc. In this code, I >>>>> use an external CFD code to provide the >>>>> matrix-vector product for an iterative solver >>>>> in PETSc. To increase convergence rate, I'm >>>>> using an explicitly stored Jacobian matrix to >>>>> precondition the solver. This works fine for >>>>> serial runs. However, when I try to use >>>>> multiple processes, I face the problem that >>>>> PETSc decomposes the preconditioner matrix, >>>>> and probably also the shell matrix, in a >>>>> different way than the external CFD code >>>>> decomposes the grid. >>>>> >>>>> The Jacobian matrix is built in a way, that >>>>> its rows and columns correspond to the global >>>>> IDs of the individual points in my CFD mesh >>>>> >>>>> The CFD code decomposes the domain based on >>>>> the proximity of points to each other, so that >>>>> the resulting subgrids are coherent. However, >>>>> since its an unstructured grid, those subgrids >>>>> are not necessarily made up of points with >>>>> successive global IDs. This is a problem, >>>>> since PETSc seems to partition the matrix in >>>>> coherent slices. >>>>> >>>>> I'm not sure what the best approach to this >>>>> problem might be. Is it maybe possible to >>>>> exactly tell PETSc, which rows/columns it >>>>> should assign to the individual processes? >>>>> >>>>> >>>>> If you are explicitly setting the values in your >>>>> Jacobians via MatSetValues(), you can create a >>>>> ISLocalToGlobalMapping >>>>> >>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/IS/ISLocalToGlobalMappingCreate.html >>>>> >>>>> >>>>> that maps the numbering you use for the Jacobians >>>>> to their counterpart in the CFD ordering, then >>>>> call MatSetLocalToGlobalMapping and then use >>>>> MatSetValuesLocal with the same arguments you are >>>>> calling MatSetValues now. >>>>> >>>>> Otherwise, you can play with the application >>>>> ordering >>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/AO/index.html >>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> Stefano >>>>> >>>>> >>>> >>>> -- >>>> >>>> ____________________________________________________ >>>> >>>> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >>>> Institut f?r Aerodynamik und Str?mungstechnik |Bunsenstr. 10 | 37073 G?ttingen >>>> >>>> >>>> Michael Werner >>>> Telefon 0551 709-2627 | Telefax 0551 709-2811 |Michael.Werner at dlr.de >>>> DLR.de >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin >>>> their experiments is infinitely more interesting than any >>>> results to which their experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>> >>> -- >>> >>> ____________________________________________________ >>> >>> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >>> Institut f?r Aerodynamik und Str?mungstechnik |Bunsenstr. 10 | 37073 G?ttingen >>> >>> >>> Michael Werner >>> Telefon 0551 709-2627 | Telefax 0551 709-2811 |Michael.Werner at dlr.de >>> DLR.de >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to >>> which their experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >> >> -- >> >> ____________________________________________________ >> >> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >> Institut f?r Aerodynamik und Str?mungstechnik |Bunsenstr. 10 | 37073 G?ttingen >> >> >> Michael Werner >> Telefon 0551 709-2627 | Telefax 0551 709-2811 |Michael.Werner at dlr.de >> DLR.de >> >> >> >> >> >> >> >> >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which >> their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 18 12:09:20 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 18 Oct 2017 13:09:20 -0400 Subject: [petsc-users] Parallelizing a matrix-free code In-Reply-To: References: <71b0677c-9615-02f3-1e60-964fdbd91aee@dlr.de> <77cc6a05-18fe-fff1-31a9-ea48fa83001c@dlr.de> <66de01c7-a335-76de-8a65-faca0d677d1c@dlr.de> Message-ID: On Wed, Oct 18, 2017 at 6:01 AM, Michael Werner wrote: > Thank your for this explanation, it makes sense. And after I updated my > code, the external CFD code runs without problems in parallel. > > However, now I'm back to the problem with the creation of the > vectors/domains. By using the application ordering, I can assign the > correct points from PETSc to the corresponding points in my external code. > At least, as long as both use the same subdomain size. But sometimes they > differ, and then the KSP breaks down, because the solution Vector it > receives has a different size than what it expects. > > An example: > I have an unstructured grid with 800,000 datapoints. > > If I decompose this to run on 2 processors, PETSc delegates exactly > 400,000 points to each process. However, the external code might assign > 400,100 points to the first and 399,900 process. As a result, PETSc expects > a solution vector of size 400,000 on each process, but receives one of > 400,100 and one of 399,900, leading to a breakdown. > > I suppose I could use VecScatterCreateToAll to collect all the values from > the solution vectors of my external code, and then create from those a > temporary vector that only contains the expected 400,000 values to hand > over to the KSP. But this would create a lot of communication between the > different processes and seems quite clunky. > > Is there a more elegant way? Is it maybe possible to manually assign the > size of the PETSc subdomains? > Yes, you can always assign the domain size in PETSc. For example, http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecSetSizes.html Matt > Kind regards, > Michael Werner > > Am 17.10.2017 um 12:31 schrieb Matthew Knepley: > > On Tue, Oct 17, 2017 at 6:08 AM, Michael Werner > wrote: > >> Because usally this code is called just once. It runs one multiple >> processes, but there it's still always processing the whole domain. I can't >> run it on only one subdomain. As I understand it now, when I call it from >> PETSc, this call is issued once per process, so I would end up running >> several contesting instances of the computation on the whole domain. >> >> But maybe that's only because I haven't completly understood how MPI >> really works in such cases... >> > > No, it makes one call in which all processes participate. So you would > call your external CFD routine once from all processes, passing in the MPI > communicator. > > Matt > > >> Kind regards, >> Michael >> >> Am 17.10.2017 um 11:50 schrieb Matthew Knepley: >> >> On Tue, Oct 17, 2017 at 5:46 AM, Michael Werner >> wrote: >> >>> I'm not sure what you mean with this question? >>> The external CFD code, if that was what you referred to, can be run in >>> parallel. >>> >> >> Then why is it a problem that "in a parallel case, this call obviously >> gets called once per process"? >> >> Matt >> >> >>> Am 17.10.2017 um 11:11 schrieb Matthew Knepley: >>> >>> On Tue, Oct 17, 2017 at 4:21 AM, Michael Werner >>> wrote: >>> >>>> That's something I'm still struggling with. In the serial case, I can >>>> simply extract the values from the original grid, and since the ordering of >>>> the Jacobian is the same there is no problem. In the parallel case this is >>>> still a more or less open question. That's why I thought about reordering >>>> the Jacobian. As long as the position of the individual IDs is the same for >>>> both, I don't have to care about their absolute position. >>>> >>>> I also wanted to thank you for your previous answer, it seems that the >>>> application ordering might be what I'm looking for. However, in the >>>> meantime I stumbled about another problem, that I have to solve first. My >>>> new problem is, that I call the external code within the shell matrix' >>>> multiply call. But in a parallel case, this call obviously gets called once >>>> per process. So right now I'm trying to circumvent this, so it might take a >>>> while before I'm able to come back to the original problem... >>>> >>> >>> I am not understanding. Is your original code parallel? >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Kind regards, >>>> Michael >>>> >>>> Am 16.10.2017 um 17:25 schrieb Praveen C: >>>> >>>> I am interested to learn more about how this works. How are the vectors >>>> created if the ids are not contiguous in a partition ? >>>> >>>> Thanks >>>> praveen >>>> >>>> On Mon, Oct 16, 2017 at 2:02 PM, Stefano Zampini < >>>> stefano.zampini at gmail.com> wrote: >>>> >>>>> >>>>> >>>>> 2017-10-16 10:26 GMT+03:00 Michael Werner : >>>>> >>>>>> Hello, >>>>>> >>>>>> I'm having trouble with parallelizing a matrix-free code with PETSc. >>>>>> In this code, I use an external CFD code to provide the matrix-vector >>>>>> product for an iterative solver in PETSc. To increase convergence rate, I'm >>>>>> using an explicitly stored Jacobian matrix to precondition the solver. This >>>>>> works fine for serial runs. However, when I try to use multiple processes, >>>>>> I face the problem that PETSc decomposes the preconditioner matrix, and >>>>>> probably also the shell matrix, in a different way than the external CFD >>>>>> code decomposes the grid. >>>>>> >>>>>> The Jacobian matrix is built in a way, that its rows and columns >>>>>> correspond to the global IDs of the individual points in my CFD mesh >>>>>> >>>>>> The CFD code decomposes the domain based on the proximity of points >>>>>> to each other, so that the resulting subgrids are coherent. However, since >>>>>> its an unstructured grid, those subgrids are not necessarily made up of >>>>>> points with successive global IDs. This is a problem, since PETSc seems to >>>>>> partition the matrix in coherent slices. >>>>>> >>>>>> I'm not sure what the best approach to this problem might be. Is it >>>>>> maybe possible to exactly tell PETSc, which rows/columns it should assign >>>>>> to the individual processes? >>>>>> >>>>>> >>>>> If you are explicitly setting the values in your Jacobians via >>>>> MatSetValues(), you can create a ISLocalToGlobalMapping >>>>> >>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>> IS/ISLocalToGlobalMappingCreate.html >>>>> >>>>> that maps the numbering you use for the Jacobians to their counterpart >>>>> in the CFD ordering, then call MatSetLocalToGlobalMapping and then use >>>>> MatSetValuesLocal with the same arguments you are calling MatSetValues now. >>>>> >>>>> Otherwise, you can play with the application ordering >>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>>>> AO/index.html >>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> Stefano >>>>> >>>> >>>> >>>> -- >>>> >>>> ____________________________________________________ >>>> >>>> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >>>> Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen >>>> >>>> Michael Werner >>>> Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de >>>> DLR.de >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >>> >>> -- >>> >>> ____________________________________________________ >>> >>> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >>> Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen >>> >>> Michael Werner >>> Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de >>> DLR.de >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> >> -- >> >> ____________________________________________________ >> >> Deutsches Zentrum f?r Luft- und Raumfahrt e.V. (DLR) >> Institut f?r Aerodynamik und Str?mungstechnik | Bunsenstr. 10 | 37073 G?ttingen >> >> Michael Werner >> Telefon 0551 709-2627 | Telefax 0551 709-2811 | Michael.Werner at dlr.de >> DLR.de >> >> >> >> >> >> >> >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Thu Oct 19 07:41:16 2017 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 19 Oct 2017 08:41:16 -0400 Subject: [petsc-users] Distributing already assembled stiffness matrix In-Reply-To: <058b9bebf3910c7858ee99e29bc2c5d0@fau.de> References: <871sm0adxq.fsf@jedbrown.org> <058b9bebf3910c7858ee99e29bc2c5d0@fau.de> Message-ID: On Wed, Oct 18, 2017 at 8:18 AM, Jaganathan, Srikrishna < srikrishna.jaganathan at fau.de> wrote: > Thanks for your response, its helpful. > > I do have few more questions, most of my matrices are of compressed row > storage format. > > 1)So when I was creating sequentially , I just used > MatCreateSeqAIJWithArrays , but the same for MPI version is quite confusing > to use. I don't understand how to decide on the local rows(it would be > really helpful if there is an example) . > You don't use local row indices (you but you don't want to). The code does not change. As Jed says don't use MatCreateSeqAIJWithArrays. Just use MatCreate. This is from ksp ex56.c: /* create stiffness matrix */ ierr = MatCreate(comm,&Amat);CHKERRQ(ierr); ierr = MatSetSizes(Amat,m,m,M,M);CHKERRQ(ierr); if (!test_late_bs) { ierr = MatSetBlockSize(Amat,3);CHKERRQ(ierr); } ierr = MatSetType(Amat,MATAIJ);CHKERRQ(ierr); ierr = MatSeqAIJSetPreallocation(Amat,0,d_nnz);CHKERRQ(ierr); ierr = MatMPIAIJSetPreallocation(Amat,0,d_nnz,0,o_nnz);CHKERRQ(ierr); You can preallocate with an estimate (upper bound). See the documentation for MatMPIAIJSetPreallocation (Google it). You then run your code on one processor and PETSc will distribute it. Now you just add values with (i,j,value) with MatSetValues (Google it). You will find that it is very simple. Now, this simple way will just chop your domain up in a dumb way. If you have a regular grid then you will get a 1D partitioning, which will work for a while. Otherwise you can partition this matrix. Bat that is another story. You want to start with this simple way anyway. > > 2)When I also tried using MatSetValues it doesn't seem to use the same > indexing as compressed row storage format.What type of indexing should be > used when MatSetValues are used and called from rank 0 for CRS Matrices? > > > On 2017-10-18 13:33, Jed Brown wrote: > >> Easiest is to assemble into a distributed matrix from rank 0. So >> instead of calling MatCreate using PETSC_COMM_SELF, use a parallel >> communicator (like PETSC_COMM_WORLD). It is fine if only rank 0 calls >> MatSetValues, but all processes must call MatAssemblyBegin/End. >> >> "Jaganathan, Srikrishna" writes: >> >> Hello, >>> >>> >>> I have been trying to distribute a already existing stiffness matrix in >>> my FEM code to petsc parallel matrix object , but I am unable to find >>> any documentation regarding it. It was quite straightforward to create a >>> sequential petsc matrix object and everything was working as intended.I >>> have read some of the user comments in the mailing lists regarding >>> similar situation and most of the times the solution suggested is to >>> create stiffness matrix from the the mesh in distributed format. Since >>> its a little difficult in my case to pass the mesh data in the code , is >>> there anyway to distribute already existing stiffness matrix ? >>> >>> Thanks and Regards >>> >>> Srikrishna Jaganathan >>> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From srikrishna.jaganathan at fau.de Thu Oct 19 09:52:31 2017 From: srikrishna.jaganathan at fau.de (Jaganathan, Srikrishna) Date: Thu, 19 Oct 2017 16:52:31 +0200 Subject: [petsc-users] Distributing already assembled stiffness matrix In-Reply-To: References: <871sm0adxq.fsf@jedbrown.org> <058b9bebf3910c7858ee99e29bc2c5d0@fau.de> Message-ID: Thanks a lot for your detailed reply, it was very slow to insert values using MatSetValues , I guess its due to lack of preallocation. I still couldn't figure out how to calculate the parameters required to preallocate MatMPIAIJ. So I have the nnz array for serial preallocation and this is accurate. So is there any rule of thumb to arrive at decent numbers for d_nnz and o_nnz from nnz array ? On 2017-10-19 14:41, Mark Adams wrote: > On Wed, Oct 18, 2017 at 8:18 AM, Jaganathan, Srikrishna < > srikrishna.jaganathan at fau.de> wrote: > >> Thanks for your response, its helpful. >> >> I do have few more questions, most of my matrices are of compressed >> row >> storage format. >> >> 1)So when I was creating sequentially , I just used >> MatCreateSeqAIJWithArrays , but the same for MPI version is quite >> confusing >> to use. I don't understand how to decide on the local rows(it would be >> really helpful if there is an example) . >> > > You don't use local row indices (you but you don't want to). The code > does > not change. As Jed says don't use MatCreateSeqAIJWithArrays. Just use > MatCreate. This is from ksp ex56.c: > > /* create stiffness matrix */ > ierr = MatCreate(comm,&Amat);CHKERRQ(ierr); > ierr = MatSetSizes(Amat,m,m,M,M);CHKERRQ(ierr); > if (!test_late_bs) { > ierr = MatSetBlockSize(Amat,3);CHKERRQ(ierr); > } > ierr = MatSetType(Amat,MATAIJ);CHKERRQ(ierr); > ierr = MatSeqAIJSetPreallocation(Amat,0,d_nnz);CHKERRQ(ierr); > ierr = > MatMPIAIJSetPreallocation(Amat,0,d_nnz,0,o_nnz);CHKERRQ(ierr); > > You can preallocate with an estimate (upper bound). See the > documentation > for MatMPIAIJSetPreallocation (Google it). > > You then run your code on one processor and PETSc will distribute it. > Now > you just add values with (i,j,value) with MatSetValues (Google it). You > will find that it is very simple. > > Now, this simple way will just chop your domain up in a dumb way. If > you > have a regular grid then you will get a 1D partitioning, which will > work > for a while. Otherwise you can partition this matrix. Bat that is > another > story. You want to start with this simple way anyway. > > >> >> 2)When I also tried using MatSetValues it doesn't seem to use the same >> indexing as compressed row storage format.What type of indexing should >> be >> used when MatSetValues are used and called from rank 0 for CRS >> Matrices? >> >> >> On 2017-10-18 13:33, Jed Brown wrote: >> >>> Easiest is to assemble into a distributed matrix from rank 0. So >>> instead of calling MatCreate using PETSC_COMM_SELF, use a parallel >>> communicator (like PETSC_COMM_WORLD). It is fine if only rank 0 >>> calls >>> MatSetValues, but all processes must call MatAssemblyBegin/End. >>> >>> "Jaganathan, Srikrishna" writes: >>> >>> Hello, >>>> >>>> >>>> I have been trying to distribute a already existing stiffness matrix >>>> in >>>> my FEM code to petsc parallel matrix object , but I am unable to >>>> find >>>> any documentation regarding it. It was quite straightforward to >>>> create a >>>> sequential petsc matrix object and everything was working as >>>> intended.I >>>> have read some of the user comments in the mailing lists regarding >>>> similar situation and most of the times the solution suggested is to >>>> create stiffness matrix from the the mesh in distributed format. >>>> Since >>>> its a little difficult in my case to pass the mesh data in the code >>>> , is >>>> there anyway to distribute already existing stiffness matrix ? >>>> >>>> Thanks and Regards >>>> >>>> Srikrishna Jaganathan >>>> >>> From knepley at gmail.com Thu Oct 19 09:56:27 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 19 Oct 2017 10:56:27 -0400 Subject: [petsc-users] Distributing already assembled stiffness matrix In-Reply-To: References: <871sm0adxq.fsf@jedbrown.org> <058b9bebf3910c7858ee99e29bc2c5d0@fau.de> Message-ID: On Thu, Oct 19, 2017 at 10:52 AM, Jaganathan, Srikrishna < srikrishna.jaganathan at fau.de> wrote: > Thanks a lot for your detailed reply, it was very slow to insert values > using MatSetValues , I guess its due to lack of preallocation. I still > couldn't figure out how to calculate the parameters required to preallocate > MatMPIAIJ. So I have the nnz array for serial preallocation and this is > accurate. So is there any rule of thumb to arrive at decent numbers for > d_nnz and o_nnz from nnz array ? > Yes. The d_nnz array wants nonzero that occur in the diagonal block. Thus, if your process owns rows [rStart, rEnd) then d_nnz is the number of nonzeros in a row that lie in columns [rStart, rEnd). The rest of the nonzeros in that row are counted in o_nnz. Thanks, Matt > > On 2017-10-19 14:41, Mark Adams wrote: > >> On Wed, Oct 18, 2017 at 8:18 AM, Jaganathan, Srikrishna < >> srikrishna.jaganathan at fau.de> wrote: >> >> Thanks for your response, its helpful. >>> >>> I do have few more questions, most of my matrices are of compressed row >>> storage format. >>> >>> 1)So when I was creating sequentially , I just used >>> MatCreateSeqAIJWithArrays , but the same for MPI version is quite >>> confusing >>> to use. I don't understand how to decide on the local rows(it would be >>> really helpful if there is an example) . >>> >>> >> You don't use local row indices (you but you don't want to). The code does >> not change. As Jed says don't use MatCreateSeqAIJWithArrays. Just use >> MatCreate. This is from ksp ex56.c: >> >> /* create stiffness matrix */ >> ierr = MatCreate(comm,&Amat);CHKERRQ(ierr); >> ierr = MatSetSizes(Amat,m,m,M,M);CHKERRQ(ierr); >> if (!test_late_bs) { >> ierr = MatSetBlockSize(Amat,3);CHKERRQ(ierr); >> } >> ierr = MatSetType(Amat,MATAIJ);CHKERRQ(ierr); >> ierr = MatSeqAIJSetPreallocation(Amat,0,d_nnz);CHKERRQ(ierr); >> ierr = MatMPIAIJSetPreallocation(Amat,0,d_nnz,0,o_nnz);CHKERRQ(ierr); >> >> You can preallocate with an estimate (upper bound). See the documentation >> for MatMPIAIJSetPreallocation (Google it). >> >> You then run your code on one processor and PETSc will distribute it. Now >> you just add values with (i,j,value) with MatSetValues (Google it). You >> will find that it is very simple. >> >> Now, this simple way will just chop your domain up in a dumb way. If you >> have a regular grid then you will get a 1D partitioning, which will work >> for a while. Otherwise you can partition this matrix. Bat that is another >> story. You want to start with this simple way anyway. >> >> >> >>> 2)When I also tried using MatSetValues it doesn't seem to use the same >>> indexing as compressed row storage format.What type of indexing should be >>> used when MatSetValues are used and called from rank 0 for CRS Matrices? >>> >>> >>> On 2017-10-18 13:33, Jed Brown wrote: >>> >>> Easiest is to assemble into a distributed matrix from rank 0. So >>>> instead of calling MatCreate using PETSC_COMM_SELF, use a parallel >>>> communicator (like PETSC_COMM_WORLD). It is fine if only rank 0 calls >>>> MatSetValues, but all processes must call MatAssemblyBegin/End. >>>> >>>> "Jaganathan, Srikrishna" writes: >>>> >>>> Hello, >>>> >>>>> >>>>> >>>>> I have been trying to distribute a already existing stiffness matrix in >>>>> my FEM code to petsc parallel matrix object , but I am unable to find >>>>> any documentation regarding it. It was quite straightforward to create >>>>> a >>>>> sequential petsc matrix object and everything was working as intended.I >>>>> have read some of the user comments in the mailing lists regarding >>>>> similar situation and most of the times the solution suggested is to >>>>> create stiffness matrix from the the mesh in distributed format. Since >>>>> its a little difficult in my case to pass the mesh data in the code , >>>>> is >>>>> there anyway to distribute already existing stiffness matrix ? >>>>> >>>>> Thanks and Regards >>>>> >>>>> Srikrishna Jaganathan >>>>> >>>>> >>>> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From l.verzeroli at studenti.unibg.it Fri Oct 20 00:47:12 2017 From: l.verzeroli at studenti.unibg.it (Luca Verzeroli) Date: Fri, 20 Oct 2017 07:47:12 +0200 Subject: [petsc-users] Good performance For small problem on GALILEO Message-ID: <59e98de3.c792df0a.e2d4f.07d9@mx.google.com> Good morning, For my thesis I'm dealing with GALILEO, one of the clusters owned by Cineca. http://www.hpc.cineca.it/hardware/galileo The first question is: What is the best configuration to run petsc on this kind of cluster? My code is only a MPI program and I would like to know if it's better to use more nodes or more CPUs with mpirun. This question comes from the speed up of my code using that cluster. I have a small problem. The global matrices are 600x600. Are they too small to see a speed up with more mpiprocess? I notice that a single core simulation and a multi cores one take a similar time (multi core a second more). The real problem comes when I have to run multiple simulation of the same code changing some parameters. So I would like to speed up the single simulation. Any advices? Luca Verzeroli -------------- next part -------------- An HTML attachment was scrubbed... URL: From jychang48 at gmail.com Fri Oct 20 01:55:09 2017 From: jychang48 at gmail.com (Justin Chang) Date: Fri, 20 Oct 2017 06:55:09 +0000 Subject: [petsc-users] Good performance For small problem on GALILEO In-Reply-To: <59e98de3.c792df0a.e2d4f.07d9@mx.google.com> References: <59e98de3.c792df0a.e2d4f.07d9@mx.google.com> Message-ID: 600 unknowns is way too small to parallelize. Need at least 10,000 unknowns per MPI process: https://www.mcs.anl.gov/petsc/documentation/faq.html#slowerparallel What problem are you solving? Sounds like you either compiled PETSc with debugging mode on or you just have a really terrible solver. Show us the output of -log_view. On Fri, Oct 20, 2017 at 12:47 AM Luca Verzeroli < l.verzeroli at studenti.unibg.it> wrote: > Good morning, > For my thesis I'm dealing with GALILEO, one of the clusters owned by > Cineca. http://www.hpc.cineca.it/hardware/galileo > The first question is: What is the best configuration to run petsc on this > kind of cluster? My code is only a MPI program and I would like to know if > it's better to use more nodes or more CPUs with mpirun. > This question comes from the speed up of my code using that cluster. I > have a small problem. The global matrices are 600x600. Are they too small > to see a speed up with more mpiprocess? I notice that a single core > simulation and a multi cores one take a similar time (multi core a second > more). The real problem comes when I have to run multiple simulation of the > same code changing some parameters. So I would like to speed up the single > simulation. > Any advices? > > > Luca Verzeroli > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Oct 20 05:31:52 2017 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 20 Oct 2017 06:31:52 -0400 Subject: [petsc-users] Good performance For small problem on GALILEO In-Reply-To: References: <59e98de3.c792df0a.e2d4f.07d9@mx.google.com> Message-ID: Justin is right that parallelism will be of limited value for such small systems. This looks like a serial optimization job. Moreover, in this case, a better numerical method usually trumps any kind of machine optimization. Matt On Fri, Oct 20, 2017 at 2:55 AM, Justin Chang wrote: > 600 unknowns is way too small to parallelize. Need at least 10,000 > unknowns per MPI process: https://www.mcs.anl.gov/petsc/documentation/faq. > html#slowerparallel > > What problem are you solving? Sounds like you either compiled PETSc with > debugging mode on or you just have a really terrible solver. Show us the > output of -log_view. > > On Fri, Oct 20, 2017 at 12:47 AM Luca Verzeroli < > l.verzeroli at studenti.unibg.it> wrote: > >> Good morning, >> For my thesis I'm dealing with GALILEO, one of the clusters owned by >> Cineca. http://www.hpc.cineca.it/hardware/galileo >> The first question is: What is the best configuration to run petsc on >> this kind of cluster? My code is only a MPI program and I would like to >> know if it's better to use more nodes or more CPUs with mpirun. >> This question comes from the speed up of my code using that cluster. I >> have a small problem. The global matrices are 600x600. Are they too small >> to see a speed up with more mpiprocess? I notice that a single core >> simulation and a multi cores one take a similar time (multi core a second >> more). The real problem comes when I have to run multiple simulation of the >> same code changing some parameters. So I would like to speed up the single >> simulation. >> Any advices? >> >> >> Luca Verzeroli >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From dnolte at dim.uchile.cl Fri Oct 20 13:32:21 2017 From: dnolte at dim.uchile.cl (David Nolte) Date: Fri, 20 Oct 2017 15:32:21 -0300 Subject: [petsc-users] GAMG advice Message-ID: <47a47b6b-ce8c-10f6-0ded-bf87e9af1bbd@dim.uchile.cl> Dear all, I have some problems using GAMG as a preconditioner for (F)GMRES. Background: I am solving the incompressible, unsteady Navier-Stokes equations with a coupled mixed FEM approach, using P1/P1 elements for velocity and pressure on an unstructured tetrahedron mesh with about 2mio DOFs (and up to 15mio). The method is stabilized with SUPG/PSPG, hence, no zeros on the diagonal of the pressure block. Time discretization with semi-implicit backward Euler. The flow is a convection dominated flow through a nozzle. So far, for this setup, I have been quite happy with a simple FGMRES/ML solver for the full system (rather bruteforce, I admit, but much faster than any block/Schur preconditioners I tried): ??? -ksp_converged_reason ??? -ksp_monitor_true_residual ??? -ksp_type fgmres ??? -ksp_rtol 1.0e-6 ??? -ksp_initial_guess_nonzero ??? -pc_type ml ??? -pc_ml_Threshold 0.03 ??? -pc_ml_maxNlevels 3 This setup converges in ~100 iterations (see below the ksp_view output) to rtol: 119 KSP unpreconditioned resid norm 4.004030812027e-05 true resid norm 4.004030812037e-05 ||r(i)||/||b|| 1.621791251517e-06 120 KSP unpreconditioned resid norm 3.256863709982e-05 true resid norm 3.256863709982e-05 ||r(i)||/||b|| 1.319158947617e-06 121 KSP unpreconditioned resid norm 2.751959681502e-05 true resid norm 2.751959681503e-05 ||r(i)||/||b|| 1.114652795021e-06 122 KSP unpreconditioned resid norm 2.420611122789e-05 true resid norm 2.420611122788e-05 ||r(i)||/||b|| 9.804434897105e-07 Now I'd like to try GAMG instead of ML. However, I don't know how to set it up to get similar performance. The obvious/naive ??? -pc_type gamg ??? -pc_gamg_type agg # with and without ??? -pc_gamg_threshold 0.03 ??? -pc_mg_levels 3 converges very slowly on 1 proc and much worse on 8 (~200k dofs per proc), for instance: np = 1: 980 KSP unpreconditioned resid norm 1.065009356215e-02 true resid norm 1.065009356215e-02 ||r(i)||/||b|| 4.532259705508e-04 981 KSP unpreconditioned resid norm 1.064978578182e-02 true resid norm 1.064978578182e-02 ||r(i)||/||b|| 4.532128726342e-04 982 KSP unpreconditioned resid norm 1.064956706598e-02 true resid norm 1.064956706598e-02 ||r(i)||/||b|| 4.532035649508e-04 np = 8: 980 KSP unpreconditioned resid norm 3.179946748495e-02 true resid norm 3.179946748495e-02 ||r(i)||/||b|| 1.353259896710e-03 981 KSP unpreconditioned resid norm 3.179946748317e-02 true resid norm 3.179946748317e-02 ||r(i)||/||b|| 1.353259896634e-03 982 KSP unpreconditioned resid norm 3.179946748317e-02 true resid norm 3.179946748317e-02 ||r(i)||/||b|| 1.353259896634e-03 A very high threshold seems to improve the GAMG PC, for instance with 0.75 I get convergence to rtol=1e-6 after 744 iterations. What else should I try? I would very much appreciate any advice on configuring GAMG and differences w.r.t ML to be taken into account (not a multigrid expert though). Thanks, best wishes David ------ ksp_view for -pc_type gamg? ??? -pc_gamg_threshold 0.75 -pc_mg_levels 3 KSP Object: 1 MPI processes ? type: fgmres ??? GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement ??? GMRES: happy breakdown tolerance 1e-30 ? maximum iterations=10000 ? tolerances:? relative=1e-06, absolute=1e-50, divergence=10000. ? right preconditioning ? using nonzero initial guess ? using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes ? type: gamg ??? MG: type is MULTIPLICATIVE, levels=1 cycles=v ????? Cycles per PCApply=1 ????? Using Galerkin computed coarse grid matrices ????? GAMG specific options ??????? Threshold for dropping small values from graph 0.75 ??????? AGG specific options ????????? Symmetric graph false ? Coarse grid solver -- level ------------------------------- ??? KSP Object:??? (mg_levels_0_)???? 1 MPI processes ????? type: preonly ????? maximum iterations=2, initial guess is zero ????? tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. ????? left preconditioning ????? using NONE norm type for convergence test ??? PC Object:??? (mg_levels_0_)???? 1 MPI processes ????? type: sor ??????? SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. ????? linear system matrix = precond matrix: ????? Mat Object:?????? 1 MPI processes ??????? type: seqaij ??????? rows=1745224, cols=1745224 ??????? total: nonzeros=99452608, allocated nonzeros=99452608 ??????? total number of mallocs used during MatSetValues calls =0 ????????? using I-node routines: found 1037847 nodes, limit used is 5 ? linear system matrix = precond matrix: ? Mat Object:?? 1 MPI processes ??? type: seqaij ??? rows=1745224, cols=1745224 ??? total: nonzeros=99452608, allocated nonzeros=99452608 ??? total number of mallocs used during MatSetValues calls =0 ????? using I-node routines: found 1037847 nodes, limit used is 5 ------ ksp_view for -pc_type ml: KSP Object: 8 MPI processes ? type: fgmres ??? GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement ??? GMRES: happy breakdown tolerance 1e-30 ? maximum iterations=10000 ? tolerances:? relative=1e-06, absolute=1e-50, divergence=10000. ? right preconditioning ? using nonzero initial guess ? using UNPRECONDITIONED norm type for convergence test PC Object: 8 MPI processes ? type: ml ??? MG: type is MULTIPLICATIVE, levels=3 cycles=v ????? Cycles per PCApply=1 ????? Using Galerkin computed coarse grid matrices ? Coarse grid solver -- level ------------------------------- ??? KSP Object:??? (mg_coarse_)???? 8 MPI processes ????? type: preonly ????? maximum iterations=10000, initial guess is zero ????? tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. ????? left preconditioning ????? using NONE norm type for convergence test ??? PC Object:??? (mg_coarse_)???? 8 MPI processes ????? type: redundant ??????? Redundant preconditioner: First (color=0) of 8 PCs follows ??????? KSP Object:??????? (mg_coarse_redundant_)???????? 1 MPI processes ????????? type: preonly ????????? maximum iterations=10000, initial guess is zero ????????? tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. ????????? left preconditioning ????????? using NONE norm type for convergence test ??????? PC Object:??????? (mg_coarse_redundant_)???????? 1 MPI processes ????????? type: lu ??????????? LU: out-of-place factorization ??????????? tolerance for zero pivot 2.22045e-14 ??????????? using diagonal shift on blocks to prevent zero pivot [INBLOCKS] ??????????? matrix ordering: nd ??????????? factor fill ratio given 5., needed 10.4795 ????????????? Factored matrix follows: ??????????????? Mat Object:???????????????? 1 MPI processes ????????????????? type: seqaij ????????????????? rows=6822, cols=6822 ????????????????? package used to perform factorization: petsc ????????????????? total: nonzeros=9575688, allocated nonzeros=9575688 ????????????????? total number of mallocs used during MatSetValues calls =0 ??????????????????? not using I-node routines ????????? linear system matrix = precond matrix: ????????? Mat Object:?????????? 1 MPI processes ??????????? type: seqaij ??????????? rows=6822, cols=6822 ??????????? total: nonzeros=913758, allocated nonzeros=913758 ??????????? total number of mallocs used during MatSetValues calls =0 ????????????? not using I-node routines ????? linear system matrix = precond matrix: ????? Mat Object:?????? 8 MPI processes ??????? type: mpiaij ??????? rows=6822, cols=6822 ??????? total: nonzeros=913758, allocated nonzeros=913758 ??????? total number of mallocs used during MatSetValues calls =0 ????????? not using I-node (on process 0) routines ? Down solver (pre-smoother) on level 1 ------------------------------- ??? KSP Object:??? (mg_levels_1_)???? 8 MPI processes ????? type: richardson ??????? Richardson: damping factor=1. ????? maximum iterations=2 ????? tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. ????? left preconditioning ????? using nonzero initial guess ????? using NONE norm type for convergence test ??? PC Object:??? (mg_levels_1_)???? 8 MPI processes ????? type: sor ??????? SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. ????? linear system matrix = precond matrix: ????? Mat Object:?????? 8 MPI processes ??????? type: mpiaij ??????? rows=67087, cols=67087 ??????? total: nonzeros=9722749, allocated nonzeros=9722749 ??????? total number of mallocs used during MatSetValues calls =0 ????????? not using I-node (on process 0) routines ? Up solver (post-smoother) same as down solver (pre-smoother) ? Down solver (pre-smoother) on level 2 ------------------------------- ??? KSP Object:??? (mg_levels_2_)???? 8 MPI processes ????? type: richardson ??????? Richardson: damping factor=1. ????? maximum iterations=2 ????? tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. ????? left preconditioning ????? using nonzero initial guess ????? using NONE norm type for convergence test ??? PC Object:??? (mg_levels_2_)???? 8 MPI processes ????? type: sor ??????? SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. ????? linear system matrix = precond matrix: ????? Mat Object:?????? 8 MPI processes ??????? type: mpiaij ??????? rows=1745224, cols=1745224 ??????? total: nonzeros=99452608, allocated nonzeros=99452608 ??????? total number of mallocs used during MatSetValues calls =0 ????????? using I-node (on process 0) routines: found 126690 nodes, limit used is 5 ? Up solver (post-smoother) same as down solver (pre-smoother) ? linear system matrix = precond matrix: ? Mat Object:?? 8 MPI processes ??? type: mpiaij ??? rows=1745224, cols=1745224 ??? total: nonzeros=99452608, allocated nonzeros=99452608 ??? total number of mallocs used during MatSetValues calls =0 ????? using I-node (on process 0) routines: found 126690 nodes, limit used is 5 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 488 bytes Desc: OpenPGP digital signature URL: From dnolte at dim.uchile.cl Fri Oct 20 14:06:34 2017 From: dnolte at dim.uchile.cl (David Nolte) Date: Fri, 20 Oct 2017 16:06:34 -0300 Subject: [petsc-users] GAMG advice In-Reply-To: <47a47b6b-ce8c-10f6-0ded-bf87e9af1bbd@dim.uchile.cl> References: <47a47b6b-ce8c-10f6-0ded-bf87e9af1bbd@dim.uchile.cl> Message-ID: <991cd7c4-bb92-ed2c-193d-7232c1ff6199@dim.uchile.cl> PS: I didn't realize at first, it looks as if the -pc_mg_levels 3 option was not taken into account: ? type: gamg ??? MG: type is MULTIPLICATIVE, levels=1 cycles=v On 10/20/2017 03:32 PM, David Nolte wrote: > Dear all, > > I have some problems using GAMG as a preconditioner for (F)GMRES. > Background: I am solving the incompressible, unsteady Navier-Stokes > equations with a coupled mixed FEM approach, using P1/P1 elements for > velocity and pressure on an unstructured tetrahedron mesh with about > 2mio DOFs (and up to 15mio). The method is stabilized with SUPG/PSPG, > hence, no zeros on the diagonal of the pressure block. Time > discretization with semi-implicit backward Euler. The flow is a > convection dominated flow through a nozzle. > > So far, for this setup, I have been quite happy with a simple FGMRES/ML > solver for the full system (rather bruteforce, I admit, but much faster > than any block/Schur preconditioners I tried): > > ??? -ksp_converged_reason > ??? -ksp_monitor_true_residual > ??? -ksp_type fgmres > ??? -ksp_rtol 1.0e-6 > ??? -ksp_initial_guess_nonzero > > ??? -pc_type ml > ??? -pc_ml_Threshold 0.03 > ??? -pc_ml_maxNlevels 3 > > This setup converges in ~100 iterations (see below the ksp_view output) > to rtol: > > 119 KSP unpreconditioned resid norm 4.004030812027e-05 true resid norm > 4.004030812037e-05 ||r(i)||/||b|| 1.621791251517e-06 > 120 KSP unpreconditioned resid norm 3.256863709982e-05 true resid norm > 3.256863709982e-05 ||r(i)||/||b|| 1.319158947617e-06 > 121 KSP unpreconditioned resid norm 2.751959681502e-05 true resid norm > 2.751959681503e-05 ||r(i)||/||b|| 1.114652795021e-06 > 122 KSP unpreconditioned resid norm 2.420611122789e-05 true resid norm > 2.420611122788e-05 ||r(i)||/||b|| 9.804434897105e-07 > > > Now I'd like to try GAMG instead of ML. However, I don't know how to set > it up to get similar performance. > The obvious/naive > > ??? -pc_type gamg > ??? -pc_gamg_type agg > > # with and without > ??? -pc_gamg_threshold 0.03 > ??? -pc_mg_levels 3 > > converges very slowly on 1 proc and much worse on 8 (~200k dofs per > proc), for instance: > np = 1: > 980 KSP unpreconditioned resid norm 1.065009356215e-02 true resid norm > 1.065009356215e-02 ||r(i)||/||b|| 4.532259705508e-04 > 981 KSP unpreconditioned resid norm 1.064978578182e-02 true resid norm > 1.064978578182e-02 ||r(i)||/||b|| 4.532128726342e-04 > 982 KSP unpreconditioned resid norm 1.064956706598e-02 true resid norm > 1.064956706598e-02 ||r(i)||/||b|| 4.532035649508e-04 > > np = 8: > 980 KSP unpreconditioned resid norm 3.179946748495e-02 true resid norm > 3.179946748495e-02 ||r(i)||/||b|| 1.353259896710e-03 > 981 KSP unpreconditioned resid norm 3.179946748317e-02 true resid norm > 3.179946748317e-02 ||r(i)||/||b|| 1.353259896634e-03 > 982 KSP unpreconditioned resid norm 3.179946748317e-02 true resid norm > 3.179946748317e-02 ||r(i)||/||b|| 1.353259896634e-03 > > A very high threshold seems to improve the GAMG PC, for instance with > 0.75 I get convergence to rtol=1e-6 after 744 iterations. > What else should I try? > > I would very much appreciate any advice on configuring GAMG and > differences w.r.t ML to be taken into account (not a multigrid expert > though). > > Thanks, best wishes > David > > > ------ > ksp_view for -pc_type gamg? ??? -pc_gamg_threshold 0.75 -pc_mg_levels 3 > > KSP Object: 1 MPI processes > ? type: fgmres > ??? GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > ??? GMRES: happy breakdown tolerance 1e-30 > ? maximum iterations=10000 > ? tolerances:? relative=1e-06, absolute=1e-50, divergence=10000. > ? right preconditioning > ? using nonzero initial guess > ? using UNPRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > ? type: gamg > ??? MG: type is MULTIPLICATIVE, levels=1 cycles=v > ????? Cycles per PCApply=1 > ????? Using Galerkin computed coarse grid matrices > ????? GAMG specific options > ??????? Threshold for dropping small values from graph 0.75 > ??????? AGG specific options > ????????? Symmetric graph false > ? Coarse grid solver -- level ------------------------------- > ??? KSP Object:??? (mg_levels_0_)???? 1 MPI processes > ????? type: preonly > ????? maximum iterations=2, initial guess is zero > ????? tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. > ????? left preconditioning > ????? using NONE norm type for convergence test > ??? PC Object:??? (mg_levels_0_)???? 1 MPI processes > ????? type: sor > ??????? SOR: type = local_symmetric, iterations = 1, local iterations = > 1, omega = 1. > ????? linear system matrix = precond matrix: > ????? Mat Object:?????? 1 MPI processes > ??????? type: seqaij > ??????? rows=1745224, cols=1745224 > ??????? total: nonzeros=99452608, allocated nonzeros=99452608 > ??????? total number of mallocs used during MatSetValues calls =0 > ????????? using I-node routines: found 1037847 nodes, limit used is 5 > ? linear system matrix = precond matrix: > ? Mat Object:?? 1 MPI processes > ??? type: seqaij > ??? rows=1745224, cols=1745224 > ??? total: nonzeros=99452608, allocated nonzeros=99452608 > ??? total number of mallocs used during MatSetValues calls =0 > ????? using I-node routines: found 1037847 nodes, limit used is 5 > > > ------ > ksp_view for -pc_type ml: > > KSP Object: 8 MPI processes > ? type: fgmres > ??? GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > ??? GMRES: happy breakdown tolerance 1e-30 > ? maximum iterations=10000 > ? tolerances:? relative=1e-06, absolute=1e-50, divergence=10000. > ? right preconditioning > ? using nonzero initial guess > ? using UNPRECONDITIONED norm type for convergence test > PC Object: 8 MPI processes > ? type: ml > ??? MG: type is MULTIPLICATIVE, levels=3 cycles=v > ????? Cycles per PCApply=1 > ????? Using Galerkin computed coarse grid matrices > ? Coarse grid solver -- level ------------------------------- > ??? KSP Object:??? (mg_coarse_)???? 8 MPI processes > ????? type: preonly > ????? maximum iterations=10000, initial guess is zero > ????? tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. > ????? left preconditioning > ????? using NONE norm type for convergence test > ??? PC Object:??? (mg_coarse_)???? 8 MPI processes > ????? type: redundant > ??????? Redundant preconditioner: First (color=0) of 8 PCs follows > ??????? KSP Object:??????? (mg_coarse_redundant_)???????? 1 MPI processes > ????????? type: preonly > ????????? maximum iterations=10000, initial guess is zero > ????????? tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. > ????????? left preconditioning > ????????? using NONE norm type for convergence test > ??????? PC Object:??????? (mg_coarse_redundant_)???????? 1 MPI processes > ????????? type: lu > ??????????? LU: out-of-place factorization > ??????????? tolerance for zero pivot 2.22045e-14 > ??????????? using diagonal shift on blocks to prevent zero pivot [INBLOCKS] > ??????????? matrix ordering: nd > ??????????? factor fill ratio given 5., needed 10.4795 > ????????????? Factored matrix follows: > ??????????????? Mat Object:???????????????? 1 MPI processes > ????????????????? type: seqaij > ????????????????? rows=6822, cols=6822 > ????????????????? package used to perform factorization: petsc > ????????????????? total: nonzeros=9575688, allocated nonzeros=9575688 > ????????????????? total number of mallocs used during MatSetValues calls =0 > ??????????????????? not using I-node routines > ????????? linear system matrix = precond matrix: > ????????? Mat Object:?????????? 1 MPI processes > ??????????? type: seqaij > ??????????? rows=6822, cols=6822 > ??????????? total: nonzeros=913758, allocated nonzeros=913758 > ??????????? total number of mallocs used during MatSetValues calls =0 > ????????????? not using I-node routines > ????? linear system matrix = precond matrix: > ????? Mat Object:?????? 8 MPI processes > ??????? type: mpiaij > ??????? rows=6822, cols=6822 > ??????? total: nonzeros=913758, allocated nonzeros=913758 > ??????? total number of mallocs used during MatSetValues calls =0 > ????????? not using I-node (on process 0) routines > ? Down solver (pre-smoother) on level 1 ------------------------------- > ??? KSP Object:??? (mg_levels_1_)???? 8 MPI processes > ????? type: richardson > ??????? Richardson: damping factor=1. > ????? maximum iterations=2 > ????? tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. > ????? left preconditioning > ????? using nonzero initial guess > ????? using NONE norm type for convergence test > ??? PC Object:??? (mg_levels_1_)???? 8 MPI processes > ????? type: sor > ??????? SOR: type = local_symmetric, iterations = 1, local iterations = > 1, omega = 1. > ????? linear system matrix = precond matrix: > ????? Mat Object:?????? 8 MPI processes > ??????? type: mpiaij > ??????? rows=67087, cols=67087 > ??????? total: nonzeros=9722749, allocated nonzeros=9722749 > ??????? total number of mallocs used during MatSetValues calls =0 > ????????? not using I-node (on process 0) routines > ? Up solver (post-smoother) same as down solver (pre-smoother) > ? Down solver (pre-smoother) on level 2 ------------------------------- > ??? KSP Object:??? (mg_levels_2_)???? 8 MPI processes > ????? type: richardson > ??????? Richardson: damping factor=1. > ????? maximum iterations=2 > ????? tolerances:? relative=1e-05, absolute=1e-50, divergence=10000. > ????? left preconditioning > ????? using nonzero initial guess > ????? using NONE norm type for convergence test > ??? PC Object:??? (mg_levels_2_)???? 8 MPI processes > ????? type: sor > ??????? SOR: type = local_symmetric, iterations = 1, local iterations = > 1, omega = 1. > ????? linear system matrix = precond matrix: > ????? Mat Object:?????? 8 MPI processes > ??????? type: mpiaij > ??????? rows=1745224, cols=1745224 > ??????? total: nonzeros=99452608, allocated nonzeros=99452608 > ??????? total number of mallocs used during MatSetValues calls =0 > ????????? using I-node (on process 0) routines: found 126690 nodes, > limit used is 5 > ? Up solver (post-smoother) same as down solver (pre-smoother) > ? linear system matrix = precond matrix: > ? Mat Object:?? 8 MPI processes > ??? type: mpiaij > ??? rows=1745224, cols=1745224 > ??? total: nonzeros=99452608, allocated nonzeros=99452608 > ??? total number of mallocs used during MatSetValues calls =0 > ????? using I-node (on process 0) routines: found 126690 nodes, limit > used is 5 > From bsmith at mcs.anl.gov Fri Oct 20 17:42:03 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 20 Oct 2017 17:42:03 -0500 Subject: [petsc-users] Distributing already assembled stiffness matrix In-Reply-To: References: Message-ID: > On Oct 18, 2017, at 4:14 AM, Jaganathan, Srikrishna wrote: > > Hello, > > > I have been trying to distribute a already existing stiffness matrix in my FEM code to petsc parallel matrix object , but I am unable to find any documentation regarding it. I really really don't recommend doing this. If you code is sequential and you want to speed up the linear solves then use some linear solver library that uses OpenMP for parallelism and be done with it. Mix sequential finite element assembly with MPI parallel solvers is just not worth going within 100 meters of. Barry > It was quite straightforward to create a sequential petsc matrix object and everything was working as intended.I have read some of the user comments in the mailing lists regarding similar situation and most of the times the solution suggested is to create stiffness matrix from the the mesh in distributed format. Since its a little difficult in my case to pass the mesh data in the code , is there anyway to distribute already existing stiffness matrix ? > > Thanks and Regards > > Srikrishna Jaganathan From mfadams at lbl.gov Fri Oct 20 18:21:38 2017 From: mfadams at lbl.gov (Mark Adams) Date: Fri, 20 Oct 2017 19:21:38 -0400 Subject: [petsc-users] Distributing already assembled stiffness matrix In-Reply-To: References: Message-ID: On Fri, Oct 20, 2017 at 6:42 PM, Barry Smith wrote: > > > On Oct 18, 2017, at 4:14 AM, Jaganathan, Srikrishna < > srikrishna.jaganathan at fau.de> wrote: > > > > Hello, > > > > > > I have been trying to distribute a already existing stiffness matrix in > my FEM code to petsc parallel matrix object , but I am unable to find any > documentation regarding it. > > I really really don't recommend doing this. If you code is sequential > and you want to speed up the linear solves then use some linear solver > library that uses OpenMP for parallelism and be done with it. Mix > sequential finite element assembly with MPI parallel solvers is just not > worth going within 100 meters of. > I agree with Barry. And, now that it has come up, let me just drop another approach. Run your serial code on every processor and have PETSC ignore off processor entries. This saves time because you have no communication at the end of the assembly but use more "power" because everyone will be busy doing (mostly useless) work the whole time. Now you can start trimming the useless work away: first mark elements that do not touch local vertices and skip these, next make a list of the active elements and iterate over those, etc. At some point you want to partition your grid intelligently. It is not clear to me where you should do this, but there is code in GAMG that takes an existing distributed matrix, partitions it in parallel, does a scatter gather to reconstitute the matrix with the good partitioning. This is all very scalable. This code is a bit involved and you would have to adapt it. Next you will want to distribute your metadata ... > > Barry > > > > > It was quite straightforward to create a sequential petsc matrix object > and everything was working as intended.I have read some of the user > comments in the mailing lists regarding similar situation and most of the > times the solution suggested is to create stiffness matrix from the the > mesh in distributed format. Since its a little difficult in my case to pass > the mesh data in the code , is there anyway to distribute already existing > stiffness matrix ? > > > > Thanks and Regards > > > > Srikrishna Jaganathan > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fande.kong at inl.gov Fri Oct 20 18:43:51 2017 From: fande.kong at inl.gov (Kong, Fande) Date: Fri, 20 Oct 2017 17:43:51 -0600 Subject: [petsc-users] "Must select a target sorting criterion if using shift-and-invert" Message-ID: Hi All, I am trying to solve a generalized eigenvalue problem (using SLEPc) with "-eps_type krylovschur -st_type sinvert". I got an error message: "Must select a target sorting criterion if using shift-and-invert". Not sure how to proceed. I do not quite understand this sentence. Fande, -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Oct 20 18:51:23 2017 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 20 Oct 2017 19:51:23 -0400 Subject: [petsc-users] "Must select a target sorting criterion if using shift-and-invert" In-Reply-To: References: Message-ID: On Fri, Oct 20, 2017 at 7:43 PM, Kong, Fande wrote: > Hi All, > > I am trying to solve a generalized eigenvalue problem (using SLEPc) with > "-eps_type krylovschur -st_type sinvert". I got an error message: "Must > select a target sorting criterion if using shift-and-invert". > > Not sure how to proceed. I do not quite understand this sentence. > You need to know how to choose the shift. So for instance you want the smallest eigenvalues, or the closest to zero, etc. I don't know the options, but they are in the manual. Matt > > Fande, > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Oct 20 22:10:02 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 20 Oct 2017 22:10:02 -0500 Subject: [petsc-users] GAMG advice In-Reply-To: <991cd7c4-bb92-ed2c-193d-7232c1ff6199@dim.uchile.cl> References: <47a47b6b-ce8c-10f6-0ded-bf87e9af1bbd@dim.uchile.cl> <991cd7c4-bb92-ed2c-193d-7232c1ff6199@dim.uchile.cl> Message-ID: <6169118C-34FE-491C-BCB4-A86BECCFBAA9@mcs.anl.gov> David, GAMG picks the number of levels based on how the coarsening process etc proceeds. You cannot hardwire it to a particular value. You can run with -info to get more info potentially on the decisions GAMG is making. Barry > On Oct 20, 2017, at 2:06 PM, David Nolte wrote: > > PS: I didn't realize at first, it looks as if the -pc_mg_levels 3 option > was not taken into account: > type: gamg > MG: type is MULTIPLICATIVE, levels=1 cycles=v > > > > On 10/20/2017 03:32 PM, David Nolte wrote: >> Dear all, >> >> I have some problems using GAMG as a preconditioner for (F)GMRES. >> Background: I am solving the incompressible, unsteady Navier-Stokes >> equations with a coupled mixed FEM approach, using P1/P1 elements for >> velocity and pressure on an unstructured tetrahedron mesh with about >> 2mio DOFs (and up to 15mio). The method is stabilized with SUPG/PSPG, >> hence, no zeros on the diagonal of the pressure block. Time >> discretization with semi-implicit backward Euler. The flow is a >> convection dominated flow through a nozzle. >> >> So far, for this setup, I have been quite happy with a simple FGMRES/ML >> solver for the full system (rather bruteforce, I admit, but much faster >> than any block/Schur preconditioners I tried): >> >> -ksp_converged_reason >> -ksp_monitor_true_residual >> -ksp_type fgmres >> -ksp_rtol 1.0e-6 >> -ksp_initial_guess_nonzero >> >> -pc_type ml >> -pc_ml_Threshold 0.03 >> -pc_ml_maxNlevels 3 >> >> This setup converges in ~100 iterations (see below the ksp_view output) >> to rtol: >> >> 119 KSP unpreconditioned resid norm 4.004030812027e-05 true resid norm >> 4.004030812037e-05 ||r(i)||/||b|| 1.621791251517e-06 >> 120 KSP unpreconditioned resid norm 3.256863709982e-05 true resid norm >> 3.256863709982e-05 ||r(i)||/||b|| 1.319158947617e-06 >> 121 KSP unpreconditioned resid norm 2.751959681502e-05 true resid norm >> 2.751959681503e-05 ||r(i)||/||b|| 1.114652795021e-06 >> 122 KSP unpreconditioned resid norm 2.420611122789e-05 true resid norm >> 2.420611122788e-05 ||r(i)||/||b|| 9.804434897105e-07 >> >> >> Now I'd like to try GAMG instead of ML. However, I don't know how to set >> it up to get similar performance. >> The obvious/naive >> >> -pc_type gamg >> -pc_gamg_type agg >> >> # with and without >> -pc_gamg_threshold 0.03 >> -pc_mg_levels 3 >> >> converges very slowly on 1 proc and much worse on 8 (~200k dofs per >> proc), for instance: >> np = 1: >> 980 KSP unpreconditioned resid norm 1.065009356215e-02 true resid norm >> 1.065009356215e-02 ||r(i)||/||b|| 4.532259705508e-04 >> 981 KSP unpreconditioned resid norm 1.064978578182e-02 true resid norm >> 1.064978578182e-02 ||r(i)||/||b|| 4.532128726342e-04 >> 982 KSP unpreconditioned resid norm 1.064956706598e-02 true resid norm >> 1.064956706598e-02 ||r(i)||/||b|| 4.532035649508e-04 >> >> np = 8: >> 980 KSP unpreconditioned resid norm 3.179946748495e-02 true resid norm >> 3.179946748495e-02 ||r(i)||/||b|| 1.353259896710e-03 >> 981 KSP unpreconditioned resid norm 3.179946748317e-02 true resid norm >> 3.179946748317e-02 ||r(i)||/||b|| 1.353259896634e-03 >> 982 KSP unpreconditioned resid norm 3.179946748317e-02 true resid norm >> 3.179946748317e-02 ||r(i)||/||b|| 1.353259896634e-03 >> >> A very high threshold seems to improve the GAMG PC, for instance with >> 0.75 I get convergence to rtol=1e-6 after 744 iterations. >> What else should I try? >> >> I would very much appreciate any advice on configuring GAMG and >> differences w.r.t ML to be taken into account (not a multigrid expert >> though). >> >> Thanks, best wishes >> David >> >> >> ------ >> ksp_view for -pc_type gamg -pc_gamg_threshold 0.75 -pc_mg_levels 3 >> >> KSP Object: 1 MPI processes >> type: fgmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000 >> tolerances: relative=1e-06, absolute=1e-50, divergence=10000. >> right preconditioning >> using nonzero initial guess >> using UNPRECONDITIONED norm type for convergence test >> PC Object: 1 MPI processes >> type: gamg >> MG: type is MULTIPLICATIVE, levels=1 cycles=v >> Cycles per PCApply=1 >> Using Galerkin computed coarse grid matrices >> GAMG specific options >> Threshold for dropping small values from graph 0.75 >> AGG specific options >> Symmetric graph false >> Coarse grid solver -- level ------------------------------- >> KSP Object: (mg_levels_0_) 1 MPI processes >> type: preonly >> maximum iterations=2, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (mg_levels_0_) 1 MPI processes >> type: sor >> SOR: type = local_symmetric, iterations = 1, local iterations = >> 1, omega = 1. >> linear system matrix = precond matrix: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=1745224, cols=1745224 >> total: nonzeros=99452608, allocated nonzeros=99452608 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 1037847 nodes, limit used is 5 >> linear system matrix = precond matrix: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=1745224, cols=1745224 >> total: nonzeros=99452608, allocated nonzeros=99452608 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 1037847 nodes, limit used is 5 >> >> >> ------ >> ksp_view for -pc_type ml: >> >> KSP Object: 8 MPI processes >> type: fgmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000 >> tolerances: relative=1e-06, absolute=1e-50, divergence=10000. >> right preconditioning >> using nonzero initial guess >> using UNPRECONDITIONED norm type for convergence test >> PC Object: 8 MPI processes >> type: ml >> MG: type is MULTIPLICATIVE, levels=3 cycles=v >> Cycles per PCApply=1 >> Using Galerkin computed coarse grid matrices >> Coarse grid solver -- level ------------------------------- >> KSP Object: (mg_coarse_) 8 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (mg_coarse_) 8 MPI processes >> type: redundant >> Redundant preconditioner: First (color=0) of 8 PCs follows >> KSP Object: (mg_coarse_redundant_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (mg_coarse_redundant_) 1 MPI processes >> type: lu >> LU: out-of-place factorization >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot [INBLOCKS] >> matrix ordering: nd >> factor fill ratio given 5., needed 10.4795 >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=6822, cols=6822 >> package used to perform factorization: petsc >> total: nonzeros=9575688, allocated nonzeros=9575688 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix = precond matrix: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=6822, cols=6822 >> total: nonzeros=913758, allocated nonzeros=913758 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix = precond matrix: >> Mat Object: 8 MPI processes >> type: mpiaij >> rows=6822, cols=6822 >> total: nonzeros=913758, allocated nonzeros=913758 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node (on process 0) routines >> Down solver (pre-smoother) on level 1 ------------------------------- >> KSP Object: (mg_levels_1_) 8 MPI processes >> type: richardson >> Richardson: damping factor=1. >> maximum iterations=2 >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using nonzero initial guess >> using NONE norm type for convergence test >> PC Object: (mg_levels_1_) 8 MPI processes >> type: sor >> SOR: type = local_symmetric, iterations = 1, local iterations = >> 1, omega = 1. >> linear system matrix = precond matrix: >> Mat Object: 8 MPI processes >> type: mpiaij >> rows=67087, cols=67087 >> total: nonzeros=9722749, allocated nonzeros=9722749 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node (on process 0) routines >> Up solver (post-smoother) same as down solver (pre-smoother) >> Down solver (pre-smoother) on level 2 ------------------------------- >> KSP Object: (mg_levels_2_) 8 MPI processes >> type: richardson >> Richardson: damping factor=1. >> maximum iterations=2 >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using nonzero initial guess >> using NONE norm type for convergence test >> PC Object: (mg_levels_2_) 8 MPI processes >> type: sor >> SOR: type = local_symmetric, iterations = 1, local iterations = >> 1, omega = 1. >> linear system matrix = precond matrix: >> Mat Object: 8 MPI processes >> type: mpiaij >> rows=1745224, cols=1745224 >> total: nonzeros=99452608, allocated nonzeros=99452608 >> total number of mallocs used during MatSetValues calls =0 >> using I-node (on process 0) routines: found 126690 nodes, >> limit used is 5 >> Up solver (post-smoother) same as down solver (pre-smoother) >> linear system matrix = precond matrix: >> Mat Object: 8 MPI processes >> type: mpiaij >> rows=1745224, cols=1745224 >> total: nonzeros=99452608, allocated nonzeros=99452608 >> total number of mallocs used during MatSetValues calls =0 >> using I-node (on process 0) routines: found 126690 nodes, limit >> used is 5 >> > From jroman at dsic.upv.es Sat Oct 21 01:20:46 2017 From: jroman at dsic.upv.es (Jose E. Roman) Date: Sat, 21 Oct 2017 08:20:46 +0200 Subject: [petsc-users] "Must select a target sorting criterion if using shift-and-invert" In-Reply-To: References: Message-ID: This was added in 3.8 to check the common case when people incorrectly sets shift-and-invert with EPS_SMALLEST_MAGNITUDE. To compute smallest eigenvalues with shift-and-invert the correct way is to set target=0 and which=EPS_TARGET_MAGNITUDE. See for instance http://slepc.upv.es/documentation/current/src/eps/examples/tutorials/ex13.c.html Jose > El 21 oct 2017, a las 1:51, Matthew Knepley escribi?: > > On Fri, Oct 20, 2017 at 7:43 PM, Kong, Fande wrote: > Hi All, > > I am trying to solve a generalized eigenvalue problem (using SLEPc) with "-eps_type krylovschur -st_type sinvert". I got an error message: "Must select a target sorting criterion if using shift-and-invert". > > Not sure how to proceed. I do not quite understand this sentence. > > You need to know how to choose the shift. So for instance you want the smallest eigenvalues, or the closest to zero, etc. > I don't know the options, but they are in the manual. > > Matt > > > Fande, > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From srikrishna.jaganathan at fau.de Sat Oct 21 05:18:10 2017 From: srikrishna.jaganathan at fau.de (Jaganathan, Srikrishna) Date: Sat, 21 Oct 2017 12:18:10 +0200 Subject: [petsc-users] Distributing already assembled stiffness matrix In-Reply-To: References: Message-ID: <7a27a95085cf4676e48882acdf2c62e4@fau.de> On 2017-10-21 00:42, Barry Smith wrote: >> On Oct 18, 2017, at 4:14 AM, Jaganathan, Srikrishna >> wrote: >> >> Hello, >> >> >> I have been trying to distribute a already existing stiffness matrix >> in my FEM code to petsc parallel matrix object , but I am unable to >> find any documentation regarding it. > > I really really don't recommend doing this. If you code is > sequential and you want to speed up the linear solves then use some > linear solver library that uses OpenMP for parallelism and be done > with it. Mix sequential finite element assembly with MPI parallel > solvers is just not worth going within 100 meters of. > > Barry > Yeah, it makes sense to not mix both. We are trying to have an MPI parallel Finite element assembly, but it is a quite big overhaul of the existing code so as a first step we are trying to incorporate the solver and then proceed from that. > >> It was quite straightforward to create a sequential petsc matrix >> object and everything was working as intended.I have read some of the >> user comments in the mailing lists regarding similar situation and >> most of the times the solution suggested is to create stiffness matrix >> from the the mesh in distributed format. Since its a little difficult >> in my case to pass the mesh data in the code , is there anyway to >> distribute already existing stiffness matrix ? >> >> Thanks and Regards >> >> Srikrishna Jaganathan From knepley at gmail.com Sat Oct 21 07:27:09 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 21 Oct 2017 08:27:09 -0400 Subject: [petsc-users] "Must select a target sorting criterion if using shift-and-invert" In-Reply-To: References: Message-ID: On Sat, Oct 21, 2017 at 2:20 AM, Jose E. Roman wrote: > This was added in 3.8 to check the common case when people incorrectly > sets shift-and-invert with EPS_SMALLEST_MAGNITUDE. To compute smallest > eigenvalues with shift-and-invert the correct way is to set target=0 and > which=EPS_TARGET_MAGNITUDE. See for instance > http://slepc.upv.es/documentation/current/src/eps/ > examples/tutorials/ex13.c.html Jose, one thing we are trying to do in PETSc now is to give the options to fix a problem (or at least representative options) directly in the error message. Or maybe a pointer to the relevant manual or tutorial section. This gives users a hand up. Thanks, Matt > > Jose > > > > El 21 oct 2017, a las 1:51, Matthew Knepley > escribi?: > > > > On Fri, Oct 20, 2017 at 7:43 PM, Kong, Fande wrote: > > Hi All, > > > > I am trying to solve a generalized eigenvalue problem (using SLEPc) with > "-eps_type krylovschur -st_type sinvert". I got an error message: "Must > select a target sorting criterion if using shift-and-invert". > > > > Not sure how to proceed. I do not quite understand this sentence. > > > > You need to know how to choose the shift. So for instance you want the > smallest eigenvalues, or the closest to zero, etc. > > I don't know the options, but they are in the manual. > > > > Matt > > > > > > Fande, > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Oct 21 08:58:12 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 21 Oct 2017 08:58:12 -0500 Subject: [petsc-users] Distributing already assembled stiffness matrix In-Reply-To: <7a27a95085cf4676e48882acdf2c62e4@fau.de> References: <7a27a95085cf4676e48882acdf2c62e4@fau.de> Message-ID: > On Oct 21, 2017, at 5:18 AM, Jaganathan, Srikrishna wrote: > > On 2017-10-21 00:42, Barry Smith wrote: >>> On Oct 18, 2017, at 4:14 AM, Jaganathan, Srikrishna wrote: >>> Hello, >>> I have been trying to distribute a already existing stiffness matrix in my FEM code to petsc parallel matrix object , but I am unable to find any documentation regarding it. >> I really really don't recommend doing this. If you code is >> sequential and you want to speed up the linear solves then use some >> linear solver library that uses OpenMP for parallelism and be done >> with it. Mix sequential finite element assembly with MPI parallel >> solvers is just not worth going within 100 meters of. >> Barry > Yeah, it makes sense to not mix both. We are trying to have an MPI parallel Finite element assembly, but it is a quite big overhaul of the existing code so as a first step we are trying to incorporate the solver and then proceed from that. I don't recommend that route, other routes are better. >>> It was quite straightforward to create a sequential petsc matrix object and everything was working as intended.I have read some of the user comments in the mailing lists regarding similar situation and most of the times the solution suggested is to create stiffness matrix from the the mesh in distributed format. Since its a little difficult in my case to pass the mesh data in the code , is there anyway to distribute already existing stiffness matrix ? >>> Thanks and Regards >>> Srikrishna Jaganathan From hbcbh1999 at gmail.com Sat Oct 21 16:04:36 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Sat, 21 Oct 2017 17:04:36 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation Message-ID: hi, I implemented HYPRE preconditioner for my study due to the fact that without preconditioner, PETSc solver will take thousands of iterations to converge for fine grid simulation. with HYPRE, depending on the parallel partition, it will take HYPRE forever to do anything. observation of output file is that the simulation is hanging with no output. Any idea what happened? will post snippet of code. -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Oct 21 16:16:04 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 21 Oct 2017 17:16:04 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: Message-ID: On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > hi, > > I implemented HYPRE preconditioner for my study due to the fact that > without preconditioner, PETSc solver will take thousands of iterations to > converge for fine grid simulation. > > with HYPRE, depending on the parallel partition, it will take HYPRE > forever to do anything. observation of output file is that the simulation > is hanging with no output. > > Any idea what happened? will post snippet of code. > 1) For any question about convergence, we need to see the output of -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason 2) Hypre has many preconditioners, which one are you talking about 3) PETSc has some preconditioners in common with Hypre, like AMG Thanks, Matt > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From hbcbh1999 at gmail.com Sat Oct 21 16:21:20 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Sat, 21 Oct 2017 17:21:20 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: Message-ID: ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); ierr = VecAssemblyBegin(x); ierr = VecAssemblyEnd(x); ierr = VecAssemblyBegin(b); ierr = VecAssemblyEnd(b); ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); KSPSetOperators(ksp,A,A); KSPSetType(ksp,KSPBCGS); KSPSetComputeSingularValues(ksp, PETSC_TRUE); #if defined(__HYPRE__) KSPGetPC(ksp, &pc); PCSetType(pc, PCHYPRE); PCHYPRESetType(pc,"boomeramg"); #else KSPSetType(ksp,KSPBCGSL); KSPBCGSLSetEll(ksp,2); #endif /* defined(__HYPRE__) */ KSPSetFromOptions(ksp); KSPSetUp(ksp); ierr = KSPSolve(ksp,b,x); command line On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley wrote: > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > >> hi, >> >> I implemented HYPRE preconditioner for my study due to the fact that >> without preconditioner, PETSc solver will take thousands of iterations to >> converge for fine grid simulation. >> >> with HYPRE, depending on the parallel partition, it will take HYPRE >> forever to do anything. observation of output file is that the simulation >> is hanging with no output. >> >> Any idea what happened? will post snippet of code. >> > > 1) For any question about convergence, we need to see the output of > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > 2) Hypre has many preconditioners, which one are you talking about > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > Thanks, > > Matt > > >> -- >> Hao Zhang >> Dept. of Applid Mathematics and Statistics, >> Stony Brook University, >> Stony Brook, New York, 11790 >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Oct 21 16:25:02 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 21 Oct 2017 17:25:02 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: Message-ID: On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > ierr = VecAssemblyBegin(x); > ierr = VecAssemblyEnd(x); > This is probably unnecessary > > ierr = VecAssemblyBegin(b); > ierr = VecAssemblyEnd(b); > This is probably unnecessary > > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL, > &nullsp); > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > Is your rhs consistent with this nullspace? > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > KSPSetOperators(ksp,A,A); > > KSPSetType(ksp,KSPBCGS); > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > #if defined(__HYPRE__) > KSPGetPC(ksp, &pc); > PCSetType(pc, PCHYPRE); > PCHYPRESetType(pc,"boomeramg"); > This is terribly unnecessary. You just use -pc_type hypre -pc_hypre_type boomeramg or -pc_type gamg > #else > KSPSetType(ksp,KSPBCGSL); > KSPBCGSLSetEll(ksp,2); > #endif /* defined(__HYPRE__) */ > > KSPSetFromOptions(ksp); > KSPSetUp(ksp); > > ierr = KSPSolve(ksp,b,x); > > > command line > You did not provide any of what I asked for the in the eprevious mail. Matt > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley > wrote: > >> On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: >> >>> hi, >>> >>> I implemented HYPRE preconditioner for my study due to the fact that >>> without preconditioner, PETSc solver will take thousands of iterations to >>> converge for fine grid simulation. >>> >>> with HYPRE, depending on the parallel partition, it will take HYPRE >>> forever to do anything. observation of output file is that the simulation >>> is hanging with no output. >>> >>> Any idea what happened? will post snippet of code. >>> >> >> 1) For any question about convergence, we need to see the output of >> >> -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason >> >> 2) Hypre has many preconditioners, which one are you talking about >> >> 3) PETSc has some preconditioners in common with Hypre, like AMG >> >> Thanks, >> >> Matt >> >> >>> -- >>> Hao Zhang >>> Dept. of Applid Mathematics and Statistics, >>> Stony Brook University, >>> Stony Brook, New York, 11790 >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From hbcbh1999 at gmail.com Sat Oct 21 16:30:06 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Sat, 21 Oct 2017 17:30:06 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: Message-ID: this is serial run. still dumping output. parallel more or less the same. KSP Object: 1 MPI processes type: bcgs maximum iterations=40000, initial guess is zero tolerances: relative=1e-14, absolute=1e-14, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: hypre HYPRE BoomerAMG preconditioning Cycle type V Maximum number of levels 25 Maximum number of iterations PER hypre call 1 Convergence tolerance PER hypre call 0. Threshold for strong coupling 0.25 Interpolation truncation factor 0. Interpolation: max elements per row 0 Number of levels of aggressive coarsening 0 Number of paths for aggressive coarsening 1 Maximum row sums 0.9 Sweeps down 1 Sweeps up 1 Sweeps on coarse 1 Relax down symmetric-SOR/Jacobi Relax up symmetric-SOR/Jacobi Relax on coarse Gaussian-elimination Relax weight (all) 1. Outer relax weight (all) 1. Using CF-relaxation Not using more complex smoothers. Measure type local Coarsen type Falgout Interpolation type classical linear system matrix = precond matrix: Mat Object: A 1 MPI processes type: seqaij rows=497664, cols=497664 total: nonzeros=3363552, allocated nonzeros=3483648 total number of mallocs used during MatSetValues calls =0 has attached null space not using I-node routines 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley wrote: > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > >> ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); >> ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); >> >> ierr = VecAssemblyBegin(x); >> ierr = VecAssemblyEnd(x); >> > This is probably unnecessary > >> >> ierr = VecAssemblyBegin(b); >> ierr = VecAssemblyEnd(b); >> > This is probably unnecessary > >> > >> >> ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL, >> &nullsp); >> ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 >> > Is your rhs consistent with this nullspace? > > >> // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); >> KSPSetOperators(ksp,A,A); >> >> KSPSetType(ksp,KSPBCGS); >> >> KSPSetComputeSingularValues(ksp, PETSC_TRUE); >> #if defined(__HYPRE__) >> KSPGetPC(ksp, &pc); >> PCSetType(pc, PCHYPRE); >> PCHYPRESetType(pc,"boomeramg"); >> > This is terribly unnecessary. You just use > > -pc_type hypre -pc_hypre_type boomeramg > > or > > -pc_type gamg > > >> #else >> KSPSetType(ksp,KSPBCGSL); >> KSPBCGSLSetEll(ksp,2); >> #endif /* defined(__HYPRE__) */ >> >> KSPSetFromOptions(ksp); >> KSPSetUp(ksp); >> >> ierr = KSPSolve(ksp,b,x); >> >> >> command line >> > > You did not provide any of what I asked for the in the eprevious mail. > > Matt > > >> On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley >> wrote: >> >>> On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: >>> >>>> hi, >>>> >>>> I implemented HYPRE preconditioner for my study due to the fact that >>>> without preconditioner, PETSc solver will take thousands of iterations to >>>> converge for fine grid simulation. >>>> >>>> with HYPRE, depending on the parallel partition, it will take HYPRE >>>> forever to do anything. observation of output file is that the simulation >>>> is hanging with no output. >>>> >>>> Any idea what happened? will post snippet of code. >>>> >>> >>> 1) For any question about convergence, we need to see the output of >>> >>> -ksp_view_pre -ksp_view -ksp_monitor_true_residual >>> -ksp_converged_reason >>> >>> 2) Hypre has many preconditioners, which one are you talking about >>> >>> 3) PETSc has some preconditioners in common with Hypre, like AMG >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> -- >>>> Hao Zhang >>>> Dept. of Applid Mathematics and Statistics, >>>> Stony Brook University, >>>> Stony Brook, New York, 11790 >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >> >> >> >> -- >> Hao Zhang >> Dept. of Applid Mathematics and Statistics, >> Stony Brook University, >> Stony Brook, New York, 11790 >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From hbcbh1999 at gmail.com Sat Oct 21 16:34:18 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Sat, 21 Oct 2017 17:34:18 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: Message-ID: if this solver doesn't converge. I have a fall-back solution, which uses GMRES solver. this setup is fine with me. I just want to know if HYPRE is a reliable solution for me. Or I will have to go without preconditioner. Thanks! On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: > this is serial run. still dumping output. parallel more or less the same. > > KSP Object: 1 MPI processes > type: bcgs > maximum iterations=40000, initial guess is zero > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: hypre > HYPRE BoomerAMG preconditioning > Cycle type V > Maximum number of levels 25 > Maximum number of iterations PER hypre call 1 > Convergence tolerance PER hypre call 0. > Threshold for strong coupling 0.25 > Interpolation truncation factor 0. > Interpolation: max elements per row 0 > Number of levels of aggressive coarsening 0 > Number of paths for aggressive coarsening 1 > Maximum row sums 0.9 > Sweeps down 1 > Sweeps up 1 > Sweeps on coarse 1 > Relax down symmetric-SOR/Jacobi > Relax up symmetric-SOR/Jacobi > Relax on coarse Gaussian-elimination > Relax weight (all) 1. > Outer relax weight (all) 1. > Using CF-relaxation > Not using more complex smoothers. > Measure type local > Coarsen type Falgout > Interpolation type classical > linear system matrix = precond matrix: > Mat Object: A 1 MPI processes > type: seqaij > rows=497664, cols=497664 > total: nonzeros=3363552, allocated nonzeros=3483648 > total number of mallocs used during MatSetValues calls =0 > has attached null space > not using I-node routines > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm > 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm > 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm > 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm > 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm > 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm > 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm > 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm > 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm > 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm > 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm > 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm > 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm > 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm > 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm > 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm > 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm > 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm > 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm > 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm > 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm > 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm > 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm > 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm > 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm > 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm > 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm > 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm > 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm > 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm > 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm > 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm > 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm > 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm > 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm > 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm > 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm > 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm > 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm > 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm > 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm > 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm > 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm > 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm > 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm > 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm > 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm > 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm > 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm > 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm > 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm > 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm > 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm > 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm > 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm > 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm > 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm > 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm > 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm > 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm > 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm > 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm > 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm > 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm > 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm > 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm > 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm > 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm > 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm > 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm > 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm > 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm > 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm > 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm > 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm > 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm > 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm > 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm > 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm > 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm > 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm > 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm > 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm > 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm > 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm > 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm > 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm > 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm > 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm > 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm > 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm > 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm > 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm > 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm > 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm > 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm > 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm > 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm > 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm > 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm > 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm > 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm > 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm > 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm > 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm > 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm > 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm > 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm > 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm > 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm > 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm > 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm > 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm > 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm > 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm > 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm > 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm > 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm > 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm > 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm > 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm > 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm > 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm > 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm > 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm > 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm > 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm > 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm > 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm > 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm > 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm > 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm > 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm > 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm > 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm > 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm > 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm > 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm > 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm > 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm > 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm > 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm > 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm > 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm > 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm > 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm > 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm > 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm > 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm > 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm > 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm > 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm > 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm > 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm > 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm > 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm > 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm > 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm > 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm > 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm > 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm > 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm > 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm > 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm > 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm > 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm > 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm > 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm > 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm > 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm > 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm > 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm > 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm > 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm > 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm > 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm > 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm > 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm > 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm > 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm > 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm > 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm > 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm > 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm > 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm > 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm > 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm > 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm > 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm > 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm > 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm > 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm > 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm > 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm > 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm > 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm > 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm > 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm > 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm > 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm > 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm > 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm > 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm > 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm > 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm > 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm > 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm > 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm > 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm > 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm > 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm > 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm > 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm > 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm > 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm > 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm > 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm > 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm > 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm > 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm > 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm > 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm > 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm > 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm > 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm > 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm > 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm > 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm > 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm > 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm > 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm > 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm > 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm > 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm > 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm > 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm > 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm > 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm > 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm > 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm > 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm > 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm > 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm > 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm > 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm > 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm > 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm > 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm > 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm > 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm > 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm > 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm > 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm > 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm > 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm > 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm > 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm > 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm > 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm > 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm > 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm > 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm > 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm > 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm > 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm > 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm > 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm > 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm > 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm > 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm > 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm > 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm > 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm > 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm > 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm > 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm > 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm > 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm > 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm > 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm > 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm > 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm > 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm > 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm > 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm > 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm > 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm > 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm > 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm > 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm > 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm > 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm > 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm > 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm > 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm > 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm > 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm > 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm > 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm > 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm > 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm > 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm > 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm > 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm > 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm > 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm > 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm > 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm > 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm > 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm > 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm > 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm > 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm > 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm > 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm > 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm > 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm > 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm > 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm > 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm > 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm > 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm > 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm > 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm > 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm > 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm > 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm > 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm > 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm > 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm > 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm > 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm > 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm > 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm > 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm > 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm > 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm > 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm > 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm > 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm > 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm > 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm > 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm > 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm > 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm > 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm > 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm > 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley > wrote: > >> On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: >> >>> ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); >>> ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); >>> >>> ierr = VecAssemblyBegin(x); >>> ierr = VecAssemblyEnd(x); >>> >> This is probably unnecessary >> >>> >>> ierr = VecAssemblyBegin(b); >>> ierr = VecAssemblyEnd(b); >>> >> This is probably unnecessary >> >>> >> >>> >>> ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL, >>> &nullsp); >>> ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 >>> >> Is your rhs consistent with this nullspace? >> >> >>> // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); >>> KSPSetOperators(ksp,A,A); >>> >>> KSPSetType(ksp,KSPBCGS); >>> >>> KSPSetComputeSingularValues(ksp, PETSC_TRUE); >>> #if defined(__HYPRE__) >>> KSPGetPC(ksp, &pc); >>> PCSetType(pc, PCHYPRE); >>> PCHYPRESetType(pc,"boomeramg"); >>> >> This is terribly unnecessary. You just use >> >> -pc_type hypre -pc_hypre_type boomeramg >> >> or >> >> -pc_type gamg >> >> >>> #else >>> KSPSetType(ksp,KSPBCGSL); >>> KSPBCGSLSetEll(ksp,2); >>> #endif /* defined(__HYPRE__) */ >>> >>> KSPSetFromOptions(ksp); >>> KSPSetUp(ksp); >>> >>> ierr = KSPSolve(ksp,b,x); >>> >>> >>> command line >>> >> >> You did not provide any of what I asked for the in the eprevious mail. >> >> Matt >> >> >>> On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley >>> wrote: >>> >>>> On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: >>>> >>>>> hi, >>>>> >>>>> I implemented HYPRE preconditioner for my study due to the fact that >>>>> without preconditioner, PETSc solver will take thousands of iterations to >>>>> converge for fine grid simulation. >>>>> >>>>> with HYPRE, depending on the parallel partition, it will take HYPRE >>>>> forever to do anything. observation of output file is that the simulation >>>>> is hanging with no output. >>>>> >>>>> Any idea what happened? will post snippet of code. >>>>> >>>> >>>> 1) For any question about convergence, we need to see the output of >>>> >>>> -ksp_view_pre -ksp_view -ksp_monitor_true_residual >>>> -ksp_converged_reason >>>> >>>> 2) Hypre has many preconditioners, which one are you talking about >>>> >>>> 3) PETSc has some preconditioners in common with Hypre, like AMG >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> -- >>>>> Hao Zhang >>>>> Dept. of Applid Mathematics and Statistics, >>>>> Stony Brook University, >>>>> Stony Brook, New York, 11790 >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>> >>> >>> >>> -- >>> Hao Zhang >>> Dept. of Applid Mathematics and Statistics, >>> Stony Brook University, >>> Stony Brook, New York, 11790 >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 > -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Oct 21 17:42:36 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 21 Oct 2017 17:42:36 -0500 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: Message-ID: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> Run with -ksp_view_mat binary -ksp_view_rhs binary and send the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov Note you can also use -ksp_type gmres with hypre, unlikely to be a reason to use bcgs BTW: tolerances: relative=1e-14, is absurd My guess is your null space is incorrect. > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > if this solver doesn't converge. I have a fall-back solution, which uses GMRES solver. this setup is fine with me. I just want to know if HYPRE is a reliable solution for me. Or I will have to go without preconditioner. > > Thanks! > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: > this is serial run. still dumping output. parallel more or less the same. > > KSP Object: 1 MPI processes > type: bcgs > maximum iterations=40000, initial guess is zero > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: hypre > HYPRE BoomerAMG preconditioning > Cycle type V > Maximum number of levels 25 > Maximum number of iterations PER hypre call 1 > Convergence tolerance PER hypre call 0. > Threshold for strong coupling 0.25 > Interpolation truncation factor 0. > Interpolation: max elements per row 0 > Number of levels of aggressive coarsening 0 > Number of paths for aggressive coarsening 1 > Maximum row sums 0.9 > Sweeps down 1 > Sweeps up 1 > Sweeps on coarse 1 > Relax down symmetric-SOR/Jacobi > Relax up symmetric-SOR/Jacobi > Relax on coarse Gaussian-elimination > Relax weight (all) 1. > Outer relax weight (all) 1. > Using CF-relaxation > Not using more complex smoothers. > Measure type local > Coarsen type Falgout > Interpolation type classical > linear system matrix = precond matrix: > Mat Object: A 1 MPI processes > type: seqaij > rows=497664, cols=497664 > total: nonzeros=3363552, allocated nonzeros=3483648 > total number of mallocs used during MatSetValues calls =0 > has attached null space > not using I-node routines > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley wrote: > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > ierr = VecAssemblyBegin(x); > ierr = VecAssemblyEnd(x); > This is probably unnecessary > > ierr = VecAssemblyBegin(b); > ierr = VecAssemblyEnd(b); > This is probably unnecessary > > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > Is your rhs consistent with this nullspace? > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > KSPSetOperators(ksp,A,A); > > KSPSetType(ksp,KSPBCGS); > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > #if defined(__HYPRE__) > KSPGetPC(ksp, &pc); > PCSetType(pc, PCHYPRE); > PCHYPRESetType(pc,"boomeramg"); > This is terribly unnecessary. You just use > > -pc_type hypre -pc_hypre_type boomeramg > > or > > -pc_type gamg > > #else > KSPSetType(ksp,KSPBCGSL); > KSPBCGSLSetEll(ksp,2); > #endif /* defined(__HYPRE__) */ > > KSPSetFromOptions(ksp); > KSPSetUp(ksp); > > ierr = KSPSolve(ksp,b,x); > > > command line > > You did not provide any of what I asked for the in the eprevious mail. > > Matt > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley wrote: > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > hi, > > I implemented HYPRE preconditioner for my study due to the fact that without preconditioner, PETSc solver will take thousands of iterations to converge for fine grid simulation. > > with HYPRE, depending on the parallel partition, it will take HYPRE forever to do anything. observation of output file is that the simulation is hanging with no output. > > Any idea what happened? will post snippet of code. > > 1) For any question about convergence, we need to see the output of > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > 2) Hypre has many preconditioners, which one are you talking about > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > Thanks, > > Matt > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 From hbcbh1999 at gmail.com Sat Oct 21 17:47:40 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Sat, 21 Oct 2017 22:47:40 +0000 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> Message-ID: hi, Barry: what do you mean absurd by setting tolerance =1e-14? On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the > resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > Note you can also use -ksp_type gmres with hypre, unlikely to be a > reason to use bcgs > > BTW: tolerances: relative=1e-14, is absurd > > My guess is your null space is incorrect. > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > > > if this solver doesn't converge. I have a fall-back solution, which uses > GMRES solver. this setup is fine with me. I just want to know if HYPRE is a > reliable solution for me. Or I will have to go without preconditioner. > > > > Thanks! > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: > > this is serial run. still dumping output. parallel more or less the same. > > > > KSP Object: 1 MPI processes > > type: bcgs > > maximum iterations=40000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: 1 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > Cycle type V > > Maximum number of levels 25 > > Maximum number of iterations PER hypre call 1 > > Convergence tolerance PER hypre call 0. > > Threshold for strong coupling 0.25 > > Interpolation truncation factor 0. > > Interpolation: max elements per row 0 > > Number of levels of aggressive coarsening 0 > > Number of paths for aggressive coarsening 1 > > Maximum row sums 0.9 > > Sweeps down 1 > > Sweeps up 1 > > Sweeps on coarse 1 > > Relax down symmetric-SOR/Jacobi > > Relax up symmetric-SOR/Jacobi > > Relax on coarse Gaussian-elimination > > Relax weight (all) 1. > > Outer relax weight (all) 1. > > Using CF-relaxation > > Not using more complex smoothers. > > Measure type local > > Coarsen type Falgout > > Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: A 1 MPI processes > > type: seqaij > > rows=497664, cols=497664 > > total: nonzeros=3363552, allocated nonzeros=3483648 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node routines > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm > 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm > 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm > 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm > 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm > 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm > 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm > 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm > 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm > 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm > 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm > 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm > 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm > 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm > 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm > 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm > 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm > 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm > 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm > 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm > 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm > 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm > 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm > 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm > 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm > 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm > 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm > 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm > 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm > 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm > 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm > 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm > 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm > 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm > 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm > 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm > 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm > 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm > 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm > 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm > 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm > 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm > 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm > 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm > 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm > 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm > 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm > 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm > 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm > 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm > 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm > 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm > 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm > 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm > 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm > 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm > 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm > 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm > 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm > 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm > 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm > 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm > 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm > 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm > 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm > 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm > 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm > 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm > 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm > 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm > 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm > 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm > 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm > 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm > 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm > 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm > 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm > 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm > 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm > 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm > 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm > 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm > 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm > 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm > 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm > 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm > 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm > 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm > 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm > 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm > 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm > 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm > 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm > 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm > 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm > 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm > 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm > 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm > 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm > 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm > 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm > 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm > 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm > 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm > 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm > 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm > 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm > 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm > 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm > 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm > 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm > 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm > 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm > 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm > 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm > 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm > 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm > 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm > 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm > 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm > 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm > 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm > 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm > 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm > 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm > 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm > 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm > 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm > 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm > 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm > 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm > 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm > 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm > 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm > 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm > 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm > 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm > 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm > 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm > 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm > 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm > 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm > 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm > 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm > 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm > 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm > 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm > 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm > 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm > 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm > 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm > 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm > 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm > 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm > 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm > 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm > 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm > 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm > 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm > 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm > 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm > 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm > 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm > 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm > 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm > 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm > 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm > 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm > 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm > 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm > 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm > 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm > 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm > 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm > 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm > 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm > 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm > 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm > 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm > 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm > 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm > 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm > 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm > 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm > 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm > 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm > 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm > 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm > 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm > 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm > 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm > 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm > 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm > 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm > 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm > 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm > 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm > 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm > 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm > 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm > 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm > 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm > 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm > 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm > 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm > 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm > 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm > 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm > 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm > 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm > 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm > 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm > 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm > 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm > 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm > 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm > 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm > 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm > 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm > 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm > 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm > 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm > 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm > 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm > 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm > 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm > 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm > 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm > 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm > 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm > 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm > 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm > 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm > 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm > 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm > 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm > 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm > 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm > 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm > 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm > 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm > 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm > 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm > 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm > 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm > 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm > 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm > 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm > 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm > 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm > 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm > 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm > 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm > 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm > 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm > 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm > 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm > 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm > 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm > 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm > 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm > 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm > 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm > 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm > 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm > 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm > 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm > 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm > 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm > 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm > 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm > 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm > 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm > 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm > 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm > 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm > 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm > 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm > 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm > 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm > 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm > 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm > 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm > 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm > 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm > 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm > 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm > 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm > 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm > 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm > 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm > 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm > 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm > 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm > 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm > 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm > 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm > 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm > 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm > 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm > 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm > 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm > 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm > 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm > 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm > 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm > 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm > 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm > 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm > 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm > 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm > 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm > 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm > 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm > 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm > 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm > 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm > 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm > 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm > 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm > 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm > 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm > 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm > 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm > 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm > 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm > 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm > 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm > 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm > 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm > 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm > 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm > 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm > 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm > 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm > 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm > 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm > 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm > 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm > 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm > 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm > 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm > 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm > 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm > 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm > 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm > 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm > 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley > wrote: > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > ierr = VecAssemblyBegin(x); > > ierr = VecAssemblyEnd(x); > > This is probably unnecessary > > > > ierr = VecAssemblyBegin(b); > > ierr = VecAssemblyEnd(b); > > This is probably unnecessary > > > > > > ierr = > MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > Is your rhs consistent with this nullspace? > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > KSPSetOperators(ksp,A,A); > > > > KSPSetType(ksp,KSPBCGS); > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > #if defined(__HYPRE__) > > KSPGetPC(ksp, &pc); > > PCSetType(pc, PCHYPRE); > > PCHYPRESetType(pc,"boomeramg"); > > This is terribly unnecessary. You just use > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > or > > > > -pc_type gamg > > > > #else > > KSPSetType(ksp,KSPBCGSL); > > KSPBCGSLSetEll(ksp,2); > > #endif /* defined(__HYPRE__) */ > > > > KSPSetFromOptions(ksp); > > KSPSetUp(ksp); > > > > ierr = KSPSolve(ksp,b,x); > > > > > > command line > > > > You did not provide any of what I asked for the in the eprevious mail. > > > > Matt > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley > wrote: > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > > hi, > > > > I implemented HYPRE preconditioner for my study due to the fact that > without preconditioner, PETSc solver will take thousands of iterations to > converge for fine grid simulation. > > > > with HYPRE, depending on the parallel partition, it will take HYPRE > forever to do anything. observation of output file is that the simulation > is hanging with no output. > > > > Any idea what happened? will post snippet of code. > > > > 1) For any question about convergence, we need to see the output of > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual > -ksp_converged_reason > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > Thanks, > > > > Matt > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Oct 21 17:53:44 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 21 Oct 2017 17:53:44 -0500 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> Message-ID: > On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: > > hi, Barry: > what do you mean absurd by setting tolerance =1e-14? Trying to decrease the initial residual norm down by a factor of 1e-14 with an iterative method (or even direct method) is unrealistic, usually unachievable) and almost never necessary. You are requiring || r_n || < 1.e-14 || r_0|| when with double precision numbers you only have roughly 14 decimal digits total to compute with. Round off alone will lead to differences far larger than 1e-14 If you are using the solver in the context of a nonlinear problem (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to get quadratic convergence of Newton's method. If you are solving a linear problem then it is extremely likely that errors due to discretization errors (from finite element method etc) and the model are much much larger than even 1.e-8. So, in summary 1.e-14 is probably unachievable 1.e-14 is almost for sure not needed. Barry > > On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > Note you can also use -ksp_type gmres with hypre, unlikely to be a reason to use bcgs > > BTW: tolerances: relative=1e-14, is absurd > > My guess is your null space is incorrect. > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > > > if this solver doesn't converge. I have a fall-back solution, which uses GMRES solver. this setup is fine with me. I just want to know if HYPRE is a reliable solution for me. Or I will have to go without preconditioner. > > > > Thanks! > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: > > this is serial run. still dumping output. parallel more or less the same. > > > > KSP Object: 1 MPI processes > > type: bcgs > > maximum iterations=40000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: 1 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > Cycle type V > > Maximum number of levels 25 > > Maximum number of iterations PER hypre call 1 > > Convergence tolerance PER hypre call 0. > > Threshold for strong coupling 0.25 > > Interpolation truncation factor 0. > > Interpolation: max elements per row 0 > > Number of levels of aggressive coarsening 0 > > Number of paths for aggressive coarsening 1 > > Maximum row sums 0.9 > > Sweeps down 1 > > Sweeps up 1 > > Sweeps on coarse 1 > > Relax down symmetric-SOR/Jacobi > > Relax up symmetric-SOR/Jacobi > > Relax on coarse Gaussian-elimination > > Relax weight (all) 1. > > Outer relax weight (all) 1. > > Using CF-relaxation > > Not using more complex smoothers. > > Measure type local > > Coarsen type Falgout > > Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: A 1 MPI processes > > type: seqaij > > rows=497664, cols=497664 > > total: nonzeros=3363552, allocated nonzeros=3483648 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node routines > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley wrote: > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > ierr = VecAssemblyBegin(x); > > ierr = VecAssemblyEnd(x); > > This is probably unnecessary > > > > ierr = VecAssemblyBegin(b); > > ierr = VecAssemblyEnd(b); > > This is probably unnecessary > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > Is your rhs consistent with this nullspace? > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > KSPSetOperators(ksp,A,A); > > > > KSPSetType(ksp,KSPBCGS); > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > #if defined(__HYPRE__) > > KSPGetPC(ksp, &pc); > > PCSetType(pc, PCHYPRE); > > PCHYPRESetType(pc,"boomeramg"); > > This is terribly unnecessary. You just use > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > or > > > > -pc_type gamg > > > > #else > > KSPSetType(ksp,KSPBCGSL); > > KSPBCGSLSetEll(ksp,2); > > #endif /* defined(__HYPRE__) */ > > > > KSPSetFromOptions(ksp); > > KSPSetUp(ksp); > > > > ierr = KSPSolve(ksp,b,x); > > > > > > command line > > > > You did not provide any of what I asked for the in the eprevious mail. > > > > Matt > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley wrote: > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > > hi, > > > > I implemented HYPRE preconditioner for my study due to the fact that without preconditioner, PETSc solver will take thousands of iterations to converge for fine grid simulation. > > > > with HYPRE, depending on the parallel partition, it will take HYPRE forever to do anything. observation of output file is that the simulation is hanging with no output. > > > > Any idea what happened? will post snippet of code. > > > > 1) For any question about convergence, we need to see the output of > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > Thanks, > > > > Matt > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 From zakaryah at gmail.com Sat Oct 21 21:16:09 2017 From: zakaryah at gmail.com (zakaryah .) Date: Sat, 21 Oct 2017 22:16:09 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> Message-ID: OK, it turns out Lukasz was exactly correct. With whatever method I try, the solver or stepper approaches a critical point, which is associated with some kind of snap-through. I have looked into the control techniques and they are pretty ingenious, and I think they should work for my problem, in that I hope to continue through the critical point. I have a technical question about the implementation, though. Following Riks 1979 for example, the control parameter is the approximate arc-length in the phase space of loading intensity and displacements. It represents one additional variable in the system, and there is one additional equation in the system (in Riks, this is eq. 3.9). In my implementation, the displacements are implemented as a DMDA with 3 dof, since I'm working in 3D. I'm not sure about the best way to add the single additional variable and equality. The way I see it, I either give up on using the DMDA, in which case I'm not sure how to efficiently implement the stencil I need to calculate spatial derivatives of the displacements, or I have to add a rather large number of extra variables. For example, if my DMDA is WxHxD, I would have to make it (W+1)xHxD, and each of the extra HxD variables will have 3 dof. Then 3xHxD-1 variables are in the nullspace (because they don't represent anything, so I would have to add a bunch of zeros to the function and the Jacobian), while the remaining variable is used as the control parameter. I'm aware of other methods, e.g. Crisfield 1983, but I'm interested in whether there is a straightforward way to implement Riks' method in PETSc. I'm sure I'm missing something so hopefully someone can give me some hints. Thanks for all the help! On Thu, Oct 12, 2017 at 2:02 PM, zakaryah . wrote: > Thanks for the response, Matt - these are excellent questions. > > On theoretical grounds, I am certain that the solution to the continuous > PDE exists. Without any serious treatment, I think this means the > discretized system should have a solution up to discretization error, but > perhaps this is indeed a bad approach. > > I am not sure whether the equations are "really hard to solve". At each > point, the equations are third order polynomials of the state variable at > that point and at nearby points (i.e. in the stencil). One possible > complication is that the external forces which are applied to the interior > of the material can be fairly complex - they are smooth, but they can have > many inflection points. > > I don't have a great test case for which I know a good solution. To my > thinking, there is no way that time-stepping the parabolic version of the > same PDE can fail to yield a solution at infinite time. So, I'm going to > try starting there. Converting the problem to a minimization is a bit > trickier, because the discretization has to be performed one step earlier > in the calculation, and therefore the gradient and Hessian would need to be > recalculated. > > Even if there are some problems with time-stepping (speed of > convergence?), maybe I can use the solutions as better test cases for the > elliptic PDE solved via SNES. > > Can you give me any additional lingo or references for the fracture > problem? > > Thanks, Zak > > On Wed, Oct 11, 2017 at 8:53 PM, Matthew Knepley > wrote: > >> On Wed, Oct 11, 2017 at 11:33 AM, zakaryah . wrote: >> >>> Many thanks for the suggestions, Matt. >>> >>> I tried putting the solvers in a loop, like this: >>> >>> do { >>> NewtonLS >>> check convergence >>> if (converged) break >>> NRichardson or NGMRES >>> } while (!converged) >>> >>> The results were interesting, to me at least. With NRichardson, there >>> was indeed improvement in the residual norm, followed by improvement with >>> NewtonLS, and so on for a few iterations of this loop. In each case, after >>> a few iterations the NewtonLS appeared to be stuck in the same way as after >>> the first iteration. Eventually neither method was able to reduce the >>> residual norm, which was still significant, so this was not a total >>> success. With NGMRES, the initial behavior was similar, but eventually the >>> NGMRES progress became erratic. The minimal residual norm was a bit better >>> using NGMRES than NRichardson, but neither combination of methods fully >>> converged. For both NRichardson and NGMRES, I simply used the defaults, as >>> I have no knowledge of how to tune the options for my problem. >>> >> >> Are you certain that the equations have a solution? I become a little >> concerned when richardson stops converging. Its >> still possible you have really hard to solve equations, it just becomes >> less likely. And even if they truly are hard to solve, >> then there should be physical reasons for this. For example, it could be >> that discretizing the minimizing PDE is just the >> wrong thing to do. I believe this is the case in fracture, where you >> attack the minimization problem directly. >> >> Matt >> >> >>> On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley >>> wrote: >>> >>>> On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . >>>> wrote: >>>> >>>>> Thanks for clearing that up. >>>>> >>>>> I'd appreciate any further help. Here's a summary: >>>>> >>>>> My ultimate goal is to find a vector field which minimizes an action. >>>>> The action is a (nonlinear) function of the field and its first spatial >>>>> derivatives. >>>>> >>>>> My current approach is to derive the (continuous) Euler-Lagrange >>>>> equations, which results in a nonlinear PDE that the minimizing field must >>>>> satisfy. These Euler-Lagrange equations are then discretized, and I'm >>>>> trying to use an SNES to solve them. >>>>> >>>>> The problem is that the solver seems to reach a point at which the >>>>> Jacobian (this corresponds to the second variation of the action, which is >>>>> like a Hessian of the energy) becomes nearly singular, but where the >>>>> residual (RHS of PDE) is not close to zero. The residual does not decrease >>>>> over additional SNES iterations, and the line search results in tiny step >>>>> sizes. My interpretation is that this point of stagnation is a critical >>>>> point. >>>>> >>>> >>>> The normal thing to do here (I think) is to engage solvers which do not >>>> depend on that particular point. So using >>>> NRichardson, or maybe NGMRES, to get past that. I would be interested >>>> to see if this is successful. >>>> >>>> Matt >>>> >>>> >>>>> I have checked the hand-coded Jacobian very carefully and I am >>>>> confident that it is correct. >>>>> >>>>> I am guessing that such a situation is well-known in the field, but I >>>>> don't know the lingo or literature. If anyone has suggestions I'd be >>>>> thrilled. Are there documentation/methodologies within PETSc for this type >>>>> of situation? >>>>> >>>>> Is there any advantage to discretizing the action itself and using the >>>>> optimization routines? With minor modifications I'll have the gradient and >>>>> Hessian calculations coded. Are the optimization routines likely to >>>>> stagnate in the same way as the nonlinear solver, or can they take >>>>> advantage of the structure of the problem to overcome this? >>>>> >>>>> Thanks a lot in advance for any help. >>>>> >>>>> On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith >>>>> wrote: >>>>> >>>>>> >>>>>> There is apparently confusing in understanding the ordering. Is >>>>>> this all on one process that you get funny results? Are you using >>>>>> MatSetValuesStencil() to provide the matrix (it is generally easier than >>>>>> providing it yourself). In parallel MatView() always maps the rows and >>>>>> columns to the natural ordering before printing, if you use a matrix >>>>>> created from the DMDA. If you create the matrix yourself it has a different >>>>>> MatView in parallel that is in in thePETSc ordering.\ >>>>>> >>>>>> >>>>>> Barry >>>>>> >>>>>> >>>>>> >>>>>> > On Oct 8, 2017, at 8:05 AM, zakaryah . wrote: >>>>>> > >>>>>> > I'm more confused than ever. I don't understand the output of >>>>>> -snes_type test -snes_test_display. >>>>>> > >>>>>> > For the user-defined state of the vector (where I'd like to test >>>>>> the Jacobian), the finite difference Jacobian at row 0 evaluates as: >>>>>> > >>>>>> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) >>>>>> (4, 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) >>>>>> (12, 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, >>>>>> -2175.34) (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) >>>>>> (36, 76.8575) (37, 16.325) (38, 4.83918) >>>>>> > >>>>>> > But the hand-coded Jacobian at row 0 evaluates as: >>>>>> > >>>>>> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) >>>>>> (4, 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) >>>>>> (36, 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, >>>>>> -2175.34) (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) >>>>>> (264, 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, >>>>>> 0.) >>>>>> > and the difference between the Jacobians at row 0 evaluates as: >>>>>> > >>>>>> > row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, >>>>>> 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, >>>>>> -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) >>>>>> (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, >>>>>> -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) >>>>>> (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, >>>>>> 0.) (41, 0.) >>>>>> > >>>>>> > The difference between the column numbering between the finite >>>>>> difference and the hand-coded Jacobians looks like a serious problem to me, >>>>>> but I'm probably missing something. >>>>>> > >>>>>> > I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, >>>>>> and for this test problem the grid dimensions are 11x7x6. For a grid point >>>>>> x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? >>>>>> If so, then the column numbers of the hand-coded Jacobian match those of >>>>>> the 27 point stencil I have in mind. However, I am then at a loss to >>>>>> explain the column numbers in the finite difference Jacobian. >>>>>> > >>>>>> > >>>>>> > >>>>>> > >>>>>> > On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . >>>>>> wrote: >>>>>> > OK - I ran with -snes_monitor -snes_converged_reason >>>>>> -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual >>>>>> -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls >>>>>> -snes_compare_explicit >>>>>> > >>>>>> > and here is the full error message, output immediately after >>>>>> > >>>>>> > Finite difference Jacobian >>>>>> > Mat Object: 24 MPI processes >>>>>> > type: mpiaij >>>>>> > >>>>>> > [0]PETSC ERROR: --------------------- Error Message >>>>>> -------------------------------------------------------------- >>>>>> > >>>>>> > [0]PETSC ERROR: Invalid argument >>>>>> > >>>>>> > [0]PETSC ERROR: Matrix not generated from a DMDA >>>>>> > >>>>>> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/d >>>>>> ocumentation/faq.html for trouble shooting. >>>>>> > >>>>>> > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 >>>>>> > >>>>>> > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named >>>>>> node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 >>>>>> > >>>>>> > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 >>>>>> --download-fblaslapack -with-debugging=0 >>>>>> > >>>>>> > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in >>>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impl >>>>>> s/da/fdda.c >>>>>> > >>>>>> > [0]PETSC ERROR: #2 MatView() line 901 in >>>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/mat/int >>>>>> erface/matrix.c >>>>>> > >>>>>> > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in >>>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/in >>>>>> terface/snes.c >>>>>> > >>>>>> > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in >>>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/im >>>>>> pls/ls/ls.c >>>>>> > >>>>>> > [0]PETSC ERROR: #5 SNESSolve() line 4005 in >>>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/in >>>>>> terface/snes.c >>>>>> > >>>>>> > [0]PETSC ERROR: #6 solveWarp3D() line 659 in >>>>>> /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October >>>>>> 6_2017/mshs.c >>>>>> > >>>>>> > >>>>>> > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown wrote: >>>>>> > Always always always send the whole error message. >>>>>> > >>>>>> > "zakaryah ." writes: >>>>>> > >>>>>> > > I tried -snes_compare_explicit, and got the following error: >>>>>> > > >>>>>> > > [0]PETSC ERROR: Invalid argument >>>>>> > > >>>>>> > > [0]PETSC ERROR: Matrix not generated from a DMDA >>>>>> > > >>>>>> > > What am I doing wrong? >>>>>> > > >>>>>> > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown >>>>>> wrote: >>>>>> > > >>>>>> > >> Barry Smith writes: >>>>>> > >> >>>>>> > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . >>>>>> wrote: >>>>>> > >> >> >>>>>> > >> >> I'm still working on this. I've made some progress, and it >>>>>> looks like >>>>>> > >> the issue is with the KSP, at least for now. The Jacobian may be >>>>>> > >> ill-conditioned. Is it possible to use -snes_test_display >>>>>> during an >>>>>> > >> intermediate step of the analysis? I would like to inspect the >>>>>> Jacobian >>>>>> > >> after several solves have already completed, >>>>>> > >> > >>>>>> > >> > No, our currently code for testing Jacobians is poor >>>>>> quality and >>>>>> > >> poorly organized. Needs a major refactoring to do things >>>>>> properly. Sorry >>>>>> > >> >>>>>> > >> You can use -snes_compare_explicit or -snes_compare_coloring to >>>>>> output >>>>>> > >> differences on each Newton step. >>>>>> > >> >>>>>> > >>>>>> > >>>>>> >>>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hbcbh1999 at gmail.com Sat Oct 21 21:29:08 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Sat, 21 Oct 2017 22:29:08 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> Message-ID: Barry, Please advise what you make of this? this is poisson solver with all neumann BC 3d case Finite difference Scheme was used. Thanks! I'm in learning mode. KSP Object: 24 MPI processes type: bcgs maximum iterations=40000, initial guess is zero tolerances: relative=1e-14, absolute=1e-14, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 24 MPI processes type: hypre HYPRE BoomerAMG preconditioning Cycle type V Maximum number of levels 25 Maximum number of iterations PER hypre call 1 Convergence tolerance PER hypre call 0. Threshold for strong coupling 0.25 Interpolation truncation factor 0. Interpolation: max elements per row 0 Number of levels of aggressive coarsening 0 Number of paths for aggressive coarsening 1 Maximum row sums 0.9 Sweeps down 1 Sweeps up 1 Sweeps on coarse 1 Relax down symmetric-SOR/Jacobi Relax up symmetric-SOR/Jacobi Relax on coarse Gaussian-elimination Relax weight (all) 1. Outer relax weight (all) 1. Using CF-relaxation Not using more complex smoothers. Measure type local Coarsen type Falgout Interpolation type classical linear system matrix = precond matrix: Mat Object: A 24 MPI processes type: mpiaij rows=497664, cols=497664 total: nonzeros=3363552, allocated nonzeros=6967296 total number of mallocs used during MatSetValues calls =0 has attached null space not using I-node (on process 0) routines 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 Linear solve converged due to CONVERGED_ATOL iterations 7 On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith wrote: > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: > > > > hi, Barry: > > what do you mean absurd by setting tolerance =1e-14? > > Trying to decrease the initial residual norm down by a factor of 1e-14 > with an iterative method (or even direct method) is unrealistic, usually > unachievable) and almost never necessary. You are requiring || r_n || < > 1.e-14 || r_0|| when with double precision numbers you only have roughly 14 > decimal digits total to compute with. Round off alone will lead to > differences far larger than 1e-14 > > If you are using the solver in the context of a nonlinear problem (i.e. > inside Newton's method) then 1.e-6 is generally more than plenty to get > quadratic convergence of Newton's method. > > If you are solving a linear problem then it is extremely likely that > errors due to discretization errors (from finite element method etc) and > the model are much much larger than even 1.e-8. > > So, in summary > > 1.e-14 is probably unachievable > > 1.e-14 is almost for sure not needed. > > Barry > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the > resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > Note you can also use -ksp_type gmres with hypre, unlikely to be a > reason to use bcgs > > > > BTW: tolerances: relative=1e-14, is absurd > > > > My guess is your null space is incorrect. > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > > > > > if this solver doesn't converge. I have a fall-back solution, which > uses GMRES solver. this setup is fine with me. I just want to know if HYPRE > is a reliable solution for me. Or I will have to go without preconditioner. > > > > > > Thanks! > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang > wrote: > > > this is serial run. still dumping output. parallel more or less the > same. > > > > > > KSP Object: 1 MPI processes > > > type: bcgs > > > maximum iterations=40000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 1 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > Cycle type V > > > Maximum number of levels 25 > > > Maximum number of iterations PER hypre call 1 > > > Convergence tolerance PER hypre call 0. > > > Threshold for strong coupling 0.25 > > > Interpolation truncation factor 0. > > > Interpolation: max elements per row 0 > > > Number of levels of aggressive coarsening 0 > > > Number of paths for aggressive coarsening 1 > > > Maximum row sums 0.9 > > > Sweeps down 1 > > > Sweeps up 1 > > > Sweeps on coarse 1 > > > Relax down symmetric-SOR/Jacobi > > > Relax up symmetric-SOR/Jacobi > > > Relax on coarse Gaussian-elimination > > > Relax weight (all) 1. > > > Outer relax weight (all) 1. > > > Using CF-relaxation > > > Not using more complex smoothers. > > > Measure type local > > > Coarsen type Falgout > > > Interpolation type classical > > > linear system matrix = precond matrix: > > > Mat Object: A 1 MPI processes > > > type: seqaij > > > rows=497664, cols=497664 > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > total number of mallocs used during MatSetValues calls =0 > > > has attached null space > > > not using I-node routines > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm > 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm > 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm > 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm > 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm > 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm > 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm > 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm > 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm > 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm > 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm > 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm > 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm > 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm > 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm > 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm > 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm > 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm > 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm > 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm > 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm > 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm > 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm > 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm > 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm > 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm > 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm > 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm > 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm > 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm > 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm > 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm > 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm > 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm > 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm > 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm > 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm > 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm > 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm > 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm > 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm > 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm > 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm > 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm > 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm > 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm > 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm > 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm > 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm > 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm > 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm > 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm > 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm > 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm > 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm > 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm > 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm > 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm > 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm > 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm > 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm > 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm > 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm > 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm > 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm > 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm > 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm > 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm > 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm > 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm > 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm > 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm > 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm > 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm > 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm > 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm > 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm > 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm > 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm > 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm > 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm > 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm > 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm > 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm > 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm > 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm > 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm > 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm > 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm > 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm > 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm > 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm > 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm > 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm > 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm > 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm > 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm > 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm > 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm > 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm > 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm > 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm > 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm > 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm > 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm > 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm > 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm > 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm > 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm > 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm > 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm > 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm > 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm > 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm > 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm > 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm > 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm > 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm > 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm > 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm > 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm > 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm > 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm > 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm > 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm > 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm > 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm > 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm > 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm > 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm > 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm > 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm > 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm > 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm > 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm > 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm > 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm > 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm > 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm > 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm > 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm > 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm > 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm > 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm > 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm > 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm > 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm > 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm > 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm > 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm > 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm > 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm > 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm > 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm > 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm > 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm > 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm > 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm > 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm > 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm > 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm > 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm > 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm > 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm > 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm > 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm > 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm > 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm > 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm > 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm > 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm > 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm > 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm > 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm > 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm > 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm > 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm > 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm > 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm > 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm > 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm > 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm > 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm > 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm > 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm > 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm > 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm > 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm > 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm > 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm > 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm > 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm > 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm > 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm > 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm > 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm > 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm > 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm > 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm > 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm > 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm > 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm > 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm > 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm > 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm > 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm > 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm > 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm > 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm > 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm > 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm > 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm > 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm > 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm > 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm > 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm > 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm > 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm > 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm > 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm > 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm > 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm > 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm > 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm > 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm > 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm > 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm > 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm > 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm > 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm > 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm > 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm > 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm > 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm > 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm > 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm > 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm > 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm > 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm > 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm > 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm > 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm > 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm > 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm > 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm > 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm > 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm > 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm > 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm > 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm > 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm > 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm > 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm > 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm > 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm > 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm > 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm > 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm > 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm > 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm > 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm > 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm > 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm > 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm > 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm > 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm > 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm > 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm > 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm > 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm > 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm > 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm > 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm > 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm > 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm > 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm > 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm > 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm > 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm > 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm > 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm > 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm > 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm > 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm > 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm > 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm > 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm > 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm > 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm > 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm > 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm > 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm > 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm > 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm > 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm > 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm > 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm > 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm > 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm > 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm > 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm > 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm > 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm > 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm > 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm > 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm > 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm > 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm > 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm > 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm > 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm > 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm > 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm > 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm > 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm > 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm > 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm > 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm > 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm > 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm > 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm > 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm > 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm > 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm > 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm > 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm > 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm > 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm > 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm > 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm > 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm > 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm > 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm > 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm > 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm > 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm > 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm > 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm > 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm > 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm > 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm > 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm > 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm > 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm > 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm > 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm > 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm > 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley > wrote: > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang > wrote: > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > ierr = VecAssemblyBegin(x); > > > ierr = VecAssemblyEnd(x); > > > This is probably unnecessary > > > > > > ierr = VecAssemblyBegin(b); > > > ierr = VecAssemblyEnd(b); > > > This is probably unnecessary > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ > WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > Is your rhs consistent with this nullspace? > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > KSPSetOperators(ksp,A,A); > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > #if defined(__HYPRE__) > > > KSPGetPC(ksp, &pc); > > > PCSetType(pc, PCHYPRE); > > > PCHYPRESetType(pc,"boomeramg"); > > > This is terribly unnecessary. You just use > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > or > > > > > > -pc_type gamg > > > > > > #else > > > KSPSetType(ksp,KSPBCGSL); > > > KSPBCGSLSetEll(ksp,2); > > > #endif /* defined(__HYPRE__) */ > > > > > > KSPSetFromOptions(ksp); > > > KSPSetUp(ksp); > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > command line > > > > > > You did not provide any of what I asked for the in the eprevious mail. > > > > > > Matt > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley > wrote: > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang > wrote: > > > hi, > > > > > > I implemented HYPRE preconditioner for my study due to the fact that > without preconditioner, PETSc solver will take thousands of iterations to > converge for fine grid simulation. > > > > > > with HYPRE, depending on the parallel partition, it will take HYPRE > forever to do anything. observation of output file is that the simulation > is hanging with no output. > > > > > > Any idea what happened? will post snippet of code. > > > > > > 1) For any question about convergence, we need to see the output of > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual > -ksp_converged_reason > > > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > > > Thanks, > > > > > > Matt > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Oct 21 22:41:56 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 21 Oct 2017 22:41:56 -0500 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> Message-ID: <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> This is good. You get more than 12 digit reduction in the true residual norm. This is good AMG convergence. Expected when everything goes well. What is different in this case from the previous case that does not converge reasonably? Barry > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: > > Barry, Please advise what you make of this? this is poisson solver with all neumann BC 3d case Finite difference Scheme was used. > Thanks! I'm in learning mode. > > KSP Object: 24 MPI processes > type: bcgs > maximum iterations=40000, initial guess is zero > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: 24 MPI processes > type: hypre > HYPRE BoomerAMG preconditioning > Cycle type V > Maximum number of levels 25 > Maximum number of iterations PER hypre call 1 > Convergence tolerance PER hypre call 0. > Threshold for strong coupling 0.25 > Interpolation truncation factor 0. > Interpolation: max elements per row 0 > Number of levels of aggressive coarsening 0 > Number of paths for aggressive coarsening 1 > Maximum row sums 0.9 > Sweeps down 1 > Sweeps up 1 > Sweeps on coarse 1 > Relax down symmetric-SOR/Jacobi > Relax up symmetric-SOR/Jacobi > Relax on coarse Gaussian-elimination > Relax weight (all) 1. > Outer relax weight (all) 1. > Using CF-relaxation > Not using more complex smoothers. > Measure type local > Coarsen type Falgout > Interpolation type classical > linear system matrix = precond matrix: > Mat Object: A 24 MPI processes > type: mpiaij > rows=497664, cols=497664 > total: nonzeros=3363552, allocated nonzeros=6967296 > total number of mallocs used during MatSetValues calls =0 > has attached null space > not using I-node (on process 0) routines > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith wrote: > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: > > > > hi, Barry: > > what do you mean absurd by setting tolerance =1e-14? > > Trying to decrease the initial residual norm down by a factor of 1e-14 with an iterative method (or even direct method) is unrealistic, usually unachievable) and almost never necessary. You are requiring || r_n || < 1.e-14 || r_0|| when with double precision numbers you only have roughly 14 decimal digits total to compute with. Round off alone will lead to differences far larger than 1e-14 > > If you are using the solver in the context of a nonlinear problem (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to get quadratic convergence of Newton's method. > > If you are solving a linear problem then it is extremely likely that errors due to discretization errors (from finite element method etc) and the model are much much larger than even 1.e-8. > > So, in summary > > 1.e-14 is probably unachievable > > 1.e-14 is almost for sure not needed. > > Barry > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > Note you can also use -ksp_type gmres with hypre, unlikely to be a reason to use bcgs > > > > BTW: tolerances: relative=1e-14, is absurd > > > > My guess is your null space is incorrect. > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > > > > > if this solver doesn't converge. I have a fall-back solution, which uses GMRES solver. this setup is fine with me. I just want to know if HYPRE is a reliable solution for me. Or I will have to go without preconditioner. > > > > > > Thanks! > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: > > > this is serial run. still dumping output. parallel more or less the same. > > > > > > KSP Object: 1 MPI processes > > > type: bcgs > > > maximum iterations=40000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 1 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > Cycle type V > > > Maximum number of levels 25 > > > Maximum number of iterations PER hypre call 1 > > > Convergence tolerance PER hypre call 0. > > > Threshold for strong coupling 0.25 > > > Interpolation truncation factor 0. > > > Interpolation: max elements per row 0 > > > Number of levels of aggressive coarsening 0 > > > Number of paths for aggressive coarsening 1 > > > Maximum row sums 0.9 > > > Sweeps down 1 > > > Sweeps up 1 > > > Sweeps on coarse 1 > > > Relax down symmetric-SOR/Jacobi > > > Relax up symmetric-SOR/Jacobi > > > Relax on coarse Gaussian-elimination > > > Relax weight (all) 1. > > > Outer relax weight (all) 1. > > > Using CF-relaxation > > > Not using more complex smoothers. > > > Measure type local > > > Coarsen type Falgout > > > Interpolation type classical > > > linear system matrix = precond matrix: > > > Mat Object: A 1 MPI processes > > > type: seqaij > > > rows=497664, cols=497664 > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > total number of mallocs used during MatSetValues calls =0 > > > has attached null space > > > not using I-node routines > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley wrote: > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > ierr = VecAssemblyBegin(x); > > > ierr = VecAssemblyEnd(x); > > > This is probably unnecessary > > > > > > ierr = VecAssemblyBegin(b); > > > ierr = VecAssemblyEnd(b); > > > This is probably unnecessary > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > Is your rhs consistent with this nullspace? > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > KSPSetOperators(ksp,A,A); > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > #if defined(__HYPRE__) > > > KSPGetPC(ksp, &pc); > > > PCSetType(pc, PCHYPRE); > > > PCHYPRESetType(pc,"boomeramg"); > > > This is terribly unnecessary. You just use > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > or > > > > > > -pc_type gamg > > > > > > #else > > > KSPSetType(ksp,KSPBCGSL); > > > KSPBCGSLSetEll(ksp,2); > > > #endif /* defined(__HYPRE__) */ > > > > > > KSPSetFromOptions(ksp); > > > KSPSetUp(ksp); > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > command line > > > > > > You did not provide any of what I asked for the in the eprevious mail. > > > > > > Matt > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley wrote: > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > > > hi, > > > > > > I implemented HYPRE preconditioner for my study due to the fact that without preconditioner, PETSc solver will take thousands of iterations to converge for fine grid simulation. > > > > > > with HYPRE, depending on the parallel partition, it will take HYPRE forever to do anything. observation of output file is that the simulation is hanging with no output. > > > > > > Any idea what happened? will post snippet of code. > > > > > > 1) For any question about convergence, we need to see the output of > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > > > Thanks, > > > > > > Matt > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 From hbcbh1999 at gmail.com Sat Oct 21 22:50:42 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Sat, 21 Oct 2017 23:50:42 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> Message-ID: the incompressible NS solver algorithm call PETSc solver at different stage of each time step. The one you were saying "This is good. 12 digit reduction" is after the initial pressure solver, in which usually HYPRE doesn't give a good convergence, so the fall-back solver GMRES will be called after. Barry, you were mentioning that I could have a wrong nullspace. that particular solver is aimed to give an initial pressure profile for 3d incompressible NS simulation using all neumann boundary conditions. could you give some insight how to test if I have a wrong nullspace etc? Thanks! On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith wrote: > > This is good. You get more than 12 digit reduction in the true residual > norm. This is good AMG convergence. Expected when everything goes well. > > What is different in this case from the previous case that does not > converge reasonably? > > Barry > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: > > > > Barry, Please advise what you make of this? this is poisson solver with > all neumann BC 3d case Finite difference Scheme was used. > > Thanks! I'm in learning mode. > > > > KSP Object: 24 MPI processes > > type: bcgs > > maximum iterations=40000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: 24 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > Cycle type V > > Maximum number of levels 25 > > Maximum number of iterations PER hypre call 1 > > Convergence tolerance PER hypre call 0. > > Threshold for strong coupling 0.25 > > Interpolation truncation factor 0. > > Interpolation: max elements per row 0 > > Number of levels of aggressive coarsening 0 > > Number of paths for aggressive coarsening 1 > > Maximum row sums 0.9 > > Sweeps down 1 > > Sweeps up 1 > > Sweeps on coarse 1 > > Relax down symmetric-SOR/Jacobi > > Relax up symmetric-SOR/Jacobi > > Relax on coarse Gaussian-elimination > > Relax weight (all) 1. > > Outer relax weight (all) 1. > > Using CF-relaxation > > Not using more complex smoothers. > > Measure type local > > Coarsen type Falgout > > Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: A 24 MPI processes > > type: mpiaij > > rows=497664, cols=497664 > > total: nonzeros=3363552, allocated nonzeros=6967296 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node (on process 0) routines > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm > 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm > 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm > 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm > 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm > 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm > 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm > 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm > 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith wrote: > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: > > > > > > hi, Barry: > > > what do you mean absurd by setting tolerance =1e-14? > > > > Trying to decrease the initial residual norm down by a factor of 1e-14 > with an iterative method (or even direct method) is unrealistic, usually > unachievable) and almost never necessary. You are requiring || r_n || < > 1.e-14 || r_0|| when with double precision numbers you only have roughly 14 > decimal digits total to compute with. Round off alone will lead to > differences far larger than 1e-14 > > > > If you are using the solver in the context of a nonlinear problem > (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to > get quadratic convergence of Newton's method. > > > > If you are solving a linear problem then it is extremely likely that > errors due to discretization errors (from finite element method etc) and > the model are much much larger than even 1.e-8. > > > > So, in summary > > > > 1.e-14 is probably unachievable > > > > 1.e-14 is almost for sure not needed. > > > > Barry > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the > resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely to be a > reason to use bcgs > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > > > > > > > if this solver doesn't converge. I have a fall-back solution, which > uses GMRES solver. this setup is fine with me. I just want to know if HYPRE > is a reliable solution for me. Or I will have to go without preconditioner. > > > > > > > > Thanks! > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang > wrote: > > > > this is serial run. still dumping output. parallel more or less the > same. > > > > > > > > KSP Object: 1 MPI processes > > > > type: bcgs > > > > maximum iterations=40000, initial guess is zero > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 1 MPI processes > > > > type: hypre > > > > HYPRE BoomerAMG preconditioning > > > > Cycle type V > > > > Maximum number of levels 25 > > > > Maximum number of iterations PER hypre call 1 > > > > Convergence tolerance PER hypre call 0. > > > > Threshold for strong coupling 0.25 > > > > Interpolation truncation factor 0. > > > > Interpolation: max elements per row 0 > > > > Number of levels of aggressive coarsening 0 > > > > Number of paths for aggressive coarsening 1 > > > > Maximum row sums 0.9 > > > > Sweeps down 1 > > > > Sweeps up 1 > > > > Sweeps on coarse 1 > > > > Relax down symmetric-SOR/Jacobi > > > > Relax up symmetric-SOR/Jacobi > > > > Relax on coarse Gaussian-elimination > > > > Relax weight (all) 1. > > > > Outer relax weight (all) 1. > > > > Using CF-relaxation > > > > Not using more complex smoothers. > > > > Measure type local > > > > Coarsen type Falgout > > > > Interpolation type classical > > > > linear system matrix = precond matrix: > > > > Mat Object: A 1 MPI processes > > > > type: seqaij > > > > rows=497664, cols=497664 > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > total number of mallocs used during MatSetValues calls =0 > > > > has attached null space > > > > not using I-node routines > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm > 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm > 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm > 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm > 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm > 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm > 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm > 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm > 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm > 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm > 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm > 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm > 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm > 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm > 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm > 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm > 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm > 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm > 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm > 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm > 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm > 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm > 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm > 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm > 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm > 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm > 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm > 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm > 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm > 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm > 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm > 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm > 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm > 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm > 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm > 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm > 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm > 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm > 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm > 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm > 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm > 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm > 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm > 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm > 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm > 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm > 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm > 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm > 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm > 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm > 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm > 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm > 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm > 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm > 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm > 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm > 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm > 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm > 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm > 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm > 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm > 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm > 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm > 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm > 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm > 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm > 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm > 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm > 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm > 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm > 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm > 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm > 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm > 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm > 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm > 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm > 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm > 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm > 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm > 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm > 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm > 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm > 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm > 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm > 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm > 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm > 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm > 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm > 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm > 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm > 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm > 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm > 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm > 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm > 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm > 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm > 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm > 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm > 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm > 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm > 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm > 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm > 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm > 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm > 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm > 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm > 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm > 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm > 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm > 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm > 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm > 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm > 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm > 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm > 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm > 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm > 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm > 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm > 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm > 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm > 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm > 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm > 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm > 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm > 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm > 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm > 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm > 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm > 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm > 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm > 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm > 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm > 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm > 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm > 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm > 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm > 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm > 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm > 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm > 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm > 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm > 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm > 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm > 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm > 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm > 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm > 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm > 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm > 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm > 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm > 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm > 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm > 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm > 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm > 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm > 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm > 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm > 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm > 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm > 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm > 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm > 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm > 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm > 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm > 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm > 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm > 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm > 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm > 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm > 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm > 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm > 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm > 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm > 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm > 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm > 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm > 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm > 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm > 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm > 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm > 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm > 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm > 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm > 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm > 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm > 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm > 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm > 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm > 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm > 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm > 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm > 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm > 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm > 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm > 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm > 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm > 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm > 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm > 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm > 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm > 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm > 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm > 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm > 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm > 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm > 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm > 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm > 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm > 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm > 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm > 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm > 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm > 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm > 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm > 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm > 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm > 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm > 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm > 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm > 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm > 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm > 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm > 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm > 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm > 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm > 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm > 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm > 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm > 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm > 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm > 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm > 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm > 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm > 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm > 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm > 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm > 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm > 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm > 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm > 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm > 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm > 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm > 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm > 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm > 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm > 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm > 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm > 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm > 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm > 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm > 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm > 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm > 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm > 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm > 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm > 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm > 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm > 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm > 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm > 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm > 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm > 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm > 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm > 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm > 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm > 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm > 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm > 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm > 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm > 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm > 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm > 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm > 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm > 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm > 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm > 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm > 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm > 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm > 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm > 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm > 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm > 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm > 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm > 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm > 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm > 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm > 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm > 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm > 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm > 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm > 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm > 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm > 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm > 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm > 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm > 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm > 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm > 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm > 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm > 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm > 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm > 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm > 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm > 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm > 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm > 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm > 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm > 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm > 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm > 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm > 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm > 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm > 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm > 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm > 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm > 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm > 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm > 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm > 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm > 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm > 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm > 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm > 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm > 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm > 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm > 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm > 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm > 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm > 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm > 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm > 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm > 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm > 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm > 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm > 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm > 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm > 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm > 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm > 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm > 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm > 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm > 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm > 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm > 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm > 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm > 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm > 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm > 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley > wrote: > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang > wrote: > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > ierr = VecAssemblyBegin(x); > > > > ierr = VecAssemblyEnd(x); > > > > This is probably unnecessary > > > > > > > > ierr = VecAssemblyBegin(b); > > > > ierr = VecAssemblyEnd(b); > > > > This is probably unnecessary > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ > WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > Is your rhs consistent with this nullspace? > > > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > > KSPSetOperators(ksp,A,A); > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > #if defined(__HYPRE__) > > > > KSPGetPC(ksp, &pc); > > > > PCSetType(pc, PCHYPRE); > > > > PCHYPRESetType(pc,"boomeramg"); > > > > This is terribly unnecessary. You just use > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > or > > > > > > > > -pc_type gamg > > > > > > > > #else > > > > KSPSetType(ksp,KSPBCGSL); > > > > KSPBCGSLSetEll(ksp,2); > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > KSPSetFromOptions(ksp); > > > > KSPSetUp(ksp); > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > command line > > > > > > > > You did not provide any of what I asked for the in the eprevious > mail. > > > > > > > > Matt > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley > wrote: > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang > wrote: > > > > hi, > > > > > > > > I implemented HYPRE preconditioner for my study due to the fact that > without preconditioner, PETSc solver will take thousands of iterations to > converge for fine grid simulation. > > > > > > > > with HYPRE, depending on the parallel partition, it will take HYPRE > forever to do anything. observation of output file is that the simulation > is hanging with no output. > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > 1) For any question about convergence, we need to see the output of > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual > -ksp_converged_reason > > > > > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > > > > > Thanks, > > > > > > > > Matt > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > -- > > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > > -- Norbert Wiener > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > -- > > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > > -- Norbert Wiener > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Oct 21 23:00:05 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 21 Oct 2017 23:00:05 -0500 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> Message-ID: > On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: > > the incompressible NS solver algorithm call PETSc solver at different stage of each time step. The one you were saying "This is good. 12 digit reduction" is after the initial pressure solver, in which usually HYPRE doesn't give a good convergence, so the fall-back solver GMRES will be called after. Hmm, I don't understand. hypre should do well on a pressure solve. In fact, very well. > > Barry, you were mentioning that I could have a wrong nullspace. that particular solver is aimed to give an initial pressure profile for 3d incompressible NS simulation using all neumann boundary conditions. could you give some insight how to test if I have a wrong nullspace etc? -ksp_test_null_space But if your null space is consistently from all Neumann boundary conditions then it likely is not wrong. Barry > > Thanks! > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith wrote: > > This is good. You get more than 12 digit reduction in the true residual norm. This is good AMG convergence. Expected when everything goes well. > > What is different in this case from the previous case that does not converge reasonably? > > Barry > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: > > > > Barry, Please advise what you make of this? this is poisson solver with all neumann BC 3d case Finite difference Scheme was used. > > Thanks! I'm in learning mode. > > > > KSP Object: 24 MPI processes > > type: bcgs > > maximum iterations=40000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: 24 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > Cycle type V > > Maximum number of levels 25 > > Maximum number of iterations PER hypre call 1 > > Convergence tolerance PER hypre call 0. > > Threshold for strong coupling 0.25 > > Interpolation truncation factor 0. > > Interpolation: max elements per row 0 > > Number of levels of aggressive coarsening 0 > > Number of paths for aggressive coarsening 1 > > Maximum row sums 0.9 > > Sweeps down 1 > > Sweeps up 1 > > Sweeps on coarse 1 > > Relax down symmetric-SOR/Jacobi > > Relax up symmetric-SOR/Jacobi > > Relax on coarse Gaussian-elimination > > Relax weight (all) 1. > > Outer relax weight (all) 1. > > Using CF-relaxation > > Not using more complex smoothers. > > Measure type local > > Coarsen type Falgout > > Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: A 24 MPI processes > > type: mpiaij > > rows=497664, cols=497664 > > total: nonzeros=3363552, allocated nonzeros=6967296 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node (on process 0) routines > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith wrote: > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: > > > > > > hi, Barry: > > > what do you mean absurd by setting tolerance =1e-14? > > > > Trying to decrease the initial residual norm down by a factor of 1e-14 with an iterative method (or even direct method) is unrealistic, usually unachievable) and almost never necessary. You are requiring || r_n || < 1.e-14 || r_0|| when with double precision numbers you only have roughly 14 decimal digits total to compute with. Round off alone will lead to differences far larger than 1e-14 > > > > If you are using the solver in the context of a nonlinear problem (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to get quadratic convergence of Newton's method. > > > > If you are solving a linear problem then it is extremely likely that errors due to discretization errors (from finite element method etc) and the model are much much larger than even 1.e-8. > > > > So, in summary > > > > 1.e-14 is probably unachievable > > > > 1.e-14 is almost for sure not needed. > > > > Barry > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely to be a reason to use bcgs > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > > > > > > > if this solver doesn't converge. I have a fall-back solution, which uses GMRES solver. this setup is fine with me. I just want to know if HYPRE is a reliable solution for me. Or I will have to go without preconditioner. > > > > > > > > Thanks! > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: > > > > this is serial run. still dumping output. parallel more or less the same. > > > > > > > > KSP Object: 1 MPI processes > > > > type: bcgs > > > > maximum iterations=40000, initial guess is zero > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 1 MPI processes > > > > type: hypre > > > > HYPRE BoomerAMG preconditioning > > > > Cycle type V > > > > Maximum number of levels 25 > > > > Maximum number of iterations PER hypre call 1 > > > > Convergence tolerance PER hypre call 0. > > > > Threshold for strong coupling 0.25 > > > > Interpolation truncation factor 0. > > > > Interpolation: max elements per row 0 > > > > Number of levels of aggressive coarsening 0 > > > > Number of paths for aggressive coarsening 1 > > > > Maximum row sums 0.9 > > > > Sweeps down 1 > > > > Sweeps up 1 > > > > Sweeps on coarse 1 > > > > Relax down symmetric-SOR/Jacobi > > > > Relax up symmetric-SOR/Jacobi > > > > Relax on coarse Gaussian-elimination > > > > Relax weight (all) 1. > > > > Outer relax weight (all) 1. > > > > Using CF-relaxation > > > > Not using more complex smoothers. > > > > Measure type local > > > > Coarsen type Falgout > > > > Interpolation type classical > > > > linear system matrix = precond matrix: > > > > Mat Object: A 1 MPI processes > > > > type: seqaij > > > > rows=497664, cols=497664 > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > total number of mallocs used during MatSetValues calls =0 > > > > has attached null space > > > > not using I-node routines > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley wrote: > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > ierr = VecAssemblyBegin(x); > > > > ierr = VecAssemblyEnd(x); > > > > This is probably unnecessary > > > > > > > > ierr = VecAssemblyBegin(b); > > > > ierr = VecAssemblyEnd(b); > > > > This is probably unnecessary > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > Is your rhs consistent with this nullspace? > > > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > > KSPSetOperators(ksp,A,A); > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > #if defined(__HYPRE__) > > > > KSPGetPC(ksp, &pc); > > > > PCSetType(pc, PCHYPRE); > > > > PCHYPRESetType(pc,"boomeramg"); > > > > This is terribly unnecessary. You just use > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > or > > > > > > > > -pc_type gamg > > > > > > > > #else > > > > KSPSetType(ksp,KSPBCGSL); > > > > KSPBCGSLSetEll(ksp,2); > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > KSPSetFromOptions(ksp); > > > > KSPSetUp(ksp); > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > command line > > > > > > > > You did not provide any of what I asked for the in the eprevious mail. > > > > > > > > Matt > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley wrote: > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > > > > hi, > > > > > > > > I implemented HYPRE preconditioner for my study due to the fact that without preconditioner, PETSc solver will take thousands of iterations to converge for fine grid simulation. > > > > > > > > with HYPRE, depending on the parallel partition, it will take HYPRE forever to do anything. observation of output file is that the simulation is hanging with no output. > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > 1) For any question about convergence, we need to see the output of > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > > > > > Thanks, > > > > > > > > Matt > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > -- > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > -- Norbert Wiener > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > -- > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > -- Norbert Wiener > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 From hbcbh1999 at gmail.com Sat Oct 21 23:05:31 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Sun, 22 Oct 2017 00:05:31 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> Message-ID: this is the initial pressure solver output regarding use of PETSc. it failed to converge after 40000 iterations, then use GMRES. 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 Linear solve did not converge due to DIVERGED_ITS iterations 40000 KSP Object: 24 MPI processes type: bcgs maximum iterations=40000, initial guess is zero tolerances: relative=1e-14, absolute=1e-14, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 24 MPI processes type: hypre HYPRE BoomerAMG preconditioning Cycle type V Maximum number of levels 25 Maximum number of iterations PER hypre call 1 Convergence tolerance PER hypre call 0. Threshold for strong coupling 0.25 Interpolation truncation factor 0. Interpolation: max elements per row 0 Number of levels of aggressive coarsening 0 Number of paths for aggressive coarsening 1 Maximum row sums 0.9 Sweeps down 1 Sweeps up 1 Sweeps on coarse 1 Relax down symmetric-SOR/Jacobi Relax up symmetric-SOR/Jacobi Relax on coarse Gaussian-elimination Relax weight (all) 1. Outer relax weight (all) 1. Using CF-relaxation Not using more complex smoothers. Measure type local Coarsen type Falgout Interpolation type classical linear system matrix = precond matrix: Mat Object: A 24 MPI processes type: mpiaij rows=497664, cols=497664 total: nonzeros=3363552, allocated nonzeros=6967296 total number of mallocs used during MatSetValues calls =0 has attached null space not using I-node (on process 0) routines The solution diverges for p0! The residual is 3.853123e-08. Solve again using GMRES! KSP Object: 24 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=40000, initial guess is zero tolerances: relative=1e-14, absolute=1e-14, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 24 MPI processes type: hypre HYPRE BoomerAMG preconditioning Cycle type V Maximum number of levels 25 Maximum number of iterations PER hypre call 1 Convergence tolerance PER hypre call 0. Threshold for strong coupling 0.25 Interpolation truncation factor 0. Interpolation: max elements per row 0 Number of levels of aggressive coarsening 0 Number of paths for aggressive coarsening 1 Maximum row sums 0.9 Sweeps down 1 Sweeps up 1 Sweeps on coarse 1 Relax down symmetric-SOR/Jacobi Relax up symmetric-SOR/Jacobi Relax on coarse Gaussian-elimination Relax weight (all) 1. Outer relax weight (all) 1. Using CF-relaxation Not using more complex smoothers. Measure type local Coarsen type Falgout Interpolation type classical linear system matrix = precond matrix: Mat Object: A 24 MPI processes type: mpiaij rows=497664, cols=497664 total: nonzeros=3363552, allocated nonzeros=6967296 total number of mallocs used during MatSetValues calls =0 has attached null space not using I-node (on process 0) routines 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 Linear solve converged due to CONVERGED_RTOL iterations 12 KSP Object: 24 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=40000, initial guess is zero tolerances: relative=1e-14, absolute=1e-14, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 24 MPI processes type: hypre HYPRE BoomerAMG preconditioning Cycle type V Maximum number of levels 25 Maximum number of iterations PER hypre call 1 Convergence tolerance PER hypre call 0. Threshold for strong coupling 0.25 Interpolation truncation factor 0. Interpolation: max elements per row 0 Number of levels of aggressive coarsening 0 Number of paths for aggressive coarsening 1 Maximum row sums 0.9 Sweeps down 1 Sweeps up 1 Sweeps on coarse 1 Relax down symmetric-SOR/Jacobi Relax up symmetric-SOR/Jacobi Relax on coarse Gaussian-elimination Relax weight (all) 1. Outer relax weight (all) 1. Using CF-relaxation Not using more complex smoothers. Measure type local Coarsen type Falgout Interpolation type classical linear system matrix = precond matrix: Mat Object: A 24 MPI processes type: mpiaij rows=497664, cols=497664 total: nonzeros=3363552, allocated nonzeros=6967296 total number of mallocs used during MatSetValues calls =0 has attached null space not using I-node (on process 0) routines The max singular value of A = 1.000872 in poisson_solver3d_P0_vd The min singular value of A = 0.667688 in poisson_solver3d_P0_vd The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd In poisson_solver3d_pressure(): num_iter = 12, rel_residual = 1.075459e-13 The max value of p0 is 0.03115845493408858 The min value of p0 is -0.07156715468428149 On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith wrote: > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: > > > > the incompressible NS solver algorithm call PETSc solver at different > stage of each time step. The one you were saying "This is good. 12 digit > reduction" is after the initial pressure solver, in which usually HYPRE > doesn't give a good convergence, so the fall-back solver GMRES will be > called after. > > Hmm, I don't understand. hypre should do well on a pressure solve. In > fact, very well. > > > > Barry, you were mentioning that I could have a wrong nullspace. that > particular solver is aimed to give an initial pressure profile for 3d > incompressible NS simulation using all neumann boundary conditions. could > you give some insight how to test if I have a wrong nullspace etc? > > -ksp_test_null_space > > But if your null space is consistently from all Neumann boundary > conditions then it likely is not wrong. > > Barry > > > > > Thanks! > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith > wrote: > > > > This is good. You get more than 12 digit reduction in the true > residual norm. This is good AMG convergence. Expected when everything goes > well. > > > > What is different in this case from the previous case that does not > converge reasonably? > > > > Barry > > > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: > > > > > > Barry, Please advise what you make of this? this is poisson solver > with all neumann BC 3d case Finite difference Scheme was used. > > > Thanks! I'm in learning mode. > > > > > > KSP Object: 24 MPI processes > > > type: bcgs > > > maximum iterations=40000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 24 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > Cycle type V > > > Maximum number of levels 25 > > > Maximum number of iterations PER hypre call 1 > > > Convergence tolerance PER hypre call 0. > > > Threshold for strong coupling 0.25 > > > Interpolation truncation factor 0. > > > Interpolation: max elements per row 0 > > > Number of levels of aggressive coarsening 0 > > > Number of paths for aggressive coarsening 1 > > > Maximum row sums 0.9 > > > Sweeps down 1 > > > Sweeps up 1 > > > Sweeps on coarse 1 > > > Relax down symmetric-SOR/Jacobi > > > Relax up symmetric-SOR/Jacobi > > > Relax on coarse Gaussian-elimination > > > Relax weight (all) 1. > > > Outer relax weight (all) 1. > > > Using CF-relaxation > > > Not using more complex smoothers. > > > Measure type local > > > Coarsen type Falgout > > > Interpolation type classical > > > linear system matrix = precond matrix: > > > Mat Object: A 24 MPI processes > > > type: mpiaij > > > rows=497664, cols=497664 > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > total number of mallocs used during MatSetValues calls =0 > > > has attached null space > > > not using I-node (on process 0) routines > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm > 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm > 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm > 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm > 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm > 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm > 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm > 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm > 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith > wrote: > > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: > > > > > > > > hi, Barry: > > > > what do you mean absurd by setting tolerance =1e-14? > > > > > > Trying to decrease the initial residual norm down by a factor of > 1e-14 with an iterative method (or even direct method) is unrealistic, > usually unachievable) and almost never necessary. You are requiring || r_n > || < 1.e-14 || r_0|| when with double precision numbers you only have > roughly 14 decimal digits total to compute with. Round off alone will lead > to differences far larger than 1e-14 > > > > > > If you are using the solver in the context of a nonlinear problem > (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to > get quadratic convergence of Newton's method. > > > > > > If you are solving a linear problem then it is extremely likely > that errors due to discretization errors (from finite element method etc) > and the model are much much larger than even 1.e-8. > > > > > > So, in summary > > > > > > 1.e-14 is probably unachievable > > > > > > 1.e-14 is almost for sure not needed. > > > > > > Barry > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith > wrote: > > > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the > resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely to be a > reason to use bcgs > > > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang > wrote: > > > > > > > > > > if this solver doesn't converge. I have a fall-back solution, > which uses GMRES solver. this setup is fine with me. I just want to know if > HYPRE is a reliable solution for me. Or I will have to go without > preconditioner. > > > > > > > > > > Thanks! > > > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang > wrote: > > > > > this is serial run. still dumping output. parallel more or less > the same. > > > > > > > > > > KSP Object: 1 MPI processes > > > > > type: bcgs > > > > > maximum iterations=40000, initial guess is zero > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > left preconditioning > > > > > using PRECONDITIONED norm type for convergence test > > > > > PC Object: 1 MPI processes > > > > > type: hypre > > > > > HYPRE BoomerAMG preconditioning > > > > > Cycle type V > > > > > Maximum number of levels 25 > > > > > Maximum number of iterations PER hypre call 1 > > > > > Convergence tolerance PER hypre call 0. > > > > > Threshold for strong coupling 0.25 > > > > > Interpolation truncation factor 0. > > > > > Interpolation: max elements per row 0 > > > > > Number of levels of aggressive coarsening 0 > > > > > Number of paths for aggressive coarsening 1 > > > > > Maximum row sums 0.9 > > > > > Sweeps down 1 > > > > > Sweeps up 1 > > > > > Sweeps on coarse 1 > > > > > Relax down symmetric-SOR/Jacobi > > > > > Relax up symmetric-SOR/Jacobi > > > > > Relax on coarse Gaussian-elimination > > > > > Relax weight (all) 1. > > > > > Outer relax weight (all) 1. > > > > > Using CF-relaxation > > > > > Not using more complex smoothers. > > > > > Measure type local > > > > > Coarsen type Falgout > > > > > Interpolation type classical > > > > > linear system matrix = precond matrix: > > > > > Mat Object: A 1 MPI processes > > > > > type: seqaij > > > > > rows=497664, cols=497664 > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > has attached null space > > > > > not using I-node routines > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid > norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid > norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid > norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid > norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid > norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid > norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid > norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid > norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid > norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid > norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid > norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid > norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid > norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid > norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid > norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid > norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid > norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid > norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid > norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid > norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid > norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid > norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid > norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid > norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid > norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid > norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid > norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid > norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid > norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid > norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid > norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid > norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid > norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid > norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid > norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid > norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid > norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid > norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid > norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid > norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid > norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid > norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid > norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid > norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid > norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid > norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid > norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid > norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid > norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid > norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid > norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid > norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid > norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid > norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid > norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid > norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid > norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid > norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid > norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid > norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid > norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid > norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid > norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid > norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid > norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid > norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid > norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid > norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid > norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid > norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid > norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid > norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid > norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid > norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid > norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid > norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid > norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid > norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid > norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid > norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid > norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid > norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid > norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid > norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid > norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid > norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid > norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid > norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid > norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid > norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid > norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid > norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid > norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid > norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid > norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid > norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid > norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid > norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid > norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid > norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid > norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid > norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid > norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid > norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid > norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid > norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid > norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid > norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid > norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid > norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid > norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid > norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid > norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid > norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid > norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid > norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid > norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid > norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid > norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid > norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid > norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid > norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid > norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid > norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid > norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid > norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid > norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid > norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid > norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid > norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid > norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid > norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid > norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid > norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid > norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid > norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid > norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid > norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid > norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid > norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid > norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid > norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid > norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid > norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid > norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid > norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid > norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid > norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid > norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid > norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid > norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid > norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid > norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid > norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid > norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid > norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid > norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid > norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid > norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid > norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid > norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid > norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid > norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid > norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid > norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid > norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid > norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid > norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid > norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid > norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid > norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid > norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid > norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid > norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid > norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid > norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid > norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid > norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid > norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid > norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid > norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid > norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid > norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid > norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid > norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid > norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid > norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid > norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid > norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid > norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid > norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid > norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid > norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid > norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid > norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid > norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid > norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid > norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid > norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid > norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid > norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid > norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid > norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid > norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid > norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid > norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid > norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid > norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid > norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid > norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid > norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid > norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid > norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid > norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid > norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid > norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid > norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid > norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid > norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid > norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid > norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid > norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid > norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid > norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid > norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid > norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid > norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid > norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid > norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid > norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid > norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid > norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid > norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid > norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid > norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid > norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid > norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid > norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid > norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid > norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid > norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid > norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid > norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid > norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid > norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid > norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid > norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid > norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid > norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid > norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid > norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid > norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid > norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid > norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid > norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid > norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid > norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid > norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid > norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid > norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid > norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid > norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid > norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid > norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid > norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid > norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid > norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid > norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid > norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid > norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid > norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid > norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid > norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid > norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid > norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid > norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid > norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid > norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid > norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid > norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid > norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid > norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid > norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid > norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid > norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid > norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid > norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid > norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid > norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid > norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid > norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid > norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid > norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid > norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid > norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid > norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid > norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid > norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid > norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid > norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid > norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid > norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid > norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid > norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid > norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid > norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid > norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid > norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid > norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid > norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid > norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid > norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid > norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid > norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid > norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid > norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid > norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid > norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid > norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid > norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid > norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid > norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid > norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid > norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid > norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid > norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid > norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid > norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid > norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid > norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid > norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid > norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid > norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid > norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid > norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid > norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid > norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid > norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid > norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid > norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid > norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid > norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid > norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid > norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid > norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid > norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid > norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley < > knepley at gmail.com> wrote: > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang > wrote: > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > > ierr = VecAssemblyBegin(x); > > > > > ierr = VecAssemblyEnd(x); > > > > > This is probably unnecessary > > > > > > > > > > ierr = VecAssemblyBegin(b); > > > > > ierr = VecAssemblyEnd(b); > > > > > This is probably unnecessary > > > > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ > WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > > Is your rhs consistent with this nullspace? > > > > > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > > > KSPSetOperators(ksp,A,A); > > > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > > #if defined(__HYPRE__) > > > > > KSPGetPC(ksp, &pc); > > > > > PCSetType(pc, PCHYPRE); > > > > > PCHYPRESetType(pc,"boomeramg"); > > > > > This is terribly unnecessary. You just use > > > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > > > or > > > > > > > > > > -pc_type gamg > > > > > > > > > > #else > > > > > KSPSetType(ksp,KSPBCGSL); > > > > > KSPBCGSLSetEll(ksp,2); > > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > > > KSPSetFromOptions(ksp); > > > > > KSPSetUp(ksp); > > > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > > > > command line > > > > > > > > > > You did not provide any of what I asked for the in the eprevious > mail. > > > > > > > > > > Matt > > > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley < > knepley at gmail.com> wrote: > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang > wrote: > > > > > hi, > > > > > > > > > > I implemented HYPRE preconditioner for my study due to the fact > that without preconditioner, PETSc solver will take thousands of iterations > to converge for fine grid simulation. > > > > > > > > > > with HYPRE, depending on the parallel partition, it will take > HYPRE forever to do anything. observation of output file is that the > simulation is hanging with no output. > > > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > > > 1) For any question about convergence, we need to see the output of > > > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual > -ksp_converged_reason > > > > > > > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > > > > > > > Thanks, > > > > > > > > > > Matt > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > > > -- Norbert Wiener > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > > > -- Norbert Wiener > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Oct 21 23:08:05 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 21 Oct 2017 23:08:05 -0500 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> Message-ID: <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> Oh, you change KSP but not hypre. I did not understand this. Why not just use GMRES all the time? Why mess with BCGS if it is not robust? Not worth the small optimization if it breaks everything. Barry > On Oct 21, 2017, at 11:05 PM, Hao Zhang wrote: > > this is the initial pressure solver output regarding use of PETSc. it failed to converge after 40000 iterations, then use GMRES. > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 > Linear solve did not converge due to DIVERGED_ITS iterations 40000 > KSP Object: 24 MPI processes > type: bcgs > maximum iterations=40000, initial guess is zero > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: 24 MPI processes > type: hypre > HYPRE BoomerAMG preconditioning > Cycle type V > Maximum number of levels 25 > Maximum number of iterations PER hypre call 1 > Convergence tolerance PER hypre call 0. > Threshold for strong coupling 0.25 > Interpolation truncation factor 0. > Interpolation: max elements per row 0 > Number of levels of aggressive coarsening 0 > Number of paths for aggressive coarsening 1 > Maximum row sums 0.9 > Sweeps down 1 > Sweeps up 1 > Sweeps on coarse 1 > Relax down symmetric-SOR/Jacobi > Relax up symmetric-SOR/Jacobi > Relax on coarse Gaussian-elimination > Relax weight (all) 1. > Outer relax weight (all) 1. > Using CF-relaxation > Not using more complex smoothers. > Measure type local > Coarsen type Falgout > Interpolation type classical > linear system matrix = precond matrix: > Mat Object: A 24 MPI processes > type: mpiaij > rows=497664, cols=497664 > total: nonzeros=3363552, allocated nonzeros=6967296 > total number of mallocs used during MatSetValues calls =0 > has attached null space > not using I-node (on process 0) routines > > The solution diverges for p0! The residual is 3.853123e-08. Solve again using GMRES! > KSP Object: 24 MPI processes > type: gmres > restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > happy breakdown tolerance 1e-30 > maximum iterations=40000, initial guess is zero > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: 24 MPI processes > type: hypre > HYPRE BoomerAMG preconditioning > Cycle type V > Maximum number of levels 25 > Maximum number of iterations PER hypre call 1 > Convergence tolerance PER hypre call 0. > Threshold for strong coupling 0.25 > Interpolation truncation factor 0. > Interpolation: max elements per row 0 > Number of levels of aggressive coarsening 0 > Number of paths for aggressive coarsening 1 > Maximum row sums 0.9 > Sweeps down 1 > Sweeps up 1 > Sweeps on coarse 1 > Relax down symmetric-SOR/Jacobi > Relax up symmetric-SOR/Jacobi > Relax on coarse Gaussian-elimination > Relax weight (all) 1. > Outer relax weight (all) 1. > Using CF-relaxation > Not using more complex smoothers. > Measure type local > Coarsen type Falgout > Interpolation type classical > linear system matrix = precond matrix: > Mat Object: A 24 MPI processes > type: mpiaij > rows=497664, cols=497664 > total: nonzeros=3363552, allocated nonzeros=6967296 > total number of mallocs used during MatSetValues calls =0 > has attached null space > not using I-node (on process 0) routines > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 > Linear solve converged due to CONVERGED_RTOL iterations 12 > KSP Object: 24 MPI processes > type: gmres > restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > happy breakdown tolerance 1e-30 > maximum iterations=40000, initial guess is zero > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: 24 MPI processes > type: hypre > HYPRE BoomerAMG preconditioning > Cycle type V > Maximum number of levels 25 > Maximum number of iterations PER hypre call 1 > Convergence tolerance PER hypre call 0. > Threshold for strong coupling 0.25 > Interpolation truncation factor 0. > Interpolation: max elements per row 0 > Number of levels of aggressive coarsening 0 > Number of paths for aggressive coarsening 1 > Maximum row sums 0.9 > Sweeps down 1 > Sweeps up 1 > Sweeps on coarse 1 > Relax down symmetric-SOR/Jacobi > Relax up symmetric-SOR/Jacobi > Relax on coarse Gaussian-elimination > Relax weight (all) 1. > Outer relax weight (all) 1. > Using CF-relaxation > Not using more complex smoothers. > Measure type local > Coarsen type Falgout > Interpolation type classical > linear system matrix = precond matrix: > Mat Object: A 24 MPI processes > type: mpiaij > rows=497664, cols=497664 > total: nonzeros=3363552, allocated nonzeros=6967296 > total number of mallocs used during MatSetValues calls =0 > has attached null space > not using I-node (on process 0) routines > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = 1.075459e-13 > > The max value of p0 is 0.03115845493408858 > > The min value of p0 is -0.07156715468428149 > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith wrote: > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: > > > > the incompressible NS solver algorithm call PETSc solver at different stage of each time step. The one you were saying "This is good. 12 digit reduction" is after the initial pressure solver, in which usually HYPRE doesn't give a good convergence, so the fall-back solver GMRES will be called after. > > Hmm, I don't understand. hypre should do well on a pressure solve. In fact, very well. > > > > Barry, you were mentioning that I could have a wrong nullspace. that particular solver is aimed to give an initial pressure profile for 3d incompressible NS simulation using all neumann boundary conditions. could you give some insight how to test if I have a wrong nullspace etc? > > -ksp_test_null_space > > But if your null space is consistently from all Neumann boundary conditions then it likely is not wrong. > > Barry > > > > > Thanks! > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith wrote: > > > > This is good. You get more than 12 digit reduction in the true residual norm. This is good AMG convergence. Expected when everything goes well. > > > > What is different in this case from the previous case that does not converge reasonably? > > > > Barry > > > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: > > > > > > Barry, Please advise what you make of this? this is poisson solver with all neumann BC 3d case Finite difference Scheme was used. > > > Thanks! I'm in learning mode. > > > > > > KSP Object: 24 MPI processes > > > type: bcgs > > > maximum iterations=40000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 24 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > Cycle type V > > > Maximum number of levels 25 > > > Maximum number of iterations PER hypre call 1 > > > Convergence tolerance PER hypre call 0. > > > Threshold for strong coupling 0.25 > > > Interpolation truncation factor 0. > > > Interpolation: max elements per row 0 > > > Number of levels of aggressive coarsening 0 > > > Number of paths for aggressive coarsening 1 > > > Maximum row sums 0.9 > > > Sweeps down 1 > > > Sweeps up 1 > > > Sweeps on coarse 1 > > > Relax down symmetric-SOR/Jacobi > > > Relax up symmetric-SOR/Jacobi > > > Relax on coarse Gaussian-elimination > > > Relax weight (all) 1. > > > Outer relax weight (all) 1. > > > Using CF-relaxation > > > Not using more complex smoothers. > > > Measure type local > > > Coarsen type Falgout > > > Interpolation type classical > > > linear system matrix = precond matrix: > > > Mat Object: A 24 MPI processes > > > type: mpiaij > > > rows=497664, cols=497664 > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > total number of mallocs used during MatSetValues calls =0 > > > has attached null space > > > not using I-node (on process 0) routines > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith wrote: > > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: > > > > > > > > hi, Barry: > > > > what do you mean absurd by setting tolerance =1e-14? > > > > > > Trying to decrease the initial residual norm down by a factor of 1e-14 with an iterative method (or even direct method) is unrealistic, usually unachievable) and almost never necessary. You are requiring || r_n || < 1.e-14 || r_0|| when with double precision numbers you only have roughly 14 decimal digits total to compute with. Round off alone will lead to differences far larger than 1e-14 > > > > > > If you are using the solver in the context of a nonlinear problem (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to get quadratic convergence of Newton's method. > > > > > > If you are solving a linear problem then it is extremely likely that errors due to discretization errors (from finite element method etc) and the model are much much larger than even 1.e-8. > > > > > > So, in summary > > > > > > 1.e-14 is probably unachievable > > > > > > 1.e-14 is almost for sure not needed. > > > > > > Barry > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: > > > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely to be a reason to use bcgs > > > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > > > > > > > > > if this solver doesn't converge. I have a fall-back solution, which uses GMRES solver. this setup is fine with me. I just want to know if HYPRE is a reliable solution for me. Or I will have to go without preconditioner. > > > > > > > > > > Thanks! > > > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: > > > > > this is serial run. still dumping output. parallel more or less the same. > > > > > > > > > > KSP Object: 1 MPI processes > > > > > type: bcgs > > > > > maximum iterations=40000, initial guess is zero > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > left preconditioning > > > > > using PRECONDITIONED norm type for convergence test > > > > > PC Object: 1 MPI processes > > > > > type: hypre > > > > > HYPRE BoomerAMG preconditioning > > > > > Cycle type V > > > > > Maximum number of levels 25 > > > > > Maximum number of iterations PER hypre call 1 > > > > > Convergence tolerance PER hypre call 0. > > > > > Threshold for strong coupling 0.25 > > > > > Interpolation truncation factor 0. > > > > > Interpolation: max elements per row 0 > > > > > Number of levels of aggressive coarsening 0 > > > > > Number of paths for aggressive coarsening 1 > > > > > Maximum row sums 0.9 > > > > > Sweeps down 1 > > > > > Sweeps up 1 > > > > > Sweeps on coarse 1 > > > > > Relax down symmetric-SOR/Jacobi > > > > > Relax up symmetric-SOR/Jacobi > > > > > Relax on coarse Gaussian-elimination > > > > > Relax weight (all) 1. > > > > > Outer relax weight (all) 1. > > > > > Using CF-relaxation > > > > > Not using more complex smoothers. > > > > > Measure type local > > > > > Coarsen type Falgout > > > > > Interpolation type classical > > > > > linear system matrix = precond matrix: > > > > > Mat Object: A 1 MPI processes > > > > > type: seqaij > > > > > rows=497664, cols=497664 > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > has attached null space > > > > > not using I-node routines > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley wrote: > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > > ierr = VecAssemblyBegin(x); > > > > > ierr = VecAssemblyEnd(x); > > > > > This is probably unnecessary > > > > > > > > > > ierr = VecAssemblyBegin(b); > > > > > ierr = VecAssemblyEnd(b); > > > > > This is probably unnecessary > > > > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > > Is your rhs consistent with this nullspace? > > > > > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > > > KSPSetOperators(ksp,A,A); > > > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > > #if defined(__HYPRE__) > > > > > KSPGetPC(ksp, &pc); > > > > > PCSetType(pc, PCHYPRE); > > > > > PCHYPRESetType(pc,"boomeramg"); > > > > > This is terribly unnecessary. You just use > > > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > > > or > > > > > > > > > > -pc_type gamg > > > > > > > > > > #else > > > > > KSPSetType(ksp,KSPBCGSL); > > > > > KSPBCGSLSetEll(ksp,2); > > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > > > KSPSetFromOptions(ksp); > > > > > KSPSetUp(ksp); > > > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > > > > command line > > > > > > > > > > You did not provide any of what I asked for the in the eprevious mail. > > > > > > > > > > Matt > > > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley wrote: > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > > > > > hi, > > > > > > > > > > I implemented HYPRE preconditioner for my study due to the fact that without preconditioner, PETSc solver will take thousands of iterations to converge for fine grid simulation. > > > > > > > > > > with HYPRE, depending on the parallel partition, it will take HYPRE forever to do anything. observation of output file is that the simulation is hanging with no output. > > > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > > > 1) For any question about convergence, we need to see the output of > > > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > > > > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > > > > > > > Thanks, > > > > > > > > > > Matt > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > > -- Norbert Wiener > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > > -- Norbert Wiener > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 From hbcbh1999 at gmail.com Sat Oct 21 23:16:52 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Sun, 22 Oct 2017 00:16:52 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> Message-ID: the reason is when I do finer grid simulation, matrix become more stiff. Much larger condition number. just to give you a perspective, it will take 6000 iterations to converge and the solver does converge. I want to reduce the number of iterations while keeping the convergence rate. that's main drive to do so much heavy lifting around. please advise. snippet will be provided upon request. Thanks again. On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith wrote: > > Oh, you change KSP but not hypre. I did not understand this. > > Why not just use GMRES all the time? Why mess with BCGS if it is not > robust? Not worth the small optimization if it breaks everything. > > Barry > > > > > On Oct 21, 2017, at 11:05 PM, Hao Zhang wrote: > > > > this is the initial pressure solver output regarding use of PETSc. it > failed to converge after 40000 iterations, then use GMRES. > > > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid norm > 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 > > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid norm > 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 > > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid norm > 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 > > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid norm > 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 > > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid norm > 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 > > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid norm > 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 > > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid norm > 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 > > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid norm > 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 > > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid norm > 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 > > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid norm > 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 > > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid norm > 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 > > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid norm > 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 > > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid norm > 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 > > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid norm > 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 > > Linear solve did not converge due to DIVERGED_ITS iterations 40000 > > KSP Object: 24 MPI processes > > type: bcgs > > maximum iterations=40000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: 24 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > Cycle type V > > Maximum number of levels 25 > > Maximum number of iterations PER hypre call 1 > > Convergence tolerance PER hypre call 0. > > Threshold for strong coupling 0.25 > > Interpolation truncation factor 0. > > Interpolation: max elements per row 0 > > Number of levels of aggressive coarsening 0 > > Number of paths for aggressive coarsening 1 > > Maximum row sums 0.9 > > Sweeps down 1 > > Sweeps up 1 > > Sweeps on coarse 1 > > Relax down symmetric-SOR/Jacobi > > Relax up symmetric-SOR/Jacobi > > Relax on coarse Gaussian-elimination > > Relax weight (all) 1. > > Outer relax weight (all) 1. > > Using CF-relaxation > > Not using more complex smoothers. > > Measure type local > > Coarsen type Falgout > > Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: A 24 MPI processes > > type: mpiaij > > rows=497664, cols=497664 > > total: nonzeros=3363552, allocated nonzeros=6967296 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node (on process 0) routines > > > > The solution diverges for p0! The residual is 3.853123e-08. Solve again > using GMRES! > > KSP Object: 24 MPI processes > > type: gmres > > restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > > happy breakdown tolerance 1e-30 > > maximum iterations=40000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: 24 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > Cycle type V > > Maximum number of levels 25 > > Maximum number of iterations PER hypre call 1 > > Convergence tolerance PER hypre call 0. > > Threshold for strong coupling 0.25 > > Interpolation truncation factor 0. > > Interpolation: max elements per row 0 > > Number of levels of aggressive coarsening 0 > > Number of paths for aggressive coarsening 1 > > Maximum row sums 0.9 > > Sweeps down 1 > > Sweeps up 1 > > Sweeps on coarse 1 > > Relax down symmetric-SOR/Jacobi > > Relax up symmetric-SOR/Jacobi > > Relax on coarse Gaussian-elimination > > Relax weight (all) 1. > > Outer relax weight (all) 1. > > Using CF-relaxation > > Not using more complex smoothers. > > Measure type local > > Coarsen type Falgout > > Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: A 24 MPI processes > > type: mpiaij > > rows=497664, cols=497664 > > total: nonzeros=3363552, allocated nonzeros=6967296 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node (on process 0) routines > > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm > 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 > > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm > 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 > > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm > 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 > > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm > 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 > > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm > 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 > > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm > 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 > > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm > 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 > > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm > 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 > > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm > 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 > > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm > 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 > > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm > 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 > > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm > 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 > > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm > 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 > > Linear solve converged due to CONVERGED_RTOL iterations 12 > > KSP Object: 24 MPI processes > > type: gmres > > restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > > happy breakdown tolerance 1e-30 > > maximum iterations=40000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: 24 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > Cycle type V > > Maximum number of levels 25 > > Maximum number of iterations PER hypre call 1 > > Convergence tolerance PER hypre call 0. > > Threshold for strong coupling 0.25 > > Interpolation truncation factor 0. > > Interpolation: max elements per row 0 > > Number of levels of aggressive coarsening 0 > > Number of paths for aggressive coarsening 1 > > Maximum row sums 0.9 > > Sweeps down 1 > > Sweeps up 1 > > Sweeps on coarse 1 > > Relax down symmetric-SOR/Jacobi > > Relax up symmetric-SOR/Jacobi > > Relax on coarse Gaussian-elimination > > Relax weight (all) 1. > > Outer relax weight (all) 1. > > Using CF-relaxation > > Not using more complex smoothers. > > Measure type local > > Coarsen type Falgout > > Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: A 24 MPI processes > > type: mpiaij > > rows=497664, cols=497664 > > total: nonzeros=3363552, allocated nonzeros=6967296 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node (on process 0) routines > > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd > > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd > > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd > > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = > 1.075459e-13 > > > > The max value of p0 is 0.03115845493408858 > > > > The min value of p0 is -0.07156715468428149 > > > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith > wrote: > > > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: > > > > > > the incompressible NS solver algorithm call PETSc solver at different > stage of each time step. The one you were saying "This is good. 12 digit > reduction" is after the initial pressure solver, in which usually HYPRE > doesn't give a good convergence, so the fall-back solver GMRES will be > called after. > > > > Hmm, I don't understand. hypre should do well on a pressure solve. In > fact, very well. > > > > > > Barry, you were mentioning that I could have a wrong nullspace. that > particular solver is aimed to give an initial pressure profile for 3d > incompressible NS simulation using all neumann boundary conditions. could > you give some insight how to test if I have a wrong nullspace etc? > > > > -ksp_test_null_space > > > > But if your null space is consistently from all Neumann boundary > conditions then it likely is not wrong. > > > > Barry > > > > > > > > Thanks! > > > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith > wrote: > > > > > > This is good. You get more than 12 digit reduction in the true > residual norm. This is good AMG convergence. Expected when everything goes > well. > > > > > > What is different in this case from the previous case that does not > converge reasonably? > > > > > > Barry > > > > > > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: > > > > > > > > Barry, Please advise what you make of this? this is poisson solver > with all neumann BC 3d case Finite difference Scheme was used. > > > > Thanks! I'm in learning mode. > > > > > > > > KSP Object: 24 MPI processes > > > > type: bcgs > > > > maximum iterations=40000, initial guess is zero > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 24 MPI processes > > > > type: hypre > > > > HYPRE BoomerAMG preconditioning > > > > Cycle type V > > > > Maximum number of levels 25 > > > > Maximum number of iterations PER hypre call 1 > > > > Convergence tolerance PER hypre call 0. > > > > Threshold for strong coupling 0.25 > > > > Interpolation truncation factor 0. > > > > Interpolation: max elements per row 0 > > > > Number of levels of aggressive coarsening 0 > > > > Number of paths for aggressive coarsening 1 > > > > Maximum row sums 0.9 > > > > Sweeps down 1 > > > > Sweeps up 1 > > > > Sweeps on coarse 1 > > > > Relax down symmetric-SOR/Jacobi > > > > Relax up symmetric-SOR/Jacobi > > > > Relax on coarse Gaussian-elimination > > > > Relax weight (all) 1. > > > > Outer relax weight (all) 1. > > > > Using CF-relaxation > > > > Not using more complex smoothers. > > > > Measure type local > > > > Coarsen type Falgout > > > > Interpolation type classical > > > > linear system matrix = precond matrix: > > > > Mat Object: A 24 MPI processes > > > > type: mpiaij > > > > rows=497664, cols=497664 > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > total number of mallocs used during MatSetValues calls =0 > > > > has attached null space > > > > not using I-node (on process 0) routines > > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm > 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm > 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm > 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm > 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm > 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm > 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm > 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm > 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith > wrote: > > > > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang > wrote: > > > > > > > > > > hi, Barry: > > > > > what do you mean absurd by setting tolerance =1e-14? > > > > > > > > Trying to decrease the initial residual norm down by a factor of > 1e-14 with an iterative method (or even direct method) is unrealistic, > usually unachievable) and almost never necessary. You are requiring || r_n > || < 1.e-14 || r_0|| when with double precision numbers you only have > roughly 14 decimal digits total to compute with. Round off alone will lead > to differences far larger than 1e-14 > > > > > > > > If you are using the solver in the context of a nonlinear problem > (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to > get quadratic convergence of Newton's method. > > > > > > > > If you are solving a linear problem then it is extremely likely > that errors due to discretization errors (from finite element method etc) > and the model are much much larger than even 1.e-8. > > > > > > > > So, in summary > > > > > > > > 1.e-14 is probably unachievable > > > > > > > > 1.e-14 is almost for sure not needed. > > > > > > > > Barry > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith > wrote: > > > > > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the > resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely to be > a reason to use bcgs > > > > > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang > wrote: > > > > > > > > > > > > if this solver doesn't converge. I have a fall-back solution, > which uses GMRES solver. this setup is fine with me. I just want to know if > HYPRE is a reliable solution for me. Or I will have to go without > preconditioner. > > > > > > > > > > > > Thanks! > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang > wrote: > > > > > > this is serial run. still dumping output. parallel more or less > the same. > > > > > > > > > > > > KSP Object: 1 MPI processes > > > > > > type: bcgs > > > > > > maximum iterations=40000, initial guess is zero > > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > > left preconditioning > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > PC Object: 1 MPI processes > > > > > > type: hypre > > > > > > HYPRE BoomerAMG preconditioning > > > > > > Cycle type V > > > > > > Maximum number of levels 25 > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > Convergence tolerance PER hypre call 0. > > > > > > Threshold for strong coupling 0.25 > > > > > > Interpolation truncation factor 0. > > > > > > Interpolation: max elements per row 0 > > > > > > Number of levels of aggressive coarsening 0 > > > > > > Number of paths for aggressive coarsening 1 > > > > > > Maximum row sums 0.9 > > > > > > Sweeps down 1 > > > > > > Sweeps up 1 > > > > > > Sweeps on coarse 1 > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > Relax on coarse Gaussian-elimination > > > > > > Relax weight (all) 1. > > > > > > Outer relax weight (all) 1. > > > > > > Using CF-relaxation > > > > > > Not using more complex smoothers. > > > > > > Measure type local > > > > > > Coarsen type Falgout > > > > > > Interpolation type classical > > > > > > linear system matrix = precond matrix: > > > > > > Mat Object: A 1 MPI processes > > > > > > type: seqaij > > > > > > rows=497664, cols=497664 > > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > > has attached null space > > > > > > not using I-node routines > > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid > norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid > norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid > norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid > norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid > norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid > norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid > norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid > norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid > norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid > norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid > norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid > norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid > norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid > norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid > norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid > norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid > norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid > norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid > norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid > norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid > norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid > norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid > norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid > norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid > norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid > norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid > norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid > norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid > norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid > norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid > norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid > norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid > norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid > norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid > norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid > norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid > norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid > norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid > norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid > norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid > norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid > norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid > norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid > norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid > norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid > norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid > norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid > norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid > norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid > norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid > norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid > norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid > norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid > norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid > norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid > norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid > norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid > norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid > norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid > norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid > norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid > norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid > norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid > norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid > norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid > norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid > norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid > norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid > norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid > norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid > norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid > norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid > norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid > norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid > norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid > norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid > norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid > norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid > norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid > norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid > norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid > norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid > norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid > norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid > norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid > norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid > norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid > norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid > norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid > norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid > norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid > norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid > norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid > norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid > norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid > norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid > norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid > norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid > norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid > norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid > norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid > norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid > norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid > norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid > norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid > norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid > norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid > norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid > norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid > norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid > norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid > norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid > norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid > norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid > norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid > norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid > norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid > norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid > norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid > norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid > norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid > norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid > norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid > norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid > norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid > norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid > norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid > norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid > norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid > norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid > norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid > norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid > norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid > norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid > norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid > norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid > norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid > norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid > norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid > norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid > norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid > norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid > norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid > norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid > norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid > norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid > norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid > norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid > norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid > norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid > norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid > norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid > norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid > norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid > norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid > norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid > norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid > norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid > norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid > norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid > norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid > norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid > norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid > norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid > norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid > norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid > norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid > norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid > norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid > norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid > norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid > norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid > norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid > norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid > norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid > norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid > norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid > norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid > norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid > norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid > norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid > norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid > norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid > norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid > norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid > norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid > norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid > norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid > norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid > norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid > norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid > norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid > norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid > norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid > norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid > norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid > norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid > norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid > norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid > norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid > norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid > norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid > norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid > norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid > norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid > norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid > norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid > norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid > norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid > norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid > norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid > norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid > norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid > norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid > norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid > norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid > norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid > norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid > norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid > norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid > norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid > norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid > norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid > norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid > norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid > norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid > norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid > norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid > norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid > norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid > norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid > norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid > norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid > norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid > norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid > norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid > norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid > norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid > norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid > norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid > norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid > norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid > norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid > norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid > norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid > norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid > norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid > norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid > norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid > norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid > norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid > norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid > norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid > norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid > norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid > norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid > norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid > norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid > norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid > norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid > norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid > norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid > norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid > norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid > norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid > norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid > norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid > norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid > norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid > norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid > norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid > norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid > norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid > norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid > norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid > norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid > norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid > norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid > norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid > norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid > norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid > norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid > norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid > norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid > norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid > norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid > norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid > norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid > norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid > norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid > norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid > norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid > norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid > norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid > norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid > norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid > norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid > norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid > norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid > norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid > norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid > norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid > norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid > norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid > norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid > norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid > norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid > norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid > norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid > norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid > norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid > norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid > norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid > norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid > norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid > norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid > norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid > norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid > norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid > norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid > norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid > norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid > norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid > norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid > norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid > norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid > norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid > norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid > norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid > norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid > norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid > norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid > norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid > norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid > norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid > norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid > norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid > norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid > norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid > norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid > norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid > norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid > norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid > norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid > norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid > norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid > norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley < > knepley at gmail.com> wrote: > > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang > wrote: > > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > > > > ierr = VecAssemblyBegin(x); > > > > > > ierr = VecAssemblyEnd(x); > > > > > > This is probably unnecessary > > > > > > > > > > > > ierr = VecAssemblyBegin(b); > > > > > > ierr = VecAssemblyEnd(b); > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ > WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > > > Is your rhs consistent with this nullspace? > > > > > > > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > > > > KSPSetOperators(ksp,A,A); > > > > > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > > > #if defined(__HYPRE__) > > > > > > KSPGetPC(ksp, &pc); > > > > > > PCSetType(pc, PCHYPRE); > > > > > > PCHYPRESetType(pc,"boomeramg"); > > > > > > This is terribly unnecessary. You just use > > > > > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > > > > > or > > > > > > > > > > > > -pc_type gamg > > > > > > > > > > > > #else > > > > > > KSPSetType(ksp,KSPBCGSL); > > > > > > KSPBCGSLSetEll(ksp,2); > > > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > > > > > KSPSetFromOptions(ksp); > > > > > > KSPSetUp(ksp); > > > > > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > > > > > > > command line > > > > > > > > > > > > You did not provide any of what I asked for the in the eprevious > mail. > > > > > > > > > > > > Matt > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley < > knepley at gmail.com> wrote: > > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang > wrote: > > > > > > hi, > > > > > > > > > > > > I implemented HYPRE preconditioner for my study due to the fact > that without preconditioner, PETSc solver will take thousands of iterations > to converge for fine grid simulation. > > > > > > > > > > > > with HYPRE, depending on the parallel partition, it will take > HYPRE forever to do anything. observation of output file is that the > simulation is hanging with no output. > > > > > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > > > > > 1) For any question about convergence, we need to see the output > of > > > > > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual > -ksp_converged_reason > > > > > > > > > > > > 2) Hypre has many preconditioners, which one are you talking > about > > > > > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > > > > > > > > > Thanks, > > > > > > > > > > > > Matt > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > > > > -- Norbert Wiener > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > > > > -- Norbert Wiener > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Sat Oct 21 23:56:25 2017 From: mfadams at lbl.gov (Mark Adams) Date: Sun, 22 Oct 2017 00:56:25 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> Message-ID: On Sun, Oct 22, 2017 at 12:16 AM, Hao Zhang wrote: > the reason is when I do finer grid simulation, matrix become more stiff. > Much larger condition number. just to give you a perspective, it will take > 6000 iterations to converge and the solver does converge. I want to reduce > the number of iterations while keeping the convergence rate. > I don't understand this, if you reduce the number of iterations you increase the convergence rate. Anyway, a PETSc "solver" has two parts: a KSP method and a PC method. They are orthogonal. You are testing BCGS and GMRES (KSP methods) and hypre (PC). Is that correct? I don't understand what is wrong with your good results above (GMRES + hypre, right?). > that's main drive to do so much heavy lifting around. please advise. > snippet will be provided upon request. > > Thanks again. > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith wrote: > >> >> Oh, you change KSP but not hypre. I did not understand this. >> >> Why not just use GMRES all the time? Why mess with BCGS if it is not >> robust? Not worth the small optimization if it breaks everything. >> >> Barry >> >> >> >> > On Oct 21, 2017, at 11:05 PM, Hao Zhang wrote: >> > >> > this is the initial pressure solver output regarding use of PETSc. it >> failed to converge after 40000 iterations, then use GMRES. >> > >> > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid norm >> 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 >> > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid norm >> 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 >> > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid norm >> 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 >> > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid norm >> 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 >> > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid norm >> 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 >> > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid norm >> 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 >> > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid norm >> 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 >> > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid norm >> 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 >> > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid norm >> 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 >> > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid norm >> 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 >> > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid norm >> 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 >> > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid norm >> 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 >> > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid norm >> 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 >> > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid norm >> 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 >> > Linear solve did not converge due to DIVERGED_ITS iterations 40000 >> > KSP Object: 24 MPI processes >> > type: bcgs >> > maximum iterations=40000, initial guess is zero >> > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > left preconditioning >> > using PRECONDITIONED norm type for convergence test >> > PC Object: 24 MPI processes >> > type: hypre >> > HYPRE BoomerAMG preconditioning >> > Cycle type V >> > Maximum number of levels 25 >> > Maximum number of iterations PER hypre call 1 >> > Convergence tolerance PER hypre call 0. >> > Threshold for strong coupling 0.25 >> > Interpolation truncation factor 0. >> > Interpolation: max elements per row 0 >> > Number of levels of aggressive coarsening 0 >> > Number of paths for aggressive coarsening 1 >> > Maximum row sums 0.9 >> > Sweeps down 1 >> > Sweeps up 1 >> > Sweeps on coarse 1 >> > Relax down symmetric-SOR/Jacobi >> > Relax up symmetric-SOR/Jacobi >> > Relax on coarse Gaussian-elimination >> > Relax weight (all) 1. >> > Outer relax weight (all) 1. >> > Using CF-relaxation >> > Not using more complex smoothers. >> > Measure type local >> > Coarsen type Falgout >> > Interpolation type classical >> > linear system matrix = precond matrix: >> > Mat Object: A 24 MPI processes >> > type: mpiaij >> > rows=497664, cols=497664 >> > total: nonzeros=3363552, allocated nonzeros=6967296 >> > total number of mallocs used during MatSetValues calls =0 >> > has attached null space >> > not using I-node (on process 0) routines >> > >> > The solution diverges for p0! The residual is 3.853123e-08. Solve >> again using GMRES! >> > KSP Object: 24 MPI processes >> > type: gmres >> > restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> > happy breakdown tolerance 1e-30 >> > maximum iterations=40000, initial guess is zero >> > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > left preconditioning >> > using PRECONDITIONED norm type for convergence test >> > PC Object: 24 MPI processes >> > type: hypre >> > HYPRE BoomerAMG preconditioning >> > Cycle type V >> > Maximum number of levels 25 >> > Maximum number of iterations PER hypre call 1 >> > Convergence tolerance PER hypre call 0. >> > Threshold for strong coupling 0.25 >> > Interpolation truncation factor 0. >> > Interpolation: max elements per row 0 >> > Number of levels of aggressive coarsening 0 >> > Number of paths for aggressive coarsening 1 >> > Maximum row sums 0.9 >> > Sweeps down 1 >> > Sweeps up 1 >> > Sweeps on coarse 1 >> > Relax down symmetric-SOR/Jacobi >> > Relax up symmetric-SOR/Jacobi >> > Relax on coarse Gaussian-elimination >> > Relax weight (all) 1. >> > Outer relax weight (all) 1. >> > Using CF-relaxation >> > Not using more complex smoothers. >> > Measure type local >> > Coarsen type Falgout >> > Interpolation type classical >> > linear system matrix = precond matrix: >> > Mat Object: A 24 MPI processes >> > type: mpiaij >> > rows=497664, cols=497664 >> > total: nonzeros=3363552, allocated nonzeros=6967296 >> > total number of mallocs used during MatSetValues calls =0 >> > has attached null space >> > not using I-node (on process 0) routines >> > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm >> 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 >> > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm >> 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 >> > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm >> 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 >> > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm >> 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 >> > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm >> 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 >> > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm >> 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 >> > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm >> 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 >> > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm >> 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 >> > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm >> 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 >> > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm >> 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 >> > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm >> 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 >> > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm >> 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 >> > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm >> 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 >> > Linear solve converged due to CONVERGED_RTOL iterations 12 >> > KSP Object: 24 MPI processes >> > type: gmres >> > restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> > happy breakdown tolerance 1e-30 >> > maximum iterations=40000, initial guess is zero >> > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > left preconditioning >> > using PRECONDITIONED norm type for convergence test >> > PC Object: 24 MPI processes >> > type: hypre >> > HYPRE BoomerAMG preconditioning >> > Cycle type V >> > Maximum number of levels 25 >> > Maximum number of iterations PER hypre call 1 >> > Convergence tolerance PER hypre call 0. >> > Threshold for strong coupling 0.25 >> > Interpolation truncation factor 0. >> > Interpolation: max elements per row 0 >> > Number of levels of aggressive coarsening 0 >> > Number of paths for aggressive coarsening 1 >> > Maximum row sums 0.9 >> > Sweeps down 1 >> > Sweeps up 1 >> > Sweeps on coarse 1 >> > Relax down symmetric-SOR/Jacobi >> > Relax up symmetric-SOR/Jacobi >> > Relax on coarse Gaussian-elimination >> > Relax weight (all) 1. >> > Outer relax weight (all) 1. >> > Using CF-relaxation >> > Not using more complex smoothers. >> > Measure type local >> > Coarsen type Falgout >> > Interpolation type classical >> > linear system matrix = precond matrix: >> > Mat Object: A 24 MPI processes >> > type: mpiaij >> > rows=497664, cols=497664 >> > total: nonzeros=3363552, allocated nonzeros=6967296 >> > total number of mallocs used during MatSetValues calls =0 >> > has attached null space >> > not using I-node (on process 0) routines >> > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd >> > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd >> > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd >> > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = >> 1.075459e-13 >> > >> > The max value of p0 is 0.03115845493408858 >> > >> > The min value of p0 is -0.07156715468428149 >> > >> > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith >> wrote: >> > >> > > On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: >> > > >> > > the incompressible NS solver algorithm call PETSc solver at different >> stage of each time step. The one you were saying "This is good. 12 digit >> reduction" is after the initial pressure solver, in which usually HYPRE >> doesn't give a good convergence, so the fall-back solver GMRES will be >> called after. >> > >> > Hmm, I don't understand. hypre should do well on a pressure solve. In >> fact, very well. >> > > >> > > Barry, you were mentioning that I could have a wrong nullspace. that >> particular solver is aimed to give an initial pressure profile for 3d >> incompressible NS simulation using all neumann boundary conditions. could >> you give some insight how to test if I have a wrong nullspace etc? >> > >> > -ksp_test_null_space >> > >> > But if your null space is consistently from all Neumann boundary >> conditions then it likely is not wrong. >> > >> > Barry >> > >> > > >> > > Thanks! >> > > >> > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith >> wrote: >> > > >> > > This is good. You get more than 12 digit reduction in the true >> residual norm. This is good AMG convergence. Expected when everything goes >> well. >> > > >> > > What is different in this case from the previous case that does not >> converge reasonably? >> > > >> > > Barry >> > > >> > > >> > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: >> > > > >> > > > Barry, Please advise what you make of this? this is poisson solver >> with all neumann BC 3d case Finite difference Scheme was used. >> > > > Thanks! I'm in learning mode. >> > > > >> > > > KSP Object: 24 MPI processes >> > > > type: bcgs >> > > > maximum iterations=40000, initial guess is zero >> > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > > > left preconditioning >> > > > using PRECONDITIONED norm type for convergence test >> > > > PC Object: 24 MPI processes >> > > > type: hypre >> > > > HYPRE BoomerAMG preconditioning >> > > > Cycle type V >> > > > Maximum number of levels 25 >> > > > Maximum number of iterations PER hypre call 1 >> > > > Convergence tolerance PER hypre call 0. >> > > > Threshold for strong coupling 0.25 >> > > > Interpolation truncation factor 0. >> > > > Interpolation: max elements per row 0 >> > > > Number of levels of aggressive coarsening 0 >> > > > Number of paths for aggressive coarsening 1 >> > > > Maximum row sums 0.9 >> > > > Sweeps down 1 >> > > > Sweeps up 1 >> > > > Sweeps on coarse 1 >> > > > Relax down symmetric-SOR/Jacobi >> > > > Relax up symmetric-SOR/Jacobi >> > > > Relax on coarse Gaussian-elimination >> > > > Relax weight (all) 1. >> > > > Outer relax weight (all) 1. >> > > > Using CF-relaxation >> > > > Not using more complex smoothers. >> > > > Measure type local >> > > > Coarsen type Falgout >> > > > Interpolation type classical >> > > > linear system matrix = precond matrix: >> > > > Mat Object: A 24 MPI processes >> > > > type: mpiaij >> > > > rows=497664, cols=497664 >> > > > total: nonzeros=3363552, allocated nonzeros=6967296 >> > > > total number of mallocs used during MatSetValues calls =0 >> > > > has attached null space >> > > > not using I-node (on process 0) routines >> > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid >> norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 >> > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid >> norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 >> > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid >> norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 >> > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid >> norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 >> > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid >> norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 >> > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid >> norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 >> > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid >> norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 >> > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid >> norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 >> > > > Linear solve converged due to CONVERGED_ATOL iterations 7 >> > > > >> > > > >> > > > >> > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith >> wrote: >> > > > >> > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang >> wrote: >> > > > > >> > > > > hi, Barry: >> > > > > what do you mean absurd by setting tolerance =1e-14? >> > > > >> > > > Trying to decrease the initial residual norm down by a factor of >> 1e-14 with an iterative method (or even direct method) is unrealistic, >> usually unachievable) and almost never necessary. You are requiring || r_n >> || < 1.e-14 || >> r_0|| when with double precision numbers you only have roughly 14 decimal >> digits total to compute with. Round off alone will lead to differences far >> larger than 1e-14 >> > > > >> > > > If you are using the solver in the context of a nonlinear >> problem (i.e. inside Newton's method) then 1.e-6 >> is generally >> more than plenty to get quadratic convergence of Newton's method. >> > > > >> > > > If you are solving a linear problem then it is extremely likely >> that errors due to discretization errors (from finite element method etc) >> and the model are much much larger than even 1.e-8 >> . >> > > > >> > > > So, in summary >> > > > >> > > > 1.e-14 >> is probably unachievable >> > > > >> > > > 1.e-14 >> is almost for >> sure not needed. >> > > > >> > > > Barry >> > > > >> > > > >> > > > > >> > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith >> wrote: >> > > > > >> > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the >> resulting output file called binaryoutput to petsc-maint at mcs.anl.gov >> > > > > >> > > > > Note you can also use -ksp_type gmres with hypre, unlikely to >> be a reason to use bcgs >> > > > > >> > > > > BTW: tolerances: relative=1e-14, is absurd >> > > > > >> > > > > My guess is your null space is incorrect. >> > > > > >> > > > > >> > > > > >> > > > > >> > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang >> wrote: >> > > > > > >> > > > > > if this solver doesn't converge. I have a fall-back solution, >> which uses GMRES solver. this setup is fine with me. I just want to know if >> HYPRE is a reliable solution for me. Or I will have to go without >> preconditioner. >> > > > > > >> > > > > > Thanks! >> > > > > > >> > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang >> wrote: >> > > > > > this is serial run. still dumping output. parallel more or less >> the same. >> > > > > > >> > > > > > KSP Object: 1 MPI processes >> > > > > > type: bcgs >> > > > > > maximum iterations=40000, initial guess is zero >> > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > > > > > left preconditioning >> > > > > > using PRECONDITIONED norm type for convergence test >> > > > > > PC Object: 1 MPI processes >> > > > > > type: hypre >> > > > > > HYPRE BoomerAMG preconditioning >> > > > > > Cycle type V >> > > > > > Maximum number of levels 25 >> > > > > > Maximum number of iterations PER hypre call 1 >> > > > > > Convergence tolerance PER hypre call 0. >> > > > > > Threshold for strong coupling 0.25 >> > > > > > Interpolation truncation factor 0. >> > > > > > Interpolation: max elements per row 0 >> > > > > > Number of levels of aggressive coarsening 0 >> > > > > > Number of paths for aggressive coarsening 1 >> > > > > > Maximum row sums 0.9 >> > > > > > Sweeps down 1 >> > > > > > Sweeps up 1 >> > > > > > Sweeps on coarse 1 >> > > > > > Relax down symmetric-SOR/Jacobi >> > > > > > Relax up symmetric-SOR/Jacobi >> > > > > > Relax on coarse Gaussian-elimination >> > > > > > Relax weight (all) 1. >> > > > > > Outer relax weight (all) 1. >> > > > > > Using CF-relaxation >> > > > > > Not using more complex smoothers. >> > > > > > Measure type local >> > > > > > Coarsen type Falgout >> > > > > > Interpolation type classical >> > > > > > linear system matrix = precond matrix: >> > > > > > Mat Object: A 1 MPI processes >> > > > > > type: seqaij >> > > > > > rows=497664, cols=497664 >> > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 >> > > > > > total number of mallocs used during MatSetValues calls =0 >> > > > > > has attached null space >> > > > > > not using I-node routines >> > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid >> norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 >> > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid >> norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 >> > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid >> norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 >> > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid >> norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 >> > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid >> norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 >> > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid >> norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 >> > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid >> norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 >> > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid >> norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 >> > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid >> norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 >> > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid >> norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 >> > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid >> norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 >> > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid >> norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 >> > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid >> norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 >> > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid >> norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 >> > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid >> norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 >> > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid >> norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 >> > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid >> norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 >> > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid >> norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 >> > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid >> norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 >> > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid >> norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 >> > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid >> norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 >> > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid >> norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 >> > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid >> norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 >> > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid >> norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 >> > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid >> norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 >> > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid >> norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 >> > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid >> norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 >> > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid >> norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 >> > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid >> norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 >> > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid >> norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 >> > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid >> norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 >> > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid >> norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 >> > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid >> norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 >> > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid >> norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 >> > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid >> norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 >> > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid >> norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 >> > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid >> norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 >> > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid >> norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 >> > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid >> norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 >> > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid >> norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 >> > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid >> norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 >> > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid >> norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 >> > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid >> norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 >> > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid >> norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 >> > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid >> norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 >> > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid >> norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 >> > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid >> norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 >> > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid >> norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 >> > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid >> norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 >> > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid >> norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 >> > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid >> norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 >> > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid >> norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 >> > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid >> norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 >> > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid >> norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 >> > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid >> norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 >> > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid >> norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 >> > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid >> norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 >> > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid >> norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 >> > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid >> norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 >> > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid >> norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 >> > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid >> norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 >> > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid >> norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 >> > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid >> norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 >> > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid >> norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 >> > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid >> norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 >> > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid >> norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 >> > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid >> norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 >> > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid >> norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 >> > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid >> norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 >> > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid >> norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 >> > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid >> norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 >> > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid >> norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 >> > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid >> norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 >> > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid >> norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 >> > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid >> norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 >> > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid >> norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 >> > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid >> norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 >> > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid >> norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 >> > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid >> norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 >> > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid >> norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 >> > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid >> norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 >> > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid >> norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 >> > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid >> norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 >> > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid >> norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 >> > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid >> norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 >> > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid >> norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 >> > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid >> norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 >> > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid >> norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 >> > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid >> norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 >> > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid >> norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 >> > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid >> norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 >> > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid >> norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 >> > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid >> norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 >> > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid >> norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 >> > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid >> norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 >> > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid >> norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 >> > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid >> norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 >> > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid >> norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 >> > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid >> norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 >> > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid >> norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 >> > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid >> norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 >> > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid >> norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 >> > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid >> norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 >> > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid >> norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 >> > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid >> norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 >> > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid >> norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 >> > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid >> norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 >> > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid >> norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 >> > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid >> norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 >> > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid >> norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 >> > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid >> norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 >> > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid >> norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 >> > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid >> norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 >> > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid >> norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 >> > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid >> norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 >> > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid >> norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 >> > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid >> norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 >> > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid >> norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 >> > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid >> norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 >> > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid >> norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 >> > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid >> norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 >> > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid >> norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 >> > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid >> norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 >> > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid >> norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 >> > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid >> norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 >> > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid >> norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 >> > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid >> norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 >> > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid >> norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 >> > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid >> norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 >> > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid >> norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 >> > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid >> norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 >> > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid >> norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 >> > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid >> norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 >> > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid >> norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 >> > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid >> norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 >> > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid >> norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 >> > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid >> norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 >> > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid >> norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 >> > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid >> norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 >> > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid >> norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 >> > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid >> norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 >> > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid >> norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 >> > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid >> norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 >> > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid >> norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 >> > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid >> norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 >> > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid >> norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 >> > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid >> norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 >> > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid >> norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 >> > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid >> norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 >> > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid >> norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 >> > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid >> norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 >> > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid >> norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 >> > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid >> norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 >> > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid >> norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 >> > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid >> norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 >> > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid >> norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 >> > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid >> norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 >> > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid >> norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 >> > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid >> norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 >> > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid >> norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 >> > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid >> norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 >> > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid >> norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 >> > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid >> norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 >> > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid >> norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 >> > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid >> norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 >> > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid >> norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 >> > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid >> norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 >> > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid >> norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 >> > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid >> norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 >> > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid >> norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 >> > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid >> norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 >> > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid >> norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 >> > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid >> norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 >> > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid >> norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 >> > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid >> norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 >> > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid >> norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 >> > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid >> norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 >> > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid >> norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 >> > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid >> norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 >> > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid >> norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 >> > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid >> norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 >> > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid >> norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 >> > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid >> norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 >> > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid >> norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 >> > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid >> norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 >> > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid >> norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 >> > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid >> norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 >> > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid >> norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 >> > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid >> norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 >> > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid >> norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 >> > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid >> norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 >> > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid >> norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 >> > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid >> norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 >> > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid >> norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 >> > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid >> norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 >> > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid >> norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 >> > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid >> norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 >> > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid >> norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 >> > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid >> norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 >> > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid >> norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 >> > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid >> norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 >> > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid >> norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 >> > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid >> norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 >> > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid >> norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 >> > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid >> norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 >> > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid >> norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 >> > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid >> norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 >> > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid >> norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 >> > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid >> norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 >> > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid >> norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 >> > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid >> norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 >> > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid >> norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 >> > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid >> norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 >> > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid >> norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 >> > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid >> norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 >> > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid >> norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 >> > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid >> norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 >> > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid >> norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 >> > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid >> norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 >> > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid >> norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 >> > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid >> norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 >> > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid >> norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 >> > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid >> norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 >> > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid >> norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 >> > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid >> norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 >> > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid >> norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 >> > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid >> norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 >> > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid >> norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 >> > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid >> norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 >> > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid >> norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 >> > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid >> norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 >> > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid >> norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 >> > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid >> norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 >> > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid >> norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 >> > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid >> norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 >> > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid >> norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 >> > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid >> norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 >> > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid >> norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 >> > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid >> norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 >> > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid >> norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 >> > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid >> norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 >> > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid >> norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 >> > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid >> norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 >> > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid >> norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 >> > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid >> norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 >> > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid >> norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 >> > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid >> norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 >> > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid >> norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 >> > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid >> norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >> > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid >> norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 >> > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid >> norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 >> > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid >> norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >> > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid >> norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 >> > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid >> norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 >> > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid >> norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 >> > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid >> norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 >> > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid >> norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 >> > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid >> norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 >> > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid >> norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 >> > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid >> norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 >> > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid >> norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 >> > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid >> norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 >> > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid >> norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 >> > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid >> norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 >> > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid >> norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 >> > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid >> norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 >> > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid >> norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 >> > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid >> norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 >> > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid >> norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 >> > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid >> norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 >> > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid >> norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 >> > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid >> norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 >> > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid >> norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 >> > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid >> norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 >> > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid >> norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 >> > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid >> norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 >> > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid >> norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 >> > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid >> norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 >> > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid >> norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 >> > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid >> norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 >> > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid >> norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 >> > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid >> norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 >> > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid >> norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 >> > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid >> norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 >> > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid >> norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 >> > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid >> norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 >> > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid >> norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 >> > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid >> norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 >> > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid >> norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 >> > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid >> norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 >> > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid >> norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 >> > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid >> norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 >> > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid >> norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 >> > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid >> norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 >> > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid >> norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 >> > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid >> norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 >> > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid >> norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 >> > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid >> norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 >> > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid >> norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 >> > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid >> norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 >> > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid >> norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 >> > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid >> norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 >> > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid >> norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 >> > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid >> norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 >> > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid >> norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 >> > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid >> norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >> > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid >> norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >> > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid >> norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 >> > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid >> norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 >> > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid >> norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 >> > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid >> norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 >> > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid >> norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 >> > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid >> norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 >> > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid >> norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 >> > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid >> norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >> > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid >> norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 >> > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid >> norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >> > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid >> norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 >> > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid >> norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 >> > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid >> norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 >> > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid >> norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 >> > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid >> norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 >> > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid >> norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 >> > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid >> norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 >> > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid >> norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 >> > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid >> norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 >> > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid >> norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 >> > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid >> norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 >> > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid >> norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 >> > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid >> norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 >> > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid >> norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 >> > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid >> norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 >> > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid >> norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 >> > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid >> norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 >> > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid >> norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 >> > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid >> norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 >> > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid >> norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 >> > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid >> norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 >> > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid >> norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 >> > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid >> norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 >> > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid >> norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 >> > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid >> norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 >> > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid >> norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 >> > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid >> norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 >> > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid >> norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 >> > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid >> norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 >> > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid >> norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 >> > > > > > >> > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley < >> knepley at gmail.com> wrote: >> > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang >> wrote: >> > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); >> > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); >> > > > > > >> > > > > > ierr = VecAssemblyBegin(x); >> > > > > > ierr = VecAssemblyEnd(x); >> > > > > > This is probably unnecessary >> > > > > > >> > > > > > ierr = VecAssemblyBegin(b); >> > > > > > ierr = VecAssemblyEnd(b); >> > > > > > This is probably unnecessary >> > > > > > >> > > > > > >> > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ >> WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); >> > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 >> > > > > > Is your rhs consistent with this nullspace? >> > > > > > >> > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); >> > > > > > KSPSetOperators(ksp,A,A); >> > > > > > >> > > > > > KSPSetType(ksp,KSPBCGS); >> > > > > > >> > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); >> > > > > > #if defined(__HYPRE__) >> > > > > > KSPGetPC(ksp, &pc); >> > > > > > PCSetType(pc, PCHYPRE); >> > > > > > PCHYPRESetType(pc,"boomeramg"); >> > > > > > This is terribly unnecessary. You just use >> > > > > > >> > > > > > -pc_type hypre -pc_hypre_type boomeramg >> > > > > > >> > > > > > or >> > > > > > >> > > > > > -pc_type gamg >> > > > > > >> > > > > > #else >> > > > > > KSPSetType(ksp,KSPBCGSL); >> > > > > > KSPBCGSLSetEll(ksp,2); >> > > > > > #endif /* defined(__HYPRE__) */ >> > > > > > >> > > > > > KSPSetFromOptions(ksp); >> > > > > > KSPSetUp(ksp); >> > > > > > >> > > > > > ierr = KSPSolve(ksp,b,x); >> > > > > > >> > > > > > >> > > > > > command line >> > > > > > >> > > > > > You did not provide any of what I asked for the in the >> eprevious mail. >> > > > > > >> > > > > > Matt >> > > > > > >> > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley < >> knepley at gmail.com> wrote: >> > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang >> wrote: >> > > > > > hi, >> > > > > > >> > > > > > I implemented HYPRE preconditioner for my study due to the fact >> that without preconditioner, PETSc solver will take thousands of iterations >> to converge for fine grid simulation. >> > > > > > >> > > > > > with HYPRE, depending on the parallel partition, it will take >> HYPRE forever to do anything. observation of output file is that the >> simulation is hanging with no output. >> > > > > > >> > > > > > Any idea what happened? will post snippet of code. >> > > > > > >> > > > > > 1) For any question about convergence, we need to see the >> output of >> > > > > > >> > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual >> -ksp_converged_reason >> > > > > > >> > > > > > 2) Hypre has many preconditioners, which one are you talking >> about >> > > > > > >> > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG >> > > > > > >> > > > > > Thanks, >> > > > > > >> > > > > > Matt >> > > > > > >> > > > > > -- >> > > > > > Hao Zhang >> > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > Stony Brook University, >> > > > > > Stony Brook, New York, 11790 >> > > > > > >> > > > > > >> > > > > > >> > > > > > -- >> > > > > > What most experimenters take for granted before they begin >> their experiments is infinitely more interesting than any results to which >> their experiments lead. >> > > > > > -- Norbert Wiener >> > > > > > >> > > > > > https://www.cse.buffalo.edu/~knepley/ >> > > > > > >> > > > > > >> > > > > > >> > > > > > -- >> > > > > > Hao Zhang >> > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > Stony Brook University, >> > > > > > Stony Brook, New York, 11790 >> > > > > > >> > > > > > >> > > > > > >> > > > > > -- >> > > > > > What most experimenters take for granted before they begin >> their experiments is infinitely more interesting than any results to which >> their experiments lead. >> > > > > > -- Norbert Wiener >> > > > > > >> > > > > > https://www.cse.buffalo.edu/~knepley/ >> > > > > > >> > > > > > >> > > > > > >> > > > > > -- >> > > > > > Hao Zhang >> > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > Stony Brook University, >> > > > > > Stony Brook, New York, 11790 >> > > > > > >> > > > > > >> > > > > > >> > > > > > -- >> > > > > > Hao Zhang >> > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > Stony Brook University, >> > > > > > Stony Brook, New York, 11790 >> > > > > >> > > > > -- >> > > > > Hao Zhang >> > > > > Dept. of Applid Mathematics and Statistics, >> > > > > Stony Brook University, >> > > > > Stony Brook, New York, 11790 >> > > > >> > > > >> > > > >> > > > >> > > > -- >> > > > Hao Zhang >> > > > Dept. of Applid Mathematics and Statistics, >> > > > Stony Brook University, >> > > > Stony Brook, New York, 11790 >> > > >> > > >> > > >> > > >> > > -- >> > > Hao Zhang >> > > Dept. of Applid Mathematics and Statistics, >> > > Stony Brook University, >> > > Stony Brook, New York, 11790 >> > >> > >> > >> > >> > -- >> > Hao Zhang >> > Dept. of Applid Mathematics and Statistics, >> > Stony Brook University, >> > Stony Brook, New York, 11790 >> >> > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Oct 22 08:06:13 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 22 Oct 2017 09:06:13 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> Message-ID: On Sun, Oct 22, 2017 at 12:16 AM, Hao Zhang wrote: > the reason is when I do finer grid simulation, matrix become more stiff. > Much larger condition number. just to give you a perspective, it will take > 6000 iterations to converge and the solver does converge. I want to reduce > the number of iterations while keeping the convergence rate. that's main > drive to do so much heavy lifting around. please advise. snippet will be > provided upon request. > This cannot be right. You have a mistake in the discretization or the matrix assembly. Pressure Poisson solves should take 10-20 iterates unless you have some kind of pathological geometry. We have plenty of examples of Poisson with Neumann conditions using both FD and FEM. 6000 iterations for a Poisson solve means that something is terribly wrong. Matt > Thanks again. > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith wrote: > >> >> Oh, you change KSP but not hypre. I did not understand this. >> >> Why not just use GMRES all the time? Why mess with BCGS if it is not >> robust? Not worth the small optimization if it breaks everything. >> >> Barry >> >> >> >> > On Oct 21, 2017, at 11:05 PM, Hao Zhang wrote: >> > >> > this is the initial pressure solver output regarding use of PETSc. it >> failed to converge after 40000 iterations, then use GMRES. >> > >> > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid norm >> 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 >> > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid norm >> 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 >> > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid norm >> 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 >> > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid norm >> 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 >> > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid norm >> 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 >> > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid norm >> 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 >> > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid norm >> 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 >> > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid norm >> 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 >> > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid norm >> 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 >> > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid norm >> 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 >> > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid norm >> 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 >> > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid norm >> 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 >> > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid norm >> 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 >> > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid norm >> 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 >> > Linear solve did not converge due to DIVERGED_ITS iterations 40000 >> > KSP Object: 24 MPI processes >> > type: bcgs >> > maximum iterations=40000, initial guess is zero >> > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > left preconditioning >> > using PRECONDITIONED norm type for convergence test >> > PC Object: 24 MPI processes >> > type: hypre >> > HYPRE BoomerAMG preconditioning >> > Cycle type V >> > Maximum number of levels 25 >> > Maximum number of iterations PER hypre call 1 >> > Convergence tolerance PER hypre call 0. >> > Threshold for strong coupling 0.25 >> > Interpolation truncation factor 0. >> > Interpolation: max elements per row 0 >> > Number of levels of aggressive coarsening 0 >> > Number of paths for aggressive coarsening 1 >> > Maximum row sums 0.9 >> > Sweeps down 1 >> > Sweeps up 1 >> > Sweeps on coarse 1 >> > Relax down symmetric-SOR/Jacobi >> > Relax up symmetric-SOR/Jacobi >> > Relax on coarse Gaussian-elimination >> > Relax weight (all) 1. >> > Outer relax weight (all) 1. >> > Using CF-relaxation >> > Not using more complex smoothers. >> > Measure type local >> > Coarsen type Falgout >> > Interpolation type classical >> > linear system matrix = precond matrix: >> > Mat Object: A 24 MPI processes >> > type: mpiaij >> > rows=497664, cols=497664 >> > total: nonzeros=3363552, allocated nonzeros=6967296 >> > total number of mallocs used during MatSetValues calls =0 >> > has attached null space >> > not using I-node (on process 0) routines >> > >> > The solution diverges for p0! The residual is 3.853123e-08. Solve >> again using GMRES! >> > KSP Object: 24 MPI processes >> > type: gmres >> > restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> > happy breakdown tolerance 1e-30 >> > maximum iterations=40000, initial guess is zero >> > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > left preconditioning >> > using PRECONDITIONED norm type for convergence test >> > PC Object: 24 MPI processes >> > type: hypre >> > HYPRE BoomerAMG preconditioning >> > Cycle type V >> > Maximum number of levels 25 >> > Maximum number of iterations PER hypre call 1 >> > Convergence tolerance PER hypre call 0. >> > Threshold for strong coupling 0.25 >> > Interpolation truncation factor 0. >> > Interpolation: max elements per row 0 >> > Number of levels of aggressive coarsening 0 >> > Number of paths for aggressive coarsening 1 >> > Maximum row sums 0.9 >> > Sweeps down 1 >> > Sweeps up 1 >> > Sweeps on coarse 1 >> > Relax down symmetric-SOR/Jacobi >> > Relax up symmetric-SOR/Jacobi >> > Relax on coarse Gaussian-elimination >> > Relax weight (all) 1. >> > Outer relax weight (all) 1. >> > Using CF-relaxation >> > Not using more complex smoothers. >> > Measure type local >> > Coarsen type Falgout >> > Interpolation type classical >> > linear system matrix = precond matrix: >> > Mat Object: A 24 MPI processes >> > type: mpiaij >> > rows=497664, cols=497664 >> > total: nonzeros=3363552, allocated nonzeros=6967296 >> > total number of mallocs used during MatSetValues calls =0 >> > has attached null space >> > not using I-node (on process 0) routines >> > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm >> 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 >> > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm >> 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 >> > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm >> 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 >> > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm >> 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 >> > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm >> 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 >> > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm >> 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 >> > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm >> 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 >> > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm >> 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 >> > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm >> 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 >> > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm >> 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 >> > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm >> 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 >> > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm >> 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 >> > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm >> 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 >> > Linear solve converged due to CONVERGED_RTOL iterations 12 >> > KSP Object: 24 MPI processes >> > type: gmres >> > restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> > happy breakdown tolerance 1e-30 >> > maximum iterations=40000, initial guess is zero >> > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > left preconditioning >> > using PRECONDITIONED norm type for convergence test >> > PC Object: 24 MPI processes >> > type: hypre >> > HYPRE BoomerAMG preconditioning >> > Cycle type V >> > Maximum number of levels 25 >> > Maximum number of iterations PER hypre call 1 >> > Convergence tolerance PER hypre call 0. >> > Threshold for strong coupling 0.25 >> > Interpolation truncation factor 0. >> > Interpolation: max elements per row 0 >> > Number of levels of aggressive coarsening 0 >> > Number of paths for aggressive coarsening 1 >> > Maximum row sums 0.9 >> > Sweeps down 1 >> > Sweeps up 1 >> > Sweeps on coarse 1 >> > Relax down symmetric-SOR/Jacobi >> > Relax up symmetric-SOR/Jacobi >> > Relax on coarse Gaussian-elimination >> > Relax weight (all) 1. >> > Outer relax weight (all) 1. >> > Using CF-relaxation >> > Not using more complex smoothers. >> > Measure type local >> > Coarsen type Falgout >> > Interpolation type classical >> > linear system matrix = precond matrix: >> > Mat Object: A 24 MPI processes >> > type: mpiaij >> > rows=497664, cols=497664 >> > total: nonzeros=3363552, allocated nonzeros=6967296 >> > total number of mallocs used during MatSetValues calls =0 >> > has attached null space >> > not using I-node (on process 0) routines >> > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd >> > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd >> > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd >> > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = >> 1.075459e-13 >> > >> > The max value of p0 is 0.03115845493408858 >> > >> > The min value of p0 is -0.07156715468428149 >> > >> > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith >> wrote: >> > >> > > On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: >> > > >> > > the incompressible NS solver algorithm call PETSc solver at different >> stage of each time step. The one you were saying "This is good. 12 digit >> reduction" is after the initial pressure solver, in which usually HYPRE >> doesn't give a good convergence, so the fall-back solver GMRES will be >> called after. >> > >> > Hmm, I don't understand. hypre should do well on a pressure solve. In >> fact, very well. >> > > >> > > Barry, you were mentioning that I could have a wrong nullspace. that >> particular solver is aimed to give an initial pressure profile for 3d >> incompressible NS simulation using all neumann boundary conditions. could >> you give some insight how to test if I have a wrong nullspace etc? >> > >> > -ksp_test_null_space >> > >> > But if your null space is consistently from all Neumann boundary >> conditions then it likely is not wrong. >> > >> > Barry >> > >> > > >> > > Thanks! >> > > >> > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith >> wrote: >> > > >> > > This is good. You get more than 12 digit reduction in the true >> residual norm. This is good AMG convergence. Expected when everything goes >> well. >> > > >> > > What is different in this case from the previous case that does not >> converge reasonably? >> > > >> > > Barry >> > > >> > > >> > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: >> > > > >> > > > Barry, Please advise what you make of this? this is poisson solver >> with all neumann BC 3d case Finite difference Scheme was used. >> > > > Thanks! I'm in learning mode. >> > > > >> > > > KSP Object: 24 MPI processes >> > > > type: bcgs >> > > > maximum iterations=40000, initial guess is zero >> > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > > > left preconditioning >> > > > using PRECONDITIONED norm type for convergence test >> > > > PC Object: 24 MPI processes >> > > > type: hypre >> > > > HYPRE BoomerAMG preconditioning >> > > > Cycle type V >> > > > Maximum number of levels 25 >> > > > Maximum number of iterations PER hypre call 1 >> > > > Convergence tolerance PER hypre call 0. >> > > > Threshold for strong coupling 0.25 >> > > > Interpolation truncation factor 0. >> > > > Interpolation: max elements per row 0 >> > > > Number of levels of aggressive coarsening 0 >> > > > Number of paths for aggressive coarsening 1 >> > > > Maximum row sums 0.9 >> > > > Sweeps down 1 >> > > > Sweeps up 1 >> > > > Sweeps on coarse 1 >> > > > Relax down symmetric-SOR/Jacobi >> > > > Relax up symmetric-SOR/Jacobi >> > > > Relax on coarse Gaussian-elimination >> > > > Relax weight (all) 1. >> > > > Outer relax weight (all) 1. >> > > > Using CF-relaxation >> > > > Not using more complex smoothers. >> > > > Measure type local >> > > > Coarsen type Falgout >> > > > Interpolation type classical >> > > > linear system matrix = precond matrix: >> > > > Mat Object: A 24 MPI processes >> > > > type: mpiaij >> > > > rows=497664, cols=497664 >> > > > total: nonzeros=3363552, allocated nonzeros=6967296 >> > > > total number of mallocs used during MatSetValues calls =0 >> > > > has attached null space >> > > > not using I-node (on process 0) routines >> > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid >> norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 >> > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid >> norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 >> > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid >> norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 >> > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid >> norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 >> > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid >> norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 >> > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid >> norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 >> > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid >> norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 >> > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid >> norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 >> > > > Linear solve converged due to CONVERGED_ATOL iterations 7 >> > > > >> > > > >> > > > >> > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith >> wrote: >> > > > >> > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang >> wrote: >> > > > > >> > > > > hi, Barry: >> > > > > what do you mean absurd by setting tolerance =1e-14? >> > > > >> > > > Trying to decrease the initial residual norm down by a factor of >> 1e-14 with an iterative method (or even direct method) is unrealistic, >> usually unachievable) and almost never necessary. You are requiring || r_n >> || < 1.e-14 || >> r_0|| when with double precision numbers you only have roughly 14 decimal >> digits total to compute with. Round off alone will lead to differences far >> larger than 1e-14 >> > > > >> > > > If you are using the solver in the context of a nonlinear >> problem (i.e. inside Newton's method) then 1.e-6 >> is generally >> more than plenty to get quadratic convergence of Newton's method. >> > > > >> > > > If you are solving a linear problem then it is extremely likely >> that errors due to discretization errors (from finite element method etc) >> and the model are much much larger than even 1.e-8 >> . >> > > > >> > > > So, in summary >> > > > >> > > > 1.e-14 >> is probably unachievable >> > > > >> > > > 1.e-14 >> is almost for >> sure not needed. >> > > > >> > > > Barry >> > > > >> > > > >> > > > > >> > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith >> wrote: >> > > > > >> > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the >> resulting output file called binaryoutput to petsc-maint at mcs.anl.gov >> > > > > >> > > > > Note you can also use -ksp_type gmres with hypre, unlikely to >> be a reason to use bcgs >> > > > > >> > > > > BTW: tolerances: relative=1e-14, is absurd >> > > > > >> > > > > My guess is your null space is incorrect. >> > > > > >> > > > > >> > > > > >> > > > > >> > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang >> wrote: >> > > > > > >> > > > > > if this solver doesn't converge. I have a fall-back solution, >> which uses GMRES solver. this setup is fine with me. I just want to know if >> HYPRE is a reliable solution for me. Or I will have to go without >> preconditioner. >> > > > > > >> > > > > > Thanks! >> > > > > > >> > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang >> wrote: >> > > > > > this is serial run. still dumping output. parallel more or less >> the same. >> > > > > > >> > > > > > KSP Object: 1 MPI processes >> > > > > > type: bcgs >> > > > > > maximum iterations=40000, initial guess is zero >> > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > > > > > left preconditioning >> > > > > > using PRECONDITIONED norm type for convergence test >> > > > > > PC Object: 1 MPI processes >> > > > > > type: hypre >> > > > > > HYPRE BoomerAMG preconditioning >> > > > > > Cycle type V >> > > > > > Maximum number of levels 25 >> > > > > > Maximum number of iterations PER hypre call 1 >> > > > > > Convergence tolerance PER hypre call 0. >> > > > > > Threshold for strong coupling 0.25 >> > > > > > Interpolation truncation factor 0. >> > > > > > Interpolation: max elements per row 0 >> > > > > > Number of levels of aggressive coarsening 0 >> > > > > > Number of paths for aggressive coarsening 1 >> > > > > > Maximum row sums 0.9 >> > > > > > Sweeps down 1 >> > > > > > Sweeps up 1 >> > > > > > Sweeps on coarse 1 >> > > > > > Relax down symmetric-SOR/Jacobi >> > > > > > Relax up symmetric-SOR/Jacobi >> > > > > > Relax on coarse Gaussian-elimination >> > > > > > Relax weight (all) 1. >> > > > > > Outer relax weight (all) 1. >> > > > > > Using CF-relaxation >> > > > > > Not using more complex smoothers. >> > > > > > Measure type local >> > > > > > Coarsen type Falgout >> > > > > > Interpolation type classical >> > > > > > linear system matrix = precond matrix: >> > > > > > Mat Object: A 1 MPI processes >> > > > > > type: seqaij >> > > > > > rows=497664, cols=497664 >> > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 >> > > > > > total number of mallocs used during MatSetValues calls =0 >> > > > > > has attached null space >> > > > > > not using I-node routines >> > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid >> norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 >> > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid >> norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 >> > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid >> norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 >> > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid >> norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 >> > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid >> norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 >> > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid >> norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 >> > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid >> norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 >> > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid >> norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 >> > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid >> norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 >> > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid >> norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 >> > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid >> norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 >> > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid >> norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 >> > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid >> norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 >> > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid >> norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 >> > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid >> norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 >> > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid >> norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 >> > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid >> norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 >> > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid >> norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 >> > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid >> norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 >> > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid >> norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 >> > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid >> norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 >> > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid >> norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 >> > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid >> norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 >> > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid >> norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 >> > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid >> norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 >> > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid >> norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 >> > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid >> norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 >> > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid >> norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 >> > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid >> norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 >> > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid >> norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 >> > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid >> norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 >> > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid >> norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 >> > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid >> norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 >> > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid >> norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 >> > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid >> norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 >> > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid >> norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 >> > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid >> norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 >> > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid >> norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 >> > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid >> norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 >> > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid >> norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 >> > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid >> norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 >> > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid >> norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 >> > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid >> norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 >> > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid >> norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 >> > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid >> norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 >> > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid >> norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 >> > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid >> norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 >> > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid >> norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 >> > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid >> norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 >> > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid >> norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 >> > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid >> norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 >> > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid >> norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 >> > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid >> norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 >> > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid >> norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 >> > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid >> norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 >> > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid >> norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 >> > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid >> norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 >> > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid >> norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 >> > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid >> norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 >> > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid >> norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 >> > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid >> norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 >> > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid >> norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 >> > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid >> norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 >> > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid >> norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 >> > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid >> norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 >> > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid >> norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 >> > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid >> norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 >> > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid >> norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 >> > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid >> norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 >> > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid >> norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 >> > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid >> norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 >> > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid >> norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 >> > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid >> norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 >> > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid >> norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 >> > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid >> norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 >> > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid >> norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 >> > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid >> norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 >> > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid >> norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 >> > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid >> norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 >> > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid >> norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 >> > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid >> norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 >> > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid >> norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 >> > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid >> norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 >> > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid >> norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 >> > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid >> norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 >> > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid >> norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 >> > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid >> norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 >> > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid >> norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 >> > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid >> norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 >> > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid >> norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 >> > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid >> norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 >> > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid >> norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 >> > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid >> norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 >> > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid >> norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 >> > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid >> norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 >> > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid >> norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 >> > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid >> norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 >> > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid >> norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 >> > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid >> norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 >> > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid >> norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 >> > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid >> norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 >> > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid >> norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 >> > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid >> norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 >> > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid >> norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 >> > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid >> norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 >> > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid >> norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 >> > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid >> norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 >> > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid >> norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 >> > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid >> norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 >> > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid >> norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 >> > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid >> norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 >> > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid >> norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 >> > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid >> norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 >> > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid >> norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 >> > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid >> norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 >> > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid >> norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 >> > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid >> norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 >> > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid >> norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 >> > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid >> norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 >> > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid >> norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 >> > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid >> norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 >> > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid >> norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 >> > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid >> norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 >> > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid >> norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 >> > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid >> norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 >> > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid >> norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 >> > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid >> norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 >> > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid >> norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 >> > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid >> norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 >> > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid >> norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 >> > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid >> norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 >> > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid >> norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 >> > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid >> norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 >> > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid >> norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 >> > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid >> norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 >> > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid >> norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 >> > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid >> norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 >> > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid >> norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 >> > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid >> norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 >> > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid >> norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 >> > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid >> norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 >> > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid >> norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 >> > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid >> norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 >> > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid >> norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 >> > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid >> norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 >> > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid >> norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 >> > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid >> norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 >> > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid >> norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 >> > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid >> norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 >> > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid >> norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 >> > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid >> norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 >> > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid >> norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 >> > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid >> norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 >> > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid >> norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 >> > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid >> norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 >> > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid >> norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 >> > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid >> norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 >> > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid >> norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 >> > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid >> norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 >> > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid >> norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 >> > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid >> norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 >> > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid >> norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 >> > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid >> norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 >> > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid >> norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 >> > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid >> norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 >> > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid >> norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 >> > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid >> norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 >> > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid >> norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 >> > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid >> norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 >> > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid >> norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 >> > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid >> norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 >> > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid >> norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 >> > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid >> norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 >> > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid >> norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 >> > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid >> norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 >> > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid >> norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 >> > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid >> norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 >> > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid >> norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 >> > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid >> norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 >> > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid >> norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 >> > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid >> norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 >> > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid >> norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 >> > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid >> norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 >> > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid >> norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 >> > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid >> norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 >> > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid >> norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 >> > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid >> norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 >> > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid >> norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 >> > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid >> norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 >> > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid >> norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 >> > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid >> norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 >> > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid >> norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 >> > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid >> norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 >> > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid >> norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 >> > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid >> norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 >> > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid >> norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 >> > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid >> norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 >> > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid >> norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 >> > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid >> norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 >> > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid >> norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 >> > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid >> norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 >> > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid >> norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 >> > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid >> norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 >> > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid >> norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 >> > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid >> norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 >> > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid >> norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 >> > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid >> norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 >> > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid >> norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 >> > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid >> norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 >> > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid >> norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 >> > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid >> norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 >> > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid >> norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 >> > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid >> norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 >> > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid >> norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 >> > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid >> norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 >> > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid >> norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 >> > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid >> norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 >> > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid >> norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 >> > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid >> norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 >> > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid >> norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 >> > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid >> norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 >> > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid >> norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 >> > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid >> norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 >> > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid >> norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 >> > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid >> norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 >> > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid >> norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 >> > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid >> norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 >> > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid >> norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 >> > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid >> norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 >> > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid >> norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 >> > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid >> norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 >> > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid >> norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 >> > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid >> norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 >> > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid >> norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 >> > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid >> norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 >> > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid >> norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 >> > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid >> norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 >> > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid >> norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 >> > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid >> norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 >> > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid >> norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 >> > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid >> norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 >> > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid >> norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 >> > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid >> norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 >> > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid >> norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 >> > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid >> norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 >> > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid >> norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 >> > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid >> norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 >> > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid >> norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 >> > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid >> norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >> > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid >> norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 >> > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid >> norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 >> > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid >> norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >> > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid >> norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 >> > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid >> norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 >> > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid >> norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 >> > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid >> norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 >> > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid >> norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 >> > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid >> norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 >> > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid >> norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 >> > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid >> norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 >> > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid >> norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 >> > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid >> norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 >> > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid >> norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 >> > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid >> norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 >> > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid >> norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 >> > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid >> norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 >> > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid >> norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 >> > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid >> norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 >> > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid >> norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 >> > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid >> norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 >> > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid >> norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 >> > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid >> norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 >> > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid >> norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 >> > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid >> norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 >> > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid >> norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 >> > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid >> norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 >> > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid >> norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 >> > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid >> norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 >> > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid >> norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 >> > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid >> norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 >> > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid >> norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 >> > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid >> norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 >> > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid >> norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 >> > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid >> norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 >> > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid >> norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 >> > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid >> norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 >> > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid >> norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 >> > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid >> norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 >> > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid >> norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 >> > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid >> norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 >> > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid >> norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 >> > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid >> norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 >> > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid >> norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 >> > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid >> norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 >> > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid >> norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 >> > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid >> norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 >> > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid >> norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 >> > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid >> norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 >> > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid >> norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 >> > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid >> norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 >> > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid >> norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 >> > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid >> norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 >> > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid >> norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 >> > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid >> norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 >> > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid >> norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 >> > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid >> norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >> > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid >> norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >> > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid >> norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 >> > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid >> norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 >> > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid >> norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 >> > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid >> norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 >> > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid >> norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 >> > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid >> norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 >> > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid >> norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 >> > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid >> norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >> > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid >> norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 >> > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid >> norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >> > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid >> norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 >> > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid >> norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 >> > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid >> norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 >> > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid >> norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 >> > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid >> norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 >> > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid >> norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 >> > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid >> norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 >> > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid >> norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 >> > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid >> norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 >> > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid >> norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 >> > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid >> norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 >> > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid >> norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 >> > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid >> norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 >> > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid >> norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 >> > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid >> norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 >> > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid >> norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 >> > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid >> norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 >> > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid >> norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 >> > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid >> norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 >> > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid >> norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 >> > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid >> norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 >> > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid >> norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 >> > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid >> norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 >> > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid >> norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 >> > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid >> norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 >> > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid >> norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 >> > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid >> norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 >> > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid >> norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 >> > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid >> norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 >> > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid >> norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 >> > > > > > >> > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley < >> knepley at gmail.com> wrote: >> > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang >> wrote: >> > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); >> > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); >> > > > > > >> > > > > > ierr = VecAssemblyBegin(x); >> > > > > > ierr = VecAssemblyEnd(x); >> > > > > > This is probably unnecessary >> > > > > > >> > > > > > ierr = VecAssemblyBegin(b); >> > > > > > ierr = VecAssemblyEnd(b); >> > > > > > This is probably unnecessary >> > > > > > >> > > > > > >> > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ >> WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); >> > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 >> > > > > > Is your rhs consistent with this nullspace? >> > > > > > >> > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); >> > > > > > KSPSetOperators(ksp,A,A); >> > > > > > >> > > > > > KSPSetType(ksp,KSPBCGS); >> > > > > > >> > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); >> > > > > > #if defined(__HYPRE__) >> > > > > > KSPGetPC(ksp, &pc); >> > > > > > PCSetType(pc, PCHYPRE); >> > > > > > PCHYPRESetType(pc,"boomeramg"); >> > > > > > This is terribly unnecessary. You just use >> > > > > > >> > > > > > -pc_type hypre -pc_hypre_type boomeramg >> > > > > > >> > > > > > or >> > > > > > >> > > > > > -pc_type gamg >> > > > > > >> > > > > > #else >> > > > > > KSPSetType(ksp,KSPBCGSL); >> > > > > > KSPBCGSLSetEll(ksp,2); >> > > > > > #endif /* defined(__HYPRE__) */ >> > > > > > >> > > > > > KSPSetFromOptions(ksp); >> > > > > > KSPSetUp(ksp); >> > > > > > >> > > > > > ierr = KSPSolve(ksp,b,x); >> > > > > > >> > > > > > >> > > > > > command line >> > > > > > >> > > > > > You did not provide any of what I asked for the in the >> eprevious mail. >> > > > > > >> > > > > > Matt >> > > > > > >> > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley < >> knepley at gmail.com> wrote: >> > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang >> wrote: >> > > > > > hi, >> > > > > > >> > > > > > I implemented HYPRE preconditioner for my study due to the fact >> that without preconditioner, PETSc solver will take thousands of iterations >> to converge for fine grid simulation. >> > > > > > >> > > > > > with HYPRE, depending on the parallel partition, it will take >> HYPRE forever to do anything. observation of output file is that the >> simulation is hanging with no output. >> > > > > > >> > > > > > Any idea what happened? will post snippet of code. >> > > > > > >> > > > > > 1) For any question about convergence, we need to see the >> output of >> > > > > > >> > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual >> -ksp_converged_reason >> > > > > > >> > > > > > 2) Hypre has many preconditioners, which one are you talking >> about >> > > > > > >> > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG >> > > > > > >> > > > > > Thanks, >> > > > > > >> > > > > > Matt >> > > > > > >> > > > > > -- >> > > > > > Hao Zhang >> > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > Stony Brook University, >> > > > > > Stony Brook, New York, 11790 >> > > > > > >> > > > > > >> > > > > > >> > > > > > -- >> > > > > > What most experimenters take for granted before they begin >> their experiments is infinitely more interesting than any results to which >> their experiments lead. >> > > > > > -- Norbert Wiener >> > > > > > >> > > > > > https://www.cse.buffalo.edu/~knepley/ >> > > > > > >> > > > > > >> > > > > > >> > > > > > -- >> > > > > > Hao Zhang >> > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > Stony Brook University, >> > > > > > Stony Brook, New York, 11790 >> > > > > > >> > > > > > >> > > > > > >> > > > > > -- >> > > > > > What most experimenters take for granted before they begin >> their experiments is infinitely more interesting than any results to which >> their experiments lead. >> > > > > > -- Norbert Wiener >> > > > > > >> > > > > > https://www.cse.buffalo.edu/~knepley/ >> > > > > > >> > > > > > >> > > > > > >> > > > > > -- >> > > > > > Hao Zhang >> > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > Stony Brook University, >> > > > > > Stony Brook, New York, 11790 >> > > > > > >> > > > > > >> > > > > > >> > > > > > -- >> > > > > > Hao Zhang >> > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > Stony Brook University, >> > > > > > Stony Brook, New York, 11790 >> > > > > >> > > > > -- >> > > > > Hao Zhang >> > > > > Dept. of Applid Mathematics and Statistics, >> > > > > Stony Brook University, >> > > > > Stony Brook, New York, 11790 >> > > > >> > > > >> > > > >> > > > >> > > > -- >> > > > Hao Zhang >> > > > Dept. of Applid Mathematics and Statistics, >> > > > Stony Brook University, >> > > > Stony Brook, New York, 11790 >> > > >> > > >> > > >> > > >> > > -- >> > > Hao Zhang >> > > Dept. of Applid Mathematics and Statistics, >> > > Stony Brook University, >> > > Stony Brook, New York, 11790 >> > >> > >> > >> > >> > -- >> > Hao Zhang >> > Dept. of Applid Mathematics and Statistics, >> > Stony Brook University, >> > Stony Brook, New York, 11790 >> >> > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Oct 22 08:26:39 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 22 Oct 2017 09:26:39 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> Message-ID: On Sat, Oct 21, 2017 at 10:16 PM, zakaryah . wrote: > OK, it turns out Lukasz was exactly correct. With whatever method I try, > the solver or stepper approaches a critical point, which is associated with > some kind of snap-through. I have looked into the control techniques and > they are pretty ingenious, and I think they should work for my problem, in > that I hope to continue through the critical point. I have a technical > question about the implementation, though. > > Following Riks 1979 for example, the control parameter is the approximate > arc-length in the phase space of loading intensity and displacements. It > represents one additional variable in the system, and there is one > additional equation in the system (in Riks, this is eq. 3.9). > > In my implementation, the displacements are implemented as a DMDA with 3 > dof, since I'm working in 3D. I'm not sure about the best way to add the > single additional variable and equality. The way I see it, I either give > up on using the DMDA, in which case I'm not sure how to efficiently > implement the stencil I need to calculate spatial derivatives of the > displacements, or I have to add a rather large number of extra variables. > For example, if my DMDA is WxHxD, I would have to make it (W+1)xHxD, and > each of the extra HxD variables will have 3 dof. Then 3xHxD-1 variables > are in the nullspace (because they don't represent anything, so I would > have to add a bunch of zeros to the function and the Jacobian), while the > remaining variable is used as the control parameter. I'm aware of other > methods, e.g. Crisfield 1983, but I'm interested in whether there is a > straightforward way to implement Riks' method in PETSc. I'm sure I'm > missing something so hopefully someone can give me some hints. > You use a DMComposite to handle two DMs, and then use a DMRedundant http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMREDUNDANT.html for the extra variable, so all processes can see it. The only short coming here is that the combined Jacobian does not know how to allocate for the coupling between your arclength parameter and the displacements. I would just turn off allocation checking and stick in what you need to. Thanks, Matt > Thanks for all the help! > > On Thu, Oct 12, 2017 at 2:02 PM, zakaryah . wrote: > >> Thanks for the response, Matt - these are excellent questions. >> >> On theoretical grounds, I am certain that the solution to the continuous >> PDE exists. Without any serious treatment, I think this means the >> discretized system should have a solution up to discretization error, but >> perhaps this is indeed a bad approach. >> >> I am not sure whether the equations are "really hard to solve". At each >> point, the equations are third order polynomials of the state variable at >> that point and at nearby points (i.e. in the stencil). One possible >> complication is that the external forces which are applied to the interior >> of the material can be fairly complex - they are smooth, but they can have >> many inflection points. >> >> I don't have a great test case for which I know a good solution. To my >> thinking, there is no way that time-stepping the parabolic version of the >> same PDE can fail to yield a solution at infinite time. So, I'm going to >> try starting there. Converting the problem to a minimization is a bit >> trickier, because the discretization has to be performed one step earlier >> in the calculation, and therefore the gradient and Hessian would need to be >> recalculated. >> >> Even if there are some problems with time-stepping (speed of >> convergence?), maybe I can use the solutions as better test cases for the >> elliptic PDE solved via SNES. >> >> Can you give me any additional lingo or references for the fracture >> problem? >> >> Thanks, Zak >> >> On Wed, Oct 11, 2017 at 8:53 PM, Matthew Knepley >> wrote: >> >>> On Wed, Oct 11, 2017 at 11:33 AM, zakaryah . wrote: >>> >>>> Many thanks for the suggestions, Matt. >>>> >>>> I tried putting the solvers in a loop, like this: >>>> >>>> do { >>>> NewtonLS >>>> check convergence >>>> if (converged) break >>>> NRichardson or NGMRES >>>> } while (!converged) >>>> >>>> The results were interesting, to me at least. With NRichardson, there >>>> was indeed improvement in the residual norm, followed by improvement with >>>> NewtonLS, and so on for a few iterations of this loop. In each case, after >>>> a few iterations the NewtonLS appeared to be stuck in the same way as after >>>> the first iteration. Eventually neither method was able to reduce the >>>> residual norm, which was still significant, so this was not a total >>>> success. With NGMRES, the initial behavior was similar, but eventually the >>>> NGMRES progress became erratic. The minimal residual norm was a bit better >>>> using NGMRES than NRichardson, but neither combination of methods fully >>>> converged. For both NRichardson and NGMRES, I simply used the defaults, as >>>> I have no knowledge of how to tune the options for my problem. >>>> >>> >>> Are you certain that the equations have a solution? I become a little >>> concerned when richardson stops converging. Its >>> still possible you have really hard to solve equations, it just becomes >>> less likely. And even if they truly are hard to solve, >>> then there should be physical reasons for this. For example, it could be >>> that discretizing the minimizing PDE is just the >>> wrong thing to do. I believe this is the case in fracture, where you >>> attack the minimization problem directly. >>> >>> Matt >>> >>> >>>> On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . >>>>> wrote: >>>>> >>>>>> Thanks for clearing that up. >>>>>> >>>>>> I'd appreciate any further help. Here's a summary: >>>>>> >>>>>> My ultimate goal is to find a vector field which minimizes an >>>>>> action. The action is a (nonlinear) function of the field and its first >>>>>> spatial derivatives. >>>>>> >>>>>> My current approach is to derive the (continuous) Euler-Lagrange >>>>>> equations, which results in a nonlinear PDE that the minimizing field must >>>>>> satisfy. These Euler-Lagrange equations are then discretized, and I'm >>>>>> trying to use an SNES to solve them. >>>>>> >>>>>> The problem is that the solver seems to reach a point at which the >>>>>> Jacobian (this corresponds to the second variation of the action, which is >>>>>> like a Hessian of the energy) becomes nearly singular, but where the >>>>>> residual (RHS of PDE) is not close to zero. The residual does not decrease >>>>>> over additional SNES iterations, and the line search results in tiny step >>>>>> sizes. My interpretation is that this point of stagnation is a critical >>>>>> point. >>>>>> >>>>> >>>>> The normal thing to do here (I think) is to engage solvers which do >>>>> not depend on that particular point. So using >>>>> NRichardson, or maybe NGMRES, to get past that. I would be interested >>>>> to see if this is successful. >>>>> >>>>> Matt >>>>> >>>>> >>>>>> I have checked the hand-coded Jacobian very carefully and I am >>>>>> confident that it is correct. >>>>>> >>>>>> I am guessing that such a situation is well-known in the field, but I >>>>>> don't know the lingo or literature. If anyone has suggestions I'd be >>>>>> thrilled. Are there documentation/methodologies within PETSc for this type >>>>>> of situation? >>>>>> >>>>>> Is there any advantage to discretizing the action itself and using >>>>>> the optimization routines? With minor modifications I'll have the gradient >>>>>> and Hessian calculations coded. Are the optimization routines likely to >>>>>> stagnate in the same way as the nonlinear solver, or can they take >>>>>> advantage of the structure of the problem to overcome this? >>>>>> >>>>>> Thanks a lot in advance for any help. >>>>>> >>>>>> On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith >>>>>> wrote: >>>>>> >>>>>>> >>>>>>> There is apparently confusing in understanding the ordering. Is >>>>>>> this all on one process that you get funny results? Are you using >>>>>>> MatSetValuesStencil() to provide the matrix (it is generally easier than >>>>>>> providing it yourself). In parallel MatView() always maps the rows and >>>>>>> columns to the natural ordering before printing, if you use a matrix >>>>>>> created from the DMDA. If you create the matrix yourself it has a different >>>>>>> MatView in parallel that is in in thePETSc ordering.\ >>>>>>> >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> >>>>>>> >>>>>>> > On Oct 8, 2017, at 8:05 AM, zakaryah . wrote: >>>>>>> > >>>>>>> > I'm more confused than ever. I don't understand the output of >>>>>>> -snes_type test -snes_test_display. >>>>>>> > >>>>>>> > For the user-defined state of the vector (where I'd like to test >>>>>>> the Jacobian), the finite difference Jacobian at row 0 evaluates as: >>>>>>> > >>>>>>> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) >>>>>>> (4, 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) >>>>>>> (12, 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, >>>>>>> -2175.34) (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) >>>>>>> (36, 76.8575) (37, 16.325) (38, 4.83918) >>>>>>> > >>>>>>> > But the hand-coded Jacobian at row 0 evaluates as: >>>>>>> > >>>>>>> > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) >>>>>>> (4, 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) >>>>>>> (36, 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, >>>>>>> -2175.34) (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) >>>>>>> (264, 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, >>>>>>> 0.) >>>>>>> > and the difference between the Jacobians at row 0 evaluates as: >>>>>>> > >>>>>>> > row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, >>>>>>> 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, >>>>>>> -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) >>>>>>> (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, >>>>>>> -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) >>>>>>> (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, >>>>>>> 0.) (41, 0.) >>>>>>> > >>>>>>> > The difference between the column numbering between the finite >>>>>>> difference and the hand-coded Jacobians looks like a serious problem to me, >>>>>>> but I'm probably missing something. >>>>>>> > >>>>>>> > I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, >>>>>>> and for this test problem the grid dimensions are 11x7x6. For a grid point >>>>>>> x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? >>>>>>> If so, then the column numbers of the hand-coded Jacobian match those of >>>>>>> the 27 point stencil I have in mind. However, I am then at a loss to >>>>>>> explain the column numbers in the finite difference Jacobian. >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> > On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . >>>>>>> wrote: >>>>>>> > OK - I ran with -snes_monitor -snes_converged_reason >>>>>>> -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual >>>>>>> -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls >>>>>>> -snes_compare_explicit >>>>>>> > >>>>>>> > and here is the full error message, output immediately after >>>>>>> > >>>>>>> > Finite difference Jacobian >>>>>>> > Mat Object: 24 MPI processes >>>>>>> > type: mpiaij >>>>>>> > >>>>>>> > [0]PETSC ERROR: --------------------- Error Message >>>>>>> -------------------------------------------------------------- >>>>>>> > >>>>>>> > [0]PETSC ERROR: Invalid argument >>>>>>> > >>>>>>> > [0]PETSC ERROR: Matrix not generated from a DMDA >>>>>>> > >>>>>>> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/d >>>>>>> ocumentation/faq.html for trouble shooting. >>>>>>> > >>>>>>> > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 >>>>>>> > >>>>>>> > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt >>>>>>> named node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 >>>>>>> 13:44:44 2017 >>>>>>> > >>>>>>> > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 >>>>>>> --download-fblaslapack -with-debugging=0 >>>>>>> > >>>>>>> > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in >>>>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impl >>>>>>> s/da/fdda.c >>>>>>> > >>>>>>> > [0]PETSC ERROR: #2 MatView() line 901 in >>>>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/mat/int >>>>>>> erface/matrix.c >>>>>>> > >>>>>>> > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in >>>>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/in >>>>>>> terface/snes.c >>>>>>> > >>>>>>> > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in >>>>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/im >>>>>>> pls/ls/ls.c >>>>>>> > >>>>>>> > [0]PETSC ERROR: #5 SNESSolve() line 4005 in >>>>>>> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/in >>>>>>> terface/snes.c >>>>>>> > >>>>>>> > [0]PETSC ERROR: #6 solveWarp3D() line 659 in >>>>>>> /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October >>>>>>> 6_2017/mshs.c >>>>>>> > >>>>>>> > >>>>>>> > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown >>>>>>> wrote: >>>>>>> > Always always always send the whole error message. >>>>>>> > >>>>>>> > "zakaryah ." writes: >>>>>>> > >>>>>>> > > I tried -snes_compare_explicit, and got the following error: >>>>>>> > > >>>>>>> > > [0]PETSC ERROR: Invalid argument >>>>>>> > > >>>>>>> > > [0]PETSC ERROR: Matrix not generated from a DMDA >>>>>>> > > >>>>>>> > > What am I doing wrong? >>>>>>> > > >>>>>>> > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown >>>>>>> wrote: >>>>>>> > > >>>>>>> > >> Barry Smith writes: >>>>>>> > >> >>>>>>> > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . >>>>>>> wrote: >>>>>>> > >> >> >>>>>>> > >> >> I'm still working on this. I've made some progress, and it >>>>>>> looks like >>>>>>> > >> the issue is with the KSP, at least for now. The Jacobian may >>>>>>> be >>>>>>> > >> ill-conditioned. Is it possible to use -snes_test_display >>>>>>> during an >>>>>>> > >> intermediate step of the analysis? I would like to inspect the >>>>>>> Jacobian >>>>>>> > >> after several solves have already completed, >>>>>>> > >> > >>>>>>> > >> > No, our currently code for testing Jacobians is poor >>>>>>> quality and >>>>>>> > >> poorly organized. Needs a major refactoring to do things >>>>>>> properly. Sorry >>>>>>> > >> >>>>>>> > >> You can use -snes_compare_explicit or -snes_compare_coloring to >>>>>>> output >>>>>>> > >> differences on each Newton step. >>>>>>> > >> >>>>>>> > >>>>>>> > >>>>>>> >>>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Lukasz.Kaczmarczyk at glasgow.ac.uk Sun Oct 22 08:26:47 2017 From: Lukasz.Kaczmarczyk at glasgow.ac.uk (Lukasz Kaczmarczyk) Date: Sun, 22 Oct 2017 13:26:47 +0000 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <98D45588-381F-44FB-9D20-65D1638E357F@mcs.anl.gov> <876F0ACF-62C0-4DC9-B203-2893B25E3938@mcs.anl.gov> <1D94CC99-2395-4BBE-A0F3-80D23D85FA45@mcs.anl.gov> <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> Message-ID: <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> On 22 Oct 2017, at 03:16, zakaryah . > wrote: OK, it turns out Lukasz was exactly correct. With whatever method I try, the solver or stepper approaches a critical point, which is associated with some kind of snap-through. I have looked into the control techniques and they are pretty ingenious, and I think they should work for my problem, in that I hope to continue through the critical point. I have a technical question about the implementation, though. Following Riks 1979 for example, the control parameter is the approximate arc-length in the phase space of loading intensity and displacements. It represents one additional variable in the system, and there is one additional equation in the system (in Riks, this is eq. 3.9). In my implementation, the displacements are implemented as a DMDA with 3 dof, since I'm working in 3D. I'm not sure about the best way to add the single additional variable and equality. The way I see it, I either give up on using the DMDA, in which case I'm not sure how to efficiently implement the stencil I need to calculate spatial derivatives of the displacements, or I have to add a rather large number of extra variables. For example, if my DMDA is WxHxD, I would have to make it (W+1)xHxD, and each of the extra HxD variables will have 3 dof. Then 3xHxD-1 variables are in the nullspace (because they don't represent anything, so I would have to add a bunch of zeros to the function and the Jacobian), while the remaining variable is used as the control parameter. I'm aware of other methods, e.g. Crisfield 1983, but I'm interested in whether there is a straightforward way to implement Riks' method in PETSc. I'm sure I'm missing something so hopefully someone can give me some hints. Thanks for all the help! Zakaryah, If you like to have a peek how we doing that, you can see http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_8hpp_source.html http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_8cpp_source.html The implementation is specific features related to MoFEM implementation. However, you can follow the same idea; implement shell matrix, which adds column and row with controlling and controlled equation, respectively, This shell matrix has to have an operator for matrix-vector multiplication. Then you have to add preconditioner, which is based on Riks and others. In fact you can use as well FieldSplit pre-conditioner, Riks method is some variant of Schur complement. Such implementation allows running multi-grid preconditioner and other preconditions with control equation. Hope that this will be helpful. Regards, Lukasz On Thu, Oct 12, 2017 at 2:02 PM, zakaryah . > wrote: Thanks for the response, Matt - these are excellent questions. On theoretical grounds, I am certain that the solution to the continuous PDE exists. Without any serious treatment, I think this means the discretized system should have a solution up to discretization error, but perhaps this is indeed a bad approach. I am not sure whether the equations are "really hard to solve". At each point, the equations are third order polynomials of the state variable at that point and at nearby points (i.e. in the stencil). One possible complication is that the external forces which are applied to the interior of the material can be fairly complex - they are smooth, but they can have many inflection points. I don't have a great test case for which I know a good solution. To my thinking, there is no way that time-stepping the parabolic version of the same PDE can fail to yield a solution at infinite time. So, I'm going to try starting there. Converting the problem to a minimization is a bit trickier, because the discretization has to be performed one step earlier in the calculation, and therefore the gradient and Hessian would need to be recalculated. Even if there are some problems with time-stepping (speed of convergence?), maybe I can use the solutions as better test cases for the elliptic PDE solved via SNES. Can you give me any additional lingo or references for the fracture problem? Thanks, Zak On Wed, Oct 11, 2017 at 8:53 PM, Matthew Knepley > wrote: On Wed, Oct 11, 2017 at 11:33 AM, zakaryah . > wrote: Many thanks for the suggestions, Matt. I tried putting the solvers in a loop, like this: do { NewtonLS check convergence if (converged) break NRichardson or NGMRES } while (!converged) The results were interesting, to me at least. With NRichardson, there was indeed improvement in the residual norm, followed by improvement with NewtonLS, and so on for a few iterations of this loop. In each case, after a few iterations the NewtonLS appeared to be stuck in the same way as after the first iteration. Eventually neither method was able to reduce the residual norm, which was still significant, so this was not a total success. With NGMRES, the initial behavior was similar, but eventually the NGMRES progress became erratic. The minimal residual norm was a bit better using NGMRES than NRichardson, but neither combination of methods fully converged. For both NRichardson and NGMRES, I simply used the defaults, as I have no knowledge of how to tune the options for my problem. Are you certain that the equations have a solution? I become a little concerned when richardson stops converging. Its still possible you have really hard to solve equations, it just becomes less likely. And even if they truly are hard to solve, then there should be physical reasons for this. For example, it could be that discretizing the minimizing PDE is just the wrong thing to do. I believe this is the case in fracture, where you attack the minimization problem directly. Matt On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley > wrote: On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . > wrote: Thanks for clearing that up. I'd appreciate any further help. Here's a summary: My ultimate goal is to find a vector field which minimizes an action. The action is a (nonlinear) function of the field and its first spatial derivatives. My current approach is to derive the (continuous) Euler-Lagrange equations, which results in a nonlinear PDE that the minimizing field must satisfy. These Euler-Lagrange equations are then discretized, and I'm trying to use an SNES to solve them. The problem is that the solver seems to reach a point at which the Jacobian (this corresponds to the second variation of the action, which is like a Hessian of the energy) becomes nearly singular, but where the residual (RHS of PDE) is not close to zero. The residual does not decrease over additional SNES iterations, and the line search results in tiny step sizes. My interpretation is that this point of stagnation is a critical point. The normal thing to do here (I think) is to engage solvers which do not depend on that particular point. So using NRichardson, or maybe NGMRES, to get past that. I would be interested to see if this is successful. Matt I have checked the hand-coded Jacobian very carefully and I am confident that it is correct. I am guessing that such a situation is well-known in the field, but I don't know the lingo or literature. If anyone has suggestions I'd be thrilled. Are there documentation/methodologies within PETSc for this type of situation? Is there any advantage to discretizing the action itself and using the optimization routines? With minor modifications I'll have the gradient and Hessian calculations coded. Are the optimization routines likely to stagnate in the same way as the nonlinear solver, or can they take advantage of the structure of the problem to overcome this? Thanks a lot in advance for any help. On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith > wrote: There is apparently confusing in understanding the ordering. Is this all on one process that you get funny results? Are you using MatSetValuesStencil() to provide the matrix (it is generally easier than providing it yourself). In parallel MatView() always maps the rows and columns to the natural ordering before printing, if you use a matrix created from the DMDA. If you create the matrix yourself it has a different MatView in parallel that is in in thePETSc ordering.\ Barry > On Oct 8, 2017, at 8:05 AM, zakaryah . > wrote: > > I'm more confused than ever. I don't understand the output of -snes_type test -snes_test_display. > > For the user-defined state of the vector (where I'd like to test the Jacobian), the finite difference Jacobian at row 0 evaluates as: > > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) (37, 16.325) (38, 4.83918) > > But the hand-coded Jacobian at row 0 evaluates as: > > row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) > and the difference between the Jacobians at row 0 evaluates as: > > row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, 0.) (41, 0.) > > The difference between the column numbering between the finite difference and the hand-coded Jacobians looks like a serious problem to me, but I'm probably missing something. > > I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, and for this test problem the grid dimensions are 11x7x6. For a grid point x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? If so, then the column numbers of the hand-coded Jacobian match those of the 27 point stencil I have in mind. However, I am then at a loss to explain the column numbers in the finite difference Jacobian. > > > > > On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . > wrote: > OK - I ran with -snes_monitor -snes_converged_reason -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls -snes_compare_explicit > > and here is the full error message, output immediately after > > Finite difference Jacobian > Mat Object: 24 MPI processes > type: mpiaij > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > [0]PETSC ERROR: Invalid argument > > [0]PETSC ERROR: Matrix not generated from a DMDA > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 > > [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 > > [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 --download-fblaslapack -with-debugging=0 > > [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c > > [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/mat/interface/matrix.c > > [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c > > [0]PETSC ERROR: #5 SNESSolve() line 4005 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > [0]PETSC ERROR: #6 solveWarp3D() line 659 in /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c > > > On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown > wrote: > Always always always send the whole error message. > > "zakaryah ." > writes: > > > I tried -snes_compare_explicit, and got the following error: > > > > [0]PETSC ERROR: Invalid argument > > > > [0]PETSC ERROR: Matrix not generated from a DMDA > > > > What am I doing wrong? > > > > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown > wrote: > > > >> Barry Smith > writes: > >> > >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . > wrote: > >> >> > >> >> I'm still working on this. I've made some progress, and it looks like > >> the issue is with the KSP, at least for now. The Jacobian may be > >> ill-conditioned. Is it possible to use -snes_test_display during an > >> intermediate step of the analysis? I would like to inspect the Jacobian > >> after several solves have already completed, > >> > > >> > No, our currently code for testing Jacobians is poor quality and > >> poorly organized. Needs a major refactoring to do things properly. Sorry > >> > >> You can use -snes_compare_explicit or -snes_compare_coloring to output > >> differences on each Newton step. > >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sun Oct 22 10:41:29 2017 From: jed at jedbrown.org (Jed Brown) Date: Sun, 22 Oct 2017 09:41:29 -0600 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> References: <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> Message-ID: <877evnyyty.fsf@jedbrown.org> Alternatively, see DMComposite and src/snes/examples/tutorials/ex22.c. Lukasz Kaczmarczyk writes: > On 22 Oct 2017, at 03:16, zakaryah . > wrote: > > OK, it turns out Lukasz was exactly correct. With whatever method I try, the solver or stepper approaches a critical point, which is associated with some kind of snap-through. I have looked into the control techniques and they are pretty ingenious, and I think they should work for my problem, in that I hope to continue through the critical point. I have a technical question about the implementation, though. > > Following Riks 1979 for example, the control parameter is the approximate arc-length in the phase space of loading intensity and displacements. It represents one additional variable in the system, and there is one additional equation in the system (in Riks, this is eq. 3.9). > > In my implementation, the displacements are implemented as a DMDA with 3 dof, since I'm working in 3D. I'm not sure about the best way to add the single additional variable and equality. The way I see it, I either give up on using the DMDA, in which case I'm not sure how to efficiently implement the stencil I need to calculate spatial derivatives of the displacements, or I have to add a rather large number of extra variables. For example, if my DMDA is WxHxD, I would have to make it (W+1)xHxD, and each of the extra HxD variables will have 3 dof. Then 3xHxD-1 variables are in the nullspace (because they don't represent anything, so I would have to add a bunch of zeros to the function and the Jacobian), while the remaining variable is used as the control parameter. I'm aware of other methods, e.g. Crisfield 1983, but I'm interested in whether there is a straightforward way to implement Riks' method in PETSc. I'm sure I'm missing something so hopefully someone can give me some hints. > > Thanks for all the help! > > > Zakaryah, > > If you like to have a peek how we doing that, you can see > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_8hpp_source.html > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_8cpp_source.html > > The implementation is specific features related to MoFEM implementation. However, you can follow the same idea; implement shell matrix, which adds column and row with controlling and controlled equation, respectively, This shell matrix has to have an operator for matrix-vector multiplication. Then you have to add preconditioner, which is based on Riks and others. In fact you can use as well FieldSplit pre-conditioner, Riks method is some variant of Schur complement. > > Such implementation allows running multi-grid preconditioner and other preconditions with control equation. > > Hope that this will be helpful. > > Regards, > Lukasz > > > > On Thu, Oct 12, 2017 at 2:02 PM, zakaryah . > wrote: > Thanks for the response, Matt - these are excellent questions. > > On theoretical grounds, I am certain that the solution to the continuous PDE exists. Without any serious treatment, I think this means the discretized system should have a solution up to discretization error, but perhaps this is indeed a bad approach. > > I am not sure whether the equations are "really hard to solve". At each point, the equations are third order polynomials of the state variable at that point and at nearby points (i.e. in the stencil). One possible complication is that the external forces which are applied to the interior of the material can be fairly complex - they are smooth, but they can have many inflection points. > > I don't have a great test case for which I know a good solution. To my thinking, there is no way that time-stepping the parabolic version of the same PDE can fail to yield a solution at infinite time. So, I'm going to try starting there. Converting the problem to a minimization is a bit trickier, because the discretization has to be performed one step earlier in the calculation, and therefore the gradient and Hessian would need to be recalculated. > > Even if there are some problems with time-stepping (speed of convergence?), maybe I can use the solutions as better test cases for the elliptic PDE solved via SNES. > > Can you give me any additional lingo or references for the fracture problem? > > Thanks, Zak > > On Wed, Oct 11, 2017 at 8:53 PM, Matthew Knepley > wrote: > On Wed, Oct 11, 2017 at 11:33 AM, zakaryah . > wrote: > Many thanks for the suggestions, Matt. > > I tried putting the solvers in a loop, like this: > > do { > NewtonLS > check convergence > if (converged) break > NRichardson or NGMRES > } while (!converged) > > The results were interesting, to me at least. With NRichardson, there was indeed improvement in the residual norm, followed by improvement with NewtonLS, and so on for a few iterations of this loop. In each case, after a few iterations the NewtonLS appeared to be stuck in the same way as after the first iteration. Eventually neither method was able to reduce the residual norm, which was still significant, so this was not a total success. With NGMRES, the initial behavior was similar, but eventually the NGMRES progress became erratic. The minimal residual norm was a bit better using NGMRES than NRichardson, but neither combination of methods fully converged. For both NRichardson and NGMRES, I simply used the defaults, as I have no knowledge of how to tune the options for my problem. > > Are you certain that the equations have a solution? I become a little concerned when richardson stops converging. Its > still possible you have really hard to solve equations, it just becomes less likely. And even if they truly are hard to solve, > then there should be physical reasons for this. For example, it could be that discretizing the minimizing PDE is just the > wrong thing to do. I believe this is the case in fracture, where you attack the minimization problem directly. > > Matt > > On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley > wrote: > On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . > wrote: > Thanks for clearing that up. > > I'd appreciate any further help. Here's a summary: > > My ultimate goal is to find a vector field which minimizes an action. The action is a (nonlinear) function of the field and its first spatial derivatives. > > My current approach is to derive the (continuous) Euler-Lagrange equations, which results in a nonlinear PDE that the minimizing field must satisfy. These Euler-Lagrange equations are then discretized, and I'm trying to use an SNES to solve them. > > The problem is that the solver seems to reach a point at which the Jacobian (this corresponds to the second variation of the action, which is like a Hessian of the energy) becomes nearly singular, but where the residual (RHS of PDE) is not close to zero. The residual does not decrease over additional SNES iterations, and the line search results in tiny step sizes. My interpretation is that this point of stagnation is a critical point. > > The normal thing to do here (I think) is to engage solvers which do not depend on that particular point. So using > NRichardson, or maybe NGMRES, to get past that. I would be interested to see if this is successful. > > Matt > > I have checked the hand-coded Jacobian very carefully and I am confident that it is correct. > > I am guessing that such a situation is well-known in the field, but I don't know the lingo or literature. If anyone has suggestions I'd be thrilled. Are there documentation/methodologies within PETSc for this type of situation? > > Is there any advantage to discretizing the action itself and using the optimization routines? With minor modifications I'll have the gradient and Hessian calculations coded. Are the optimization routines likely to stagnate in the same way as the nonlinear solver, or can they take advantage of the structure of the problem to overcome this? > > Thanks a lot in advance for any help. > > On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith > wrote: > > There is apparently confusing in understanding the ordering. Is this all on one process that you get funny results? Are you using MatSetValuesStencil() to provide the matrix (it is generally easier than providing it yourself). In parallel MatView() always maps the rows and columns to the natural ordering before printing, if you use a matrix created from the DMDA. If you create the matrix yourself it has a different MatView in parallel that is in in thePETSc ordering.\ > > > Barry > > > >> On Oct 8, 2017, at 8:05 AM, zakaryah . > wrote: >> >> I'm more confused than ever. I don't understand the output of -snes_type test -snes_test_display. >> >> For the user-defined state of the vector (where I'd like to test the Jacobian), the finite difference Jacobian at row 0 evaluates as: >> >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) (37, 16.325) (38, 4.83918) >> >> But the hand-coded Jacobian at row 0 evaluates as: >> >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) >> and the difference between the Jacobians at row 0 evaluates as: >> >> row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, 0.) (41, 0.) >> >> The difference between the column numbering between the finite difference and the hand-coded Jacobians looks like a serious problem to me, but I'm probably missing something. >> >> I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, and for this test problem the grid dimensions are 11x7x6. For a grid point x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? If so, then the column numbers of the hand-coded Jacobian match those of the 27 point stencil I have in mind. However, I am then at a loss to explain the column numbers in the finite difference Jacobian. >> >> >> >> >> On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . > wrote: >> OK - I ran with -snes_monitor -snes_converged_reason -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls -snes_compare_explicit >> >> and here is the full error message, output immediately after >> >> Finite difference Jacobian >> Mat Object: 24 MPI processes >> type: mpiaij >> >> [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- >> >> [0]PETSC ERROR: Invalid argument >> >> [0]PETSC ERROR: Matrix not generated from a DMDA >> >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. >> >> [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 >> >> [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 >> >> [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 --download-fblaslapack -with-debugging=0 >> >> [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c >> >> [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/mat/interface/matrix.c >> >> [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c >> >> [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c >> >> [0]PETSC ERROR: #5 SNESSolve() line 4005 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c >> >> [0]PETSC ERROR: #6 solveWarp3D() line 659 in /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c >> >> >> On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown > wrote: >> Always always always send the whole error message. >> >> "zakaryah ." > writes: >> >> > I tried -snes_compare_explicit, and got the following error: >> > >> > [0]PETSC ERROR: Invalid argument >> > >> > [0]PETSC ERROR: Matrix not generated from a DMDA >> > >> > What am I doing wrong? >> > >> > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown > wrote: >> > >> >> Barry Smith > writes: >> >> >> >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . > wrote: >> >> >> >> >> >> I'm still working on this. I've made some progress, and it looks like >> >> the issue is with the KSP, at least for now. The Jacobian may be >> >> ill-conditioned. Is it possible to use -snes_test_display during an >> >> intermediate step of the analysis? I would like to inspect the Jacobian >> >> after several solves have already completed, >> >> > >> >> > No, our currently code for testing Jacobians is poor quality and >> >> poorly organized. Needs a major refactoring to do things properly. Sorry >> >> >> >> You can use -snes_compare_explicit or -snes_compare_coloring to output >> >> differences on each Newton step. >> >> >> >> > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From bsmith at mcs.anl.gov Sun Oct 22 11:11:16 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 22 Oct 2017 11:11:16 -0500 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> Message-ID: > On Oct 21, 2017, at 11:16 PM, Hao Zhang wrote: > > the reason is when I do finer grid simulation, matrix become more stiff. Are you saying that for a finer grid but everything else the same, the convergence of hypre (with the same GMRES) with the same options gets much worse? This normally will not happen, that is the fundamental beauty of multigrid methods (when they work well). Yes the matrix condition number increases but multigrid doesn't care about that, its number of iterations should remain pretty much the same. Something must be different (with this finer grid case), either the mesh becomes horrible, or the physics changes, or there are errors in the code that lead to the problem. What happens if you just refine the mesh a little? Then a little more? Then a little more? Does the convergence rate suddenly go bad at some point, or does it just get worse slowly? Barry > Much larger condition number. just to give you a perspective, it will take 6000 iterations to converge and the solver does converge. I want to reduce the number of iterations while keeping the convergence rate. that's main drive to do so much heavy lifting around. please advise. snippet will be provided upon request. > > Thanks again. > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith wrote: > > Oh, you change KSP but not hypre. I did not understand this. > > Why not just use GMRES all the time? Why mess with BCGS if it is not robust? Not worth the small optimization if it breaks everything. > > Barry > > > > > On Oct 21, 2017, at 11:05 PM, Hao Zhang wrote: > > > > this is the initial pressure solver output regarding use of PETSc. it failed to converge after 40000 iterations, then use GMRES. > > > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 > > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 > > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 > > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 > > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 > > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 > > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 > > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 > > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 > > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 > > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 > > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 > > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 > > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 > > Linear solve did not converge due to DIVERGED_ITS iterations 40000 > > KSP Object: 24 MPI processes > > type: bcgs > > maximum iterations=40000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: 24 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > Cycle type V > > Maximum number of levels 25 > > Maximum number of iterations PER hypre call 1 > > Convergence tolerance PER hypre call 0. > > Threshold for strong coupling 0.25 > > Interpolation truncation factor 0. > > Interpolation: max elements per row 0 > > Number of levels of aggressive coarsening 0 > > Number of paths for aggressive coarsening 1 > > Maximum row sums 0.9 > > Sweeps down 1 > > Sweeps up 1 > > Sweeps on coarse 1 > > Relax down symmetric-SOR/Jacobi > > Relax up symmetric-SOR/Jacobi > > Relax on coarse Gaussian-elimination > > Relax weight (all) 1. > > Outer relax weight (all) 1. > > Using CF-relaxation > > Not using more complex smoothers. > > Measure type local > > Coarsen type Falgout > > Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: A 24 MPI processes > > type: mpiaij > > rows=497664, cols=497664 > > total: nonzeros=3363552, allocated nonzeros=6967296 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node (on process 0) routines > > > > The solution diverges for p0! The residual is 3.853123e-08. Solve again using GMRES! > > KSP Object: 24 MPI processes > > type: gmres > > restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > > happy breakdown tolerance 1e-30 > > maximum iterations=40000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: 24 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > Cycle type V > > Maximum number of levels 25 > > Maximum number of iterations PER hypre call 1 > > Convergence tolerance PER hypre call 0. > > Threshold for strong coupling 0.25 > > Interpolation truncation factor 0. > > Interpolation: max elements per row 0 > > Number of levels of aggressive coarsening 0 > > Number of paths for aggressive coarsening 1 > > Maximum row sums 0.9 > > Sweeps down 1 > > Sweeps up 1 > > Sweeps on coarse 1 > > Relax down symmetric-SOR/Jacobi > > Relax up symmetric-SOR/Jacobi > > Relax on coarse Gaussian-elimination > > Relax weight (all) 1. > > Outer relax weight (all) 1. > > Using CF-relaxation > > Not using more complex smoothers. > > Measure type local > > Coarsen type Falgout > > Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: A 24 MPI processes > > type: mpiaij > > rows=497664, cols=497664 > > total: nonzeros=3363552, allocated nonzeros=6967296 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node (on process 0) routines > > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 > > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 > > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 > > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 > > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 > > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 > > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 > > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 > > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 > > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 > > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 > > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 > > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 > > Linear solve converged due to CONVERGED_RTOL iterations 12 > > KSP Object: 24 MPI processes > > type: gmres > > restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > > happy breakdown tolerance 1e-30 > > maximum iterations=40000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: 24 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > Cycle type V > > Maximum number of levels 25 > > Maximum number of iterations PER hypre call 1 > > Convergence tolerance PER hypre call 0. > > Threshold for strong coupling 0.25 > > Interpolation truncation factor 0. > > Interpolation: max elements per row 0 > > Number of levels of aggressive coarsening 0 > > Number of paths for aggressive coarsening 1 > > Maximum row sums 0.9 > > Sweeps down 1 > > Sweeps up 1 > > Sweeps on coarse 1 > > Relax down symmetric-SOR/Jacobi > > Relax up symmetric-SOR/Jacobi > > Relax on coarse Gaussian-elimination > > Relax weight (all) 1. > > Outer relax weight (all) 1. > > Using CF-relaxation > > Not using more complex smoothers. > > Measure type local > > Coarsen type Falgout > > Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: A 24 MPI processes > > type: mpiaij > > rows=497664, cols=497664 > > total: nonzeros=3363552, allocated nonzeros=6967296 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node (on process 0) routines > > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd > > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd > > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd > > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = 1.075459e-13 > > > > The max value of p0 is 0.03115845493408858 > > > > The min value of p0 is -0.07156715468428149 > > > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith wrote: > > > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: > > > > > > the incompressible NS solver algorithm call PETSc solver at different stage of each time step. The one you were saying "This is good. 12 digit reduction" is after the initial pressure solver, in which usually HYPRE doesn't give a good convergence, so the fall-back solver GMRES will be called after. > > > > Hmm, I don't understand. hypre should do well on a pressure solve. In fact, very well. > > > > > > Barry, you were mentioning that I could have a wrong nullspace. that particular solver is aimed to give an initial pressure profile for 3d incompressible NS simulation using all neumann boundary conditions. could you give some insight how to test if I have a wrong nullspace etc? > > > > -ksp_test_null_space > > > > But if your null space is consistently from all Neumann boundary conditions then it likely is not wrong. > > > > Barry > > > > > > > > Thanks! > > > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith wrote: > > > > > > This is good. You get more than 12 digit reduction in the true residual norm. This is good AMG convergence. Expected when everything goes well. > > > > > > What is different in this case from the previous case that does not converge reasonably? > > > > > > Barry > > > > > > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: > > > > > > > > Barry, Please advise what you make of this? this is poisson solver with all neumann BC 3d case Finite difference Scheme was used. > > > > Thanks! I'm in learning mode. > > > > > > > > KSP Object: 24 MPI processes > > > > type: bcgs > > > > maximum iterations=40000, initial guess is zero > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 24 MPI processes > > > > type: hypre > > > > HYPRE BoomerAMG preconditioning > > > > Cycle type V > > > > Maximum number of levels 25 > > > > Maximum number of iterations PER hypre call 1 > > > > Convergence tolerance PER hypre call 0. > > > > Threshold for strong coupling 0.25 > > > > Interpolation truncation factor 0. > > > > Interpolation: max elements per row 0 > > > > Number of levels of aggressive coarsening 0 > > > > Number of paths for aggressive coarsening 1 > > > > Maximum row sums 0.9 > > > > Sweeps down 1 > > > > Sweeps up 1 > > > > Sweeps on coarse 1 > > > > Relax down symmetric-SOR/Jacobi > > > > Relax up symmetric-SOR/Jacobi > > > > Relax on coarse Gaussian-elimination > > > > Relax weight (all) 1. > > > > Outer relax weight (all) 1. > > > > Using CF-relaxation > > > > Not using more complex smoothers. > > > > Measure type local > > > > Coarsen type Falgout > > > > Interpolation type classical > > > > linear system matrix = precond matrix: > > > > Mat Object: A 24 MPI processes > > > > type: mpiaij > > > > rows=497664, cols=497664 > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > total number of mallocs used during MatSetValues calls =0 > > > > has attached null space > > > > not using I-node (on process 0) routines > > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith wrote: > > > > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: > > > > > > > > > > hi, Barry: > > > > > what do you mean absurd by setting tolerance =1e-14? > > > > > > > > Trying to decrease the initial residual norm down by a factor of 1e-14 with an iterative method (or even direct method) is unrealistic, usually unachievable) and almost never necessary. You are requiring || r_n || < 1.e-14 || r_0|| when with double precision numbers you only have roughly 14 decimal digits total to compute with. Round off alone will lead to differences far larger than 1e-14 > > > > > > > > If you are using the solver in the context of a nonlinear problem (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to get quadratic convergence of Newton's method. > > > > > > > > If you are solving a linear problem then it is extremely likely that errors due to discretization errors (from finite element method etc) and the model are much much larger than even 1.e-8. > > > > > > > > So, in summary > > > > > > > > 1.e-14 is probably unachievable > > > > > > > > 1.e-14 is almost for sure not needed. > > > > > > > > Barry > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: > > > > > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely to be a reason to use bcgs > > > > > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > > > > > > > > > > > if this solver doesn't converge. I have a fall-back solution, which uses GMRES solver. this setup is fine with me. I just want to know if HYPRE is a reliable solution for me. Or I will have to go without preconditioner. > > > > > > > > > > > > Thanks! > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: > > > > > > this is serial run. still dumping output. parallel more or less the same. > > > > > > > > > > > > KSP Object: 1 MPI processes > > > > > > type: bcgs > > > > > > maximum iterations=40000, initial guess is zero > > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > > left preconditioning > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > PC Object: 1 MPI processes > > > > > > type: hypre > > > > > > HYPRE BoomerAMG preconditioning > > > > > > Cycle type V > > > > > > Maximum number of levels 25 > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > Convergence tolerance PER hypre call 0. > > > > > > Threshold for strong coupling 0.25 > > > > > > Interpolation truncation factor 0. > > > > > > Interpolation: max elements per row 0 > > > > > > Number of levels of aggressive coarsening 0 > > > > > > Number of paths for aggressive coarsening 1 > > > > > > Maximum row sums 0.9 > > > > > > Sweeps down 1 > > > > > > Sweeps up 1 > > > > > > Sweeps on coarse 1 > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > Relax on coarse Gaussian-elimination > > > > > > Relax weight (all) 1. > > > > > > Outer relax weight (all) 1. > > > > > > Using CF-relaxation > > > > > > Not using more complex smoothers. > > > > > > Measure type local > > > > > > Coarsen type Falgout > > > > > > Interpolation type classical > > > > > > linear system matrix = precond matrix: > > > > > > Mat Object: A 1 MPI processes > > > > > > type: seqaij > > > > > > rows=497664, cols=497664 > > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > > has attached null space > > > > > > not using I-node routines > > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley wrote: > > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > > > > ierr = VecAssemblyBegin(x); > > > > > > ierr = VecAssemblyEnd(x); > > > > > > This is probably unnecessary > > > > > > > > > > > > ierr = VecAssemblyBegin(b); > > > > > > ierr = VecAssemblyEnd(b); > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > > > Is your rhs consistent with this nullspace? > > > > > > > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > > > > KSPSetOperators(ksp,A,A); > > > > > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > > > #if defined(__HYPRE__) > > > > > > KSPGetPC(ksp, &pc); > > > > > > PCSetType(pc, PCHYPRE); > > > > > > PCHYPRESetType(pc,"boomeramg"); > > > > > > This is terribly unnecessary. You just use > > > > > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > > > > > or > > > > > > > > > > > > -pc_type gamg > > > > > > > > > > > > #else > > > > > > KSPSetType(ksp,KSPBCGSL); > > > > > > KSPBCGSLSetEll(ksp,2); > > > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > > > > > KSPSetFromOptions(ksp); > > > > > > KSPSetUp(ksp); > > > > > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > > > > > > > command line > > > > > > > > > > > > You did not provide any of what I asked for the in the eprevious mail. > > > > > > > > > > > > Matt > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley wrote: > > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > > > > > > hi, > > > > > > > > > > > > I implemented HYPRE preconditioner for my study due to the fact that without preconditioner, PETSc solver will take thousands of iterations to converge for fine grid simulation. > > > > > > > > > > > > with HYPRE, depending on the parallel partition, it will take HYPRE forever to do anything. observation of output file is that the simulation is hanging with no output. > > > > > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > > > > > 1) For any question about convergence, we need to see the output of > > > > > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > > > > > > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > > > > > > > > > Thanks, > > > > > > > > > > > > Matt > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > > > -- Norbert Wiener > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > > > -- Norbert Wiener > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 From hbcbh1999 at gmail.com Sun Oct 22 14:35:45 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Sun, 22 Oct 2017 15:35:45 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> Message-ID: Thanks for all the inputs. before simulating finer grid, HYPRE wasn't used and simulations were just fine. I will do a few tests and post more information later. On Sun, Oct 22, 2017 at 12:11 PM, Barry Smith wrote: > > > On Oct 21, 2017, at 11:16 PM, Hao Zhang wrote: > > > > the reason is when I do finer grid simulation, matrix become more stiff. > > Are you saying that for a finer grid but everything else the same, the > convergence of hypre (with the same GMRES) with the same options gets much > worse? This normally will not happen, that is the fundamental beauty of > multigrid methods (when they work well). > > Yes the matrix condition number increases but multigrid doesn't care > about that, its number of iterations should remain pretty much the same. > > Something must be different (with this finer grid case), either the > mesh becomes horrible, or the physics changes, or there are errors in the > code that lead to the problem. > > What happens if you just refine the mesh a little? Then a little > more? Then a little more? Does the convergence rate suddenly go bad at some > point, or does it just get worse slowly? > > Barry > > > > > Much larger condition number. just to give you a perspective, it will > take 6000 iterations to converge and the solver does converge. I want to > reduce the number of iterations while keeping the convergence rate. that's > main drive to do so much heavy lifting around. please advise. snippet will > be provided upon request. > > > > Thanks again. > > > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith > wrote: > > > > Oh, you change KSP but not hypre. I did not understand this. > > > > Why not just use GMRES all the time? Why mess with BCGS if it is not > robust? Not worth the small optimization if it breaks everything. > > > > Barry > > > > > > > > > On Oct 21, 2017, at 11:05 PM, Hao Zhang wrote: > > > > > > this is the initial pressure solver output regarding use of PETSc. it > failed to converge after 40000 iterations, then use GMRES. > > > > > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid norm > 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 > > > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid norm > 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 > > > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid norm > 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 > > > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid norm > 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 > > > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid norm > 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 > > > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid norm > 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 > > > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid norm > 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 > > > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid norm > 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 > > > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid norm > 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 > > > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid norm > 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 > > > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid norm > 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 > > > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid norm > 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 > > > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid norm > 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 > > > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid norm > 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 > > > Linear solve did not converge due to DIVERGED_ITS iterations 40000 > > > KSP Object: 24 MPI processes > > > type: bcgs > > > maximum iterations=40000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 24 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > Cycle type V > > > Maximum number of levels 25 > > > Maximum number of iterations PER hypre call 1 > > > Convergence tolerance PER hypre call 0. > > > Threshold for strong coupling 0.25 > > > Interpolation truncation factor 0. > > > Interpolation: max elements per row 0 > > > Number of levels of aggressive coarsening 0 > > > Number of paths for aggressive coarsening 1 > > > Maximum row sums 0.9 > > > Sweeps down 1 > > > Sweeps up 1 > > > Sweeps on coarse 1 > > > Relax down symmetric-SOR/Jacobi > > > Relax up symmetric-SOR/Jacobi > > > Relax on coarse Gaussian-elimination > > > Relax weight (all) 1. > > > Outer relax weight (all) 1. > > > Using CF-relaxation > > > Not using more complex smoothers. > > > Measure type local > > > Coarsen type Falgout > > > Interpolation type classical > > > linear system matrix = precond matrix: > > > Mat Object: A 24 MPI processes > > > type: mpiaij > > > rows=497664, cols=497664 > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > total number of mallocs used during MatSetValues calls =0 > > > has attached null space > > > not using I-node (on process 0) routines > > > > > > The solution diverges for p0! The residual is 3.853123e-08. Solve > again using GMRES! > > > KSP Object: 24 MPI processes > > > type: gmres > > > restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > > > happy breakdown tolerance 1e-30 > > > maximum iterations=40000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 24 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > Cycle type V > > > Maximum number of levels 25 > > > Maximum number of iterations PER hypre call 1 > > > Convergence tolerance PER hypre call 0. > > > Threshold for strong coupling 0.25 > > > Interpolation truncation factor 0. > > > Interpolation: max elements per row 0 > > > Number of levels of aggressive coarsening 0 > > > Number of paths for aggressive coarsening 1 > > > Maximum row sums 0.9 > > > Sweeps down 1 > > > Sweeps up 1 > > > Sweeps on coarse 1 > > > Relax down symmetric-SOR/Jacobi > > > Relax up symmetric-SOR/Jacobi > > > Relax on coarse Gaussian-elimination > > > Relax weight (all) 1. > > > Outer relax weight (all) 1. > > > Using CF-relaxation > > > Not using more complex smoothers. > > > Measure type local > > > Coarsen type Falgout > > > Interpolation type classical > > > linear system matrix = precond matrix: > > > Mat Object: A 24 MPI processes > > > type: mpiaij > > > rows=497664, cols=497664 > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > total number of mallocs used during MatSetValues calls =0 > > > has attached null space > > > not using I-node (on process 0) routines > > > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm > 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm > 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 > > > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm > 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 > > > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm > 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 > > > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm > 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 > > > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm > 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 > > > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm > 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 > > > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm > 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 > > > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm > 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 > > > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm > 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 > > > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm > 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 > > > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm > 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 > > > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm > 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 > > > Linear solve converged due to CONVERGED_RTOL iterations 12 > > > KSP Object: 24 MPI processes > > > type: gmres > > > restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > > > happy breakdown tolerance 1e-30 > > > maximum iterations=40000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 24 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > Cycle type V > > > Maximum number of levels 25 > > > Maximum number of iterations PER hypre call 1 > > > Convergence tolerance PER hypre call 0. > > > Threshold for strong coupling 0.25 > > > Interpolation truncation factor 0. > > > Interpolation: max elements per row 0 > > > Number of levels of aggressive coarsening 0 > > > Number of paths for aggressive coarsening 1 > > > Maximum row sums 0.9 > > > Sweeps down 1 > > > Sweeps up 1 > > > Sweeps on coarse 1 > > > Relax down symmetric-SOR/Jacobi > > > Relax up symmetric-SOR/Jacobi > > > Relax on coarse Gaussian-elimination > > > Relax weight (all) 1. > > > Outer relax weight (all) 1. > > > Using CF-relaxation > > > Not using more complex smoothers. > > > Measure type local > > > Coarsen type Falgout > > > Interpolation type classical > > > linear system matrix = precond matrix: > > > Mat Object: A 24 MPI processes > > > type: mpiaij > > > rows=497664, cols=497664 > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > total number of mallocs used during MatSetValues calls =0 > > > has attached null space > > > not using I-node (on process 0) routines > > > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd > > > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd > > > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd > > > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = > 1.075459e-13 > > > > > > The max value of p0 is 0.03115845493408858 > > > > > > The min value of p0 is -0.07156715468428149 > > > > > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith > wrote: > > > > > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: > > > > > > > > the incompressible NS solver algorithm call PETSc solver at > different stage of each time step. The one you were saying "This is good. > 12 digit reduction" is after the initial pressure solver, in which usually > HYPRE doesn't give a good convergence, so the fall-back solver GMRES will > be called after. > > > > > > Hmm, I don't understand. hypre should do well on a pressure solve. > In fact, very well. > > > > > > > > Barry, you were mentioning that I could have a wrong nullspace. that > particular solver is aimed to give an initial pressure profile for 3d > incompressible NS simulation using all neumann boundary conditions. could > you give some insight how to test if I have a wrong nullspace etc? > > > > > > -ksp_test_null_space > > > > > > But if your null space is consistently from all Neumann boundary > conditions then it likely is not wrong. > > > > > > Barry > > > > > > > > > > > Thanks! > > > > > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith > wrote: > > > > > > > > This is good. You get more than 12 digit reduction in the true > residual norm. This is good AMG convergence. Expected when everything goes > well. > > > > > > > > What is different in this case from the previous case that does > not converge reasonably? > > > > > > > > Barry > > > > > > > > > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang > wrote: > > > > > > > > > > Barry, Please advise what you make of this? this is poisson > solver with all neumann BC 3d case Finite difference Scheme was used. > > > > > Thanks! I'm in learning mode. > > > > > > > > > > KSP Object: 24 MPI processes > > > > > type: bcgs > > > > > maximum iterations=40000, initial guess is zero > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > left preconditioning > > > > > using PRECONDITIONED norm type for convergence test > > > > > PC Object: 24 MPI processes > > > > > type: hypre > > > > > HYPRE BoomerAMG preconditioning > > > > > Cycle type V > > > > > Maximum number of levels 25 > > > > > Maximum number of iterations PER hypre call 1 > > > > > Convergence tolerance PER hypre call 0. > > > > > Threshold for strong coupling 0.25 > > > > > Interpolation truncation factor 0. > > > > > Interpolation: max elements per row 0 > > > > > Number of levels of aggressive coarsening 0 > > > > > Number of paths for aggressive coarsening 1 > > > > > Maximum row sums 0.9 > > > > > Sweeps down 1 > > > > > Sweeps up 1 > > > > > Sweeps on coarse 1 > > > > > Relax down symmetric-SOR/Jacobi > > > > > Relax up symmetric-SOR/Jacobi > > > > > Relax on coarse Gaussian-elimination > > > > > Relax weight (all) 1. > > > > > Outer relax weight (all) 1. > > > > > Using CF-relaxation > > > > > Not using more complex smoothers. > > > > > Measure type local > > > > > Coarsen type Falgout > > > > > Interpolation type classical > > > > > linear system matrix = precond matrix: > > > > > Mat Object: A 24 MPI processes > > > > > type: mpiaij > > > > > rows=497664, cols=497664 > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > has attached null space > > > > > not using I-node (on process 0) routines > > > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid > norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid > norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid > norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid > norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid > norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid > norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid > norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid > norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > > > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith > wrote: > > > > > > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang > wrote: > > > > > > > > > > > > hi, Barry: > > > > > > what do you mean absurd by setting tolerance =1e-14? > > > > > > > > > > Trying to decrease the initial residual norm down by a factor of > 1e-14 with an iterative method (or even direct method) is unrealistic, > usually unachievable) and almost never necessary. You are requiring || r_n > || < 1.e-14 || r_0|| when with double precision numbers you only have > roughly 14 decimal digits total to compute with. Round off alone will lead > to differences far larger than 1e-14 > > > > > > > > > > If you are using the solver in the context of a nonlinear > problem (i.e. inside Newton's method) then 1.e-6 is generally more than > plenty to get quadratic convergence of Newton's method. > > > > > > > > > > If you are solving a linear problem then it is extremely likely > that errors due to discretization errors (from finite element method etc) > and the model are much much larger than even 1.e-8. > > > > > > > > > > So, in summary > > > > > > > > > > 1.e-14 is probably unachievable > > > > > > > > > > 1.e-14 is almost for sure not needed. > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith > wrote: > > > > > > > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send > the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > > > > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely to > be a reason to use bcgs > > > > > > > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang > wrote: > > > > > > > > > > > > > > if this solver doesn't converge. I have a fall-back solution, > which uses GMRES solver. this setup is fine with me. I just want to know if > HYPRE is a reliable solution for me. Or I will have to go without > preconditioner. > > > > > > > > > > > > > > Thanks! > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang < > hbcbh1999 at gmail.com> wrote: > > > > > > > this is serial run. still dumping output. parallel more or > less the same. > > > > > > > > > > > > > > KSP Object: 1 MPI processes > > > > > > > type: bcgs > > > > > > > maximum iterations=40000, initial guess is zero > > > > > > > tolerances: relative=1e-14, absolute=1e-14, > divergence=10000. > > > > > > > left preconditioning > > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > > PC Object: 1 MPI processes > > > > > > > type: hypre > > > > > > > HYPRE BoomerAMG preconditioning > > > > > > > Cycle type V > > > > > > > Maximum number of levels 25 > > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > > Convergence tolerance PER hypre call 0. > > > > > > > Threshold for strong coupling 0.25 > > > > > > > Interpolation truncation factor 0. > > > > > > > Interpolation: max elements per row 0 > > > > > > > Number of levels of aggressive coarsening 0 > > > > > > > Number of paths for aggressive coarsening 1 > > > > > > > Maximum row sums 0.9 > > > > > > > Sweeps down 1 > > > > > > > Sweeps up 1 > > > > > > > Sweeps on coarse 1 > > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > > Relax on coarse Gaussian-elimination > > > > > > > Relax weight (all) 1. > > > > > > > Outer relax weight (all) 1. > > > > > > > Using CF-relaxation > > > > > > > Not using more complex smoothers. > > > > > > > Measure type local > > > > > > > Coarsen type Falgout > > > > > > > Interpolation type classical > > > > > > > linear system matrix = precond matrix: > > > > > > > Mat Object: A 1 MPI processes > > > > > > > type: seqaij > > > > > > > rows=497664, cols=497664 > > > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > > > has attached null space > > > > > > > not using I-node routines > > > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true > resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true > resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true > resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true > resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true > resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true > resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true > resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true > resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true > resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true > resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true > resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true > resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true > resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true > resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true > resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true > resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true > resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true > resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true > resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true > resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true > resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true > resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true > resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true > resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true > resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true > resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true > resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true > resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true > resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true > resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true > resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true > resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true > resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true > resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true > resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true > resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true > resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true > resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true > resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true > resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true > resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true > resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true > resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true > resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true > resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true > resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true > resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true > resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true > resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true > resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true > resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true > resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true > resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true > resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true > resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true > resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true > resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true > resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true > resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true > resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true > resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true > resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true > resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true > resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true > resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true > resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true > resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true > resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true > resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true > resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true > resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true > resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true > resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true > resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true > resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true > resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true > resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true > resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true > resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true > resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true > resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true > resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true > resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true > resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true > resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true > resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true > resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true > resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true > resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true > resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true > resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true > resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true > resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true > resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true > resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true > resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true > resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true > resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true > resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true > resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true > resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true > resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true > resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true > resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true > resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true > resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true > resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true > resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true > resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true > resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true > resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true > resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true > resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true > resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true > resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true > resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true > resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true > resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true > resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true > resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true > resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true > resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true > resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true > resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true > resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true > resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true > resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true > resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true > resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true > resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true > resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true > resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true > resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true > resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true > resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true > resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true > resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true > resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true > resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true > resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true > resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true > resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true > resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true > resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true > resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true > resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true > resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true > resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true > resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true > resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true > resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true > resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true > resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true > resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true > resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true > resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true > resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true > resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true > resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true > resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true > resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true > resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true > resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true > resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true > resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true > resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true > resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true > resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true > resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true > resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true > resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true > resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true > resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true > resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true > resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true > resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true > resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true > resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true > resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true > resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true > resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true > resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true > resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true > resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true > resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true > resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true > resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true > resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true > resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true > resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true > resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true > resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true > resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true > resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true > resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true > resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true > resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true > resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true > resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true > resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true > resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true > resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true > resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true > resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true > resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true > resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true > resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true > resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true > resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true > resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true > resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true > resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true > resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true > resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true > resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true > resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true > resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true > resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true > resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true > resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true > resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true > resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true > resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true > resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true > resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true > resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true > resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true > resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true > resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true > resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true > resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true > resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true > resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true > resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true > resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true > resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true > resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true > resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true > resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true > resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true > resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true > resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true > resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true > resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true > resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true > resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true > resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true > resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true > resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true > resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true > resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true > resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true > resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true > resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true > resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true > resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true > resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true > resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true > resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true > resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true > resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true > resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true > resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true > resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true > resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true > resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true > resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true > resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true > resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true > resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true > resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true > resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true > resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true > resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true > resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true > resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true > resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true > resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true > resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true > resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true > resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true > resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true > resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true > resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true > resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true > resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true > resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true > resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true > resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true > resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true > resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true > resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true > resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true > resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true > resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true > resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true > resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true > resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true > resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true > resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true > resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true > resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true > resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true > resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true > resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true > resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true > resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true > resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true > resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true > resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true > resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true > resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true > resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true > resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true > resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true > resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true > resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true > resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true > resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true > resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true > resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true > resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true > resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true > resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true > resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true > resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true > resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true > resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true > resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true > resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true > resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true > resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true > resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true > resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true > resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true > resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true > resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true > resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true > resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true > resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true > resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true > resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true > resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true > resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true > resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true > resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true > resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley < > knepley at gmail.com> wrote: > > > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang < > hbcbh1999 at gmail.com> wrote: > > > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > > > > > > ierr = VecAssemblyBegin(x); > > > > > > > ierr = VecAssemblyEnd(x); > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > ierr = VecAssemblyBegin(b); > > > > > > > ierr = VecAssemblyEnd(b); > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ > WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > > > > Is your rhs consistent with this nullspace? > > > > > > > > > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > > > > > KSPSetOperators(ksp,A,A); > > > > > > > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > > > > #if defined(__HYPRE__) > > > > > > > KSPGetPC(ksp, &pc); > > > > > > > PCSetType(pc, PCHYPRE); > > > > > > > PCHYPRESetType(pc,"boomeramg"); > > > > > > > This is terribly unnecessary. You just use > > > > > > > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > > > > > > > or > > > > > > > > > > > > > > -pc_type gamg > > > > > > > > > > > > > > #else > > > > > > > KSPSetType(ksp,KSPBCGSL); > > > > > > > KSPBCGSLSetEll(ksp,2); > > > > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > > > > > > > KSPSetFromOptions(ksp); > > > > > > > KSPSetUp(ksp); > > > > > > > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > > > > > > > > > > command line > > > > > > > > > > > > > > You did not provide any of what I asked for the in the > eprevious mail. > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley < > knepley at gmail.com> wrote: > > > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang < > hbcbh1999 at gmail.com> wrote: > > > > > > > hi, > > > > > > > > > > > > > > I implemented HYPRE preconditioner for my study due to the > fact that without preconditioner, PETSc solver will take thousands of > iterations to converge for fine grid simulation. > > > > > > > > > > > > > > with HYPRE, depending on the parallel partition, it will take > HYPRE forever to do anything. observation of output file is that the > simulation is hanging with no output. > > > > > > > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > > > > > > > 1) For any question about convergence, we need to see the > output of > > > > > > > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual > -ksp_converged_reason > > > > > > > > > > > > > > 2) Hypre has many preconditioners, which one are you talking > about > > > > > > > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, like > AMG > > > > > > > > > > > > > > Thanks, > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > What most experimenters take for granted before they begin > their experiments is infinitely more interesting than any results to which > their experiments lead. > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > What most experimenters take for granted before they begin > their experiments is infinitely more interesting than any results to which > their experiments lead. > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Oct 22 14:57:17 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 22 Oct 2017 14:57:17 -0500 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> Message-ID: <8C8ADFC3-C71F-4DE1-B21C-5BF50EBC652D@mcs.anl.gov> One thing important to understand is that multigrid is an optimal or nearly optimal algorithm. This means, when it works, as you refine the mesh the number of iterations remains nearly constant, regardless of the problem size and number of processes. Simple preconditioners such as ILU, block Jacobi, one level additive Schwarz etc have iterations that increase with the problem size and likely also with the number of processes. Thus these algorithms become essentially impractical for very large problems while multigrid can remain practical (when it works). Good luck Barry > On Oct 22, 2017, at 2:35 PM, Hao Zhang wrote: > > Thanks for all the inputs. before simulating finer grid, HYPRE wasn't used and simulations were just fine. I will do a few tests and post more information later. > > On Sun, Oct 22, 2017 at 12:11 PM, Barry Smith wrote: > > > On Oct 21, 2017, at 11:16 PM, Hao Zhang wrote: > > > > the reason is when I do finer grid simulation, matrix become more stiff. > > Are you saying that for a finer grid but everything else the same, the convergence of hypre (with the same GMRES) with the same options gets much worse? This normally will not happen, that is the fundamental beauty of multigrid methods (when they work well). > > Yes the matrix condition number increases but multigrid doesn't care about that, its number of iterations should remain pretty much the same. > > Something must be different (with this finer grid case), either the mesh becomes horrible, or the physics changes, or there are errors in the code that lead to the problem. > > What happens if you just refine the mesh a little? Then a little more? Then a little more? Does the convergence rate suddenly go bad at some point, or does it just get worse slowly? > > Barry > > > > > Much larger condition number. just to give you a perspective, it will take 6000 iterations to converge and the solver does converge. I want to reduce the number of iterations while keeping the convergence rate. that's main drive to do so much heavy lifting around. please advise. snippet will be provided upon request. > > > > Thanks again. > > > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith wrote: > > > > Oh, you change KSP but not hypre. I did not understand this. > > > > Why not just use GMRES all the time? Why mess with BCGS if it is not robust? Not worth the small optimization if it breaks everything. > > > > Barry > > > > > > > > > On Oct 21, 2017, at 11:05 PM, Hao Zhang wrote: > > > > > > this is the initial pressure solver output regarding use of PETSc. it failed to converge after 40000 iterations, then use GMRES. > > > > > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 > > > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 > > > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 > > > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 > > > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 > > > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 > > > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 > > > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 > > > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 > > > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 > > > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 > > > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 > > > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 > > > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 > > > Linear solve did not converge due to DIVERGED_ITS iterations 40000 > > > KSP Object: 24 MPI processes > > > type: bcgs > > > maximum iterations=40000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 24 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > Cycle type V > > > Maximum number of levels 25 > > > Maximum number of iterations PER hypre call 1 > > > Convergence tolerance PER hypre call 0. > > > Threshold for strong coupling 0.25 > > > Interpolation truncation factor 0. > > > Interpolation: max elements per row 0 > > > Number of levels of aggressive coarsening 0 > > > Number of paths for aggressive coarsening 1 > > > Maximum row sums 0.9 > > > Sweeps down 1 > > > Sweeps up 1 > > > Sweeps on coarse 1 > > > Relax down symmetric-SOR/Jacobi > > > Relax up symmetric-SOR/Jacobi > > > Relax on coarse Gaussian-elimination > > > Relax weight (all) 1. > > > Outer relax weight (all) 1. > > > Using CF-relaxation > > > Not using more complex smoothers. > > > Measure type local > > > Coarsen type Falgout > > > Interpolation type classical > > > linear system matrix = precond matrix: > > > Mat Object: A 24 MPI processes > > > type: mpiaij > > > rows=497664, cols=497664 > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > total number of mallocs used during MatSetValues calls =0 > > > has attached null space > > > not using I-node (on process 0) routines > > > > > > The solution diverges for p0! The residual is 3.853123e-08. Solve again using GMRES! > > > KSP Object: 24 MPI processes > > > type: gmres > > > restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > > > happy breakdown tolerance 1e-30 > > > maximum iterations=40000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 24 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > Cycle type V > > > Maximum number of levels 25 > > > Maximum number of iterations PER hypre call 1 > > > Convergence tolerance PER hypre call 0. > > > Threshold for strong coupling 0.25 > > > Interpolation truncation factor 0. > > > Interpolation: max elements per row 0 > > > Number of levels of aggressive coarsening 0 > > > Number of paths for aggressive coarsening 1 > > > Maximum row sums 0.9 > > > Sweeps down 1 > > > Sweeps up 1 > > > Sweeps on coarse 1 > > > Relax down symmetric-SOR/Jacobi > > > Relax up symmetric-SOR/Jacobi > > > Relax on coarse Gaussian-elimination > > > Relax weight (all) 1. > > > Outer relax weight (all) 1. > > > Using CF-relaxation > > > Not using more complex smoothers. > > > Measure type local > > > Coarsen type Falgout > > > Interpolation type classical > > > linear system matrix = precond matrix: > > > Mat Object: A 24 MPI processes > > > type: mpiaij > > > rows=497664, cols=497664 > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > total number of mallocs used during MatSetValues calls =0 > > > has attached null space > > > not using I-node (on process 0) routines > > > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 > > > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 > > > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 > > > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 > > > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 > > > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 > > > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 > > > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 > > > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 > > > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 > > > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 > > > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 > > > Linear solve converged due to CONVERGED_RTOL iterations 12 > > > KSP Object: 24 MPI processes > > > type: gmres > > > restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > > > happy breakdown tolerance 1e-30 > > > maximum iterations=40000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 24 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > Cycle type V > > > Maximum number of levels 25 > > > Maximum number of iterations PER hypre call 1 > > > Convergence tolerance PER hypre call 0. > > > Threshold for strong coupling 0.25 > > > Interpolation truncation factor 0. > > > Interpolation: max elements per row 0 > > > Number of levels of aggressive coarsening 0 > > > Number of paths for aggressive coarsening 1 > > > Maximum row sums 0.9 > > > Sweeps down 1 > > > Sweeps up 1 > > > Sweeps on coarse 1 > > > Relax down symmetric-SOR/Jacobi > > > Relax up symmetric-SOR/Jacobi > > > Relax on coarse Gaussian-elimination > > > Relax weight (all) 1. > > > Outer relax weight (all) 1. > > > Using CF-relaxation > > > Not using more complex smoothers. > > > Measure type local > > > Coarsen type Falgout > > > Interpolation type classical > > > linear system matrix = precond matrix: > > > Mat Object: A 24 MPI processes > > > type: mpiaij > > > rows=497664, cols=497664 > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > total number of mallocs used during MatSetValues calls =0 > > > has attached null space > > > not using I-node (on process 0) routines > > > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd > > > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd > > > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd > > > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = 1.075459e-13 > > > > > > The max value of p0 is 0.03115845493408858 > > > > > > The min value of p0 is -0.07156715468428149 > > > > > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith wrote: > > > > > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: > > > > > > > > the incompressible NS solver algorithm call PETSc solver at different stage of each time step. The one you were saying "This is good. 12 digit reduction" is after the initial pressure solver, in which usually HYPRE doesn't give a good convergence, so the fall-back solver GMRES will be called after. > > > > > > Hmm, I don't understand. hypre should do well on a pressure solve. In fact, very well. > > > > > > > > Barry, you were mentioning that I could have a wrong nullspace. that particular solver is aimed to give an initial pressure profile for 3d incompressible NS simulation using all neumann boundary conditions. could you give some insight how to test if I have a wrong nullspace etc? > > > > > > -ksp_test_null_space > > > > > > But if your null space is consistently from all Neumann boundary conditions then it likely is not wrong. > > > > > > Barry > > > > > > > > > > > Thanks! > > > > > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith wrote: > > > > > > > > This is good. You get more than 12 digit reduction in the true residual norm. This is good AMG convergence. Expected when everything goes well. > > > > > > > > What is different in this case from the previous case that does not converge reasonably? > > > > > > > > Barry > > > > > > > > > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: > > > > > > > > > > Barry, Please advise what you make of this? this is poisson solver with all neumann BC 3d case Finite difference Scheme was used. > > > > > Thanks! I'm in learning mode. > > > > > > > > > > KSP Object: 24 MPI processes > > > > > type: bcgs > > > > > maximum iterations=40000, initial guess is zero > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > left preconditioning > > > > > using PRECONDITIONED norm type for convergence test > > > > > PC Object: 24 MPI processes > > > > > type: hypre > > > > > HYPRE BoomerAMG preconditioning > > > > > Cycle type V > > > > > Maximum number of levels 25 > > > > > Maximum number of iterations PER hypre call 1 > > > > > Convergence tolerance PER hypre call 0. > > > > > Threshold for strong coupling 0.25 > > > > > Interpolation truncation factor 0. > > > > > Interpolation: max elements per row 0 > > > > > Number of levels of aggressive coarsening 0 > > > > > Number of paths for aggressive coarsening 1 > > > > > Maximum row sums 0.9 > > > > > Sweeps down 1 > > > > > Sweeps up 1 > > > > > Sweeps on coarse 1 > > > > > Relax down symmetric-SOR/Jacobi > > > > > Relax up symmetric-SOR/Jacobi > > > > > Relax on coarse Gaussian-elimination > > > > > Relax weight (all) 1. > > > > > Outer relax weight (all) 1. > > > > > Using CF-relaxation > > > > > Not using more complex smoothers. > > > > > Measure type local > > > > > Coarsen type Falgout > > > > > Interpolation type classical > > > > > linear system matrix = precond matrix: > > > > > Mat Object: A 24 MPI processes > > > > > type: mpiaij > > > > > rows=497664, cols=497664 > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > has attached null space > > > > > not using I-node (on process 0) routines > > > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > > > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith wrote: > > > > > > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: > > > > > > > > > > > > hi, Barry: > > > > > > what do you mean absurd by setting tolerance =1e-14? > > > > > > > > > > Trying to decrease the initial residual norm down by a factor of 1e-14 with an iterative method (or even direct method) is unrealistic, usually unachievable) and almost never necessary. You are requiring || r_n || < 1.e-14 || r_0|| when with double precision numbers you only have roughly 14 decimal digits total to compute with. Round off alone will lead to differences far larger than 1e-14 > > > > > > > > > > If you are using the solver in the context of a nonlinear problem (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to get quadratic convergence of Newton's method. > > > > > > > > > > If you are solving a linear problem then it is extremely likely that errors due to discretization errors (from finite element method etc) and the model are much much larger than even 1.e-8. > > > > > > > > > > So, in summary > > > > > > > > > > 1.e-14 is probably unachievable > > > > > > > > > > 1.e-14 is almost for sure not needed. > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: > > > > > > > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > > > > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely to be a reason to use bcgs > > > > > > > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > > > > > > > > > > > > > if this solver doesn't converge. I have a fall-back solution, which uses GMRES solver. this setup is fine with me. I just want to know if HYPRE is a reliable solution for me. Or I will have to go without preconditioner. > > > > > > > > > > > > > > Thanks! > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: > > > > > > > this is serial run. still dumping output. parallel more or less the same. > > > > > > > > > > > > > > KSP Object: 1 MPI processes > > > > > > > type: bcgs > > > > > > > maximum iterations=40000, initial guess is zero > > > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > > > left preconditioning > > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > > PC Object: 1 MPI processes > > > > > > > type: hypre > > > > > > > HYPRE BoomerAMG preconditioning > > > > > > > Cycle type V > > > > > > > Maximum number of levels 25 > > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > > Convergence tolerance PER hypre call 0. > > > > > > > Threshold for strong coupling 0.25 > > > > > > > Interpolation truncation factor 0. > > > > > > > Interpolation: max elements per row 0 > > > > > > > Number of levels of aggressive coarsening 0 > > > > > > > Number of paths for aggressive coarsening 1 > > > > > > > Maximum row sums 0.9 > > > > > > > Sweeps down 1 > > > > > > > Sweeps up 1 > > > > > > > Sweeps on coarse 1 > > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > > Relax on coarse Gaussian-elimination > > > > > > > Relax weight (all) 1. > > > > > > > Outer relax weight (all) 1. > > > > > > > Using CF-relaxation > > > > > > > Not using more complex smoothers. > > > > > > > Measure type local > > > > > > > Coarsen type Falgout > > > > > > > Interpolation type classical > > > > > > > linear system matrix = precond matrix: > > > > > > > Mat Object: A 1 MPI processes > > > > > > > type: seqaij > > > > > > > rows=497664, cols=497664 > > > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > > > has attached null space > > > > > > > not using I-node routines > > > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley wrote: > > > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > > > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > > > > > > ierr = VecAssemblyBegin(x); > > > > > > > ierr = VecAssemblyEnd(x); > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > ierr = VecAssemblyBegin(b); > > > > > > > ierr = VecAssemblyEnd(b); > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > > > > Is your rhs consistent with this nullspace? > > > > > > > > > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > > > > > KSPSetOperators(ksp,A,A); > > > > > > > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > > > > #if defined(__HYPRE__) > > > > > > > KSPGetPC(ksp, &pc); > > > > > > > PCSetType(pc, PCHYPRE); > > > > > > > PCHYPRESetType(pc,"boomeramg"); > > > > > > > This is terribly unnecessary. You just use > > > > > > > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > > > > > > > or > > > > > > > > > > > > > > -pc_type gamg > > > > > > > > > > > > > > #else > > > > > > > KSPSetType(ksp,KSPBCGSL); > > > > > > > KSPBCGSLSetEll(ksp,2); > > > > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > > > > > > > KSPSetFromOptions(ksp); > > > > > > > KSPSetUp(ksp); > > > > > > > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > > > > > > > > > > command line > > > > > > > > > > > > > > You did not provide any of what I asked for the in the eprevious mail. > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley wrote: > > > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > > > > > > > hi, > > > > > > > > > > > > > > I implemented HYPRE preconditioner for my study due to the fact that without preconditioner, PETSc solver will take thousands of iterations to converge for fine grid simulation. > > > > > > > > > > > > > > with HYPRE, depending on the parallel partition, it will take HYPRE forever to do anything. observation of output file is that the simulation is hanging with no output. > > > > > > > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > > > > > > > 1) For any question about convergence, we need to see the output of > > > > > > > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > > > > > > > > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > > > > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > > > > > > > > > > > Thanks, > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 From mfadams at lbl.gov Mon Oct 23 08:01:18 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 23 Oct 2017 09:01:18 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: <8C8ADFC3-C71F-4DE1-B21C-5BF50EBC652D@mcs.anl.gov> References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> <8C8ADFC3-C71F-4DE1-B21C-5BF50EBC652D@mcs.anl.gov> Message-ID: Just to be clear: 1) are you solving the Laplacian (div grad) and 2) what type of discretizations are you using? and 3) do you have stretched or terrible grids in some way? On Sun, Oct 22, 2017 at 3:57 PM, Barry Smith wrote: > > One thing important to understand is that multigrid is an optimal or > nearly optimal algorithm. This means, when it works, as you refine the mesh > the number of iterations remains nearly constant, regardless of the problem > size and number of processes. Simple preconditioners such as ILU, block > Jacobi, one level additive Schwarz etc have iterations that increase with > the problem size and likely also with the number of processes. Thus these > algorithms become essentially impractical for very large problems while > multigrid can remain practical (when it works). > > Good luck > > Barry > > On Oct 22, 2017, at 2:35 PM, Hao Zhang wrote: > > > > Thanks for all the inputs. before simulating finer grid, HYPRE wasn't > used and simulations were just fine. I will do a few tests and post more > information later. > > > > On Sun, Oct 22, 2017 at 12:11 PM, Barry Smith > wrote: > > > > > On Oct 21, 2017, at 11:16 PM, Hao Zhang wrote: > > > > > > the reason is when I do finer grid simulation, matrix become more > stiff. > > > > Are you saying that for a finer grid but everything else the same, > the convergence of hypre (with the same GMRES) with the same options gets > much worse? This normally will not happen, that is the fundamental beauty > of multigrid methods (when they work well). > > > > Yes the matrix condition number increases but multigrid doesn't care > about that, its number of iterations should remain pretty much the same. > > > > Something must be different (with this finer grid case), either the > mesh becomes horrible, or the physics changes, or there are errors in the > code that lead to the problem. > > > > What happens if you just refine the mesh a little? Then a little > more? Then a little more? Does the convergence rate suddenly go bad at some > point, or does it just get worse slowly? > > > > Barry > > > > > > > > > Much larger condition number. just to give you a perspective, it will > take 6000 iterations to converge and the solver does converge. I want to > reduce the number of iterations while keeping the convergence rate. that's > main drive to do so much heavy lifting around. please advise. snippet will > be provided upon request. > > > > > > Thanks again. > > > > > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith > wrote: > > > > > > Oh, you change KSP but not hypre. I did not understand this. > > > > > > Why not just use GMRES all the time? Why mess with BCGS if it is not > robust? Not worth the small optimization if it breaks everything. > > > > > > Barry > > > > > > > > > > > > > On Oct 21, 2017, at 11:05 PM, Hao Zhang wrote: > > > > > > > > this is the initial pressure solver output regarding use of PETSc. > it failed to converge after 40000 iterations, then use GMRES. > > > > > > > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid > norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 > > > > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid > norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 > > > > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid > norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 > > > > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid > norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 > > > > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid > norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 > > > > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid > norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 > > > > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid > norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 > > > > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid > norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 > > > > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid > norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 > > > > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid > norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 > > > > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid > norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 > > > > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid > norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 > > > > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid > norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 > > > > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid > norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 > > > > Linear solve did not converge due to DIVERGED_ITS iterations 40000 > > > > KSP Object: 24 MPI processes > > > > type: bcgs > > > > maximum iterations=40000, initial guess is zero > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 24 MPI processes > > > > type: hypre > > > > HYPRE BoomerAMG preconditioning > > > > Cycle type V > > > > Maximum number of levels 25 > > > > Maximum number of iterations PER hypre call 1 > > > > Convergence tolerance PER hypre call 0. > > > > Threshold for strong coupling 0.25 > > > > Interpolation truncation factor 0. > > > > Interpolation: max elements per row 0 > > > > Number of levels of aggressive coarsening 0 > > > > Number of paths for aggressive coarsening 1 > > > > Maximum row sums 0.9 > > > > Sweeps down 1 > > > > Sweeps up 1 > > > > Sweeps on coarse 1 > > > > Relax down symmetric-SOR/Jacobi > > > > Relax up symmetric-SOR/Jacobi > > > > Relax on coarse Gaussian-elimination > > > > Relax weight (all) 1. > > > > Outer relax weight (all) 1. > > > > Using CF-relaxation > > > > Not using more complex smoothers. > > > > Measure type local > > > > Coarsen type Falgout > > > > Interpolation type classical > > > > linear system matrix = precond matrix: > > > > Mat Object: A 24 MPI processes > > > > type: mpiaij > > > > rows=497664, cols=497664 > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > total number of mallocs used during MatSetValues calls =0 > > > > has attached null space > > > > not using I-node (on process 0) routines > > > > > > > > The solution diverges for p0! The residual is 3.853123e-08. Solve > again using GMRES! > > > > KSP Object: 24 MPI processes > > > > type: gmres > > > > restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > > > > happy breakdown tolerance 1e-30 > > > > maximum iterations=40000, initial guess is zero > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 24 MPI processes > > > > type: hypre > > > > HYPRE BoomerAMG preconditioning > > > > Cycle type V > > > > Maximum number of levels 25 > > > > Maximum number of iterations PER hypre call 1 > > > > Convergence tolerance PER hypre call 0. > > > > Threshold for strong coupling 0.25 > > > > Interpolation truncation factor 0. > > > > Interpolation: max elements per row 0 > > > > Number of levels of aggressive coarsening 0 > > > > Number of paths for aggressive coarsening 1 > > > > Maximum row sums 0.9 > > > > Sweeps down 1 > > > > Sweeps up 1 > > > > Sweeps on coarse 1 > > > > Relax down symmetric-SOR/Jacobi > > > > Relax up symmetric-SOR/Jacobi > > > > Relax on coarse Gaussian-elimination > > > > Relax weight (all) 1. > > > > Outer relax weight (all) 1. > > > > Using CF-relaxation > > > > Not using more complex smoothers. > > > > Measure type local > > > > Coarsen type Falgout > > > > Interpolation type classical > > > > linear system matrix = precond matrix: > > > > Mat Object: A 24 MPI processes > > > > type: mpiaij > > > > rows=497664, cols=497664 > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > total number of mallocs used during MatSetValues calls =0 > > > > has attached null space > > > > not using I-node (on process 0) routines > > > > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm > 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm > 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 > > > > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm > 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 > > > > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm > 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 > > > > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm > 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 > > > > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm > 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 > > > > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm > 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 > > > > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm > 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 > > > > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm > 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 > > > > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm > 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 > > > > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm > 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 > > > > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm > 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 > > > > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm > 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 > > > > Linear solve converged due to CONVERGED_RTOL iterations 12 > > > > KSP Object: 24 MPI processes > > > > type: gmres > > > > restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > > > > happy breakdown tolerance 1e-30 > > > > maximum iterations=40000, initial guess is zero > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 24 MPI processes > > > > type: hypre > > > > HYPRE BoomerAMG preconditioning > > > > Cycle type V > > > > Maximum number of levels 25 > > > > Maximum number of iterations PER hypre call 1 > > > > Convergence tolerance PER hypre call 0. > > > > Threshold for strong coupling 0.25 > > > > Interpolation truncation factor 0. > > > > Interpolation: max elements per row 0 > > > > Number of levels of aggressive coarsening 0 > > > > Number of paths for aggressive coarsening 1 > > > > Maximum row sums 0.9 > > > > Sweeps down 1 > > > > Sweeps up 1 > > > > Sweeps on coarse 1 > > > > Relax down symmetric-SOR/Jacobi > > > > Relax up symmetric-SOR/Jacobi > > > > Relax on coarse Gaussian-elimination > > > > Relax weight (all) 1. > > > > Outer relax weight (all) 1. > > > > Using CF-relaxation > > > > Not using more complex smoothers. > > > > Measure type local > > > > Coarsen type Falgout > > > > Interpolation type classical > > > > linear system matrix = precond matrix: > > > > Mat Object: A 24 MPI processes > > > > type: mpiaij > > > > rows=497664, cols=497664 > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > total number of mallocs used during MatSetValues calls =0 > > > > has attached null space > > > > not using I-node (on process 0) routines > > > > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd > > > > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd > > > > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd > > > > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = > 1.075459e-13 > > > > > > > > The max value of p0 is 0.03115845493408858 > > > > > > > > The min value of p0 is -0.07156715468428149 > > > > > > > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith > wrote: > > > > > > > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang > wrote: > > > > > > > > > > the incompressible NS solver algorithm call PETSc solver at > different stage of each time step. The one you were saying "This is good. > 12 digit reduction" is after the initial pressure solver, in which usually > HYPRE doesn't give a good convergence, so the fall-back solver GMRES will > be called after. > > > > > > > > Hmm, I don't understand. hypre should do well on a pressure solve. > In fact, very well. > > > > > > > > > > Barry, you were mentioning that I could have a wrong nullspace. > that particular solver is aimed to give an initial pressure profile for 3d > incompressible NS simulation using all neumann boundary conditions. could > you give some insight how to test if I have a wrong nullspace etc? > > > > > > > > -ksp_test_null_space > > > > > > > > But if your null space is consistently from all Neumann boundary > conditions then it likely is not wrong. > > > > > > > > Barry > > > > > > > > > > > > > > Thanks! > > > > > > > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith > wrote: > > > > > > > > > > This is good. You get more than 12 digit reduction in the true > residual norm. This is good AMG convergence. Expected when everything goes > well. > > > > > > > > > > What is different in this case from the previous case that does > not converge reasonably? > > > > > > > > > > Barry > > > > > > > > > > > > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang > wrote: > > > > > > > > > > > > Barry, Please advise what you make of this? this is poisson > solver with all neumann BC 3d case Finite difference Scheme was used. > > > > > > Thanks! I'm in learning mode. > > > > > > > > > > > > KSP Object: 24 MPI processes > > > > > > type: bcgs > > > > > > maximum iterations=40000, initial guess is zero > > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > > left preconditioning > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > PC Object: 24 MPI processes > > > > > > type: hypre > > > > > > HYPRE BoomerAMG preconditioning > > > > > > Cycle type V > > > > > > Maximum number of levels 25 > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > Convergence tolerance PER hypre call 0. > > > > > > Threshold for strong coupling 0.25 > > > > > > Interpolation truncation factor 0. > > > > > > Interpolation: max elements per row 0 > > > > > > Number of levels of aggressive coarsening 0 > > > > > > Number of paths for aggressive coarsening 1 > > > > > > Maximum row sums 0.9 > > > > > > Sweeps down 1 > > > > > > Sweeps up 1 > > > > > > Sweeps on coarse 1 > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > Relax on coarse Gaussian-elimination > > > > > > Relax weight (all) 1. > > > > > > Outer relax weight (all) 1. > > > > > > Using CF-relaxation > > > > > > Not using more complex smoothers. > > > > > > Measure type local > > > > > > Coarsen type Falgout > > > > > > Interpolation type classical > > > > > > linear system matrix = precond matrix: > > > > > > Mat Object: A 24 MPI processes > > > > > > type: mpiaij > > > > > > rows=497664, cols=497664 > > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > > has attached null space > > > > > > not using I-node (on process 0) routines > > > > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid > norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid > norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > > > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid > norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > > > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid > norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > > > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid > norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > > > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid > norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > > > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid > norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > > > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid > norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > > > > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith > wrote: > > > > > > > > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang > wrote: > > > > > > > > > > > > > > hi, Barry: > > > > > > > what do you mean absurd by setting tolerance =1e-14? > > > > > > > > > > > > Trying to decrease the initial residual norm down by a factor > of 1e-14 with an iterative method (or even direct method) is unrealistic, > usually unachievable) and almost never necessary. You are requiring || r_n > || < 1.e-14 || r_0|| when with double precision numbers you only have > roughly 14 decimal digits total to compute with. Round off alone will lead > to differences far larger than 1e-14 > > > > > > > > > > > > If you are using the solver in the context of a nonlinear > problem (i.e. inside Newton's method) then 1.e-6 is generally more than > plenty to get quadratic convergence of Newton's method. > > > > > > > > > > > > If you are solving a linear problem then it is extremely > likely that errors due to discretization errors (from finite element method > etc) and the model are much much larger than even 1.e-8. > > > > > > > > > > > > So, in summary > > > > > > > > > > > > 1.e-14 is probably unachievable > > > > > > > > > > > > 1.e-14 is almost for sure not needed. > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith > wrote: > > > > > > > > > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send > the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > > > > > > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely > to be a reason to use bcgs > > > > > > > > > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > > > > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang > wrote: > > > > > > > > > > > > > > > > if this solver doesn't converge. I have a fall-back > solution, which uses GMRES solver. this setup is fine with me. I just want > to know if HYPRE is a reliable solution for me. Or I will have to go > without preconditioner. > > > > > > > > > > > > > > > > Thanks! > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang < > hbcbh1999 at gmail.com> wrote: > > > > > > > > this is serial run. still dumping output. parallel more or > less the same. > > > > > > > > > > > > > > > > KSP Object: 1 MPI processes > > > > > > > > type: bcgs > > > > > > > > maximum iterations=40000, initial guess is zero > > > > > > > > tolerances: relative=1e-14, absolute=1e-14, > divergence=10000. > > > > > > > > left preconditioning > > > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > > > PC Object: 1 MPI processes > > > > > > > > type: hypre > > > > > > > > HYPRE BoomerAMG preconditioning > > > > > > > > Cycle type V > > > > > > > > Maximum number of levels 25 > > > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > > > Convergence tolerance PER hypre call 0. > > > > > > > > Threshold for strong coupling 0.25 > > > > > > > > Interpolation truncation factor 0. > > > > > > > > Interpolation: max elements per row 0 > > > > > > > > Number of levels of aggressive coarsening 0 > > > > > > > > Number of paths for aggressive coarsening 1 > > > > > > > > Maximum row sums 0.9 > > > > > > > > Sweeps down 1 > > > > > > > > Sweeps up 1 > > > > > > > > Sweeps on coarse 1 > > > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > > > Relax on coarse Gaussian-elimination > > > > > > > > Relax weight (all) 1. > > > > > > > > Outer relax weight (all) 1. > > > > > > > > Using CF-relaxation > > > > > > > > Not using more complex smoothers. > > > > > > > > Measure type local > > > > > > > > Coarsen type Falgout > > > > > > > > Interpolation type classical > > > > > > > > linear system matrix = precond matrix: > > > > > > > > Mat Object: A 1 MPI processes > > > > > > > > type: seqaij > > > > > > > > rows=497664, cols=497664 > > > > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > > > > has attached null space > > > > > > > > not using I-node routines > > > > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true > resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true > resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true > resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true > resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true > resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true > resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true > resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true > resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true > resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true > resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true > resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true > resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true > resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true > resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true > resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true > resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true > resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true > resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true > resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true > resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true > resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true > resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true > resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true > resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true > resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true > resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true > resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true > resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true > resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true > resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true > resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true > resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true > resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true > resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true > resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true > resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true > resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true > resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true > resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true > resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true > resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true > resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true > resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true > resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true > resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true > resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true > resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true > resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true > resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true > resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true > resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true > resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true > resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true > resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true > resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true > resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true > resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true > resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true > resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true > resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true > resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true > resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true > resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true > resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true > resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true > resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true > resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true > resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true > resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true > resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true > resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true > resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true > resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true > resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true > resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true > resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true > resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true > resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true > resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true > resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true > resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true > resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true > resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true > resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true > resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true > resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true > resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true > resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true > resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true > resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true > resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true > resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true > resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true > resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true > resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true > resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true > resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true > resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true > resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true > resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true > resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true > resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true > resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true > resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true > resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true > resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true > resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true > resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true > resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true > resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true > resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true > resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true > resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true > resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true > resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true > resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true > resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true > resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true > resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true > resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true > resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true > resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true > resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true > resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true > resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true > resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true > resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true > resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true > resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true > resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true > resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true > resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true > resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true > resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true > resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true > resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true > resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true > resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true > resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true > resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true > resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true > resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true > resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true > resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true > resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true > resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true > resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true > resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true > resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true > resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true > resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true > resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true > resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true > resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true > resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true > resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true > resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true > resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true > resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true > resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true > resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true > resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true > resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true > resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true > resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true > resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true > resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true > resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true > resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true > resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true > resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true > resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true > resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true > resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true > resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true > resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true > resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true > resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true > resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true > resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true > resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true > resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true > resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true > resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true > resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true > resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true > resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true > resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true > resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true > resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true > resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true > resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true > resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true > resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true > resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true > resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true > resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true > resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true > resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true > resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true > resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true > resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true > resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true > resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true > resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true > resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true > resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true > resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true > resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true > resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true > resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true > resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true > resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true > resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true > resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true > resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true > resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true > resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true > resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true > resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true > resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true > resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true > resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true > resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true > resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true > resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true > resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true > resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true > resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true > resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true > resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true > resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true > resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true > resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true > resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true > resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true > resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true > resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true > resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true > resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true > resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true > resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true > resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true > resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true > resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true > resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true > resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true > resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true > resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true > resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true > resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true > resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true > resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true > resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true > resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true > resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true > resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true > resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true > resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true > resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true > resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true > resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true > resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true > resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true > resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true > resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true > resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true > resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true > resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true > resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true > resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true > resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true > resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true > resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true > resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true > resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true > resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true > resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true > resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true > resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true > resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true > resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true > resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true > resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true > resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true > resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true > resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true > resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true > resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true > resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true > resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true > resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true > resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true > resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true > resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true > resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true > resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true > resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true > resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true > resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true > resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true > resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true > resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true > resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true > resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true > resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true > resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true > resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true > resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true > resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true > resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true > resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true > resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true > resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true > resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true > resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true > resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true > resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true > resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true > resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true > resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true > resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true > resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true > resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true > resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true > resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true > resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true > resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true > resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true > resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true > resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true > resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true > resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true > resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true > resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true > resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true > resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true > resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true > resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true > resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true > resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true > resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true > resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true > resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true > resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true > resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true > resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley < > knepley at gmail.com> wrote: > > > > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang < > hbcbh1999 at gmail.com> wrote: > > > > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > > > > > > > > ierr = VecAssemblyBegin(x); > > > > > > > > ierr = VecAssemblyEnd(x); > > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > ierr = VecAssemblyBegin(b); > > > > > > > > ierr = VecAssemblyEnd(b); > > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ > WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > > > > > Is your rhs consistent with this nullspace? > > > > > > > > > > > > > > > > // KSPSetOperators(ksp,A,A, > DIFFERENT_NONZERO_PATTERN); > > > > > > > > KSPSetOperators(ksp,A,A); > > > > > > > > > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > > > > > #if defined(__HYPRE__) > > > > > > > > KSPGetPC(ksp, &pc); > > > > > > > > PCSetType(pc, PCHYPRE); > > > > > > > > PCHYPRESetType(pc,"boomeramg"); > > > > > > > > This is terribly unnecessary. You just use > > > > > > > > > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > > > > > > > > > or > > > > > > > > > > > > > > > > -pc_type gamg > > > > > > > > > > > > > > > > #else > > > > > > > > KSPSetType(ksp,KSPBCGSL); > > > > > > > > KSPBCGSLSetEll(ksp,2); > > > > > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > > > > > > > > > KSPSetFromOptions(ksp); > > > > > > > > KSPSetUp(ksp); > > > > > > > > > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > > > > > > > > > > > > > command line > > > > > > > > > > > > > > > > You did not provide any of what I asked for the in the > eprevious mail. > > > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley < > knepley at gmail.com> wrote: > > > > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang < > hbcbh1999 at gmail.com> wrote: > > > > > > > > hi, > > > > > > > > > > > > > > > > I implemented HYPRE preconditioner for my study due to the > fact that without preconditioner, PETSc solver will take thousands of > iterations to converge for fine grid simulation. > > > > > > > > > > > > > > > > with HYPRE, depending on the parallel partition, it will > take HYPRE forever to do anything. observation of output file is that the > simulation is hanging with no output. > > > > > > > > > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > > > > > > > > > 1) For any question about convergence, we need to see the > output of > > > > > > > > > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual > -ksp_converged_reason > > > > > > > > > > > > > > > > 2) Hypre has many preconditioners, which one are you talking > about > > > > > > > > > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, like > AMG > > > > > > > > > > > > > > > > Thanks, > > > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > > > -- > > > > > > > > Hao Zhang > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > Stony Brook University, > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > What most experimenters take for granted before they begin > their experiments is infinitely more interesting than any results to which > their experiments lead. > > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > Hao Zhang > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > Stony Brook University, > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > What most experimenters take for granted before they begin > their experiments is infinitely more interesting than any results to which > their experiments lead. > > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > Hao Zhang > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > Stony Brook University, > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > Hao Zhang > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > Stony Brook University, > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hbcbh1999 at gmail.com Mon Oct 23 09:04:09 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Mon, 23 Oct 2017 10:04:09 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> <8C8ADFC3-C71F-4DE1-B21C-5BF50EBC652D@mcs.anl.gov> Message-ID: The big picture is I'm solving 3d incompressible Navier-Stokes Equations using staggered/MAC grid with Finite Difference Method. The particular function is poisson pressure solver or Laplacian Mark Adam mentioned. The simulation runs fine for medium size mesh grid. When I try harder to go very fine grid, not DNS level, I'm having some difficulties to have meaning physical results. All PETSc solvers converge but come with huge iterations. That's when/how I started using HYPRE. On Mon, Oct 23, 2017 at 9:01 AM, Mark Adams wrote: > Just to be clear: 1) are you solving the Laplacian (div grad) and 2) what > type of discretizations are you using? and 3) do you have stretched or > terrible grids in some way? > > On Sun, Oct 22, 2017 at 3:57 PM, Barry Smith wrote: > >> >> One thing important to understand is that multigrid is an optimal or >> nearly optimal algorithm. This means, when it works, as you refine the mesh >> the number of iterations remains nearly constant, regardless of the problem >> size and number of processes. Simple preconditioners such as ILU, block >> Jacobi, one level additive Schwarz etc have iterations that increase with >> the problem size and likely also with the number of processes. Thus these >> algorithms become essentially impractical for very large problems while >> multigrid can remain practical (when it works). >> >> Good luck >> >> Barry >> > On Oct 22, 2017, at 2:35 PM, Hao Zhang wrote: >> > >> > Thanks for all the inputs. before simulating finer grid, HYPRE wasn't >> used and simulations were just fine. I will do a few tests and post more >> information later. >> > >> > On Sun, Oct 22, 2017 at 12:11 PM, Barry Smith >> wrote: >> > >> > > On Oct 21, 2017, at 11:16 PM, Hao Zhang wrote: >> > > >> > > the reason is when I do finer grid simulation, matrix become more >> stiff. >> > >> > Are you saying that for a finer grid but everything else the same, >> the convergence of hypre (with the same GMRES) with the same options gets >> much worse? This normally will not happen, that is the fundamental beauty >> of multigrid methods (when they work well). >> > >> > Yes the matrix condition number increases but multigrid doesn't care >> about that, its number of iterations should remain pretty much the same. >> > >> > Something must be different (with this finer grid case), either the >> mesh becomes horrible, or the physics changes, or there are errors in the >> code that lead to the problem. >> > >> > What happens if you just refine the mesh a little? Then a little >> more? Then a little more? Does the convergence rate suddenly go bad at some >> point, or does it just get worse slowly? >> > >> > Barry >> > >> > >> > >> > > Much larger condition number. just to give you a perspective, it will >> take 6000 iterations to converge and the solver does converge. I want to >> reduce the number of iterations while keeping the convergence rate. that's >> main drive to do so much heavy lifting around. please advise. snippet will >> be provided upon request. >> > > >> > > Thanks again. >> > > >> > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith >> wrote: >> > > >> > > Oh, you change KSP but not hypre. I did not understand this. >> > > >> > > Why not just use GMRES all the time? Why mess with BCGS if it is >> not robust? Not worth the small optimization if it breaks everything. >> > > >> > > Barry >> > > >> > > >> > > >> > > > On Oct 21, 2017, at 11:05 PM, Hao Zhang >> wrote: >> > > > >> > > > this is the initial pressure solver output regarding use of PETSc. >> it failed to converge after 40000 iterations, then use GMRES. >> > > > >> > > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid >> norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 >> > > > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid >> norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 >> > > > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid >> norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 >> > > > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid >> norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 >> > > > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid >> norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 >> > > > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid >> norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 >> > > > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid >> norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 >> > > > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid >> norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 >> > > > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid >> norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 >> > > > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid >> norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 >> > > > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid >> norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 >> > > > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid >> norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 >> > > > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid >> norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 >> > > > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid >> norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 >> > > > Linear solve did not converge due to DIVERGED_ITS iterations 40000 >> > > > KSP Object: 24 MPI processes >> > > > type: bcgs >> > > > maximum iterations=40000, initial guess is zero >> > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > > > left preconditioning >> > > > using PRECONDITIONED norm type for convergence test >> > > > PC Object: 24 MPI processes >> > > > type: hypre >> > > > HYPRE BoomerAMG preconditioning >> > > > Cycle type V >> > > > Maximum number of levels 25 >> > > > Maximum number of iterations PER hypre call 1 >> > > > Convergence tolerance PER hypre call 0. >> > > > Threshold for strong coupling 0.25 >> > > > Interpolation truncation factor 0. >> > > > Interpolation: max elements per row 0 >> > > > Number of levels of aggressive coarsening 0 >> > > > Number of paths for aggressive coarsening 1 >> > > > Maximum row sums 0.9 >> > > > Sweeps down 1 >> > > > Sweeps up 1 >> > > > Sweeps on coarse 1 >> > > > Relax down symmetric-SOR/Jacobi >> > > > Relax up symmetric-SOR/Jacobi >> > > > Relax on coarse Gaussian-elimination >> > > > Relax weight (all) 1. >> > > > Outer relax weight (all) 1. >> > > > Using CF-relaxation >> > > > Not using more complex smoothers. >> > > > Measure type local >> > > > Coarsen type Falgout >> > > > Interpolation type classical >> > > > linear system matrix = precond matrix: >> > > > Mat Object: A 24 MPI processes >> > > > type: mpiaij >> > > > rows=497664, cols=497664 >> > > > total: nonzeros=3363552, allocated nonzeros=6967296 >> > > > total number of mallocs used during MatSetValues calls =0 >> > > > has attached null space >> > > > not using I-node (on process 0) routines >> > > > >> > > > The solution diverges for p0! The residual is 3.853123e-08. Solve >> again using GMRES! >> > > > KSP Object: 24 MPI processes >> > > > type: gmres >> > > > restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> > > > happy breakdown tolerance 1e-30 >> > > > maximum iterations=40000, initial guess is zero >> > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > > > left preconditioning >> > > > using PRECONDITIONED norm type for convergence test >> > > > PC Object: 24 MPI processes >> > > > type: hypre >> > > > HYPRE BoomerAMG preconditioning >> > > > Cycle type V >> > > > Maximum number of levels 25 >> > > > Maximum number of iterations PER hypre call 1 >> > > > Convergence tolerance PER hypre call 0. >> > > > Threshold for strong coupling 0.25 >> > > > Interpolation truncation factor 0. >> > > > Interpolation: max elements per row 0 >> > > > Number of levels of aggressive coarsening 0 >> > > > Number of paths for aggressive coarsening 1 >> > > > Maximum row sums 0.9 >> > > > Sweeps down 1 >> > > > Sweeps up 1 >> > > > Sweeps on coarse 1 >> > > > Relax down symmetric-SOR/Jacobi >> > > > Relax up symmetric-SOR/Jacobi >> > > > Relax on coarse Gaussian-elimination >> > > > Relax weight (all) 1. >> > > > Outer relax weight (all) 1. >> > > > Using CF-relaxation >> > > > Not using more complex smoothers. >> > > > Measure type local >> > > > Coarsen type Falgout >> > > > Interpolation type classical >> > > > linear system matrix = precond matrix: >> > > > Mat Object: A 24 MPI processes >> > > > type: mpiaij >> > > > rows=497664, cols=497664 >> > > > total: nonzeros=3363552, allocated nonzeros=6967296 >> > > > total number of mallocs used during MatSetValues calls =0 >> > > > has attached null space >> > > > not using I-node (on process 0) routines >> > > > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid >> norm 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 >> > > > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid >> norm 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 >> > > > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid >> norm 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 >> > > > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid >> norm 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 >> > > > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid >> norm 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 >> > > > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid >> norm 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 >> > > > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid >> norm 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 >> > > > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid >> norm 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 >> > > > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid >> norm 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 >> > > > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid >> norm 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 >> > > > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid >> norm 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 >> > > > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid >> norm 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 >> > > > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid >> norm 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 >> > > > Linear solve converged due to CONVERGED_RTOL iterations 12 >> > > > KSP Object: 24 MPI processes >> > > > type: gmres >> > > > restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> > > > happy breakdown tolerance 1e-30 >> > > > maximum iterations=40000, initial guess is zero >> > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > > > left preconditioning >> > > > using PRECONDITIONED norm type for convergence test >> > > > PC Object: 24 MPI processes >> > > > type: hypre >> > > > HYPRE BoomerAMG preconditioning >> > > > Cycle type V >> > > > Maximum number of levels 25 >> > > > Maximum number of iterations PER hypre call 1 >> > > > Convergence tolerance PER hypre call 0. >> > > > Threshold for strong coupling 0.25 >> > > > Interpolation truncation factor 0. >> > > > Interpolation: max elements per row 0 >> > > > Number of levels of aggressive coarsening 0 >> > > > Number of paths for aggressive coarsening 1 >> > > > Maximum row sums 0.9 >> > > > Sweeps down 1 >> > > > Sweeps up 1 >> > > > Sweeps on coarse 1 >> > > > Relax down symmetric-SOR/Jacobi >> > > > Relax up symmetric-SOR/Jacobi >> > > > Relax on coarse Gaussian-elimination >> > > > Relax weight (all) 1. >> > > > Outer relax weight (all) 1. >> > > > Using CF-relaxation >> > > > Not using more complex smoothers. >> > > > Measure type local >> > > > Coarsen type Falgout >> > > > Interpolation type classical >> > > > linear system matrix = precond matrix: >> > > > Mat Object: A 24 MPI processes >> > > > type: mpiaij >> > > > rows=497664, cols=497664 >> > > > total: nonzeros=3363552, allocated nonzeros=6967296 >> > > > total number of mallocs used during MatSetValues calls =0 >> > > > has attached null space >> > > > not using I-node (on process 0) routines >> > > > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd >> > > > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd >> > > > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd >> > > > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = >> 1.075459e-13 >> > > > >> > > > The max value of p0 is 0.03115845493408858 >> > > > >> > > > The min value of p0 is -0.07156715468428149 >> > > > >> > > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith >> wrote: >> > > > >> > > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang >> wrote: >> > > > > >> > > > > the incompressible NS solver algorithm call PETSc solver at >> different stage of each time step. The one you were saying "This is good. >> 12 digit reduction" is after the initial pressure solver, in which usually >> HYPRE doesn't give a good convergence, so the fall-back solver GMRES will >> be called after. >> > > > >> > > > Hmm, I don't understand. hypre should do well on a pressure >> solve. In fact, very well. >> > > > > >> > > > > Barry, you were mentioning that I could have a wrong nullspace. >> that particular solver is aimed to give an initial pressure profile for 3d >> incompressible NS simulation using all neumann boundary conditions. could >> you give some insight how to test if I have a wrong nullspace etc? >> > > > >> > > > -ksp_test_null_space >> > > > >> > > > But if your null space is consistently from all Neumann boundary >> conditions then it likely is not wrong. >> > > > >> > > > Barry >> > > > >> > > > > >> > > > > Thanks! >> > > > > >> > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith >> wrote: >> > > > > >> > > > > This is good. You get more than 12 digit reduction in the true >> residual norm. This is good AMG convergence. Expected when everything goes >> well. >> > > > > >> > > > > What is different in this case from the previous case that does >> not converge reasonably? >> > > > > >> > > > > Barry >> > > > > >> > > > > >> > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang >> wrote: >> > > > > > >> > > > > > Barry, Please advise what you make of this? this is poisson >> solver with all neumann BC 3d case Finite difference Scheme was used. >> > > > > > Thanks! I'm in learning mode. >> > > > > > >> > > > > > KSP Object: 24 MPI processes >> > > > > > type: bcgs >> > > > > > maximum iterations=40000, initial guess is zero >> > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >> > > > > > left preconditioning >> > > > > > using PRECONDITIONED norm type for convergence test >> > > > > > PC Object: 24 MPI processes >> > > > > > type: hypre >> > > > > > HYPRE BoomerAMG preconditioning >> > > > > > Cycle type V >> > > > > > Maximum number of levels 25 >> > > > > > Maximum number of iterations PER hypre call 1 >> > > > > > Convergence tolerance PER hypre call 0. >> > > > > > Threshold for strong coupling 0.25 >> > > > > > Interpolation truncation factor 0. >> > > > > > Interpolation: max elements per row 0 >> > > > > > Number of levels of aggressive coarsening 0 >> > > > > > Number of paths for aggressive coarsening 1 >> > > > > > Maximum row sums 0.9 >> > > > > > Sweeps down 1 >> > > > > > Sweeps up 1 >> > > > > > Sweeps on coarse 1 >> > > > > > Relax down symmetric-SOR/Jacobi >> > > > > > Relax up symmetric-SOR/Jacobi >> > > > > > Relax on coarse Gaussian-elimination >> > > > > > Relax weight (all) 1. >> > > > > > Outer relax weight (all) 1. >> > > > > > Using CF-relaxation >> > > > > > Not using more complex smoothers. >> > > > > > Measure type local >> > > > > > Coarsen type Falgout >> > > > > > Interpolation type classical >> > > > > > linear system matrix = precond matrix: >> > > > > > Mat Object: A 24 MPI processes >> > > > > > type: mpiaij >> > > > > > rows=497664, cols=497664 >> > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 >> > > > > > total number of mallocs used during MatSetValues calls =0 >> > > > > > has attached null space >> > > > > > not using I-node (on process 0) routines >> > > > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid >> norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 >> > > > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid >> norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 >> > > > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid >> norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 >> > > > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid >> norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 >> > > > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid >> norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 >> > > > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid >> norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 >> > > > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid >> norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 >> > > > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid >> norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 >> > > > > > Linear solve converged due to CONVERGED_ATOL iterations 7 >> > > > > > >> > > > > > >> > > > > > >> > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith < >> bsmith at mcs.anl.gov> wrote: >> > > > > > >> > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang >> wrote: >> > > > > > > >> > > > > > > hi, Barry: >> > > > > > > what do you mean absurd by setting tolerance =1e-14? >> > > > > > >> > > > > > Trying to decrease the initial residual norm down by a factor >> of 1e-14 with an iterative method (or even direct method) is unrealistic, >> usually unachievable) and almost never necessary. You are requiring || r_n >> || < 1.e-14 || >> r_0|| when with double precision numbers you only have roughly 14 decimal >> digits total to compute with. Round off alone will lead to differences far >> larger than 1e-14 >> > > > > > >> > > > > > If you are using the solver in the context of a nonlinear >> problem (i.e. inside Newton's method) then 1.e-6 >> is generally >> more than plenty to get quadratic convergence of Newton's method. >> > > > > > >> > > > > > If you are solving a linear problem then it is extremely >> likely that errors due to discretization errors (from finite element method >> etc) and the model are much much larger than even 1.e-8 >> . >> > > > > > >> > > > > > So, in summary >> > > > > > >> > > > > > 1.e-14 >> is probably >> unachievable >> > > > > > >> > > > > > 1.e-14 >> is almost for >> sure not needed. >> > > > > > >> > > > > > Barry >> > > > > > >> > > > > > >> > > > > > > >> > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith >> wrote: >> > > > > > > >> > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send >> the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov >> > > > > > > >> > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely >> to be a reason to use bcgs >> > > > > > > >> > > > > > > BTW: tolerances: relative=1e-14, is absurd >> > > > > > > >> > > > > > > My guess is your null space is incorrect. >> > > > > > > >> > > > > > > >> > > > > > > >> > > > > > > >> > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang >> wrote: >> > > > > > > > >> > > > > > > > if this solver doesn't converge. I have a fall-back >> solution, which uses GMRES solver. this setup is fine with me. I just want >> to know if HYPRE is a reliable solution for me. Or I will have to go >> without preconditioner. >> > > > > > > > >> > > > > > > > Thanks! >> > > > > > > > >> > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang < >> hbcbh1999 at gmail.com> wrote: >> > > > > > > > this is serial run. still dumping output. parallel more or >> less the same. >> > > > > > > > >> > > > > > > > KSP Object: 1 MPI processes >> > > > > > > > type: bcgs >> > > > > > > > maximum iterations=40000, initial guess is zero >> > > > > > > > tolerances: relative=1e-14, absolute=1e-14, >> divergence=10000. >> > > > > > > > left preconditioning >> > > > > > > > using PRECONDITIONED norm type for convergence test >> > > > > > > > PC Object: 1 MPI processes >> > > > > > > > type: hypre >> > > > > > > > HYPRE BoomerAMG preconditioning >> > > > > > > > Cycle type V >> > > > > > > > Maximum number of levels 25 >> > > > > > > > Maximum number of iterations PER hypre call 1 >> > > > > > > > Convergence tolerance PER hypre call 0. >> > > > > > > > Threshold for strong coupling 0.25 >> > > > > > > > Interpolation truncation factor 0. >> > > > > > > > Interpolation: max elements per row 0 >> > > > > > > > Number of levels of aggressive coarsening 0 >> > > > > > > > Number of paths for aggressive coarsening 1 >> > > > > > > > Maximum row sums 0.9 >> > > > > > > > Sweeps down 1 >> > > > > > > > Sweeps up 1 >> > > > > > > > Sweeps on coarse 1 >> > > > > > > > Relax down symmetric-SOR/Jacobi >> > > > > > > > Relax up symmetric-SOR/Jacobi >> > > > > > > > Relax on coarse Gaussian-elimination >> > > > > > > > Relax weight (all) 1. >> > > > > > > > Outer relax weight (all) 1. >> > > > > > > > Using CF-relaxation >> > > > > > > > Not using more complex smoothers. >> > > > > > > > Measure type local >> > > > > > > > Coarsen type Falgout >> > > > > > > > Interpolation type classical >> > > > > > > > linear system matrix = precond matrix: >> > > > > > > > Mat Object: A 1 MPI processes >> > > > > > > > type: seqaij >> > > > > > > > rows=497664, cols=497664 >> > > > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 >> > > > > > > > total number of mallocs used during MatSetValues calls >> =0 >> > > > > > > > has attached null space >> > > > > > > > not using I-node routines >> > > > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true >> resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 >> > > > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true >> resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 >> > > > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true >> resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 >> > > > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true >> resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 >> > > > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true >> resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 >> > > > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true >> resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 >> > > > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true >> resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 >> > > > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true >> resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 >> > > > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true >> resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 >> > > > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true >> resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 >> > > > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true >> resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 >> > > > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true >> resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 >> > > > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true >> resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 >> > > > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true >> resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 >> > > > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true >> resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 >> > > > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true >> resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 >> > > > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true >> resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 >> > > > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true >> resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 >> > > > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true >> resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 >> > > > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true >> resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 >> > > > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true >> resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 >> > > > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true >> resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 >> > > > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true >> resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 >> > > > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true >> resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 >> > > > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true >> resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 >> > > > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true >> resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 >> > > > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true >> resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 >> > > > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true >> resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 >> > > > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true >> resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 >> > > > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true >> resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 >> > > > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true >> resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 >> > > > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true >> resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 >> > > > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true >> resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 >> > > > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true >> resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 >> > > > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true >> resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 >> > > > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true >> resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 >> > > > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true >> resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 >> > > > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true >> resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 >> > > > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true >> resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 >> > > > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true >> resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 >> > > > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true >> resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 >> > > > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true >> resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 >> > > > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true >> resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 >> > > > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true >> resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 >> > > > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true >> resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 >> > > > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true >> resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 >> > > > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true >> resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 >> > > > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true >> resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 >> > > > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true >> resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 >> > > > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true >> resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 >> > > > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true >> resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 >> > > > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true >> resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 >> > > > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true >> resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 >> > > > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true >> resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 >> > > > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true >> resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 >> > > > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true >> resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 >> > > > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true >> resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 >> > > > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true >> resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 >> > > > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true >> resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 >> > > > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true >> resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 >> > > > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true >> resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 >> > > > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true >> resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 >> > > > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true >> resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 >> > > > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true >> resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 >> > > > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true >> resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 >> > > > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true >> resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 >> > > > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true >> resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 >> > > > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true >> resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 >> > > > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true >> resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 >> > > > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true >> resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 >> > > > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true >> resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 >> > > > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true >> resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 >> > > > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true >> resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 >> > > > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true >> resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 >> > > > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true >> resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 >> > > > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true >> resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 >> > > > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true >> resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 >> > > > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true >> resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 >> > > > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true >> resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 >> > > > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true >> resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 >> > > > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true >> resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 >> > > > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true >> resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 >> > > > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true >> resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 >> > > > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true >> resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 >> > > > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true >> resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 >> > > > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true >> resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 >> > > > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true >> resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 >> > > > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true >> resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 >> > > > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true >> resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 >> > > > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true >> resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 >> > > > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true >> resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 >> > > > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true >> resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 >> > > > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true >> resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 >> > > > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true >> resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 >> > > > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true >> resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 >> > > > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true >> resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 >> > > > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true >> resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 >> > > > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true >> resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 >> > > > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true >> resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 >> > > > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true >> resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 >> > > > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true >> resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 >> > > > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true >> resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 >> > > > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true >> resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 >> > > > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true >> resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 >> > > > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true >> resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 >> > > > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true >> resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 >> > > > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true >> resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 >> > > > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true >> resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 >> > > > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true >> resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 >> > > > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true >> resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 >> > > > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true >> resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 >> > > > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true >> resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 >> > > > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true >> resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 >> > > > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true >> resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 >> > > > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true >> resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 >> > > > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true >> resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 >> > > > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true >> resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 >> > > > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true >> resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 >> > > > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true >> resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 >> > > > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true >> resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 >> > > > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true >> resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 >> > > > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true >> resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 >> > > > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true >> resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 >> > > > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true >> resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 >> > > > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true >> resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 >> > > > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true >> resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 >> > > > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true >> resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 >> > > > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true >> resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 >> > > > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true >> resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 >> > > > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true >> resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 >> > > > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true >> resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 >> > > > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true >> resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 >> > > > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true >> resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 >> > > > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true >> resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 >> > > > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true >> resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 >> > > > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true >> resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 >> > > > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true >> resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 >> > > > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true >> resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 >> > > > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true >> resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 >> > > > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true >> resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 >> > > > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true >> resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 >> > > > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true >> resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 >> > > > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true >> resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 >> > > > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true >> resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 >> > > > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true >> resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 >> > > > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true >> resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 >> > > > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true >> resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 >> > > > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true >> resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 >> > > > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true >> resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 >> > > > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true >> resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 >> > > > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true >> resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 >> > > > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true >> resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 >> > > > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true >> resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 >> > > > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true >> resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 >> > > > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true >> resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 >> > > > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true >> resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 >> > > > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true >> resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 >> > > > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true >> resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 >> > > > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true >> resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 >> > > > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true >> resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 >> > > > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true >> resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 >> > > > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true >> resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 >> > > > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true >> resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 >> > > > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true >> resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 >> > > > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true >> resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 >> > > > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true >> resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 >> > > > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true >> resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 >> > > > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true >> resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 >> > > > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true >> resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 >> > > > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true >> resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 >> > > > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true >> resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 >> > > > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true >> resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 >> > > > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true >> resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 >> > > > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true >> resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 >> > > > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true >> resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 >> > > > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true >> resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 >> > > > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true >> resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 >> > > > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true >> resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 >> > > > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true >> resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 >> > > > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true >> resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 >> > > > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true >> resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 >> > > > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true >> resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 >> > > > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true >> resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 >> > > > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true >> resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 >> > > > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true >> resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 >> > > > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true >> resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 >> > > > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true >> resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 >> > > > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true >> resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 >> > > > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true >> resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 >> > > > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true >> resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 >> > > > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true >> resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 >> > > > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true >> resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 >> > > > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true >> resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 >> > > > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true >> resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 >> > > > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true >> resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 >> > > > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true >> resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 >> > > > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true >> resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 >> > > > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true >> resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 >> > > > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true >> resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 >> > > > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true >> resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 >> > > > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true >> resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 >> > > > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true >> resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 >> > > > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true >> resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 >> > > > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true >> resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 >> > > > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true >> resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 >> > > > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true >> resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 >> > > > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true >> resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 >> > > > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true >> resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 >> > > > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true >> resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 >> > > > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true >> resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 >> > > > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true >> resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 >> > > > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true >> resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 >> > > > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true >> resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 >> > > > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true >> resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 >> > > > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true >> resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 >> > > > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true >> resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 >> > > > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true >> resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 >> > > > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true >> resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 >> > > > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true >> resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 >> > > > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true >> resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 >> > > > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true >> resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 >> > > > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true >> resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 >> > > > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true >> resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 >> > > > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true >> resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 >> > > > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true >> resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 >> > > > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true >> resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 >> > > > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true >> resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 >> > > > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true >> resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 >> > > > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true >> resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 >> > > > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true >> resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 >> > > > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true >> resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 >> > > > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true >> resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 >> > > > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true >> resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 >> > > > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true >> resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 >> > > > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true >> resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 >> > > > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true >> resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 >> > > > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true >> resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 >> > > > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true >> resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 >> > > > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true >> resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 >> > > > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true >> resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 >> > > > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true >> resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 >> > > > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true >> resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 >> > > > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true >> resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 >> > > > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true >> resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 >> > > > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true >> resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 >> > > > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true >> resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 >> > > > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true >> resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 >> > > > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true >> resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 >> > > > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true >> resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >> > > > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true >> resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 >> > > > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true >> resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 >> > > > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true >> resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >> > > > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true >> resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 >> > > > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true >> resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 >> > > > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true >> resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 >> > > > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true >> resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 >> > > > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true >> resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 >> > > > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true >> resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 >> > > > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true >> resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 >> > > > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true >> resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 >> > > > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true >> resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 >> > > > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true >> resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 >> > > > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true >> resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 >> > > > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true >> resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 >> > > > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true >> resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 >> > > > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true >> resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 >> > > > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true >> resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 >> > > > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true >> resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 >> > > > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true >> resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 >> > > > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true >> resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 >> > > > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true >> resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 >> > > > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true >> resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 >> > > > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true >> resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 >> > > > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true >> resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 >> > > > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true >> resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 >> > > > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true >> resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 >> > > > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true >> resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 >> > > > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true >> resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 >> > > > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true >> resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 >> > > > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true >> resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 >> > > > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true >> resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 >> > > > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true >> resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 >> > > > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true >> resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 >> > > > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true >> resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 >> > > > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true >> resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 >> > > > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true >> resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 >> > > > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true >> resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 >> > > > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true >> resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 >> > > > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true >> resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 >> > > > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true >> resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 >> > > > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true >> resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 >> > > > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true >> resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 >> > > > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true >> resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 >> > > > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true >> resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 >> > > > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true >> resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 >> > > > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true >> resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 >> > > > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true >> resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 >> > > > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true >> resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 >> > > > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true >> resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 >> > > > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true >> resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 >> > > > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true >> resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 >> > > > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true >> resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 >> > > > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true >> resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 >> > > > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true >> resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 >> > > > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true >> resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 >> > > > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true >> resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >> > > > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true >> resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >> > > > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true >> resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 >> > > > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true >> resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 >> > > > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true >> resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 >> > > > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true >> resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 >> > > > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true >> resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 >> > > > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true >> resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 >> > > > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true >> resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 >> > > > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true >> resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >> > > > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true >> resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 >> > > > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true >> resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >> > > > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true >> resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 >> > > > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true >> resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 >> > > > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true >> resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 >> > > > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true >> resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 >> > > > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true >> resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 >> > > > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true >> resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 >> > > > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true >> resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 >> > > > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true >> resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 >> > > > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true >> resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 >> > > > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true >> resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 >> > > > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true >> resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 >> > > > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true >> resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 >> > > > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true >> resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 >> > > > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true >> resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 >> > > > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true >> resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 >> > > > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true >> resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 >> > > > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true >> resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 >> > > > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true >> resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 >> > > > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true >> resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 >> > > > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true >> resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 >> > > > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true >> resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 >> > > > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true >> resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 >> > > > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true >> resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 >> > > > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true >> resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 >> > > > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true >> resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 >> > > > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true >> resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 >> > > > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true >> resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 >> > > > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true >> resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 >> > > > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true >> resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 >> > > > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true >> resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 >> > > > > > > > >> > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley < >> knepley at gmail.com> wrote: >> > > > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang < >> hbcbh1999 at gmail.com> wrote: >> > > > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); >> > > > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); >> > > > > > > > >> > > > > > > > ierr = VecAssemblyBegin(x); >> > > > > > > > ierr = VecAssemblyEnd(x); >> > > > > > > > This is probably unnecessary >> > > > > > > > >> > > > > > > > ierr = VecAssemblyBegin(b); >> > > > > > > > ierr = VecAssemblyEnd(b); >> > > > > > > > This is probably unnecessary >> > > > > > > > >> > > > > > > > >> > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ >> WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); >> > > > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 >> > > > > > > > Is your rhs consistent with this nullspace? >> > > > > > > > >> > > > > > > > // KSPSetOperators(ksp,A,A,DIFFER >> ENT_NONZERO_PATTERN); >> > > > > > > > KSPSetOperators(ksp,A,A); >> > > > > > > > >> > > > > > > > KSPSetType(ksp,KSPBCGS); >> > > > > > > > >> > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); >> > > > > > > > #if defined(__HYPRE__) >> > > > > > > > KSPGetPC(ksp, &pc); >> > > > > > > > PCSetType(pc, PCHYPRE); >> > > > > > > > PCHYPRESetType(pc,"boomeramg"); >> > > > > > > > This is terribly unnecessary. You just use >> > > > > > > > >> > > > > > > > -pc_type hypre -pc_hypre_type boomeramg >> > > > > > > > >> > > > > > > > or >> > > > > > > > >> > > > > > > > -pc_type gamg >> > > > > > > > >> > > > > > > > #else >> > > > > > > > KSPSetType(ksp,KSPBCGSL); >> > > > > > > > KSPBCGSLSetEll(ksp,2); >> > > > > > > > #endif /* defined(__HYPRE__) */ >> > > > > > > > >> > > > > > > > KSPSetFromOptions(ksp); >> > > > > > > > KSPSetUp(ksp); >> > > > > > > > >> > > > > > > > ierr = KSPSolve(ksp,b,x); >> > > > > > > > >> > > > > > > > >> > > > > > > > command line >> > > > > > > > >> > > > > > > > You did not provide any of what I asked for the in the >> eprevious mail. >> > > > > > > > >> > > > > > > > Matt >> > > > > > > > >> > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley < >> knepley at gmail.com> wrote: >> > > > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang < >> hbcbh1999 at gmail.com> wrote: >> > > > > > > > hi, >> > > > > > > > >> > > > > > > > I implemented HYPRE preconditioner for my study due to the >> fact that without preconditioner, PETSc solver will take thousands of >> iterations to converge for fine grid simulation. >> > > > > > > > >> > > > > > > > with HYPRE, depending on the parallel partition, it will >> take HYPRE forever to do anything. observation of output file is that the >> simulation is hanging with no output. >> > > > > > > > >> > > > > > > > Any idea what happened? will post snippet of code. >> > > > > > > > >> > > > > > > > 1) For any question about convergence, we need to see the >> output of >> > > > > > > > >> > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual >> -ksp_converged_reason >> > > > > > > > >> > > > > > > > 2) Hypre has many preconditioners, which one are you >> talking about >> > > > > > > > >> > > > > > > > 3) PETSc has some preconditioners in common with Hypre, >> like AMG >> > > > > > > > >> > > > > > > > Thanks, >> > > > > > > > >> > > > > > > > Matt >> > > > > > > > >> > > > > > > > -- >> > > > > > > > Hao Zhang >> > > > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > > > Stony Brook University, >> > > > > > > > Stony Brook, New York, 11790 >> > > > > > > > >> > > > > > > > >> > > > > > > > >> > > > > > > > -- >> > > > > > > > What most experimenters take for granted before they begin >> their experiments is infinitely more interesting than any results to which >> their experiments lead. >> > > > > > > > -- Norbert Wiener >> > > > > > > > >> > > > > > > > https://www.cse.buffalo.edu/~knepley/ >> > > > > > > > >> > > > > > > > >> > > > > > > > >> > > > > > > > -- >> > > > > > > > Hao Zhang >> > > > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > > > Stony Brook University, >> > > > > > > > Stony Brook, New York, 11790 >> > > > > > > > >> > > > > > > > >> > > > > > > > >> > > > > > > > -- >> > > > > > > > What most experimenters take for granted before they begin >> their experiments is infinitely more interesting than any results to which >> their experiments lead. >> > > > > > > > -- Norbert Wiener >> > > > > > > > >> > > > > > > > https://www.cse.buffalo.edu/~knepley/ >> > > > > > > > >> > > > > > > > >> > > > > > > > >> > > > > > > > -- >> > > > > > > > Hao Zhang >> > > > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > > > Stony Brook University, >> > > > > > > > Stony Brook, New York, 11790 >> > > > > > > > >> > > > > > > > >> > > > > > > > >> > > > > > > > -- >> > > > > > > > Hao Zhang >> > > > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > > > Stony Brook University, >> > > > > > > > Stony Brook, New York, 11790 >> > > > > > > >> > > > > > > -- >> > > > > > > Hao Zhang >> > > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > > Stony Brook University, >> > > > > > > Stony Brook, New York, 11790 >> > > > > > >> > > > > > >> > > > > > >> > > > > > >> > > > > > -- >> > > > > > Hao Zhang >> > > > > > Dept. of Applid Mathematics and Statistics, >> > > > > > Stony Brook University, >> > > > > > Stony Brook, New York, 11790 >> > > > > >> > > > > >> > > > > >> > > > > >> > > > > -- >> > > > > Hao Zhang >> > > > > Dept. of Applid Mathematics and Statistics, >> > > > > Stony Brook University, >> > > > > Stony Brook, New York, 11790 >> > > > >> > > > >> > > > >> > > > >> > > > -- >> > > > Hao Zhang >> > > > Dept. of Applid Mathematics and Statistics, >> > > > Stony Brook University, >> > > > Stony Brook, New York, 11790 >> > > >> > > >> > > >> > > >> > > -- >> > > Hao Zhang >> > > Dept. of Applid Mathematics and Statistics, >> > > Stony Brook University, >> > > Stony Brook, New York, 11790 >> > >> > >> > >> > >> > -- >> > Hao Zhang >> > Dept. of Applid Mathematics and Statistics, >> > Stony Brook University, >> > Stony Brook, New York, 11790 >> >> > -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Oct 23 09:07:23 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 23 Oct 2017 10:07:23 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> <8C8ADFC3-C71F-4DE1-B21C-5BF50EBC652D@mcs.anl.gov> Message-ID: On Mon, Oct 23, 2017 at 10:04 AM, Hao Zhang wrote: > The big picture is I'm solving 3d incompressible Navier-Stokes Equations > using staggered/MAC grid with Finite Difference Method. The particular > function is poisson pressure solver or Laplacian Mark Adam mentioned. The > simulation runs fine for medium size mesh grid. When I try harder to go > very fine grid, not DNS level, I'm having some difficulties to have meaning > physical results. All PETSc solvers converge but come with huge iterations. > That's when/how I started using HYPRE. > Are we just talking about the convergence of the pressure solve? > > On Mon, Oct 23, 2017 at 9:01 AM, Mark Adams wrote: > >> Just to be clear: 1) are you solving the Laplacian (div grad) and 2) what >> type of discretizations are you using? and 3) do you have stretched or >> terrible grids in some way? >> >> On Sun, Oct 22, 2017 at 3:57 PM, Barry Smith wrote: >> >>> >>> One thing important to understand is that multigrid is an optimal or >>> nearly optimal algorithm. This means, when it works, as you refine the mesh >>> the number of iterations remains nearly constant, regardless of the problem >>> size and number of processes. Simple preconditioners such as ILU, block >>> Jacobi, one level additive Schwarz etc have iterations that increase with >>> the problem size and likely also with the number of processes. Thus these >>> algorithms become essentially impractical for very large problems while >>> multigrid can remain practical (when it works). >>> >>> Good luck >>> >>> Barry >>> > On Oct 22, 2017, at 2:35 PM, Hao Zhang wrote: >>> > >>> > Thanks for all the inputs. before simulating finer grid, HYPRE wasn't >>> used and simulations were just fine. I will do a few tests and post more >>> information later. >>> > >>> > On Sun, Oct 22, 2017 at 12:11 PM, Barry Smith >>> wrote: >>> > >>> > > On Oct 21, 2017, at 11:16 PM, Hao Zhang wrote: >>> > > >>> > > the reason is when I do finer grid simulation, matrix become more >>> stiff. >>> > >>> > Are you saying that for a finer grid but everything else the same, >>> the convergence of hypre (with the same GMRES) with the same options gets >>> much worse? This normally will not happen, that is the fundamental beauty >>> of multigrid methods (when they work well). >>> > >>> > Yes the matrix condition number increases but multigrid doesn't >>> care about that, its number of iterations should remain pretty much the >>> same. >>> > >>> > Something must be different (with this finer grid case), either >>> the mesh becomes horrible, or the physics changes, or there are errors in >>> the code that lead to the problem. >>> > >>> > What happens if you just refine the mesh a little? Then a little >>> more? Then a little more? Does the convergence rate suddenly go bad at some >>> point, or does it just get worse slowly? >>> > >>> > Barry >>> > >>> > >>> > >>> > > Much larger condition number. just to give you a perspective, it >>> will take 6000 iterations to converge and the solver does converge. I want >>> to reduce the number of iterations while keeping the convergence rate. >>> that's main drive to do so much heavy lifting around. please advise. >>> snippet will be provided upon request. >>> > > >>> > > Thanks again. >>> > > >>> > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith >>> wrote: >>> > > >>> > > Oh, you change KSP but not hypre. I did not understand this. >>> > > >>> > > Why not just use GMRES all the time? Why mess with BCGS if it is >>> not robust? Not worth the small optimization if it breaks everything. >>> > > >>> > > Barry >>> > > >>> > > >>> > > >>> > > > On Oct 21, 2017, at 11:05 PM, Hao Zhang >>> wrote: >>> > > > >>> > > > this is the initial pressure solver output regarding use of PETSc. >>> it failed to converge after 40000 iterations, then use GMRES. >>> > > > >>> > > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid >>> norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 >>> > > > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid >>> norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 >>> > > > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid >>> norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 >>> > > > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid >>> norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 >>> > > > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid >>> norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 >>> > > > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid >>> norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 >>> > > > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid >>> norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 >>> > > > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid >>> norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 >>> > > > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid >>> norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 >>> > > > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid >>> norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 >>> > > > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid >>> norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 >>> > > > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid >>> norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 >>> > > > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid >>> norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 >>> > > > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid >>> norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 >>> > > > Linear solve did not converge due to DIVERGED_ITS iterations 40000 >>> > > > KSP Object: 24 MPI processes >>> > > > type: bcgs >>> > > > maximum iterations=40000, initial guess is zero >>> > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >>> > > > left preconditioning >>> > > > using PRECONDITIONED norm type for convergence test >>> > > > PC Object: 24 MPI processes >>> > > > type: hypre >>> > > > HYPRE BoomerAMG preconditioning >>> > > > Cycle type V >>> > > > Maximum number of levels 25 >>> > > > Maximum number of iterations PER hypre call 1 >>> > > > Convergence tolerance PER hypre call 0. >>> > > > Threshold for strong coupling 0.25 >>> > > > Interpolation truncation factor 0. >>> > > > Interpolation: max elements per row 0 >>> > > > Number of levels of aggressive coarsening 0 >>> > > > Number of paths for aggressive coarsening 1 >>> > > > Maximum row sums 0.9 >>> > > > Sweeps down 1 >>> > > > Sweeps up 1 >>> > > > Sweeps on coarse 1 >>> > > > Relax down symmetric-SOR/Jacobi >>> > > > Relax up symmetric-SOR/Jacobi >>> > > > Relax on coarse Gaussian-elimination >>> > > > Relax weight (all) 1. >>> > > > Outer relax weight (all) 1. >>> > > > Using CF-relaxation >>> > > > Not using more complex smoothers. >>> > > > Measure type local >>> > > > Coarsen type Falgout >>> > > > Interpolation type classical >>> > > > linear system matrix = precond matrix: >>> > > > Mat Object: A 24 MPI processes >>> > > > type: mpiaij >>> > > > rows=497664, cols=497664 >>> > > > total: nonzeros=3363552, allocated nonzeros=6967296 >>> > > > total number of mallocs used during MatSetValues calls =0 >>> > > > has attached null space >>> > > > not using I-node (on process 0) routines >>> > > > >>> > > > The solution diverges for p0! The residual is 3.853123e-08. Solve >>> again using GMRES! >>> > > > KSP Object: 24 MPI processes >>> > > > type: gmres >>> > > > restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> > > > happy breakdown tolerance 1e-30 >>> > > > maximum iterations=40000, initial guess is zero >>> > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >>> > > > left preconditioning >>> > > > using PRECONDITIONED norm type for convergence test >>> > > > PC Object: 24 MPI processes >>> > > > type: hypre >>> > > > HYPRE BoomerAMG preconditioning >>> > > > Cycle type V >>> > > > Maximum number of levels 25 >>> > > > Maximum number of iterations PER hypre call 1 >>> > > > Convergence tolerance PER hypre call 0. >>> > > > Threshold for strong coupling 0.25 >>> > > > Interpolation truncation factor 0. >>> > > > Interpolation: max elements per row 0 >>> > > > Number of levels of aggressive coarsening 0 >>> > > > Number of paths for aggressive coarsening 1 >>> > > > Maximum row sums 0.9 >>> > > > Sweeps down 1 >>> > > > Sweeps up 1 >>> > > > Sweeps on coarse 1 >>> > > > Relax down symmetric-SOR/Jacobi >>> > > > Relax up symmetric-SOR/Jacobi >>> > > > Relax on coarse Gaussian-elimination >>> > > > Relax weight (all) 1. >>> > > > Outer relax weight (all) 1. >>> > > > Using CF-relaxation >>> > > > Not using more complex smoothers. >>> > > > Measure type local >>> > > > Coarsen type Falgout >>> > > > Interpolation type classical >>> > > > linear system matrix = precond matrix: >>> > > > Mat Object: A 24 MPI processes >>> > > > type: mpiaij >>> > > > rows=497664, cols=497664 >>> > > > total: nonzeros=3363552, allocated nonzeros=6967296 >>> > > > total number of mallocs used during MatSetValues calls =0 >>> > > > has attached null space >>> > > > not using I-node (on process 0) routines >>> > > > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid >>> norm 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 >>> > > > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid >>> norm 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 >>> > > > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid >>> norm 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 >>> > > > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid >>> norm 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 >>> > > > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid >>> norm 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 >>> > > > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid >>> norm 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 >>> > > > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid >>> norm 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 >>> > > > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid >>> norm 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 >>> > > > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid >>> norm 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 >>> > > > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid >>> norm 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 >>> > > > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid >>> norm 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 >>> > > > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid >>> norm 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 >>> > > > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid >>> norm 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 >>> > > > Linear solve converged due to CONVERGED_RTOL iterations 12 >>> > > > KSP Object: 24 MPI processes >>> > > > type: gmres >>> > > > restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> > > > happy breakdown tolerance 1e-30 >>> > > > maximum iterations=40000, initial guess is zero >>> > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >>> > > > left preconditioning >>> > > > using PRECONDITIONED norm type for convergence test >>> > > > PC Object: 24 MPI processes >>> > > > type: hypre >>> > > > HYPRE BoomerAMG preconditioning >>> > > > Cycle type V >>> > > > Maximum number of levels 25 >>> > > > Maximum number of iterations PER hypre call 1 >>> > > > Convergence tolerance PER hypre call 0. >>> > > > Threshold for strong coupling 0.25 >>> > > > Interpolation truncation factor 0. >>> > > > Interpolation: max elements per row 0 >>> > > > Number of levels of aggressive coarsening 0 >>> > > > Number of paths for aggressive coarsening 1 >>> > > > Maximum row sums 0.9 >>> > > > Sweeps down 1 >>> > > > Sweeps up 1 >>> > > > Sweeps on coarse 1 >>> > > > Relax down symmetric-SOR/Jacobi >>> > > > Relax up symmetric-SOR/Jacobi >>> > > > Relax on coarse Gaussian-elimination >>> > > > Relax weight (all) 1. >>> > > > Outer relax weight (all) 1. >>> > > > Using CF-relaxation >>> > > > Not using more complex smoothers. >>> > > > Measure type local >>> > > > Coarsen type Falgout >>> > > > Interpolation type classical >>> > > > linear system matrix = precond matrix: >>> > > > Mat Object: A 24 MPI processes >>> > > > type: mpiaij >>> > > > rows=497664, cols=497664 >>> > > > total: nonzeros=3363552, allocated nonzeros=6967296 >>> > > > total number of mallocs used during MatSetValues calls =0 >>> > > > has attached null space >>> > > > not using I-node (on process 0) routines >>> > > > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd >>> > > > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd >>> > > > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd >>> > > > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = >>> 1.075459e-13 >>> > > > >>> > > > The max value of p0 is 0.03115845493408858 >>> > > > >>> > > > The min value of p0 is -0.07156715468428149 >>> > > > >>> > > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith >>> wrote: >>> > > > >>> > > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang >>> wrote: >>> > > > > >>> > > > > the incompressible NS solver algorithm call PETSc solver at >>> different stage of each time step. The one you were saying "This is good. >>> 12 digit reduction" is after the initial pressure solver, in which usually >>> HYPRE doesn't give a good convergence, so the fall-back solver GMRES will >>> be called after. >>> > > > >>> > > > Hmm, I don't understand. hypre should do well on a pressure >>> solve. In fact, very well. >>> > > > > >>> > > > > Barry, you were mentioning that I could have a wrong nullspace. >>> that particular solver is aimed to give an initial pressure profile for 3d >>> incompressible NS simulation using all neumann boundary conditions. could >>> you give some insight how to test if I have a wrong nullspace etc? >>> > > > >>> > > > -ksp_test_null_space >>> > > > >>> > > > But if your null space is consistently from all Neumann >>> boundary conditions then it likely is not wrong. >>> > > > >>> > > > Barry >>> > > > >>> > > > > >>> > > > > Thanks! >>> > > > > >>> > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith < >>> bsmith at mcs.anl.gov> wrote: >>> > > > > >>> > > > > This is good. You get more than 12 digit reduction in the true >>> residual norm. This is good AMG convergence. Expected when everything goes >>> well. >>> > > > > >>> > > > > What is different in this case from the previous case that >>> does not converge reasonably? >>> > > > > >>> > > > > Barry >>> > > > > >>> > > > > >>> > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang >>> wrote: >>> > > > > > >>> > > > > > Barry, Please advise what you make of this? this is poisson >>> solver with all neumann BC 3d case Finite difference Scheme was used. >>> > > > > > Thanks! I'm in learning mode. >>> > > > > > >>> > > > > > KSP Object: 24 MPI processes >>> > > > > > type: bcgs >>> > > > > > maximum iterations=40000, initial guess is zero >>> > > > > > tolerances: relative=1e-14, absolute=1e-14, >>> divergence=10000. >>> > > > > > left preconditioning >>> > > > > > using PRECONDITIONED norm type for convergence test >>> > > > > > PC Object: 24 MPI processes >>> > > > > > type: hypre >>> > > > > > HYPRE BoomerAMG preconditioning >>> > > > > > Cycle type V >>> > > > > > Maximum number of levels 25 >>> > > > > > Maximum number of iterations PER hypre call 1 >>> > > > > > Convergence tolerance PER hypre call 0. >>> > > > > > Threshold for strong coupling 0.25 >>> > > > > > Interpolation truncation factor 0. >>> > > > > > Interpolation: max elements per row 0 >>> > > > > > Number of levels of aggressive coarsening 0 >>> > > > > > Number of paths for aggressive coarsening 1 >>> > > > > > Maximum row sums 0.9 >>> > > > > > Sweeps down 1 >>> > > > > > Sweeps up 1 >>> > > > > > Sweeps on coarse 1 >>> > > > > > Relax down symmetric-SOR/Jacobi >>> > > > > > Relax up symmetric-SOR/Jacobi >>> > > > > > Relax on coarse Gaussian-elimination >>> > > > > > Relax weight (all) 1. >>> > > > > > Outer relax weight (all) 1. >>> > > > > > Using CF-relaxation >>> > > > > > Not using more complex smoothers. >>> > > > > > Measure type local >>> > > > > > Coarsen type Falgout >>> > > > > > Interpolation type classical >>> > > > > > linear system matrix = precond matrix: >>> > > > > > Mat Object: A 24 MPI processes >>> > > > > > type: mpiaij >>> > > > > > rows=497664, cols=497664 >>> > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 >>> > > > > > total number of mallocs used during MatSetValues calls =0 >>> > > > > > has attached null space >>> > > > > > not using I-node (on process 0) routines >>> > > > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true >>> resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 >>> > > > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true >>> resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 >>> > > > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true >>> resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 >>> > > > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true >>> resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 >>> > > > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true >>> resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 >>> > > > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true >>> resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 >>> > > > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true >>> resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 >>> > > > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true >>> resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 >>> > > > > > Linear solve converged due to CONVERGED_ATOL iterations 7 >>> > > > > > >>> > > > > > >>> > > > > > >>> > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith < >>> bsmith at mcs.anl.gov> wrote: >>> > > > > > >>> > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang >>> wrote: >>> > > > > > > >>> > > > > > > hi, Barry: >>> > > > > > > what do you mean absurd by setting tolerance =1e-14? >>> > > > > > >>> > > > > > Trying to decrease the initial residual norm down by a >>> factor of 1e-14 with an iterative method (or even direct method) is >>> unrealistic, usually unachievable) and almost never necessary. You are >>> requiring || r_n || < 1.e-14 >>> || r_0|| when >>> with double precision numbers you only have roughly 14 decimal digits total >>> to compute with. Round off alone will lead to differences far larger than >>> 1e-14 >>> > > > > > >>> > > > > > If you are using the solver in the context of a nonlinear >>> problem (i.e. inside Newton's method) then 1.e-6 >>> is generally >>> more than plenty to get quadratic convergence of Newton's method. >>> > > > > > >>> > > > > > If you are solving a linear problem then it is extremely >>> likely that errors due to discretization errors (from finite element method >>> etc) and the model are much much larger than even 1.e-8 >>> . >>> > > > > > >>> > > > > > So, in summary >>> > > > > > >>> > > > > > 1.e-14 >>> is probably >>> unachievable >>> > > > > > >>> > > > > > 1.e-14 >>> is almost for >>> sure not needed. >>> > > > > > >>> > > > > > Barry >>> > > > > > >>> > > > > > >>> > > > > > > >>> > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith < >>> bsmith at mcs.anl.gov> wrote: >>> > > > > > > >>> > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and >>> send the resulting output file called binaryoutput to >>> petsc-maint at mcs.anl.gov >>> > > > > > > >>> > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely >>> to be a reason to use bcgs >>> > > > > > > >>> > > > > > > BTW: tolerances: relative=1e-14, is absurd >>> > > > > > > >>> > > > > > > My guess is your null space is incorrect. >>> > > > > > > >>> > > > > > > >>> > > > > > > >>> > > > > > > >>> > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang < >>> hbcbh1999 at gmail.com> wrote: >>> > > > > > > > >>> > > > > > > > if this solver doesn't converge. I have a fall-back >>> solution, which uses GMRES solver. this setup is fine with me. I just want >>> to know if HYPRE is a reliable solution for me. Or I will have to go >>> without preconditioner. >>> > > > > > > > >>> > > > > > > > Thanks! >>> > > > > > > > >>> > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang < >>> hbcbh1999 at gmail.com> wrote: >>> > > > > > > > this is serial run. still dumping output. parallel more or >>> less the same. >>> > > > > > > > >>> > > > > > > > KSP Object: 1 MPI processes >>> > > > > > > > type: bcgs >>> > > > > > > > maximum iterations=40000, initial guess is zero >>> > > > > > > > tolerances: relative=1e-14, absolute=1e-14, >>> divergence=10000. >>> > > > > > > > left preconditioning >>> > > > > > > > using PRECONDITIONED norm type for convergence test >>> > > > > > > > PC Object: 1 MPI processes >>> > > > > > > > type: hypre >>> > > > > > > > HYPRE BoomerAMG preconditioning >>> > > > > > > > Cycle type V >>> > > > > > > > Maximum number of levels 25 >>> > > > > > > > Maximum number of iterations PER hypre call 1 >>> > > > > > > > Convergence tolerance PER hypre call 0. >>> > > > > > > > Threshold for strong coupling 0.25 >>> > > > > > > > Interpolation truncation factor 0. >>> > > > > > > > Interpolation: max elements per row 0 >>> > > > > > > > Number of levels of aggressive coarsening 0 >>> > > > > > > > Number of paths for aggressive coarsening 1 >>> > > > > > > > Maximum row sums 0.9 >>> > > > > > > > Sweeps down 1 >>> > > > > > > > Sweeps up 1 >>> > > > > > > > Sweeps on coarse 1 >>> > > > > > > > Relax down symmetric-SOR/Jacobi >>> > > > > > > > Relax up symmetric-SOR/Jacobi >>> > > > > > > > Relax on coarse Gaussian-elimination >>> > > > > > > > Relax weight (all) 1. >>> > > > > > > > Outer relax weight (all) 1. >>> > > > > > > > Using CF-relaxation >>> > > > > > > > Not using more complex smoothers. >>> > > > > > > > Measure type local >>> > > > > > > > Coarsen type Falgout >>> > > > > > > > Interpolation type classical >>> > > > > > > > linear system matrix = precond matrix: >>> > > > > > > > Mat Object: A 1 MPI processes >>> > > > > > > > type: seqaij >>> > > > > > > > rows=497664, cols=497664 >>> > > > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 >>> > > > > > > > total number of mallocs used during MatSetValues calls >>> =0 >>> > > > > > > > has attached null space >>> > > > > > > > not using I-node routines >>> > > > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true >>> resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 >>> > > > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true >>> resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 >>> > > > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true >>> resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 >>> > > > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true >>> resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 >>> > > > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true >>> resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 >>> > > > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true >>> resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 >>> > > > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true >>> resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 >>> > > > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true >>> resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 >>> > > > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true >>> resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 >>> > > > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true >>> resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 >>> > > > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true >>> resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 >>> > > > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true >>> resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 >>> > > > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true >>> resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 >>> > > > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true >>> resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 >>> > > > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true >>> resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 >>> > > > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true >>> resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 >>> > > > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true >>> resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 >>> > > > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true >>> resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 >>> > > > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true >>> resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 >>> > > > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true >>> resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 >>> > > > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true >>> resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 >>> > > > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true >>> resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 >>> > > > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true >>> resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 >>> > > > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true >>> resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 >>> > > > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true >>> resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 >>> > > > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true >>> resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 >>> > > > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true >>> resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 >>> > > > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true >>> resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 >>> > > > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true >>> resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 >>> > > > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true >>> resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 >>> > > > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true >>> resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 >>> > > > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true >>> resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 >>> > > > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true >>> resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 >>> > > > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true >>> resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 >>> > > > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true >>> resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 >>> > > > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true >>> resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 >>> > > > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true >>> resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 >>> > > > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true >>> resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 >>> > > > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true >>> resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 >>> > > > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true >>> resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 >>> > > > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true >>> resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 >>> > > > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true >>> resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 >>> > > > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true >>> resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 >>> > > > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true >>> resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 >>> > > > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true >>> resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 >>> > > > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true >>> resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 >>> > > > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true >>> resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 >>> > > > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true >>> resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 >>> > > > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true >>> resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 >>> > > > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true >>> resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 >>> > > > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true >>> resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 >>> > > > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true >>> resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 >>> > > > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true >>> resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 >>> > > > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true >>> resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 >>> > > > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true >>> resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 >>> > > > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true >>> resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 >>> > > > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true >>> resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 >>> > > > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true >>> resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 >>> > > > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true >>> resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 >>> > > > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true >>> resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 >>> > > > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true >>> resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 >>> > > > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true >>> resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 >>> > > > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true >>> resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 >>> > > > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true >>> resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 >>> > > > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true >>> resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 >>> > > > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true >>> resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 >>> > > > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true >>> resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 >>> > > > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true >>> resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 >>> > > > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true >>> resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 >>> > > > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true >>> resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 >>> > > > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true >>> resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 >>> > > > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true >>> resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 >>> > > > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true >>> resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 >>> > > > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true >>> resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 >>> > > > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true >>> resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 >>> > > > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true >>> resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 >>> > > > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true >>> resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 >>> > > > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true >>> resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 >>> > > > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true >>> resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 >>> > > > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true >>> resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 >>> > > > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true >>> resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 >>> > > > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true >>> resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 >>> > > > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true >>> resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 >>> > > > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true >>> resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 >>> > > > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true >>> resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 >>> > > > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true >>> resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 >>> > > > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true >>> resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 >>> > > > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true >>> resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 >>> > > > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true >>> resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 >>> > > > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true >>> resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 >>> > > > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true >>> resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 >>> > > > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true >>> resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 >>> > > > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true >>> resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 >>> > > > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true >>> resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 >>> > > > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true >>> resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 >>> > > > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true >>> resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 >>> > > > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true >>> resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 >>> > > > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true >>> resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 >>> > > > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true >>> resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 >>> > > > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true >>> resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 >>> > > > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true >>> resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 >>> > > > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true >>> resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 >>> > > > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true >>> resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 >>> > > > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true >>> resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 >>> > > > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true >>> resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 >>> > > > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true >>> resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 >>> > > > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true >>> resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 >>> > > > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true >>> resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 >>> > > > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true >>> resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 >>> > > > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true >>> resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 >>> > > > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true >>> resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 >>> > > > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true >>> resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 >>> > > > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true >>> resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 >>> > > > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true >>> resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 >>> > > > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true >>> resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 >>> > > > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true >>> resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 >>> > > > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true >>> resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 >>> > > > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true >>> resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 >>> > > > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true >>> resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 >>> > > > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true >>> resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 >>> > > > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true >>> resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 >>> > > > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true >>> resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 >>> > > > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true >>> resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 >>> > > > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true >>> resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 >>> > > > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true >>> resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 >>> > > > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true >>> resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 >>> > > > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true >>> resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 >>> > > > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true >>> resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 >>> > > > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true >>> resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 >>> > > > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true >>> resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 >>> > > > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true >>> resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 >>> > > > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true >>> resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 >>> > > > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true >>> resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 >>> > > > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true >>> resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 >>> > > > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true >>> resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 >>> > > > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true >>> resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 >>> > > > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true >>> resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 >>> > > > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true >>> resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 >>> > > > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true >>> resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 >>> > > > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true >>> resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 >>> > > > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true >>> resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 >>> > > > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true >>> resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 >>> > > > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true >>> resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 >>> > > > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true >>> resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 >>> > > > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true >>> resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 >>> > > > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true >>> resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 >>> > > > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true >>> resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 >>> > > > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true >>> resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 >>> > > > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true >>> resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 >>> > > > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true >>> resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 >>> > > > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true >>> resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 >>> > > > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true >>> resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 >>> > > > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true >>> resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 >>> > > > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true >>> resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 >>> > > > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true >>> resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 >>> > > > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true >>> resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 >>> > > > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true >>> resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 >>> > > > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true >>> resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 >>> > > > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true >>> resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 >>> > > > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true >>> resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 >>> > > > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true >>> resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 >>> > > > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true >>> resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 >>> > > > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true >>> resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 >>> > > > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true >>> resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 >>> > > > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true >>> resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 >>> > > > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true >>> resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 >>> > > > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true >>> resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 >>> > > > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true >>> resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 >>> > > > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true >>> resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 >>> > > > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true >>> resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 >>> > > > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true >>> resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 >>> > > > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true >>> resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 >>> > > > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true >>> resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 >>> > > > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true >>> resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 >>> > > > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true >>> resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 >>> > > > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true >>> resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 >>> > > > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true >>> resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 >>> > > > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true >>> resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 >>> > > > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true >>> resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 >>> > > > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true >>> resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 >>> > > > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true >>> resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 >>> > > > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true >>> resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 >>> > > > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true >>> resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 >>> > > > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true >>> resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 >>> > > > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true >>> resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 >>> > > > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true >>> resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 >>> > > > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true >>> resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 >>> > > > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true >>> resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 >>> > > > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true >>> resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 >>> > > > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true >>> resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 >>> > > > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true >>> resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 >>> > > > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true >>> resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 >>> > > > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true >>> resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 >>> > > > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true >>> resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 >>> > > > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true >>> resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 >>> > > > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true >>> resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 >>> > > > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true >>> resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 >>> > > > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true >>> resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 >>> > > > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true >>> resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 >>> > > > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true >>> resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 >>> > > > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true >>> resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 >>> > > > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true >>> resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 >>> > > > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true >>> resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 >>> > > > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true >>> resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 >>> > > > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true >>> resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 >>> > > > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true >>> resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 >>> > > > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true >>> resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 >>> > > > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true >>> resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 >>> > > > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true >>> resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 >>> > > > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true >>> resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 >>> > > > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true >>> resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 >>> > > > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true >>> resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 >>> > > > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true >>> resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 >>> > > > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true >>> resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 >>> > > > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true >>> resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 >>> > > > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true >>> resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 >>> > > > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true >>> resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 >>> > > > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true >>> resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 >>> > > > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true >>> resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 >>> > > > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true >>> resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 >>> > > > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true >>> resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 >>> > > > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true >>> resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 >>> > > > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true >>> resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 >>> > > > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true >>> resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 >>> > > > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true >>> resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 >>> > > > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true >>> resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 >>> > > > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true >>> resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 >>> > > > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true >>> resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 >>> > > > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true >>> resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 >>> > > > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true >>> resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 >>> > > > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true >>> resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 >>> > > > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true >>> resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 >>> > > > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true >>> resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 >>> > > > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true >>> resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 >>> > > > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true >>> resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 >>> > > > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true >>> resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 >>> > > > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true >>> resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 >>> > > > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true >>> resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 >>> > > > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true >>> resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 >>> > > > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true >>> resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 >>> > > > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true >>> resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 >>> > > > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true >>> resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 >>> > > > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true >>> resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 >>> > > > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true >>> resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 >>> > > > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true >>> resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 >>> > > > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true >>> resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 >>> > > > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true >>> resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 >>> > > > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true >>> resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 >>> > > > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true >>> resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >>> > > > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true >>> resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 >>> > > > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true >>> resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 >>> > > > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true >>> resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >>> > > > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true >>> resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 >>> > > > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true >>> resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 >>> > > > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true >>> resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 >>> > > > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true >>> resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 >>> > > > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true >>> resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 >>> > > > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true >>> resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 >>> > > > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true >>> resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 >>> > > > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true >>> resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 >>> > > > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true >>> resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 >>> > > > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true >>> resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 >>> > > > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true >>> resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 >>> > > > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true >>> resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 >>> > > > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true >>> resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 >>> > > > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true >>> resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 >>> > > > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true >>> resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 >>> > > > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true >>> resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 >>> > > > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true >>> resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 >>> > > > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true >>> resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 >>> > > > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true >>> resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 >>> > > > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true >>> resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 >>> > > > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true >>> resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 >>> > > > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true >>> resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 >>> > > > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true >>> resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 >>> > > > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true >>> resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 >>> > > > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true >>> resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 >>> > > > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true >>> resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 >>> > > > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true >>> resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 >>> > > > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true >>> resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 >>> > > > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true >>> resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 >>> > > > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true >>> resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 >>> > > > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true >>> resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 >>> > > > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true >>> resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 >>> > > > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true >>> resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 >>> > > > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true >>> resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 >>> > > > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true >>> resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 >>> > > > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true >>> resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 >>> > > > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true >>> resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 >>> > > > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true >>> resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 >>> > > > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true >>> resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 >>> > > > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true >>> resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 >>> > > > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true >>> resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 >>> > > > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true >>> resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 >>> > > > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true >>> resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 >>> > > > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true >>> resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 >>> > > > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true >>> resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 >>> > > > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true >>> resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 >>> > > > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true >>> resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 >>> > > > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true >>> resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 >>> > > > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true >>> resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 >>> > > > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true >>> resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 >>> > > > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true >>> resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 >>> > > > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true >>> resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 >>> > > > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true >>> resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 >>> > > > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true >>> resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >>> > > > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true >>> resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >>> > > > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true >>> resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 >>> > > > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true >>> resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 >>> > > > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true >>> resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 >>> > > > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true >>> resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 >>> > > > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true >>> resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 >>> > > > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true >>> resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 >>> > > > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true >>> resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 >>> > > > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true >>> resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >>> > > > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true >>> resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 >>> > > > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true >>> resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >>> > > > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true >>> resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 >>> > > > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true >>> resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 >>> > > > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true >>> resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 >>> > > > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true >>> resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 >>> > > > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true >>> resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 >>> > > > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true >>> resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 >>> > > > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true >>> resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 >>> > > > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true >>> resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 >>> > > > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true >>> resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 >>> > > > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true >>> resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 >>> > > > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true >>> resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 >>> > > > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true >>> resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 >>> > > > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true >>> resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 >>> > > > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true >>> resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 >>> > > > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true >>> resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 >>> > > > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true >>> resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 >>> > > > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true >>> resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 >>> > > > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true >>> resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 >>> > > > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true >>> resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 >>> > > > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true >>> resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 >>> > > > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true >>> resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 >>> > > > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true >>> resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 >>> > > > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true >>> resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 >>> > > > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true >>> resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 >>> > > > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true >>> resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 >>> > > > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true >>> resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 >>> > > > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true >>> resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 >>> > > > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true >>> resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 >>> > > > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true >>> resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 >>> > > > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true >>> resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 >>> > > > > > > > >>> > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley < >>> knepley at gmail.com> wrote: >>> > > > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang < >>> hbcbh1999 at gmail.com> wrote: >>> > > > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); >>> > > > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); >>> > > > > > > > >>> > > > > > > > ierr = VecAssemblyBegin(x); >>> > > > > > > > ierr = VecAssemblyEnd(x); >>> > > > > > > > This is probably unnecessary >>> > > > > > > > >>> > > > > > > > ierr = VecAssemblyBegin(b); >>> > > > > > > > ierr = VecAssemblyEnd(b); >>> > > > > > > > This is probably unnecessary >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ >>> WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); >>> > > > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 >>> > > > > > > > Is your rhs consistent with this nullspace? >>> > > > > > > > >>> > > > > > > > // KSPSetOperators(ksp,A,A,DIFFER >>> ENT_NONZERO_PATTERN); >>> > > > > > > > KSPSetOperators(ksp,A,A); >>> > > > > > > > >>> > > > > > > > KSPSetType(ksp,KSPBCGS); >>> > > > > > > > >>> > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); >>> > > > > > > > #if defined(__HYPRE__) >>> > > > > > > > KSPGetPC(ksp, &pc); >>> > > > > > > > PCSetType(pc, PCHYPRE); >>> > > > > > > > PCHYPRESetType(pc,"boomeramg"); >>> > > > > > > > This is terribly unnecessary. You just use >>> > > > > > > > >>> > > > > > > > -pc_type hypre -pc_hypre_type boomeramg >>> > > > > > > > >>> > > > > > > > or >>> > > > > > > > >>> > > > > > > > -pc_type gamg >>> > > > > > > > >>> > > > > > > > #else >>> > > > > > > > KSPSetType(ksp,KSPBCGSL); >>> > > > > > > > KSPBCGSLSetEll(ksp,2); >>> > > > > > > > #endif /* defined(__HYPRE__) */ >>> > > > > > > > >>> > > > > > > > KSPSetFromOptions(ksp); >>> > > > > > > > KSPSetUp(ksp); >>> > > > > > > > >>> > > > > > > > ierr = KSPSolve(ksp,b,x); >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > command line >>> > > > > > > > >>> > > > > > > > You did not provide any of what I asked for the in the >>> eprevious mail. >>> > > > > > > > >>> > > > > > > > Matt >>> > > > > > > > >>> > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley < >>> knepley at gmail.com> wrote: >>> > > > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang < >>> hbcbh1999 at gmail.com> wrote: >>> > > > > > > > hi, >>> > > > > > > > >>> > > > > > > > I implemented HYPRE preconditioner for my study due to the >>> fact that without preconditioner, PETSc solver will take thousands of >>> iterations to converge for fine grid simulation. >>> > > > > > > > >>> > > > > > > > with HYPRE, depending on the parallel partition, it will >>> take HYPRE forever to do anything. observation of output file is that the >>> simulation is hanging with no output. >>> > > > > > > > >>> > > > > > > > Any idea what happened? will post snippet of code. >>> > > > > > > > >>> > > > > > > > 1) For any question about convergence, we need to see the >>> output of >>> > > > > > > > >>> > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual >>> -ksp_converged_reason >>> > > > > > > > >>> > > > > > > > 2) Hypre has many preconditioners, which one are you >>> talking about >>> > > > > > > > >>> > > > > > > > 3) PETSc has some preconditioners in common with Hypre, >>> like AMG >>> > > > > > > > >>> > > > > > > > Thanks, >>> > > > > > > > >>> > > > > > > > Matt >>> > > > > > > > >>> > > > > > > > -- >>> > > > > > > > Hao Zhang >>> > > > > > > > Dept. of Applid Mathematics and Statistics, >>> > > > > > > > Stony Brook University, >>> > > > > > > > Stony Brook, New York, 11790 >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > -- >>> > > > > > > > What most experimenters take for granted before they begin >>> their experiments is infinitely more interesting than any results to which >>> their experiments lead. >>> > > > > > > > -- Norbert Wiener >>> > > > > > > > >>> > > > > > > > https://www.cse.buffalo.edu/~knepley/ >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > -- >>> > > > > > > > Hao Zhang >>> > > > > > > > Dept. of Applid Mathematics and Statistics, >>> > > > > > > > Stony Brook University, >>> > > > > > > > Stony Brook, New York, 11790 >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > -- >>> > > > > > > > What most experimenters take for granted before they begin >>> their experiments is infinitely more interesting than any results to which >>> their experiments lead. >>> > > > > > > > -- Norbert Wiener >>> > > > > > > > >>> > > > > > > > https://www.cse.buffalo.edu/~knepley/ >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > -- >>> > > > > > > > Hao Zhang >>> > > > > > > > Dept. of Applid Mathematics and Statistics, >>> > > > > > > > Stony Brook University, >>> > > > > > > > Stony Brook, New York, 11790 >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > -- >>> > > > > > > > Hao Zhang >>> > > > > > > > Dept. of Applid Mathematics and Statistics, >>> > > > > > > > Stony Brook University, >>> > > > > > > > Stony Brook, New York, 11790 >>> > > > > > > >>> > > > > > > -- >>> > > > > > > Hao Zhang >>> > > > > > > Dept. of Applid Mathematics and Statistics, >>> > > > > > > Stony Brook University, >>> > > > > > > Stony Brook, New York, 11790 >>> > > > > > >>> > > > > > >>> > > > > > >>> > > > > > >>> > > > > > -- >>> > > > > > Hao Zhang >>> > > > > > Dept. of Applid Mathematics and Statistics, >>> > > > > > Stony Brook University, >>> > > > > > Stony Brook, New York, 11790 >>> > > > > >>> > > > > >>> > > > > >>> > > > > >>> > > > > -- >>> > > > > Hao Zhang >>> > > > > Dept. of Applid Mathematics and Statistics, >>> > > > > Stony Brook University, >>> > > > > Stony Brook, New York, 11790 >>> > > > >>> > > > >>> > > > >>> > > > >>> > > > -- >>> > > > Hao Zhang >>> > > > Dept. of Applid Mathematics and Statistics, >>> > > > Stony Brook University, >>> > > > Stony Brook, New York, 11790 >>> > > >>> > > >>> > > >>> > > >>> > > -- >>> > > Hao Zhang >>> > > Dept. of Applid Mathematics and Statistics, >>> > > Stony Brook University, >>> > > Stony Brook, New York, 11790 >>> > >>> > >>> > >>> > >>> > -- >>> > Hao Zhang >>> > Dept. of Applid Mathematics and Statistics, >>> > Stony Brook University, >>> > Stony Brook, New York, 11790 >>> >>> >> > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hbcbh1999 at gmail.com Mon Oct 23 09:09:41 2017 From: hbcbh1999 at gmail.com (Hao Zhang) Date: Mon, 23 Oct 2017 10:09:41 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> <8C8ADFC3-C71F-4DE1-B21C-5BF50EBC652D@mcs.anl.gov> Message-ID: Yes. On Mon, Oct 23, 2017 at 10:07 AM, Mark Adams wrote: > > > On Mon, Oct 23, 2017 at 10:04 AM, Hao Zhang wrote: > >> The big picture is I'm solving 3d incompressible Navier-Stokes Equations >> using staggered/MAC grid with Finite Difference Method. The particular >> function is poisson pressure solver or Laplacian Mark Adam mentioned. The >> simulation runs fine for medium size mesh grid. When I try harder to go >> very fine grid, not DNS level, I'm having some difficulties to have meaning >> physical results. All PETSc solvers converge but come with huge iterations. >> That's when/how I started using HYPRE. >> > > Are we just talking about the convergence of the pressure solve? > > >> >> On Mon, Oct 23, 2017 at 9:01 AM, Mark Adams wrote: >> >>> Just to be clear: 1) are you solving the Laplacian (div grad) and 2) >>> what type of discretizations are you using? and 3) do you have stretched or >>> terrible grids in some way? >>> >>> On Sun, Oct 22, 2017 at 3:57 PM, Barry Smith wrote: >>> >>>> >>>> One thing important to understand is that multigrid is an optimal >>>> or nearly optimal algorithm. This means, when it works, as you refine the >>>> mesh the number of iterations remains nearly constant, regardless of the >>>> problem size and number of processes. Simple preconditioners such as ILU, >>>> block Jacobi, one level additive Schwarz etc have iterations that increase >>>> with the problem size and likely also with the number of processes. Thus >>>> these algorithms become essentially impractical for very large problems >>>> while multigrid can remain practical (when it works). >>>> >>>> Good luck >>>> >>>> Barry >>>> > On Oct 22, 2017, at 2:35 PM, Hao Zhang wrote: >>>> > >>>> > Thanks for all the inputs. before simulating finer grid, HYPRE wasn't >>>> used and simulations were just fine. I will do a few tests and post more >>>> information later. >>>> > >>>> > On Sun, Oct 22, 2017 at 12:11 PM, Barry Smith >>>> wrote: >>>> > >>>> > > On Oct 21, 2017, at 11:16 PM, Hao Zhang >>>> wrote: >>>> > > >>>> > > the reason is when I do finer grid simulation, matrix become more >>>> stiff. >>>> > >>>> > Are you saying that for a finer grid but everything else the same, >>>> the convergence of hypre (with the same GMRES) with the same options gets >>>> much worse? This normally will not happen, that is the fundamental beauty >>>> of multigrid methods (when they work well). >>>> > >>>> > Yes the matrix condition number increases but multigrid doesn't >>>> care about that, its number of iterations should remain pretty much the >>>> same. >>>> > >>>> > Something must be different (with this finer grid case), either >>>> the mesh becomes horrible, or the physics changes, or there are errors in >>>> the code that lead to the problem. >>>> > >>>> > What happens if you just refine the mesh a little? Then a little >>>> more? Then a little more? Does the convergence rate suddenly go bad at some >>>> point, or does it just get worse slowly? >>>> > >>>> > Barry >>>> > >>>> > >>>> > >>>> > > Much larger condition number. just to give you a perspective, it >>>> will take 6000 iterations to converge and the solver does converge. I want >>>> to reduce the number of iterations while keeping the convergence rate. >>>> that's main drive to do so much heavy lifting around. please advise. >>>> snippet will be provided upon request. >>>> > > >>>> > > Thanks again. >>>> > > >>>> > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith >>>> wrote: >>>> > > >>>> > > Oh, you change KSP but not hypre. I did not understand this. >>>> > > >>>> > > Why not just use GMRES all the time? Why mess with BCGS if it is >>>> not robust? Not worth the small optimization if it breaks everything. >>>> > > >>>> > > Barry >>>> > > >>>> > > >>>> > > >>>> > > > On Oct 21, 2017, at 11:05 PM, Hao Zhang >>>> wrote: >>>> > > > >>>> > > > this is the initial pressure solver output regarding use of >>>> PETSc. it failed to converge after 40000 iterations, then use GMRES. >>>> > > > >>>> > > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid >>>> norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 >>>> > > > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid >>>> norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 >>>> > > > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid >>>> norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 >>>> > > > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid >>>> norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 >>>> > > > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid >>>> norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 >>>> > > > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid >>>> norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 >>>> > > > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid >>>> norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 >>>> > > > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid >>>> norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 >>>> > > > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid >>>> norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 >>>> > > > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid >>>> norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 >>>> > > > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid >>>> norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 >>>> > > > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid >>>> norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 >>>> > > > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid >>>> norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 >>>> > > > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid >>>> norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 >>>> > > > Linear solve did not converge due to DIVERGED_ITS iterations 40000 >>>> > > > KSP Object: 24 MPI processes >>>> > > > type: bcgs >>>> > > > maximum iterations=40000, initial guess is zero >>>> > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >>>> > > > left preconditioning >>>> > > > using PRECONDITIONED norm type for convergence test >>>> > > > PC Object: 24 MPI processes >>>> > > > type: hypre >>>> > > > HYPRE BoomerAMG preconditioning >>>> > > > Cycle type V >>>> > > > Maximum number of levels 25 >>>> > > > Maximum number of iterations PER hypre call 1 >>>> > > > Convergence tolerance PER hypre call 0. >>>> > > > Threshold for strong coupling 0.25 >>>> > > > Interpolation truncation factor 0. >>>> > > > Interpolation: max elements per row 0 >>>> > > > Number of levels of aggressive coarsening 0 >>>> > > > Number of paths for aggressive coarsening 1 >>>> > > > Maximum row sums 0.9 >>>> > > > Sweeps down 1 >>>> > > > Sweeps up 1 >>>> > > > Sweeps on coarse 1 >>>> > > > Relax down symmetric-SOR/Jacobi >>>> > > > Relax up symmetric-SOR/Jacobi >>>> > > > Relax on coarse Gaussian-elimination >>>> > > > Relax weight (all) 1. >>>> > > > Outer relax weight (all) 1. >>>> > > > Using CF-relaxation >>>> > > > Not using more complex smoothers. >>>> > > > Measure type local >>>> > > > Coarsen type Falgout >>>> > > > Interpolation type classical >>>> > > > linear system matrix = precond matrix: >>>> > > > Mat Object: A 24 MPI processes >>>> > > > type: mpiaij >>>> > > > rows=497664, cols=497664 >>>> > > > total: nonzeros=3363552, allocated nonzeros=6967296 >>>> > > > total number of mallocs used during MatSetValues calls =0 >>>> > > > has attached null space >>>> > > > not using I-node (on process 0) routines >>>> > > > >>>> > > > The solution diverges for p0! The residual is 3.853123e-08. >>>> Solve again using GMRES! >>>> > > > KSP Object: 24 MPI processes >>>> > > > type: gmres >>>> > > > restart=30, using Classical (unmodified) Gram-Schmidt >>>> Orthogonalization with no iterative refinement >>>> > > > happy breakdown tolerance 1e-30 >>>> > > > maximum iterations=40000, initial guess is zero >>>> > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >>>> > > > left preconditioning >>>> > > > using PRECONDITIONED norm type for convergence test >>>> > > > PC Object: 24 MPI processes >>>> > > > type: hypre >>>> > > > HYPRE BoomerAMG preconditioning >>>> > > > Cycle type V >>>> > > > Maximum number of levels 25 >>>> > > > Maximum number of iterations PER hypre call 1 >>>> > > > Convergence tolerance PER hypre call 0. >>>> > > > Threshold for strong coupling 0.25 >>>> > > > Interpolation truncation factor 0. >>>> > > > Interpolation: max elements per row 0 >>>> > > > Number of levels of aggressive coarsening 0 >>>> > > > Number of paths for aggressive coarsening 1 >>>> > > > Maximum row sums 0.9 >>>> > > > Sweeps down 1 >>>> > > > Sweeps up 1 >>>> > > > Sweeps on coarse 1 >>>> > > > Relax down symmetric-SOR/Jacobi >>>> > > > Relax up symmetric-SOR/Jacobi >>>> > > > Relax on coarse Gaussian-elimination >>>> > > > Relax weight (all) 1. >>>> > > > Outer relax weight (all) 1. >>>> > > > Using CF-relaxation >>>> > > > Not using more complex smoothers. >>>> > > > Measure type local >>>> > > > Coarsen type Falgout >>>> > > > Interpolation type classical >>>> > > > linear system matrix = precond matrix: >>>> > > > Mat Object: A 24 MPI processes >>>> > > > type: mpiaij >>>> > > > rows=497664, cols=497664 >>>> > > > total: nonzeros=3363552, allocated nonzeros=6967296 >>>> > > > total number of mallocs used during MatSetValues calls =0 >>>> > > > has attached null space >>>> > > > not using I-node (on process 0) routines >>>> > > > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid >>>> norm 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 >>>> > > > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid >>>> norm 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 >>>> > > > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid >>>> norm 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 >>>> > > > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid >>>> norm 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 >>>> > > > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid >>>> norm 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 >>>> > > > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid >>>> norm 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 >>>> > > > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid >>>> norm 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 >>>> > > > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid >>>> norm 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 >>>> > > > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid >>>> norm 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 >>>> > > > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid >>>> norm 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 >>>> > > > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid >>>> norm 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 >>>> > > > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid >>>> norm 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 >>>> > > > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid >>>> norm 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 >>>> > > > Linear solve converged due to CONVERGED_RTOL iterations 12 >>>> > > > KSP Object: 24 MPI processes >>>> > > > type: gmres >>>> > > > restart=30, using Classical (unmodified) Gram-Schmidt >>>> Orthogonalization with no iterative refinement >>>> > > > happy breakdown tolerance 1e-30 >>>> > > > maximum iterations=40000, initial guess is zero >>>> > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >>>> > > > left preconditioning >>>> > > > using PRECONDITIONED norm type for convergence test >>>> > > > PC Object: 24 MPI processes >>>> > > > type: hypre >>>> > > > HYPRE BoomerAMG preconditioning >>>> > > > Cycle type V >>>> > > > Maximum number of levels 25 >>>> > > > Maximum number of iterations PER hypre call 1 >>>> > > > Convergence tolerance PER hypre call 0. >>>> > > > Threshold for strong coupling 0.25 >>>> > > > Interpolation truncation factor 0. >>>> > > > Interpolation: max elements per row 0 >>>> > > > Number of levels of aggressive coarsening 0 >>>> > > > Number of paths for aggressive coarsening 1 >>>> > > > Maximum row sums 0.9 >>>> > > > Sweeps down 1 >>>> > > > Sweeps up 1 >>>> > > > Sweeps on coarse 1 >>>> > > > Relax down symmetric-SOR/Jacobi >>>> > > > Relax up symmetric-SOR/Jacobi >>>> > > > Relax on coarse Gaussian-elimination >>>> > > > Relax weight (all) 1. >>>> > > > Outer relax weight (all) 1. >>>> > > > Using CF-relaxation >>>> > > > Not using more complex smoothers. >>>> > > > Measure type local >>>> > > > Coarsen type Falgout >>>> > > > Interpolation type classical >>>> > > > linear system matrix = precond matrix: >>>> > > > Mat Object: A 24 MPI processes >>>> > > > type: mpiaij >>>> > > > rows=497664, cols=497664 >>>> > > > total: nonzeros=3363552, allocated nonzeros=6967296 >>>> > > > total number of mallocs used during MatSetValues calls =0 >>>> > > > has attached null space >>>> > > > not using I-node (on process 0) routines >>>> > > > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd >>>> > > > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd >>>> > > > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd >>>> > > > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = >>>> 1.075459e-13 >>>> > > > >>>> > > > The max value of p0 is 0.03115845493408858 >>>> > > > >>>> > > > The min value of p0 is -0.07156715468428149 >>>> > > > >>>> > > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith >>>> wrote: >>>> > > > >>>> > > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang >>>> wrote: >>>> > > > > >>>> > > > > the incompressible NS solver algorithm call PETSc solver at >>>> different stage of each time step. The one you were saying "This is good. >>>> 12 digit reduction" is after the initial pressure solver, in which usually >>>> HYPRE doesn't give a good convergence, so the fall-back solver GMRES will >>>> be called after. >>>> > > > >>>> > > > Hmm, I don't understand. hypre should do well on a pressure >>>> solve. In fact, very well. >>>> > > > > >>>> > > > > Barry, you were mentioning that I could have a wrong nullspace. >>>> that particular solver is aimed to give an initial pressure profile for 3d >>>> incompressible NS simulation using all neumann boundary conditions. could >>>> you give some insight how to test if I have a wrong nullspace etc? >>>> > > > >>>> > > > -ksp_test_null_space >>>> > > > >>>> > > > But if your null space is consistently from all Neumann >>>> boundary conditions then it likely is not wrong. >>>> > > > >>>> > > > Barry >>>> > > > >>>> > > > > >>>> > > > > Thanks! >>>> > > > > >>>> > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith < >>>> bsmith at mcs.anl.gov> wrote: >>>> > > > > >>>> > > > > This is good. You get more than 12 digit reduction in the >>>> true residual norm. This is good AMG convergence. Expected when everything >>>> goes well. >>>> > > > > >>>> > > > > What is different in this case from the previous case that >>>> does not converge reasonably? >>>> > > > > >>>> > > > > Barry >>>> > > > > >>>> > > > > >>>> > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang >>>> wrote: >>>> > > > > > >>>> > > > > > Barry, Please advise what you make of this? this is poisson >>>> solver with all neumann BC 3d case Finite difference Scheme was used. >>>> > > > > > Thanks! I'm in learning mode. >>>> > > > > > >>>> > > > > > KSP Object: 24 MPI processes >>>> > > > > > type: bcgs >>>> > > > > > maximum iterations=40000, initial guess is zero >>>> > > > > > tolerances: relative=1e-14, absolute=1e-14, >>>> divergence=10000. >>>> > > > > > left preconditioning >>>> > > > > > using PRECONDITIONED norm type for convergence test >>>> > > > > > PC Object: 24 MPI processes >>>> > > > > > type: hypre >>>> > > > > > HYPRE BoomerAMG preconditioning >>>> > > > > > Cycle type V >>>> > > > > > Maximum number of levels 25 >>>> > > > > > Maximum number of iterations PER hypre call 1 >>>> > > > > > Convergence tolerance PER hypre call 0. >>>> > > > > > Threshold for strong coupling 0.25 >>>> > > > > > Interpolation truncation factor 0. >>>> > > > > > Interpolation: max elements per row 0 >>>> > > > > > Number of levels of aggressive coarsening 0 >>>> > > > > > Number of paths for aggressive coarsening 1 >>>> > > > > > Maximum row sums 0.9 >>>> > > > > > Sweeps down 1 >>>> > > > > > Sweeps up 1 >>>> > > > > > Sweeps on coarse 1 >>>> > > > > > Relax down symmetric-SOR/Jacobi >>>> > > > > > Relax up symmetric-SOR/Jacobi >>>> > > > > > Relax on coarse Gaussian-elimination >>>> > > > > > Relax weight (all) 1. >>>> > > > > > Outer relax weight (all) 1. >>>> > > > > > Using CF-relaxation >>>> > > > > > Not using more complex smoothers. >>>> > > > > > Measure type local >>>> > > > > > Coarsen type Falgout >>>> > > > > > Interpolation type classical >>>> > > > > > linear system matrix = precond matrix: >>>> > > > > > Mat Object: A 24 MPI processes >>>> > > > > > type: mpiaij >>>> > > > > > rows=497664, cols=497664 >>>> > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 >>>> > > > > > total number of mallocs used during MatSetValues calls =0 >>>> > > > > > has attached null space >>>> > > > > > not using I-node (on process 0) routines >>>> > > > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true >>>> resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 >>>> > > > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true >>>> resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 >>>> > > > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true >>>> resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 >>>> > > > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true >>>> resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 >>>> > > > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true >>>> resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 >>>> > > > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true >>>> resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 >>>> > > > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true >>>> resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 >>>> > > > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true >>>> resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 >>>> > > > > > Linear solve converged due to CONVERGED_ATOL iterations 7 >>>> > > > > > >>>> > > > > > >>>> > > > > > >>>> > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith < >>>> bsmith at mcs.anl.gov> wrote: >>>> > > > > > >>>> > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang >>>> wrote: >>>> > > > > > > >>>> > > > > > > hi, Barry: >>>> > > > > > > what do you mean absurd by setting tolerance =1e-14? >>>> > > > > > >>>> > > > > > Trying to decrease the initial residual norm down by a >>>> factor of 1e-14 with an iterative method (or even direct method) is >>>> unrealistic, usually unachievable) and almost never necessary. You are >>>> requiring || r_n || < 1.e-14 >>>> || r_0|| when >>>> with double precision numbers you only have roughly 14 decimal digits total >>>> to compute with. Round off alone will lead to differences far larger than >>>> 1e-14 >>>> > > > > > >>>> > > > > > If you are using the solver in the context of a nonlinear >>>> problem (i.e. inside Newton's method) then 1.e-6 >>>> is generally >>>> more than plenty to get quadratic convergence of Newton's method. >>>> > > > > > >>>> > > > > > If you are solving a linear problem then it is extremely >>>> likely that errors due to discretization errors (from finite element method >>>> etc) and the model are much much larger than even 1.e-8 >>>> . >>>> > > > > > >>>> > > > > > So, in summary >>>> > > > > > >>>> > > > > > 1.e-14 >>>> is probably >>>> unachievable >>>> > > > > > >>>> > > > > > 1.e-14 >>>> is almost for >>>> sure not needed. >>>> > > > > > >>>> > > > > > Barry >>>> > > > > > >>>> > > > > > >>>> > > > > > > >>>> > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith < >>>> bsmith at mcs.anl.gov> wrote: >>>> > > > > > > >>>> > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and >>>> send the resulting output file called binaryoutput to >>>> petsc-maint at mcs.anl.gov >>>> > > > > > > >>>> > > > > > > Note you can also use -ksp_type gmres with hypre, >>>> unlikely to be a reason to use bcgs >>>> > > > > > > >>>> > > > > > > BTW: tolerances: relative=1e-14, is absurd >>>> > > > > > > >>>> > > > > > > My guess is your null space is incorrect. >>>> > > > > > > >>>> > > > > > > >>>> > > > > > > >>>> > > > > > > >>>> > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang < >>>> hbcbh1999 at gmail.com> wrote: >>>> > > > > > > > >>>> > > > > > > > if this solver doesn't converge. I have a fall-back >>>> solution, which uses GMRES solver. this setup is fine with me. I just want >>>> to know if HYPRE is a reliable solution for me. Or I will have to go >>>> without preconditioner. >>>> > > > > > > > >>>> > > > > > > > Thanks! >>>> > > > > > > > >>>> > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang < >>>> hbcbh1999 at gmail.com> wrote: >>>> > > > > > > > this is serial run. still dumping output. parallel more >>>> or less the same. >>>> > > > > > > > >>>> > > > > > > > KSP Object: 1 MPI processes >>>> > > > > > > > type: bcgs >>>> > > > > > > > maximum iterations=40000, initial guess is zero >>>> > > > > > > > tolerances: relative=1e-14, absolute=1e-14, >>>> divergence=10000. >>>> > > > > > > > left preconditioning >>>> > > > > > > > using PRECONDITIONED norm type for convergence test >>>> > > > > > > > PC Object: 1 MPI processes >>>> > > > > > > > type: hypre >>>> > > > > > > > HYPRE BoomerAMG preconditioning >>>> > > > > > > > Cycle type V >>>> > > > > > > > Maximum number of levels 25 >>>> > > > > > > > Maximum number of iterations PER hypre call 1 >>>> > > > > > > > Convergence tolerance PER hypre call 0. >>>> > > > > > > > Threshold for strong coupling 0.25 >>>> > > > > > > > Interpolation truncation factor 0. >>>> > > > > > > > Interpolation: max elements per row 0 >>>> > > > > > > > Number of levels of aggressive coarsening 0 >>>> > > > > > > > Number of paths for aggressive coarsening 1 >>>> > > > > > > > Maximum row sums 0.9 >>>> > > > > > > > Sweeps down 1 >>>> > > > > > > > Sweeps up 1 >>>> > > > > > > > Sweeps on coarse 1 >>>> > > > > > > > Relax down symmetric-SOR/Jacobi >>>> > > > > > > > Relax up symmetric-SOR/Jacobi >>>> > > > > > > > Relax on coarse Gaussian-elimination >>>> > > > > > > > Relax weight (all) 1. >>>> > > > > > > > Outer relax weight (all) 1. >>>> > > > > > > > Using CF-relaxation >>>> > > > > > > > Not using more complex smoothers. >>>> > > > > > > > Measure type local >>>> > > > > > > > Coarsen type Falgout >>>> > > > > > > > Interpolation type classical >>>> > > > > > > > linear system matrix = precond matrix: >>>> > > > > > > > Mat Object: A 1 MPI processes >>>> > > > > > > > type: seqaij >>>> > > > > > > > rows=497664, cols=497664 >>>> > > > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 >>>> > > > > > > > total number of mallocs used during MatSetValues >>>> calls =0 >>>> > > > > > > > has attached null space >>>> > > > > > > > not using I-node routines >>>> > > > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true >>>> resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 >>>> > > > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true >>>> resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 >>>> > > > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true >>>> resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 >>>> > > > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true >>>> resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 >>>> > > > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true >>>> resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 >>>> > > > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true >>>> resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 >>>> > > > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true >>>> resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 >>>> > > > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true >>>> resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 >>>> > > > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true >>>> resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 >>>> > > > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true >>>> resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 >>>> > > > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true >>>> resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 >>>> > > > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true >>>> resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 >>>> > > > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true >>>> resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 >>>> > > > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true >>>> resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 >>>> > > > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true >>>> resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 >>>> > > > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true >>>> resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 >>>> > > > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true >>>> resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 >>>> > > > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true >>>> resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 >>>> > > > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true >>>> resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 >>>> > > > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true >>>> resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 >>>> > > > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true >>>> resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 >>>> > > > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true >>>> resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 >>>> > > > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true >>>> resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 >>>> > > > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true >>>> resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 >>>> > > > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true >>>> resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 >>>> > > > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true >>>> resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 >>>> > > > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true >>>> resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 >>>> > > > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true >>>> resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 >>>> > > > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true >>>> resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 >>>> > > > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true >>>> resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 >>>> > > > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true >>>> resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 >>>> > > > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true >>>> resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 >>>> > > > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true >>>> resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 >>>> > > > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true >>>> resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 >>>> > > > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true >>>> resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 >>>> > > > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true >>>> resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 >>>> > > > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true >>>> resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 >>>> > > > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true >>>> resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 >>>> > > > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true >>>> resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 >>>> > > > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true >>>> resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 >>>> > > > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true >>>> resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 >>>> > > > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true >>>> resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 >>>> > > > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true >>>> resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 >>>> > > > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true >>>> resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 >>>> > > > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true >>>> resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 >>>> > > > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true >>>> resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 >>>> > > > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true >>>> resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 >>>> > > > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true >>>> resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 >>>> > > > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true >>>> resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 >>>> > > > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true >>>> resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 >>>> > > > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true >>>> resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 >>>> > > > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true >>>> resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 >>>> > > > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true >>>> resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 >>>> > > > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true >>>> resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 >>>> > > > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true >>>> resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 >>>> > > > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true >>>> resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 >>>> > > > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true >>>> resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 >>>> > > > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true >>>> resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 >>>> > > > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true >>>> resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 >>>> > > > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true >>>> resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 >>>> > > > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true >>>> resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 >>>> > > > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true >>>> resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 >>>> > > > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true >>>> resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 >>>> > > > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true >>>> resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 >>>> > > > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true >>>> resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 >>>> > > > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true >>>> resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 >>>> > > > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true >>>> resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 >>>> > > > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true >>>> resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 >>>> > > > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true >>>> resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 >>>> > > > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true >>>> resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 >>>> > > > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true >>>> resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 >>>> > > > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true >>>> resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 >>>> > > > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true >>>> resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 >>>> > > > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true >>>> resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 >>>> > > > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true >>>> resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 >>>> > > > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true >>>> resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 >>>> > > > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true >>>> resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 >>>> > > > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true >>>> resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 >>>> > > > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true >>>> resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 >>>> > > > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true >>>> resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 >>>> > > > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true >>>> resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 >>>> > > > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true >>>> resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 >>>> > > > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true >>>> resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 >>>> > > > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true >>>> resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 >>>> > > > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true >>>> resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 >>>> > > > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true >>>> resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 >>>> > > > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true >>>> resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 >>>> > > > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true >>>> resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 >>>> > > > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true >>>> resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 >>>> > > > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true >>>> resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 >>>> > > > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true >>>> resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 >>>> > > > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true >>>> resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 >>>> > > > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true >>>> resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 >>>> > > > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true >>>> resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 >>>> > > > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true >>>> resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 >>>> > > > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true >>>> resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 >>>> > > > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true >>>> resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 >>>> > > > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true >>>> resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 >>>> > > > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true >>>> resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 >>>> > > > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true >>>> resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 >>>> > > > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true >>>> resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 >>>> > > > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true >>>> resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 >>>> > > > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true >>>> resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 >>>> > > > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true >>>> resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 >>>> > > > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true >>>> resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 >>>> > > > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true >>>> resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 >>>> > > > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true >>>> resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 >>>> > > > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true >>>> resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 >>>> > > > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true >>>> resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 >>>> > > > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true >>>> resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 >>>> > > > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true >>>> resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 >>>> > > > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true >>>> resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 >>>> > > > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true >>>> resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 >>>> > > > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true >>>> resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 >>>> > > > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true >>>> resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 >>>> > > > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true >>>> resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 >>>> > > > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true >>>> resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 >>>> > > > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true >>>> resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 >>>> > > > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true >>>> resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 >>>> > > > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true >>>> resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 >>>> > > > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true >>>> resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 >>>> > > > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true >>>> resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 >>>> > > > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true >>>> resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 >>>> > > > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true >>>> resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 >>>> > > > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true >>>> resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 >>>> > > > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true >>>> resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 >>>> > > > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true >>>> resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 >>>> > > > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true >>>> resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 >>>> > > > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true >>>> resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 >>>> > > > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true >>>> resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 >>>> > > > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true >>>> resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 >>>> > > > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true >>>> resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 >>>> > > > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true >>>> resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 >>>> > > > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true >>>> resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 >>>> > > > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true >>>> resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 >>>> > > > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true >>>> resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 >>>> > > > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true >>>> resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 >>>> > > > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true >>>> resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 >>>> > > > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true >>>> resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 >>>> > > > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true >>>> resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 >>>> > > > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true >>>> resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 >>>> > > > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true >>>> resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 >>>> > > > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true >>>> resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 >>>> > > > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true >>>> resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 >>>> > > > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true >>>> resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 >>>> > > > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true >>>> resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 >>>> > > > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true >>>> resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 >>>> > > > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true >>>> resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 >>>> > > > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true >>>> resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 >>>> > > > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true >>>> resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 >>>> > > > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true >>>> resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 >>>> > > > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true >>>> resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 >>>> > > > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true >>>> resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 >>>> > > > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true >>>> resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 >>>> > > > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true >>>> resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 >>>> > > > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true >>>> resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 >>>> > > > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true >>>> resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 >>>> > > > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true >>>> resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 >>>> > > > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true >>>> resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 >>>> > > > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true >>>> resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 >>>> > > > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true >>>> resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 >>>> > > > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true >>>> resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 >>>> > > > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true >>>> resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 >>>> > > > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true >>>> resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 >>>> > > > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true >>>> resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 >>>> > > > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true >>>> resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 >>>> > > > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true >>>> resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 >>>> > > > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true >>>> resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 >>>> > > > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true >>>> resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 >>>> > > > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true >>>> resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 >>>> > > > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true >>>> resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 >>>> > > > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true >>>> resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 >>>> > > > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true >>>> resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 >>>> > > > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true >>>> resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 >>>> > > > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true >>>> resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 >>>> > > > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true >>>> resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 >>>> > > > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true >>>> resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 >>>> > > > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true >>>> resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 >>>> > > > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true >>>> resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 >>>> > > > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true >>>> resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 >>>> > > > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true >>>> resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 >>>> > > > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true >>>> resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 >>>> > > > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true >>>> resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 >>>> > > > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true >>>> resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 >>>> > > > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true >>>> resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 >>>> > > > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true >>>> resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 >>>> > > > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true >>>> resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 >>>> > > > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true >>>> resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 >>>> > > > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true >>>> resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 >>>> > > > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true >>>> resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 >>>> > > > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true >>>> resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 >>>> > > > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true >>>> resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 >>>> > > > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true >>>> resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 >>>> > > > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true >>>> resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 >>>> > > > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true >>>> resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 >>>> > > > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true >>>> resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 >>>> > > > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true >>>> resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 >>>> > > > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true >>>> resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 >>>> > > > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true >>>> resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 >>>> > > > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true >>>> resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 >>>> > > > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true >>>> resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 >>>> > > > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true >>>> resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 >>>> > > > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true >>>> resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 >>>> > > > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true >>>> resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 >>>> > > > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true >>>> resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 >>>> > > > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true >>>> resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 >>>> > > > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true >>>> resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 >>>> > > > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true >>>> resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 >>>> > > > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true >>>> resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 >>>> > > > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true >>>> resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 >>>> > > > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true >>>> resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 >>>> > > > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true >>>> resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 >>>> > > > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true >>>> resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 >>>> > > > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true >>>> resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 >>>> > > > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true >>>> resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 >>>> > > > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true >>>> resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 >>>> > > > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true >>>> resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 >>>> > > > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true >>>> resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 >>>> > > > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true >>>> resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 >>>> > > > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true >>>> resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 >>>> > > > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true >>>> resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 >>>> > > > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true >>>> resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 >>>> > > > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true >>>> resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 >>>> > > > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true >>>> resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 >>>> > > > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true >>>> resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 >>>> > > > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true >>>> resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 >>>> > > > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true >>>> resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 >>>> > > > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true >>>> resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 >>>> > > > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true >>>> resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 >>>> > > > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true >>>> resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 >>>> > > > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true >>>> resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 >>>> > > > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true >>>> resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 >>>> > > > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true >>>> resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 >>>> > > > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true >>>> resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 >>>> > > > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true >>>> resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 >>>> > > > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true >>>> resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 >>>> > > > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true >>>> resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 >>>> > > > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true >>>> resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 >>>> > > > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true >>>> resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 >>>> > > > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true >>>> resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 >>>> > > > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true >>>> resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 >>>> > > > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true >>>> resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 >>>> > > > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true >>>> resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 >>>> > > > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true >>>> resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 >>>> > > > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true >>>> resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 >>>> > > > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true >>>> resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 >>>> > > > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true >>>> resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 >>>> > > > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true >>>> resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 >>>> > > > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true >>>> resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >>>> > > > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true >>>> resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 >>>> > > > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true >>>> resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 >>>> > > > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true >>>> resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >>>> > > > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true >>>> resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 >>>> > > > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true >>>> resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 >>>> > > > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true >>>> resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 >>>> > > > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true >>>> resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 >>>> > > > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true >>>> resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 >>>> > > > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true >>>> resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 >>>> > > > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true >>>> resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 >>>> > > > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true >>>> resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 >>>> > > > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true >>>> resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 >>>> > > > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true >>>> resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 >>>> > > > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true >>>> resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 >>>> > > > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true >>>> resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 >>>> > > > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true >>>> resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 >>>> > > > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true >>>> resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 >>>> > > > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true >>>> resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 >>>> > > > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true >>>> resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 >>>> > > > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true >>>> resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 >>>> > > > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true >>>> resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 >>>> > > > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true >>>> resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 >>>> > > > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true >>>> resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 >>>> > > > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true >>>> resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 >>>> > > > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true >>>> resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 >>>> > > > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true >>>> resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 >>>> > > > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true >>>> resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 >>>> > > > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true >>>> resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 >>>> > > > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true >>>> resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 >>>> > > > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true >>>> resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 >>>> > > > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true >>>> resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 >>>> > > > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true >>>> resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 >>>> > > > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true >>>> resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 >>>> > > > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true >>>> resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 >>>> > > > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true >>>> resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 >>>> > > > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true >>>> resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 >>>> > > > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true >>>> resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 >>>> > > > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true >>>> resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 >>>> > > > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true >>>> resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 >>>> > > > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true >>>> resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 >>>> > > > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true >>>> resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 >>>> > > > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true >>>> resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 >>>> > > > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true >>>> resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 >>>> > > > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true >>>> resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 >>>> > > > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true >>>> resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 >>>> > > > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true >>>> resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 >>>> > > > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true >>>> resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 >>>> > > > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true >>>> resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 >>>> > > > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true >>>> resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 >>>> > > > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true >>>> resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 >>>> > > > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true >>>> resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 >>>> > > > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true >>>> resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 >>>> > > > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true >>>> resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 >>>> > > > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true >>>> resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 >>>> > > > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true >>>> resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 >>>> > > > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true >>>> resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 >>>> > > > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true >>>> resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >>>> > > > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true >>>> resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >>>> > > > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true >>>> resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 >>>> > > > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true >>>> resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 >>>> > > > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true >>>> resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 >>>> > > > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true >>>> resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 >>>> > > > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true >>>> resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 >>>> > > > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true >>>> resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 >>>> > > > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true >>>> resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 >>>> > > > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true >>>> resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >>>> > > > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true >>>> resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 >>>> > > > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true >>>> resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >>>> > > > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true >>>> resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 >>>> > > > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true >>>> resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 >>>> > > > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true >>>> resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 >>>> > > > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true >>>> resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 >>>> > > > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true >>>> resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 >>>> > > > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true >>>> resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 >>>> > > > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true >>>> resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 >>>> > > > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true >>>> resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 >>>> > > > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true >>>> resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 >>>> > > > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true >>>> resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 >>>> > > > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true >>>> resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 >>>> > > > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true >>>> resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 >>>> > > > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true >>>> resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 >>>> > > > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true >>>> resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 >>>> > > > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true >>>> resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 >>>> > > > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true >>>> resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 >>>> > > > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true >>>> resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 >>>> > > > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true >>>> resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 >>>> > > > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true >>>> resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 >>>> > > > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true >>>> resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 >>>> > > > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true >>>> resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 >>>> > > > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true >>>> resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 >>>> > > > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true >>>> resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 >>>> > > > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true >>>> resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 >>>> > > > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true >>>> resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 >>>> > > > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true >>>> resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 >>>> > > > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true >>>> resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 >>>> > > > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true >>>> resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 >>>> > > > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true >>>> resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 >>>> > > > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true >>>> resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 >>>> > > > > > > > >>>> > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley < >>>> knepley at gmail.com> wrote: >>>> > > > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang < >>>> hbcbh1999 at gmail.com> wrote: >>>> > > > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); >>>> > > > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); >>>> > > > > > > > >>>> > > > > > > > ierr = VecAssemblyBegin(x); >>>> > > > > > > > ierr = VecAssemblyEnd(x); >>>> > > > > > > > This is probably unnecessary >>>> > > > > > > > >>>> > > > > > > > ierr = VecAssemblyBegin(b); >>>> > > > > > > > ierr = VecAssemblyEnd(b); >>>> > > > > > > > This is probably unnecessary >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ >>>> WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); >>>> > > > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 >>>> > > > > > > > Is your rhs consistent with this nullspace? >>>> > > > > > > > >>>> > > > > > > > // KSPSetOperators(ksp,A,A,DIFFER >>>> ENT_NONZERO_PATTERN); >>>> > > > > > > > KSPSetOperators(ksp,A,A); >>>> > > > > > > > >>>> > > > > > > > KSPSetType(ksp,KSPBCGS); >>>> > > > > > > > >>>> > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); >>>> > > > > > > > #if defined(__HYPRE__) >>>> > > > > > > > KSPGetPC(ksp, &pc); >>>> > > > > > > > PCSetType(pc, PCHYPRE); >>>> > > > > > > > PCHYPRESetType(pc,"boomeramg"); >>>> > > > > > > > This is terribly unnecessary. You just use >>>> > > > > > > > >>>> > > > > > > > -pc_type hypre -pc_hypre_type boomeramg >>>> > > > > > > > >>>> > > > > > > > or >>>> > > > > > > > >>>> > > > > > > > -pc_type gamg >>>> > > > > > > > >>>> > > > > > > > #else >>>> > > > > > > > KSPSetType(ksp,KSPBCGSL); >>>> > > > > > > > KSPBCGSLSetEll(ksp,2); >>>> > > > > > > > #endif /* defined(__HYPRE__) */ >>>> > > > > > > > >>>> > > > > > > > KSPSetFromOptions(ksp); >>>> > > > > > > > KSPSetUp(ksp); >>>> > > > > > > > >>>> > > > > > > > ierr = KSPSolve(ksp,b,x); >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > command line >>>> > > > > > > > >>>> > > > > > > > You did not provide any of what I asked for the in the >>>> eprevious mail. >>>> > > > > > > > >>>> > > > > > > > Matt >>>> > > > > > > > >>>> > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley < >>>> knepley at gmail.com> wrote: >>>> > > > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang < >>>> hbcbh1999 at gmail.com> wrote: >>>> > > > > > > > hi, >>>> > > > > > > > >>>> > > > > > > > I implemented HYPRE preconditioner for my study due to >>>> the fact that without preconditioner, PETSc solver will take thousands of >>>> iterations to converge for fine grid simulation. >>>> > > > > > > > >>>> > > > > > > > with HYPRE, depending on the parallel partition, it will >>>> take HYPRE forever to do anything. observation of output file is that the >>>> simulation is hanging with no output. >>>> > > > > > > > >>>> > > > > > > > Any idea what happened? will post snippet of code. >>>> > > > > > > > >>>> > > > > > > > 1) For any question about convergence, we need to see the >>>> output of >>>> > > > > > > > >>>> > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual >>>> -ksp_converged_reason >>>> > > > > > > > >>>> > > > > > > > 2) Hypre has many preconditioners, which one are you >>>> talking about >>>> > > > > > > > >>>> > > > > > > > 3) PETSc has some preconditioners in common with Hypre, >>>> like AMG >>>> > > > > > > > >>>> > > > > > > > Thanks, >>>> > > > > > > > >>>> > > > > > > > Matt >>>> > > > > > > > >>>> > > > > > > > -- >>>> > > > > > > > Hao Zhang >>>> > > > > > > > Dept. of Applid Mathematics and Statistics, >>>> > > > > > > > Stony Brook University, >>>> > > > > > > > Stony Brook, New York, 11790 >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > -- >>>> > > > > > > > What most experimenters take for granted before they >>>> begin their experiments is infinitely more interesting than any results to >>>> which their experiments lead. >>>> > > > > > > > -- Norbert Wiener >>>> > > > > > > > >>>> > > > > > > > https://www.cse.buffalo.edu/~knepley/ >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > -- >>>> > > > > > > > Hao Zhang >>>> > > > > > > > Dept. of Applid Mathematics and Statistics, >>>> > > > > > > > Stony Brook University, >>>> > > > > > > > Stony Brook, New York, 11790 >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > -- >>>> > > > > > > > What most experimenters take for granted before they >>>> begin their experiments is infinitely more interesting than any results to >>>> which their experiments lead. >>>> > > > > > > > -- Norbert Wiener >>>> > > > > > > > >>>> > > > > > > > https://www.cse.buffalo.edu/~knepley/ >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > -- >>>> > > > > > > > Hao Zhang >>>> > > > > > > > Dept. of Applid Mathematics and Statistics, >>>> > > > > > > > Stony Brook University, >>>> > > > > > > > Stony Brook, New York, 11790 >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > >>>> > > > > > > > -- >>>> > > > > > > > Hao Zhang >>>> > > > > > > > Dept. of Applid Mathematics and Statistics, >>>> > > > > > > > Stony Brook University, >>>> > > > > > > > Stony Brook, New York, 11790 >>>> > > > > > > >>>> > > > > > > -- >>>> > > > > > > Hao Zhang >>>> > > > > > > Dept. of Applid Mathematics and Statistics, >>>> > > > > > > Stony Brook University, >>>> > > > > > > Stony Brook, New York, 11790 >>>> > > > > > >>>> > > > > > >>>> > > > > > >>>> > > > > > >>>> > > > > > -- >>>> > > > > > Hao Zhang >>>> > > > > > Dept. of Applid Mathematics and Statistics, >>>> > > > > > Stony Brook University, >>>> > > > > > Stony Brook, New York, 11790 >>>> > > > > >>>> > > > > >>>> > > > > >>>> > > > > >>>> > > > > -- >>>> > > > > Hao Zhang >>>> > > > > Dept. of Applid Mathematics and Statistics, >>>> > > > > Stony Brook University, >>>> > > > > Stony Brook, New York, 11790 >>>> > > > >>>> > > > >>>> > > > >>>> > > > >>>> > > > -- >>>> > > > Hao Zhang >>>> > > > Dept. of Applid Mathematics and Statistics, >>>> > > > Stony Brook University, >>>> > > > Stony Brook, New York, 11790 >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > -- >>>> > > Hao Zhang >>>> > > Dept. of Applid Mathematics and Statistics, >>>> > > Stony Brook University, >>>> > > Stony Brook, New York, 11790 >>>> > >>>> > >>>> > >>>> > >>>> > -- >>>> > Hao Zhang >>>> > Dept. of Applid Mathematics and Statistics, >>>> > Stony Brook University, >>>> > Stony Brook, New York, 11790 >>>> >>>> >>> >> >> >> -- >> Hao Zhang >> Dept. of Applid Mathematics and Statistics, >> Stony Brook University, >> Stony Brook, New York, 11790 >> > > -- Hao Zhang Dept. of Applid Mathematics and Statistics, Stony Brook University, Stony Brook, New York, 11790 -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Oct 23 09:23:36 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 23 Oct 2017 10:23:36 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> <8C8ADFC3-C71F-4DE1-B21C-5BF50EBC652D@mcs.anl.gov> Message-ID: On Mon, Oct 23, 2017 at 10:04 AM, Hao Zhang wrote: > The big picture is I'm solving 3d incompressible Navier-Stokes Equations > using staggered/MAC grid with Finite Difference Method. The particular > function is poisson pressure solver or Laplacian Mark Adam mentioned. The > simulation runs fine for medium size mesh grid. When I try harder to go > very fine grid, not DNS level, I'm having some difficulties to have meaning > physical results. > I assume that you are getting good solutions when the solver is converging well. I would not trust that your solution is ever accurate (converged) when you see this (stagnation), unless you get the true residual way, way down. How low do you get the true residual? 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 I would take the first test case (smallest) in which you can see "nonphysical results", use GMRES and ILU with lots of iterations, or a direct solver ideally, and get a good solution. If you can not get a good solution then do not worry about multigrid convergence. One possibility is that you solution is going crazy and your density is getting non-smooth and thus the coefficients in the pressure solve are getting large and non-smooth (I assume you have the density term in the pressure solve). This would cause a problem for multigrid. > All PETSc solvers converge but come with huge iterations. That's when/how > I started using HYPRE. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Mon Oct 23 10:18:22 2017 From: jroman at dsic.upv.es (Jose E. Roman) Date: Mon, 23 Oct 2017 17:18:22 +0200 Subject: [petsc-users] "Must select a target sorting criterion if using shift-and-invert" In-Reply-To: References: Message-ID: Changed. Hope it is clear now. https://bitbucket.org/slepc/slepc/commits/511900656a27a161c1df6fe2e42fd8d66d071800 Jose > El 21 oct 2017, a las 14:27, Matthew Knepley escribi?: > > On Sat, Oct 21, 2017 at 2:20 AM, Jose E. Roman wrote: > This was added in 3.8 to check the common case when people incorrectly sets shift-and-invert with EPS_SMALLEST_MAGNITUDE. To compute smallest eigenvalues with shift-and-invert the correct way is to set target=0 and which=EPS_TARGET_MAGNITUDE. See for instance > http://slepc.upv.es/documentation/current/src/eps/examples/tutorials/ex13.c.html > > Jose, one thing we are trying to do in PETSc now is to give the options to fix a problem (or at least representative options) > directly in the error message. Or maybe a pointer to the relevant manual or tutorial section. This gives users a hand up. > > Thanks, > > Matt > > > Jose > > > > El 21 oct 2017, a las 1:51, Matthew Knepley escribi?: > > > > On Fri, Oct 20, 2017 at 7:43 PM, Kong, Fande wrote: > > Hi All, > > > > I am trying to solve a generalized eigenvalue problem (using SLEPc) with "-eps_type krylovschur -st_type sinvert". I got an error message: "Must select a target sorting criterion if using shift-and-invert". > > > > Not sure how to proceed. I do not quite understand this sentence. > > > > You need to know how to choose the shift. So for instance you want the smallest eigenvalues, or the closest to zero, etc. > > I don't know the options, but they are in the manual. > > > > Matt > > > > > > Fande, > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From bsmith at mcs.anl.gov Mon Oct 23 10:36:08 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 23 Oct 2017 10:36:08 -0500 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> <8C8ADFC3-C71F-4DE1-B21C-5BF50EBC652D@mcs.anl.gov> Message-ID: I'm confused. Is hypre + GMRES ever not working fine for you? Why not just always use hypre + gmres; no basic solver is going to be faster for large problems ever. Barry > On Oct 23, 2017, at 9:09 AM, Hao Zhang wrote: > > Yes. > > On Mon, Oct 23, 2017 at 10:07 AM, Mark Adams wrote: > > > On Mon, Oct 23, 2017 at 10:04 AM, Hao Zhang wrote: > The big picture is I'm solving 3d incompressible Navier-Stokes Equations using staggered/MAC grid with Finite Difference Method. The particular function is poisson pressure solver or Laplacian Mark Adam mentioned. The simulation runs fine for medium size mesh grid. When I try harder to go very fine grid, not DNS level, I'm having some difficulties to have meaning physical results. All PETSc solvers converge but come with huge iterations. That's when/how I started using HYPRE. > > Are we just talking about the convergence of the pressure solve? > > > On Mon, Oct 23, 2017 at 9:01 AM, Mark Adams wrote: > Just to be clear: 1) are you solving the Laplacian (div grad) and 2) what type of discretizations are you using? and 3) do you have stretched or terrible grids in some way? > > On Sun, Oct 22, 2017 at 3:57 PM, Barry Smith wrote: > > One thing important to understand is that multigrid is an optimal or nearly optimal algorithm. This means, when it works, as you refine the mesh the number of iterations remains nearly constant, regardless of the problem size and number of processes. Simple preconditioners such as ILU, block Jacobi, one level additive Schwarz etc have iterations that increase with the problem size and likely also with the number of processes. Thus these algorithms become essentially impractical for very large problems while multigrid can remain practical (when it works). > > Good luck > > Barry > > On Oct 22, 2017, at 2:35 PM, Hao Zhang wrote: > > > > Thanks for all the inputs. before simulating finer grid, HYPRE wasn't used and simulations were just fine. I will do a few tests and post more information later. > > > > On Sun, Oct 22, 2017 at 12:11 PM, Barry Smith wrote: > > > > > On Oct 21, 2017, at 11:16 PM, Hao Zhang wrote: > > > > > > the reason is when I do finer grid simulation, matrix become more stiff. > > > > Are you saying that for a finer grid but everything else the same, the convergence of hypre (with the same GMRES) with the same options gets much worse? This normally will not happen, that is the fundamental beauty of multigrid methods (when they work well). > > > > Yes the matrix condition number increases but multigrid doesn't care about that, its number of iterations should remain pretty much the same. > > > > Something must be different (with this finer grid case), either the mesh becomes horrible, or the physics changes, or there are errors in the code that lead to the problem. > > > > What happens if you just refine the mesh a little? Then a little more? Then a little more? Does the convergence rate suddenly go bad at some point, or does it just get worse slowly? > > > > Barry > > > > > > > > > Much larger condition number. just to give you a perspective, it will take 6000 iterations to converge and the solver does converge. I want to reduce the number of iterations while keeping the convergence rate. that's main drive to do so much heavy lifting around. please advise. snippet will be provided upon request. > > > > > > Thanks again. > > > > > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith wrote: > > > > > > Oh, you change KSP but not hypre. I did not understand this. > > > > > > Why not just use GMRES all the time? Why mess with BCGS if it is not robust? Not worth the small optimization if it breaks everything. > > > > > > Barry > > > > > > > > > > > > > On Oct 21, 2017, at 11:05 PM, Hao Zhang wrote: > > > > > > > > this is the initial pressure solver output regarding use of PETSc. it failed to converge after 40000 iterations, then use GMRES. > > > > > > > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 > > > > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 > > > > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 > > > > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 > > > > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 > > > > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 > > > > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 > > > > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 > > > > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 > > > > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 > > > > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 > > > > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 > > > > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 > > > > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 > > > > Linear solve did not converge due to DIVERGED_ITS iterations 40000 > > > > KSP Object: 24 MPI processes > > > > type: bcgs > > > > maximum iterations=40000, initial guess is zero > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 24 MPI processes > > > > type: hypre > > > > HYPRE BoomerAMG preconditioning > > > > Cycle type V > > > > Maximum number of levels 25 > > > > Maximum number of iterations PER hypre call 1 > > > > Convergence tolerance PER hypre call 0. > > > > Threshold for strong coupling 0.25 > > > > Interpolation truncation factor 0. > > > > Interpolation: max elements per row 0 > > > > Number of levels of aggressive coarsening 0 > > > > Number of paths for aggressive coarsening 1 > > > > Maximum row sums 0.9 > > > > Sweeps down 1 > > > > Sweeps up 1 > > > > Sweeps on coarse 1 > > > > Relax down symmetric-SOR/Jacobi > > > > Relax up symmetric-SOR/Jacobi > > > > Relax on coarse Gaussian-elimination > > > > Relax weight (all) 1. > > > > Outer relax weight (all) 1. > > > > Using CF-relaxation > > > > Not using more complex smoothers. > > > > Measure type local > > > > Coarsen type Falgout > > > > Interpolation type classical > > > > linear system matrix = precond matrix: > > > > Mat Object: A 24 MPI processes > > > > type: mpiaij > > > > rows=497664, cols=497664 > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > total number of mallocs used during MatSetValues calls =0 > > > > has attached null space > > > > not using I-node (on process 0) routines > > > > > > > > The solution diverges for p0! The residual is 3.853123e-08. Solve again using GMRES! > > > > KSP Object: 24 MPI processes > > > > type: gmres > > > > restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > > > > happy breakdown tolerance 1e-30 > > > > maximum iterations=40000, initial guess is zero > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 24 MPI processes > > > > type: hypre > > > > HYPRE BoomerAMG preconditioning > > > > Cycle type V > > > > Maximum number of levels 25 > > > > Maximum number of iterations PER hypre call 1 > > > > Convergence tolerance PER hypre call 0. > > > > Threshold for strong coupling 0.25 > > > > Interpolation truncation factor 0. > > > > Interpolation: max elements per row 0 > > > > Number of levels of aggressive coarsening 0 > > > > Number of paths for aggressive coarsening 1 > > > > Maximum row sums 0.9 > > > > Sweeps down 1 > > > > Sweeps up 1 > > > > Sweeps on coarse 1 > > > > Relax down symmetric-SOR/Jacobi > > > > Relax up symmetric-SOR/Jacobi > > > > Relax on coarse Gaussian-elimination > > > > Relax weight (all) 1. > > > > Outer relax weight (all) 1. > > > > Using CF-relaxation > > > > Not using more complex smoothers. > > > > Measure type local > > > > Coarsen type Falgout > > > > Interpolation type classical > > > > linear system matrix = precond matrix: > > > > Mat Object: A 24 MPI processes > > > > type: mpiaij > > > > rows=497664, cols=497664 > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > total number of mallocs used during MatSetValues calls =0 > > > > has attached null space > > > > not using I-node (on process 0) routines > > > > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 > > > > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 > > > > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 > > > > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 > > > > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 > > > > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 > > > > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 > > > > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 > > > > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 > > > > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 > > > > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 > > > > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 > > > > Linear solve converged due to CONVERGED_RTOL iterations 12 > > > > KSP Object: 24 MPI processes > > > > type: gmres > > > > restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > > > > happy breakdown tolerance 1e-30 > > > > maximum iterations=40000, initial guess is zero > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 24 MPI processes > > > > type: hypre > > > > HYPRE BoomerAMG preconditioning > > > > Cycle type V > > > > Maximum number of levels 25 > > > > Maximum number of iterations PER hypre call 1 > > > > Convergence tolerance PER hypre call 0. > > > > Threshold for strong coupling 0.25 > > > > Interpolation truncation factor 0. > > > > Interpolation: max elements per row 0 > > > > Number of levels of aggressive coarsening 0 > > > > Number of paths for aggressive coarsening 1 > > > > Maximum row sums 0.9 > > > > Sweeps down 1 > > > > Sweeps up 1 > > > > Sweeps on coarse 1 > > > > Relax down symmetric-SOR/Jacobi > > > > Relax up symmetric-SOR/Jacobi > > > > Relax on coarse Gaussian-elimination > > > > Relax weight (all) 1. > > > > Outer relax weight (all) 1. > > > > Using CF-relaxation > > > > Not using more complex smoothers. > > > > Measure type local > > > > Coarsen type Falgout > > > > Interpolation type classical > > > > linear system matrix = precond matrix: > > > > Mat Object: A 24 MPI processes > > > > type: mpiaij > > > > rows=497664, cols=497664 > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > total number of mallocs used during MatSetValues calls =0 > > > > has attached null space > > > > not using I-node (on process 0) routines > > > > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd > > > > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd > > > > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd > > > > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = 1.075459e-13 > > > > > > > > The max value of p0 is 0.03115845493408858 > > > > > > > > The min value of p0 is -0.07156715468428149 > > > > > > > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith wrote: > > > > > > > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: > > > > > > > > > > the incompressible NS solver algorithm call PETSc solver at different stage of each time step. The one you were saying "This is good. 12 digit reduction" is after the initial pressure solver, in which usually HYPRE doesn't give a good convergence, so the fall-back solver GMRES will be called after. > > > > > > > > Hmm, I don't understand. hypre should do well on a pressure solve. In fact, very well. > > > > > > > > > > Barry, you were mentioning that I could have a wrong nullspace. that particular solver is aimed to give an initial pressure profile for 3d incompressible NS simulation using all neumann boundary conditions. could you give some insight how to test if I have a wrong nullspace etc? > > > > > > > > -ksp_test_null_space > > > > > > > > But if your null space is consistently from all Neumann boundary conditions then it likely is not wrong. > > > > > > > > Barry > > > > > > > > > > > > > > Thanks! > > > > > > > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith wrote: > > > > > > > > > > This is good. You get more than 12 digit reduction in the true residual norm. This is good AMG convergence. Expected when everything goes well. > > > > > > > > > > What is different in this case from the previous case that does not converge reasonably? > > > > > > > > > > Barry > > > > > > > > > > > > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: > > > > > > > > > > > > Barry, Please advise what you make of this? this is poisson solver with all neumann BC 3d case Finite difference Scheme was used. > > > > > > Thanks! I'm in learning mode. > > > > > > > > > > > > KSP Object: 24 MPI processes > > > > > > type: bcgs > > > > > > maximum iterations=40000, initial guess is zero > > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > > left preconditioning > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > PC Object: 24 MPI processes > > > > > > type: hypre > > > > > > HYPRE BoomerAMG preconditioning > > > > > > Cycle type V > > > > > > Maximum number of levels 25 > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > Convergence tolerance PER hypre call 0. > > > > > > Threshold for strong coupling 0.25 > > > > > > Interpolation truncation factor 0. > > > > > > Interpolation: max elements per row 0 > > > > > > Number of levels of aggressive coarsening 0 > > > > > > Number of paths for aggressive coarsening 1 > > > > > > Maximum row sums 0.9 > > > > > > Sweeps down 1 > > > > > > Sweeps up 1 > > > > > > Sweeps on coarse 1 > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > Relax on coarse Gaussian-elimination > > > > > > Relax weight (all) 1. > > > > > > Outer relax weight (all) 1. > > > > > > Using CF-relaxation > > > > > > Not using more complex smoothers. > > > > > > Measure type local > > > > > > Coarsen type Falgout > > > > > > Interpolation type classical > > > > > > linear system matrix = precond matrix: > > > > > > Mat Object: A 24 MPI processes > > > > > > type: mpiaij > > > > > > rows=497664, cols=497664 > > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > > has attached null space > > > > > > not using I-node (on process 0) routines > > > > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > > > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > > > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > > > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > > > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > > > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > > > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > > > > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith wrote: > > > > > > > > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: > > > > > > > > > > > > > > hi, Barry: > > > > > > > what do you mean absurd by setting tolerance =1e-14? > > > > > > > > > > > > Trying to decrease the initial residual norm down by a factor of 1e-14 with an iterative method (or even direct method) is unrealistic, usually unachievable) and almost never necessary. You are requiring || r_n || < 1.e-14 || r_0|| when with double precision numbers you only have roughly 14 decimal digits total to compute with. Round off alone will lead to differences far larger than 1e-14 > > > > > > > > > > > > If you are using the solver in the context of a nonlinear problem (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to get quadratic convergence of Newton's method. > > > > > > > > > > > > If you are solving a linear problem then it is extremely likely that errors due to discretization errors (from finite element method etc) and the model are much much larger than even 1.e-8. > > > > > > > > > > > > So, in summary > > > > > > > > > > > > 1.e-14 is probably unachievable > > > > > > > > > > > > 1.e-14 is almost for sure not needed. > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: > > > > > > > > > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > > > > > > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely to be a reason to use bcgs > > > > > > > > > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > > > > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > > > > > > > > > > > > > > > if this solver doesn't converge. I have a fall-back solution, which uses GMRES solver. this setup is fine with me. I just want to know if HYPRE is a reliable solution for me. Or I will have to go without preconditioner. > > > > > > > > > > > > > > > > Thanks! > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: > > > > > > > > this is serial run. still dumping output. parallel more or less the same. > > > > > > > > > > > > > > > > KSP Object: 1 MPI processes > > > > > > > > type: bcgs > > > > > > > > maximum iterations=40000, initial guess is zero > > > > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > > > > left preconditioning > > > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > > > PC Object: 1 MPI processes > > > > > > > > type: hypre > > > > > > > > HYPRE BoomerAMG preconditioning > > > > > > > > Cycle type V > > > > > > > > Maximum number of levels 25 > > > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > > > Convergence tolerance PER hypre call 0. > > > > > > > > Threshold for strong coupling 0.25 > > > > > > > > Interpolation truncation factor 0. > > > > > > > > Interpolation: max elements per row 0 > > > > > > > > Number of levels of aggressive coarsening 0 > > > > > > > > Number of paths for aggressive coarsening 1 > > > > > > > > Maximum row sums 0.9 > > > > > > > > Sweeps down 1 > > > > > > > > Sweeps up 1 > > > > > > > > Sweeps on coarse 1 > > > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > > > Relax on coarse Gaussian-elimination > > > > > > > > Relax weight (all) 1. > > > > > > > > Outer relax weight (all) 1. > > > > > > > > Using CF-relaxation > > > > > > > > Not using more complex smoothers. > > > > > > > > Measure type local > > > > > > > > Coarsen type Falgout > > > > > > > > Interpolation type classical > > > > > > > > linear system matrix = precond matrix: > > > > > > > > Mat Object: A 1 MPI processes > > > > > > > > type: seqaij > > > > > > > > rows=497664, cols=497664 > > > > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > > > > has attached null space > > > > > > > > not using I-node routines > > > > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley wrote: > > > > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > > > > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > > > > > > > > ierr = VecAssemblyBegin(x); > > > > > > > > ierr = VecAssemblyEnd(x); > > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > ierr = VecAssemblyBegin(b); > > > > > > > > ierr = VecAssemblyEnd(b); > > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > > > > > Is your rhs consistent with this nullspace? > > > > > > > > > > > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > > > > > > KSPSetOperators(ksp,A,A); > > > > > > > > > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > > > > > #if defined(__HYPRE__) > > > > > > > > KSPGetPC(ksp, &pc); > > > > > > > > PCSetType(pc, PCHYPRE); > > > > > > > > PCHYPRESetType(pc,"boomeramg"); > > > > > > > > This is terribly unnecessary. You just use > > > > > > > > > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > > > > > > > > > or > > > > > > > > > > > > > > > > -pc_type gamg > > > > > > > > > > > > > > > > #else > > > > > > > > KSPSetType(ksp,KSPBCGSL); > > > > > > > > KSPBCGSLSetEll(ksp,2); > > > > > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > > > > > > > > > KSPSetFromOptions(ksp); > > > > > > > > KSPSetUp(ksp); > > > > > > > > > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > > > > > > > > > > > > > command line > > > > > > > > > > > > > > > > You did not provide any of what I asked for the in the eprevious mail. > > > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley wrote: > > > > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > > > > > > > > hi, > > > > > > > > > > > > > > > > I implemented HYPRE preconditioner for my study due to the fact that without preconditioner, PETSc solver will take thousands of iterations to converge for fine grid simulation. > > > > > > > > > > > > > > > > with HYPRE, depending on the parallel partition, it will take HYPRE forever to do anything. observation of output file is that the simulation is hanging with no output. > > > > > > > > > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > > > > > > > > > 1) For any question about convergence, we need to see the output of > > > > > > > > > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > > > > > > > > > > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > > > > > > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > > > > > > > > > > > > > Thanks, > > > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > > > -- > > > > > > > > Hao Zhang > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > Stony Brook University, > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > Hao Zhang > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > Stony Brook University, > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > Hao Zhang > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > Stony Brook University, > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > Hao Zhang > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > Stony Brook University, > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 > > > > > -- > Hao Zhang > Dept. of Applid Mathematics and Statistics, > Stony Brook University, > Stony Brook, New York, 11790 From mfadams at lbl.gov Mon Oct 23 10:38:15 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 23 Oct 2017 11:38:15 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> <8C8ADFC3-C71F-4DE1-B21C-5BF50EBC652D@mcs.anl.gov> Message-ID: On Mon, Oct 23, 2017 at 11:36 AM, Barry Smith wrote: > > I'm confused. Is hypre + GMRES ever not working fine for you? Why not > just always use hypre + gmres; no basic solver is going to be faster for > large problems ever. > I assume the problem is that as he refines the mesh the solver dies *and* the solution goes bad. Right? > > Barry > > > On Oct 23, 2017, at 9:09 AM, Hao Zhang wrote: > > > > Yes. > > > > On Mon, Oct 23, 2017 at 10:07 AM, Mark Adams wrote: > > > > > > On Mon, Oct 23, 2017 at 10:04 AM, Hao Zhang wrote: > > The big picture is I'm solving 3d incompressible Navier-Stokes Equations > using staggered/MAC grid with Finite Difference Method. The particular > function is poisson pressure solver or Laplacian Mark Adam mentioned. The > simulation runs fine for medium size mesh grid. When I try harder to go > very fine grid, not DNS level, I'm having some difficulties to have meaning > physical results. All PETSc solvers converge but come with huge iterations. > That's when/how I started using HYPRE. > > > > Are we just talking about the convergence of the pressure solve? > > > > > > On Mon, Oct 23, 2017 at 9:01 AM, Mark Adams wrote: > > Just to be clear: 1) are you solving the Laplacian (div grad) and 2) > what type of discretizations are you using? and 3) do you have stretched or > terrible grids in some way? > > > > On Sun, Oct 22, 2017 at 3:57 PM, Barry Smith wrote: > > > > One thing important to understand is that multigrid is an optimal or > nearly optimal algorithm. This means, when it works, as you refine the mesh > the number of iterations remains nearly constant, regardless of the problem > size and number of processes. Simple preconditioners such as ILU, block > Jacobi, one level additive Schwarz etc have iterations that increase with > the problem size and likely also with the number of processes. Thus these > algorithms become essentially impractical for very large problems while > multigrid can remain practical (when it works). > > > > Good luck > > > > Barry > > > On Oct 22, 2017, at 2:35 PM, Hao Zhang wrote: > > > > > > Thanks for all the inputs. before simulating finer grid, HYPRE wasn't > used and simulations were just fine. I will do a few tests and post more > information later. > > > > > > On Sun, Oct 22, 2017 at 12:11 PM, Barry Smith > wrote: > > > > > > > On Oct 21, 2017, at 11:16 PM, Hao Zhang wrote: > > > > > > > > the reason is when I do finer grid simulation, matrix become more > stiff. > > > > > > Are you saying that for a finer grid but everything else the same, > the convergence of hypre (with the same GMRES) with the same options gets > much worse? This normally will not happen, that is the fundamental beauty > of multigrid methods (when they work well). > > > > > > Yes the matrix condition number increases but multigrid doesn't > care about that, its number of iterations should remain pretty much the > same. > > > > > > Something must be different (with this finer grid case), either > the mesh becomes horrible, or the physics changes, or there are errors in > the code that lead to the problem. > > > > > > What happens if you just refine the mesh a little? Then a little > more? Then a little more? Does the convergence rate suddenly go bad at some > point, or does it just get worse slowly? > > > > > > Barry > > > > > > > > > > > > > Much larger condition number. just to give you a perspective, it > will take 6000 iterations to converge and the solver does converge. I want > to reduce the number of iterations while keeping the convergence rate. > that's main drive to do so much heavy lifting around. please advise. > snippet will be provided upon request. > > > > > > > > Thanks again. > > > > > > > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith > wrote: > > > > > > > > Oh, you change KSP but not hypre. I did not understand this. > > > > > > > > Why not just use GMRES all the time? Why mess with BCGS if it is > not robust? Not worth the small optimization if it breaks everything. > > > > > > > > Barry > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 11:05 PM, Hao Zhang > wrote: > > > > > > > > > > this is the initial pressure solver output regarding use of PETSc. > it failed to converge after 40000 iterations, then use GMRES. > > > > > > > > > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid > norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 > > > > > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid > norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 > > > > > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid > norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 > > > > > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid > norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 > > > > > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid > norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 > > > > > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid > norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 > > > > > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid > norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 > > > > > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid > norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 > > > > > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid > norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 > > > > > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid > norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 > > > > > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid > norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 > > > > > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid > norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 > > > > > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid > norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 > > > > > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid > norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 > > > > > Linear solve did not converge due to DIVERGED_ITS iterations 40000 > > > > > KSP Object: 24 MPI processes > > > > > type: bcgs > > > > > maximum iterations=40000, initial guess is zero > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > left preconditioning > > > > > using PRECONDITIONED norm type for convergence test > > > > > PC Object: 24 MPI processes > > > > > type: hypre > > > > > HYPRE BoomerAMG preconditioning > > > > > Cycle type V > > > > > Maximum number of levels 25 > > > > > Maximum number of iterations PER hypre call 1 > > > > > Convergence tolerance PER hypre call 0. > > > > > Threshold for strong coupling 0.25 > > > > > Interpolation truncation factor 0. > > > > > Interpolation: max elements per row 0 > > > > > Number of levels of aggressive coarsening 0 > > > > > Number of paths for aggressive coarsening 1 > > > > > Maximum row sums 0.9 > > > > > Sweeps down 1 > > > > > Sweeps up 1 > > > > > Sweeps on coarse 1 > > > > > Relax down symmetric-SOR/Jacobi > > > > > Relax up symmetric-SOR/Jacobi > > > > > Relax on coarse Gaussian-elimination > > > > > Relax weight (all) 1. > > > > > Outer relax weight (all) 1. > > > > > Using CF-relaxation > > > > > Not using more complex smoothers. > > > > > Measure type local > > > > > Coarsen type Falgout > > > > > Interpolation type classical > > > > > linear system matrix = precond matrix: > > > > > Mat Object: A 24 MPI processes > > > > > type: mpiaij > > > > > rows=497664, cols=497664 > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > has attached null space > > > > > not using I-node (on process 0) routines > > > > > > > > > > The solution diverges for p0! The residual is 3.853123e-08. Solve > again using GMRES! > > > > > KSP Object: 24 MPI processes > > > > > type: gmres > > > > > restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > > > > > happy breakdown tolerance 1e-30 > > > > > maximum iterations=40000, initial guess is zero > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > left preconditioning > > > > > using PRECONDITIONED norm type for convergence test > > > > > PC Object: 24 MPI processes > > > > > type: hypre > > > > > HYPRE BoomerAMG preconditioning > > > > > Cycle type V > > > > > Maximum number of levels 25 > > > > > Maximum number of iterations PER hypre call 1 > > > > > Convergence tolerance PER hypre call 0. > > > > > Threshold for strong coupling 0.25 > > > > > Interpolation truncation factor 0. > > > > > Interpolation: max elements per row 0 > > > > > Number of levels of aggressive coarsening 0 > > > > > Number of paths for aggressive coarsening 1 > > > > > Maximum row sums 0.9 > > > > > Sweeps down 1 > > > > > Sweeps up 1 > > > > > Sweeps on coarse 1 > > > > > Relax down symmetric-SOR/Jacobi > > > > > Relax up symmetric-SOR/Jacobi > > > > > Relax on coarse Gaussian-elimination > > > > > Relax weight (all) 1. > > > > > Outer relax weight (all) 1. > > > > > Using CF-relaxation > > > > > Not using more complex smoothers. > > > > > Measure type local > > > > > Coarsen type Falgout > > > > > Interpolation type classical > > > > > linear system matrix = precond matrix: > > > > > Mat Object: A 24 MPI processes > > > > > type: mpiaij > > > > > rows=497664, cols=497664 > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > has attached null space > > > > > not using I-node (on process 0) routines > > > > > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid > norm 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid > norm 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 > > > > > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid > norm 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 > > > > > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid > norm 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 > > > > > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid > norm 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 > > > > > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid > norm 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 > > > > > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid > norm 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 > > > > > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid > norm 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 > > > > > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid > norm 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 > > > > > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid > norm 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 > > > > > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid > norm 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 > > > > > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid > norm 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 > > > > > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid > norm 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 > > > > > Linear solve converged due to CONVERGED_RTOL iterations 12 > > > > > KSP Object: 24 MPI processes > > > > > type: gmres > > > > > restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > > > > > happy breakdown tolerance 1e-30 > > > > > maximum iterations=40000, initial guess is zero > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > left preconditioning > > > > > using PRECONDITIONED norm type for convergence test > > > > > PC Object: 24 MPI processes > > > > > type: hypre > > > > > HYPRE BoomerAMG preconditioning > > > > > Cycle type V > > > > > Maximum number of levels 25 > > > > > Maximum number of iterations PER hypre call 1 > > > > > Convergence tolerance PER hypre call 0. > > > > > Threshold for strong coupling 0.25 > > > > > Interpolation truncation factor 0. > > > > > Interpolation: max elements per row 0 > > > > > Number of levels of aggressive coarsening 0 > > > > > Number of paths for aggressive coarsening 1 > > > > > Maximum row sums 0.9 > > > > > Sweeps down 1 > > > > > Sweeps up 1 > > > > > Sweeps on coarse 1 > > > > > Relax down symmetric-SOR/Jacobi > > > > > Relax up symmetric-SOR/Jacobi > > > > > Relax on coarse Gaussian-elimination > > > > > Relax weight (all) 1. > > > > > Outer relax weight (all) 1. > > > > > Using CF-relaxation > > > > > Not using more complex smoothers. > > > > > Measure type local > > > > > Coarsen type Falgout > > > > > Interpolation type classical > > > > > linear system matrix = precond matrix: > > > > > Mat Object: A 24 MPI processes > > > > > type: mpiaij > > > > > rows=497664, cols=497664 > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > has attached null space > > > > > not using I-node (on process 0) routines > > > > > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd > > > > > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd > > > > > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd > > > > > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = > 1.075459e-13 > > > > > > > > > > The max value of p0 is 0.03115845493408858 > > > > > > > > > > The min value of p0 is -0.07156715468428149 > > > > > > > > > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith > wrote: > > > > > > > > > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang > wrote: > > > > > > > > > > > > the incompressible NS solver algorithm call PETSc solver at > different stage of each time step. The one you were saying "This is good. > 12 digit reduction" is after the initial pressure solver, in which usually > HYPRE doesn't give a good convergence, so the fall-back solver GMRES will > be called after. > > > > > > > > > > Hmm, I don't understand. hypre should do well on a pressure > solve. In fact, very well. > > > > > > > > > > > > Barry, you were mentioning that I could have a wrong nullspace. > that particular solver is aimed to give an initial pressure profile for 3d > incompressible NS simulation using all neumann boundary conditions. could > you give some insight how to test if I have a wrong nullspace etc? > > > > > > > > > > -ksp_test_null_space > > > > > > > > > > But if your null space is consistently from all Neumann > boundary conditions then it likely is not wrong. > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > Thanks! > > > > > > > > > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith < > bsmith at mcs.anl.gov> wrote: > > > > > > > > > > > > This is good. You get more than 12 digit reduction in the true > residual norm. This is good AMG convergence. Expected when everything goes > well. > > > > > > > > > > > > What is different in this case from the previous case that > does not converge reasonably? > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang > wrote: > > > > > > > > > > > > > > Barry, Please advise what you make of this? this is poisson > solver with all neumann BC 3d case Finite difference Scheme was used. > > > > > > > Thanks! I'm in learning mode. > > > > > > > > > > > > > > KSP Object: 24 MPI processes > > > > > > > type: bcgs > > > > > > > maximum iterations=40000, initial guess is zero > > > > > > > tolerances: relative=1e-14, absolute=1e-14, > divergence=10000. > > > > > > > left preconditioning > > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > > PC Object: 24 MPI processes > > > > > > > type: hypre > > > > > > > HYPRE BoomerAMG preconditioning > > > > > > > Cycle type V > > > > > > > Maximum number of levels 25 > > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > > Convergence tolerance PER hypre call 0. > > > > > > > Threshold for strong coupling 0.25 > > > > > > > Interpolation truncation factor 0. > > > > > > > Interpolation: max elements per row 0 > > > > > > > Number of levels of aggressive coarsening 0 > > > > > > > Number of paths for aggressive coarsening 1 > > > > > > > Maximum row sums 0.9 > > > > > > > Sweeps down 1 > > > > > > > Sweeps up 1 > > > > > > > Sweeps on coarse 1 > > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > > Relax on coarse Gaussian-elimination > > > > > > > Relax weight (all) 1. > > > > > > > Outer relax weight (all) 1. > > > > > > > Using CF-relaxation > > > > > > > Not using more complex smoothers. > > > > > > > Measure type local > > > > > > > Coarsen type Falgout > > > > > > > Interpolation type classical > > > > > > > linear system matrix = precond matrix: > > > > > > > Mat Object: A 24 MPI processes > > > > > > > type: mpiaij > > > > > > > rows=497664, cols=497664 > > > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > > > has attached null space > > > > > > > not using I-node (on process 0) routines > > > > > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true > resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true > resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > > > > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true > resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > > > > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true > resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > > > > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true > resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > > > > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true > resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > > > > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true > resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > > > > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true > resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > > > > > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith < > bsmith at mcs.anl.gov> wrote: > > > > > > > > > > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang > wrote: > > > > > > > > > > > > > > > > hi, Barry: > > > > > > > > what do you mean absurd by setting tolerance =1e-14? > > > > > > > > > > > > > > Trying to decrease the initial residual norm down by a > factor of 1e-14 with an iterative method (or even direct method) is > unrealistic, usually unachievable) and almost never necessary. You are > requiring || r_n || < 1.e-14 || r_0|| when with double precision numbers > you only have roughly 14 decimal digits total to compute with. Round off > alone will lead to differences far larger than 1e-14 > > > > > > > > > > > > > > If you are using the solver in the context of a nonlinear > problem (i.e. inside Newton's method) then 1.e-6 is generally more than > plenty to get quadratic convergence of Newton's method. > > > > > > > > > > > > > > If you are solving a linear problem then it is extremely > likely that errors due to discretization errors (from finite element method > etc) and the model are much much larger than even 1.e-8. > > > > > > > > > > > > > > So, in summary > > > > > > > > > > > > > > 1.e-14 is probably unachievable > > > > > > > > > > > > > > 1.e-14 is almost for sure not needed. > > > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith < > bsmith at mcs.anl.gov> wrote: > > > > > > > > > > > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and > send the resulting output file called binaryoutput to > petsc-maint at mcs.anl.gov > > > > > > > > > > > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely > to be a reason to use bcgs > > > > > > > > > > > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > > > > > > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang < > hbcbh1999 at gmail.com> wrote: > > > > > > > > > > > > > > > > > > if this solver doesn't converge. I have a fall-back > solution, which uses GMRES solver. this setup is fine with me. I just want > to know if HYPRE is a reliable solution for me. Or I will have to go > without preconditioner. > > > > > > > > > > > > > > > > > > Thanks! > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang < > hbcbh1999 at gmail.com> wrote: > > > > > > > > > this is serial run. still dumping output. parallel more or > less the same. > > > > > > > > > > > > > > > > > > KSP Object: 1 MPI processes > > > > > > > > > type: bcgs > > > > > > > > > maximum iterations=40000, initial guess is zero > > > > > > > > > tolerances: relative=1e-14, absolute=1e-14, > divergence=10000. > > > > > > > > > left preconditioning > > > > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > > > > PC Object: 1 MPI processes > > > > > > > > > type: hypre > > > > > > > > > HYPRE BoomerAMG preconditioning > > > > > > > > > Cycle type V > > > > > > > > > Maximum number of levels 25 > > > > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > > > > Convergence tolerance PER hypre call 0. > > > > > > > > > Threshold for strong coupling 0.25 > > > > > > > > > Interpolation truncation factor 0. > > > > > > > > > Interpolation: max elements per row 0 > > > > > > > > > Number of levels of aggressive coarsening 0 > > > > > > > > > Number of paths for aggressive coarsening 1 > > > > > > > > > Maximum row sums 0.9 > > > > > > > > > Sweeps down 1 > > > > > > > > > Sweeps up 1 > > > > > > > > > Sweeps on coarse 1 > > > > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > > > > Relax on coarse Gaussian-elimination > > > > > > > > > Relax weight (all) 1. > > > > > > > > > Outer relax weight (all) 1. > > > > > > > > > Using CF-relaxation > > > > > > > > > Not using more complex smoothers. > > > > > > > > > Measure type local > > > > > > > > > Coarsen type Falgout > > > > > > > > > Interpolation type classical > > > > > > > > > linear system matrix = precond matrix: > > > > > > > > > Mat Object: A 1 MPI processes > > > > > > > > > type: seqaij > > > > > > > > > rows=497664, cols=497664 > > > > > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > > > > > > total number of mallocs used during MatSetValues calls > =0 > > > > > > > > > has attached null space > > > > > > > > > not using I-node routines > > > > > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true > resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true > resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true > resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true > resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true > resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true > resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true > resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true > resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true > resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true > resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true > resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true > resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true > resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true > resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true > resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true > resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true > resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true > resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true > resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true > resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true > resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true > resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true > resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true > resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true > resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true > resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true > resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true > resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true > resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true > resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true > resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true > resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true > resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true > resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true > resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true > resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true > resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true > resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true > resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true > resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true > resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true > resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true > resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true > resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true > resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true > resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true > resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true > resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true > resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true > resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true > resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true > resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true > resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true > resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true > resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true > resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true > resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true > resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true > resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true > resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true > resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true > resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true > resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true > resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true > resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true > resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true > resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true > resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true > resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true > resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true > resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true > resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true > resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true > resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true > resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true > resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true > resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true > resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true > resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true > resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true > resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true > resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true > resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true > resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true > resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true > resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true > resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true > resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true > resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true > resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true > resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true > resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true > resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true > resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true > resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true > resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true > resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true > resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true > resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true > resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true > resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true > resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true > resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true > resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true > resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true > resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true > resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true > resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true > resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true > resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true > resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true > resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true > resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true > resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true > resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true > resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true > resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true > resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true > resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true > resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true > resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true > resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true > resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true > resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true > resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true > resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true > resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true > resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true > resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true > resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true > resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true > resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true > resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true > resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true > resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true > resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true > resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true > resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true > resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true > resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true > resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true > resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true > resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true > resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true > resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true > resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true > resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true > resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true > resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true > resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true > resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true > resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true > resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true > resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true > resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true > resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true > resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true > resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true > resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true > resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true > resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true > resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true > resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true > resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true > resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true > resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true > resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true > resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true > resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true > resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true > resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true > resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true > resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true > resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true > resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true > resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true > resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true > resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true > resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true > resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true > resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true > resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true > resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true > resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true > resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true > resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true > resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true > resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true > resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true > resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true > resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true > resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true > resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true > resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true > resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true > resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true > resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true > resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true > resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true > resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true > resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true > resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true > resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true > resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true > resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true > resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true > resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true > resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true > resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true > resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true > resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true > resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true > resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true > resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true > resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true > resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true > resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true > resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true > resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true > resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true > resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true > resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true > resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true > resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true > resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true > resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true > resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true > resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true > resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true > resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true > resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true > resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true > resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true > resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true > resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true > resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true > resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true > resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true > resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true > resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true > resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true > resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true > resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true > resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true > resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true > resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true > resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true > resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true > resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true > resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true > resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true > resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true > resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true > resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true > resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true > resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true > resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true > resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true > resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true > resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true > resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true > resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true > resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true > resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true > resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true > resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true > resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true > resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true > resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true > resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true > resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true > resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true > resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true > resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true > resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true > resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true > resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true > resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true > resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true > resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true > resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true > resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true > resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true > resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true > resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true > resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true > resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true > resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true > resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true > resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true > resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true > resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true > resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true > resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true > resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true > resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true > resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true > resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true > resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true > resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true > resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true > resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true > resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true > resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true > resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true > resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true > resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true > resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true > resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true > resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true > resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true > resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true > resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true > resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true > resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true > resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true > resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true > resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true > resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true > resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true > resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true > resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true > resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true > resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true > resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true > resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true > resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true > resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true > resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true > resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true > resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true > resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true > resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true > resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true > resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true > resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true > resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true > resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true > resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true > resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true > resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true > resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true > resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true > resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true > resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true > resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true > resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley < > knepley at gmail.com> wrote: > > > > > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang < > hbcbh1999 at gmail.com> wrote: > > > > > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > > > > > > > > > > ierr = VecAssemblyBegin(x); > > > > > > > > > ierr = VecAssemblyEnd(x); > > > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > > > ierr = VecAssemblyBegin(b); > > > > > > > > > ierr = VecAssemblyEnd(b); > > > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_ > WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > > > > > > Is your rhs consistent with this nullspace? > > > > > > > > > > > > > > > > > > // KSPSetOperators(ksp,A,A, > DIFFERENT_NONZERO_PATTERN); > > > > > > > > > KSPSetOperators(ksp,A,A); > > > > > > > > > > > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > > > > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > > > > > > #if defined(__HYPRE__) > > > > > > > > > KSPGetPC(ksp, &pc); > > > > > > > > > PCSetType(pc, PCHYPRE); > > > > > > > > > PCHYPRESetType(pc,"boomeramg"); > > > > > > > > > This is terribly unnecessary. You just use > > > > > > > > > > > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > > > > > > > > > > > or > > > > > > > > > > > > > > > > > > -pc_type gamg > > > > > > > > > > > > > > > > > > #else > > > > > > > > > KSPSetType(ksp,KSPBCGSL); > > > > > > > > > KSPBCGSLSetEll(ksp,2); > > > > > > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > > > > > > > > > > > KSPSetFromOptions(ksp); > > > > > > > > > KSPSetUp(ksp); > > > > > > > > > > > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > > > > > > > > > > > > > > > > command line > > > > > > > > > > > > > > > > > > You did not provide any of what I asked for the in the > eprevious mail. > > > > > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley < > knepley at gmail.com> wrote: > > > > > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang < > hbcbh1999 at gmail.com> wrote: > > > > > > > > > hi, > > > > > > > > > > > > > > > > > > I implemented HYPRE preconditioner for my study due to the > fact that without preconditioner, PETSc solver will take thousands of > iterations to converge for fine grid simulation. > > > > > > > > > > > > > > > > > > with HYPRE, depending on the parallel partition, it will > take HYPRE forever to do anything. observation of output file is that the > simulation is hanging with no output. > > > > > > > > > > > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > > > > > > > > > > > 1) For any question about convergence, we need to see the > output of > > > > > > > > > > > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual > -ksp_converged_reason > > > > > > > > > > > > > > > > > > 2) Hypre has many preconditioners, which one are you > talking about > > > > > > > > > > > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, > like AMG > > > > > > > > > > > > > > > > > > Thanks, > > > > > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > > > > > -- > > > > > > > > > Hao Zhang > > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > > Stony Brook University, > > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > > What most experimenters take for granted before they begin > their experiments is infinitely more interesting than any results to which > their experiments lead. > > > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > > Hao Zhang > > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > > Stony Brook University, > > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > > What most experimenters take for granted before they begin > their experiments is infinitely more interesting than any results to which > their experiments lead. > > > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > > Hao Zhang > > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > > Stony Brook University, > > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > > Hao Zhang > > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > > Stony Brook University, > > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > -- > > > > > > > > Hao Zhang > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > Stony Brook University, > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Oct 23 10:51:06 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 23 Oct 2017 10:51:06 -0500 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> <8C8ADFC3-C71F-4DE1-B21C-5BF50EBC652D@mcs.anl.gov> Message-ID: <1FF3BF35-124D-4754-B1BE-62DD8FFBD5B0@mcs.anl.gov> > On Oct 23, 2017, at 10:38 AM, Mark Adams wrote: > > > > On Mon, Oct 23, 2017 at 11:36 AM, Barry Smith wrote: > > I'm confused. Is hypre + GMRES ever not working fine for you? Why not just always use hypre + gmres; no basic solver is going to be faster for large problems ever. > > I assume the problem is that as he refines the mesh the solver dies *and* the solution goes bad. Right? He never said this when using hypre + gmres; he said this happened before they tried hypre. Barry > > > Barry > > > On Oct 23, 2017, at 9:09 AM, Hao Zhang wrote: > > > > Yes. > > > > On Mon, Oct 23, 2017 at 10:07 AM, Mark Adams wrote: > > > > > > On Mon, Oct 23, 2017 at 10:04 AM, Hao Zhang wrote: > > The big picture is I'm solving 3d incompressible Navier-Stokes Equations using staggered/MAC grid with Finite Difference Method. The particular function is poisson pressure solver or Laplacian Mark Adam mentioned. The simulation runs fine for medium size mesh grid. When I try harder to go very fine grid, not DNS level, I'm having some difficulties to have meaning physical results. All PETSc solvers converge but come with huge iterations. That's when/how I started using HYPRE. > > > > Are we just talking about the convergence of the pressure solve? > > > > > > On Mon, Oct 23, 2017 at 9:01 AM, Mark Adams wrote: > > Just to be clear: 1) are you solving the Laplacian (div grad) and 2) what type of discretizations are you using? and 3) do you have stretched or terrible grids in some way? > > > > On Sun, Oct 22, 2017 at 3:57 PM, Barry Smith wrote: > > > > One thing important to understand is that multigrid is an optimal or nearly optimal algorithm. This means, when it works, as you refine the mesh the number of iterations remains nearly constant, regardless of the problem size and number of processes. Simple preconditioners such as ILU, block Jacobi, one level additive Schwarz etc have iterations that increase with the problem size and likely also with the number of processes. Thus these algorithms become essentially impractical for very large problems while multigrid can remain practical (when it works). > > > > Good luck > > > > Barry > > > On Oct 22, 2017, at 2:35 PM, Hao Zhang wrote: > > > > > > Thanks for all the inputs. before simulating finer grid, HYPRE wasn't used and simulations were just fine. I will do a few tests and post more information later. > > > > > > On Sun, Oct 22, 2017 at 12:11 PM, Barry Smith wrote: > > > > > > > On Oct 21, 2017, at 11:16 PM, Hao Zhang wrote: > > > > > > > > the reason is when I do finer grid simulation, matrix become more stiff. > > > > > > Are you saying that for a finer grid but everything else the same, the convergence of hypre (with the same GMRES) with the same options gets much worse? This normally will not happen, that is the fundamental beauty of multigrid methods (when they work well). > > > > > > Yes the matrix condition number increases but multigrid doesn't care about that, its number of iterations should remain pretty much the same. > > > > > > Something must be different (with this finer grid case), either the mesh becomes horrible, or the physics changes, or there are errors in the code that lead to the problem. > > > > > > What happens if you just refine the mesh a little? Then a little more? Then a little more? Does the convergence rate suddenly go bad at some point, or does it just get worse slowly? > > > > > > Barry > > > > > > > > > > > > > Much larger condition number. just to give you a perspective, it will take 6000 iterations to converge and the solver does converge. I want to reduce the number of iterations while keeping the convergence rate. that's main drive to do so much heavy lifting around. please advise. snippet will be provided upon request. > > > > > > > > Thanks again. > > > > > > > > On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith wrote: > > > > > > > > Oh, you change KSP but not hypre. I did not understand this. > > > > > > > > Why not just use GMRES all the time? Why mess with BCGS if it is not robust? Not worth the small optimization if it breaks everything. > > > > > > > > Barry > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 11:05 PM, Hao Zhang wrote: > > > > > > > > > > this is the initial pressure solver output regarding use of PETSc. it failed to converge after 40000 iterations, then use GMRES. > > > > > > > > > > 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 > > > > > 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 > > > > > 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 > > > > > 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 > > > > > 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 > > > > > 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 > > > > > 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 > > > > > 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 > > > > > 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 > > > > > 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 > > > > > 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 > > > > > 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 > > > > > 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 > > > > > 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 > > > > > Linear solve did not converge due to DIVERGED_ITS iterations 40000 > > > > > KSP Object: 24 MPI processes > > > > > type: bcgs > > > > > maximum iterations=40000, initial guess is zero > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > left preconditioning > > > > > using PRECONDITIONED norm type for convergence test > > > > > PC Object: 24 MPI processes > > > > > type: hypre > > > > > HYPRE BoomerAMG preconditioning > > > > > Cycle type V > > > > > Maximum number of levels 25 > > > > > Maximum number of iterations PER hypre call 1 > > > > > Convergence tolerance PER hypre call 0. > > > > > Threshold for strong coupling 0.25 > > > > > Interpolation truncation factor 0. > > > > > Interpolation: max elements per row 0 > > > > > Number of levels of aggressive coarsening 0 > > > > > Number of paths for aggressive coarsening 1 > > > > > Maximum row sums 0.9 > > > > > Sweeps down 1 > > > > > Sweeps up 1 > > > > > Sweeps on coarse 1 > > > > > Relax down symmetric-SOR/Jacobi > > > > > Relax up symmetric-SOR/Jacobi > > > > > Relax on coarse Gaussian-elimination > > > > > Relax weight (all) 1. > > > > > Outer relax weight (all) 1. > > > > > Using CF-relaxation > > > > > Not using more complex smoothers. > > > > > Measure type local > > > > > Coarsen type Falgout > > > > > Interpolation type classical > > > > > linear system matrix = precond matrix: > > > > > Mat Object: A 24 MPI processes > > > > > type: mpiaij > > > > > rows=497664, cols=497664 > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > has attached null space > > > > > not using I-node (on process 0) routines > > > > > > > > > > The solution diverges for p0! The residual is 3.853123e-08. Solve again using GMRES! > > > > > KSP Object: 24 MPI processes > > > > > type: gmres > > > > > restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > > > > > happy breakdown tolerance 1e-30 > > > > > maximum iterations=40000, initial guess is zero > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > left preconditioning > > > > > using PRECONDITIONED norm type for convergence test > > > > > PC Object: 24 MPI processes > > > > > type: hypre > > > > > HYPRE BoomerAMG preconditioning > > > > > Cycle type V > > > > > Maximum number of levels 25 > > > > > Maximum number of iterations PER hypre call 1 > > > > > Convergence tolerance PER hypre call 0. > > > > > Threshold for strong coupling 0.25 > > > > > Interpolation truncation factor 0. > > > > > Interpolation: max elements per row 0 > > > > > Number of levels of aggressive coarsening 0 > > > > > Number of paths for aggressive coarsening 1 > > > > > Maximum row sums 0.9 > > > > > Sweeps down 1 > > > > > Sweeps up 1 > > > > > Sweeps on coarse 1 > > > > > Relax down symmetric-SOR/Jacobi > > > > > Relax up symmetric-SOR/Jacobi > > > > > Relax on coarse Gaussian-elimination > > > > > Relax weight (all) 1. > > > > > Outer relax weight (all) 1. > > > > > Using CF-relaxation > > > > > Not using more complex smoothers. > > > > > Measure type local > > > > > Coarsen type Falgout > > > > > Interpolation type classical > > > > > linear system matrix = precond matrix: > > > > > Mat Object: A 24 MPI processes > > > > > type: mpiaij > > > > > rows=497664, cols=497664 > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > has attached null space > > > > > not using I-node (on process 0) routines > > > > > 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 > > > > > 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 > > > > > 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 > > > > > 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 > > > > > 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 > > > > > 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 > > > > > 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 > > > > > 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 > > > > > 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 > > > > > 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 > > > > > 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 > > > > > 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 > > > > > Linear solve converged due to CONVERGED_RTOL iterations 12 > > > > > KSP Object: 24 MPI processes > > > > > type: gmres > > > > > restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > > > > > happy breakdown tolerance 1e-30 > > > > > maximum iterations=40000, initial guess is zero > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > left preconditioning > > > > > using PRECONDITIONED norm type for convergence test > > > > > PC Object: 24 MPI processes > > > > > type: hypre > > > > > HYPRE BoomerAMG preconditioning > > > > > Cycle type V > > > > > Maximum number of levels 25 > > > > > Maximum number of iterations PER hypre call 1 > > > > > Convergence tolerance PER hypre call 0. > > > > > Threshold for strong coupling 0.25 > > > > > Interpolation truncation factor 0. > > > > > Interpolation: max elements per row 0 > > > > > Number of levels of aggressive coarsening 0 > > > > > Number of paths for aggressive coarsening 1 > > > > > Maximum row sums 0.9 > > > > > Sweeps down 1 > > > > > Sweeps up 1 > > > > > Sweeps on coarse 1 > > > > > Relax down symmetric-SOR/Jacobi > > > > > Relax up symmetric-SOR/Jacobi > > > > > Relax on coarse Gaussian-elimination > > > > > Relax weight (all) 1. > > > > > Outer relax weight (all) 1. > > > > > Using CF-relaxation > > > > > Not using more complex smoothers. > > > > > Measure type local > > > > > Coarsen type Falgout > > > > > Interpolation type classical > > > > > linear system matrix = precond matrix: > > > > > Mat Object: A 24 MPI processes > > > > > type: mpiaij > > > > > rows=497664, cols=497664 > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > has attached null space > > > > > not using I-node (on process 0) routines > > > > > The max singular value of A = 1.000872 in poisson_solver3d_P0_vd > > > > > The min singular value of A = 0.667688 in poisson_solver3d_P0_vd > > > > > The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd > > > > > In poisson_solver3d_pressure(): num_iter = 12, rel_residual = 1.075459e-13 > > > > > > > > > > The max value of p0 is 0.03115845493408858 > > > > > > > > > > The min value of p0 is -0.07156715468428149 > > > > > > > > > > On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith wrote: > > > > > > > > > > > On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: > > > > > > > > > > > > the incompressible NS solver algorithm call PETSc solver at different stage of each time step. The one you were saying "This is good. 12 digit reduction" is after the initial pressure solver, in which usually HYPRE doesn't give a good convergence, so the fall-back solver GMRES will be called after. > > > > > > > > > > Hmm, I don't understand. hypre should do well on a pressure solve. In fact, very well. > > > > > > > > > > > > Barry, you were mentioning that I could have a wrong nullspace. that particular solver is aimed to give an initial pressure profile for 3d incompressible NS simulation using all neumann boundary conditions. could you give some insight how to test if I have a wrong nullspace etc? > > > > > > > > > > -ksp_test_null_space > > > > > > > > > > But if your null space is consistently from all Neumann boundary conditions then it likely is not wrong. > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > Thanks! > > > > > > > > > > > > On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith wrote: > > > > > > > > > > > > This is good. You get more than 12 digit reduction in the true residual norm. This is good AMG convergence. Expected when everything goes well. > > > > > > > > > > > > What is different in this case from the previous case that does not converge reasonably? > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: > > > > > > > > > > > > > > Barry, Please advise what you make of this? this is poisson solver with all neumann BC 3d case Finite difference Scheme was used. > > > > > > > Thanks! I'm in learning mode. > > > > > > > > > > > > > > KSP Object: 24 MPI processes > > > > > > > type: bcgs > > > > > > > maximum iterations=40000, initial guess is zero > > > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > > > left preconditioning > > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > > PC Object: 24 MPI processes > > > > > > > type: hypre > > > > > > > HYPRE BoomerAMG preconditioning > > > > > > > Cycle type V > > > > > > > Maximum number of levels 25 > > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > > Convergence tolerance PER hypre call 0. > > > > > > > Threshold for strong coupling 0.25 > > > > > > > Interpolation truncation factor 0. > > > > > > > Interpolation: max elements per row 0 > > > > > > > Number of levels of aggressive coarsening 0 > > > > > > > Number of paths for aggressive coarsening 1 > > > > > > > Maximum row sums 0.9 > > > > > > > Sweeps down 1 > > > > > > > Sweeps up 1 > > > > > > > Sweeps on coarse 1 > > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > > Relax on coarse Gaussian-elimination > > > > > > > Relax weight (all) 1. > > > > > > > Outer relax weight (all) 1. > > > > > > > Using CF-relaxation > > > > > > > Not using more complex smoothers. > > > > > > > Measure type local > > > > > > > Coarsen type Falgout > > > > > > > Interpolation type classical > > > > > > > linear system matrix = precond matrix: > > > > > > > Mat Object: A 24 MPI processes > > > > > > > type: mpiaij > > > > > > > rows=497664, cols=497664 > > > > > > > total: nonzeros=3363552, allocated nonzeros=6967296 > > > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > > > has attached null space > > > > > > > not using I-node (on process 0) routines > > > > > > > 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > > 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 > > > > > > > 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 > > > > > > > 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 > > > > > > > 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 > > > > > > > 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 > > > > > > > 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 > > > > > > > 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 > > > > > > > Linear solve converged due to CONVERGED_ATOL iterations 7 > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith wrote: > > > > > > > > > > > > > > > On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: > > > > > > > > > > > > > > > > hi, Barry: > > > > > > > > what do you mean absurd by setting tolerance =1e-14? > > > > > > > > > > > > > > Trying to decrease the initial residual norm down by a factor of 1e-14 with an iterative method (or even direct method) is unrealistic, usually unachievable) and almost never necessary. You are requiring || r_n || < 1.e-14 || r_0|| when with double precision numbers you only have roughly 14 decimal digits total to compute with. Round off alone will lead to differences far larger than 1e-14 > > > > > > > > > > > > > > If you are using the solver in the context of a nonlinear problem (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to get quadratic convergence of Newton's method. > > > > > > > > > > > > > > If you are solving a linear problem then it is extremely likely that errors due to discretization errors (from finite element method etc) and the model are much much larger than even 1.e-8. > > > > > > > > > > > > > > So, in summary > > > > > > > > > > > > > > 1.e-14 is probably unachievable > > > > > > > > > > > > > > 1.e-14 is almost for sure not needed. > > > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: > > > > > > > > > > > > > > > > Run with -ksp_view_mat binary -ksp_view_rhs binary and send the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov > > > > > > > > > > > > > > > > Note you can also use -ksp_type gmres with hypre, unlikely to be a reason to use bcgs > > > > > > > > > > > > > > > > BTW: tolerances: relative=1e-14, is absurd > > > > > > > > > > > > > > > > My guess is your null space is incorrect. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: > > > > > > > > > > > > > > > > > > if this solver doesn't converge. I have a fall-back solution, which uses GMRES solver. this setup is fine with me. I just want to know if HYPRE is a reliable solution for me. Or I will have to go without preconditioner. > > > > > > > > > > > > > > > > > > Thanks! > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: > > > > > > > > > this is serial run. still dumping output. parallel more or less the same. > > > > > > > > > > > > > > > > > > KSP Object: 1 MPI processes > > > > > > > > > type: bcgs > > > > > > > > > maximum iterations=40000, initial guess is zero > > > > > > > > > tolerances: relative=1e-14, absolute=1e-14, divergence=10000. > > > > > > > > > left preconditioning > > > > > > > > > using PRECONDITIONED norm type for convergence test > > > > > > > > > PC Object: 1 MPI processes > > > > > > > > > type: hypre > > > > > > > > > HYPRE BoomerAMG preconditioning > > > > > > > > > Cycle type V > > > > > > > > > Maximum number of levels 25 > > > > > > > > > Maximum number of iterations PER hypre call 1 > > > > > > > > > Convergence tolerance PER hypre call 0. > > > > > > > > > Threshold for strong coupling 0.25 > > > > > > > > > Interpolation truncation factor 0. > > > > > > > > > Interpolation: max elements per row 0 > > > > > > > > > Number of levels of aggressive coarsening 0 > > > > > > > > > Number of paths for aggressive coarsening 1 > > > > > > > > > Maximum row sums 0.9 > > > > > > > > > Sweeps down 1 > > > > > > > > > Sweeps up 1 > > > > > > > > > Sweeps on coarse 1 > > > > > > > > > Relax down symmetric-SOR/Jacobi > > > > > > > > > Relax up symmetric-SOR/Jacobi > > > > > > > > > Relax on coarse Gaussian-elimination > > > > > > > > > Relax weight (all) 1. > > > > > > > > > Outer relax weight (all) 1. > > > > > > > > > Using CF-relaxation > > > > > > > > > Not using more complex smoothers. > > > > > > > > > Measure type local > > > > > > > > > Coarsen type Falgout > > > > > > > > > Interpolation type classical > > > > > > > > > linear system matrix = precond matrix: > > > > > > > > > Mat Object: A 1 MPI processes > > > > > > > > > type: seqaij > > > > > > > > > rows=497664, cols=497664 > > > > > > > > > total: nonzeros=3363552, allocated nonzeros=3483648 > > > > > > > > > total number of mallocs used during MatSetValues calls =0 > > > > > > > > > has attached null space > > > > > > > > > not using I-node routines > > > > > > > > > 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 > > > > > > > > > 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 > > > > > > > > > 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 > > > > > > > > > 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 > > > > > > > > > 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 > > > > > > > > > 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 > > > > > > > > > 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 > > > > > > > > > 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 > > > > > > > > > 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 > > > > > > > > > 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 > > > > > > > > > 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 > > > > > > > > > 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 > > > > > > > > > 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 > > > > > > > > > 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 > > > > > > > > > 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 > > > > > > > > > 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 > > > > > > > > > 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 > > > > > > > > > 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 > > > > > > > > > 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 > > > > > > > > > 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 > > > > > > > > > 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 > > > > > > > > > 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 > > > > > > > > > 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 > > > > > > > > > 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 > > > > > > > > > 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 > > > > > > > > > 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 > > > > > > > > > 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 > > > > > > > > > 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 > > > > > > > > > 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 > > > > > > > > > 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 > > > > > > > > > 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 > > > > > > > > > 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 > > > > > > > > > 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 > > > > > > > > > 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 > > > > > > > > > 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 > > > > > > > > > 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 > > > > > > > > > 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 > > > > > > > > > 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 > > > > > > > > > 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 > > > > > > > > > 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 > > > > > > > > > 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 > > > > > > > > > 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 > > > > > > > > > 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 > > > > > > > > > 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 > > > > > > > > > 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 > > > > > > > > > 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 > > > > > > > > > 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 > > > > > > > > > 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 > > > > > > > > > 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 > > > > > > > > > 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 > > > > > > > > > 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 > > > > > > > > > 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 > > > > > > > > > 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 > > > > > > > > > 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 > > > > > > > > > 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 > > > > > > > > > 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 > > > > > > > > > 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 > > > > > > > > > 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 > > > > > > > > > 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 > > > > > > > > > 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 > > > > > > > > > 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 > > > > > > > > > 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 > > > > > > > > > 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 > > > > > > > > > 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 > > > > > > > > > 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 > > > > > > > > > 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 > > > > > > > > > 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 > > > > > > > > > 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 > > > > > > > > > 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 > > > > > > > > > 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 > > > > > > > > > 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 > > > > > > > > > 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 > > > > > > > > > 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 > > > > > > > > > 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 > > > > > > > > > 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 > > > > > > > > > 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 > > > > > > > > > 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 > > > > > > > > > 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 > > > > > > > > > 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 > > > > > > > > > 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 > > > > > > > > > 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 > > > > > > > > > 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 > > > > > > > > > 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 > > > > > > > > > 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 > > > > > > > > > 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 > > > > > > > > > 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 > > > > > > > > > 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 > > > > > > > > > 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 > > > > > > > > > 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 > > > > > > > > > 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 > > > > > > > > > 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 > > > > > > > > > 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 > > > > > > > > > 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 > > > > > > > > > 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 > > > > > > > > > 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 > > > > > > > > > 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 > > > > > > > > > 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 > > > > > > > > > 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 > > > > > > > > > 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 > > > > > > > > > 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 > > > > > > > > > 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 > > > > > > > > > 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 > > > > > > > > > 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 > > > > > > > > > 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 > > > > > > > > > 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 > > > > > > > > > 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 > > > > > > > > > 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 > > > > > > > > > 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 > > > > > > > > > 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 > > > > > > > > > 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 > > > > > > > > > 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 > > > > > > > > > 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 > > > > > > > > > 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 > > > > > > > > > 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 > > > > > > > > > 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 > > > > > > > > > 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 > > > > > > > > > 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 > > > > > > > > > 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 > > > > > > > > > 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 > > > > > > > > > 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 > > > > > > > > > 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 > > > > > > > > > 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 > > > > > > > > > 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 > > > > > > > > > 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 > > > > > > > > > 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 > > > > > > > > > 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 > > > > > > > > > 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 > > > > > > > > > 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 > > > > > > > > > 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 > > > > > > > > > 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 > > > > > > > > > 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 > > > > > > > > > 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 > > > > > > > > > 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 > > > > > > > > > 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 > > > > > > > > > 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 > > > > > > > > > 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 > > > > > > > > > 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 > > > > > > > > > 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 > > > > > > > > > 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 > > > > > > > > > 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 > > > > > > > > > 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 > > > > > > > > > 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 > > > > > > > > > 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 > > > > > > > > > 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 > > > > > > > > > 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 > > > > > > > > > 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 > > > > > > > > > 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 > > > > > > > > > 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 > > > > > > > > > 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 > > > > > > > > > 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 > > > > > > > > > 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 > > > > > > > > > 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 > > > > > > > > > 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 > > > > > > > > > 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 > > > > > > > > > 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 > > > > > > > > > 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 > > > > > > > > > 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 > > > > > > > > > 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 > > > > > > > > > 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 > > > > > > > > > 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 > > > > > > > > > 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 > > > > > > > > > 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 > > > > > > > > > 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 > > > > > > > > > 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 > > > > > > > > > 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 > > > > > > > > > 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 > > > > > > > > > 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 > > > > > > > > > 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 > > > > > > > > > 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 > > > > > > > > > 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 > > > > > > > > > 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 > > > > > > > > > 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 > > > > > > > > > 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 > > > > > > > > > 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 > > > > > > > > > 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 > > > > > > > > > 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 > > > > > > > > > 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 > > > > > > > > > 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 > > > > > > > > > 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 > > > > > > > > > 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 > > > > > > > > > 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 > > > > > > > > > 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 > > > > > > > > > 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 > > > > > > > > > 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 > > > > > > > > > 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 > > > > > > > > > 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 > > > > > > > > > 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 > > > > > > > > > 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 > > > > > > > > > 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 > > > > > > > > > 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 > > > > > > > > > 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 > > > > > > > > > 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 > > > > > > > > > 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 > > > > > > > > > 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 > > > > > > > > > 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 > > > > > > > > > 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 > > > > > > > > > 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 > > > > > > > > > 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 > > > > > > > > > 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 > > > > > > > > > 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 > > > > > > > > > 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 > > > > > > > > > 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 > > > > > > > > > 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 > > > > > > > > > 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 > > > > > > > > > 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 > > > > > > > > > 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 > > > > > > > > > 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 > > > > > > > > > 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 > > > > > > > > > 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 > > > > > > > > > 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 > > > > > > > > > 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 > > > > > > > > > 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 > > > > > > > > > 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 > > > > > > > > > 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 > > > > > > > > > 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 > > > > > > > > > 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 > > > > > > > > > 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 > > > > > > > > > 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 > > > > > > > > > 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 > > > > > > > > > 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 > > > > > > > > > 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 > > > > > > > > > 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 > > > > > > > > > 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 > > > > > > > > > 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 > > > > > > > > > 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 > > > > > > > > > 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 > > > > > > > > > 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 > > > > > > > > > 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 > > > > > > > > > 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 > > > > > > > > > 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 > > > > > > > > > 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 > > > > > > > > > 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 > > > > > > > > > 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 > > > > > > > > > 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 > > > > > > > > > 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 > > > > > > > > > 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 > > > > > > > > > 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 > > > > > > > > > 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 > > > > > > > > > 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 > > > > > > > > > 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 > > > > > > > > > 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 > > > > > > > > > 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 > > > > > > > > > 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 > > > > > > > > > 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 > > > > > > > > > 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 > > > > > > > > > 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 > > > > > > > > > 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 > > > > > > > > > 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 > > > > > > > > > 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > > > 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 > > > > > > > > > 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 > > > > > > > > > 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 > > > > > > > > > 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 > > > > > > > > > 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 > > > > > > > > > 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 > > > > > > > > > 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 > > > > > > > > > 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 > > > > > > > > > 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 > > > > > > > > > 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 > > > > > > > > > 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 > > > > > > > > > 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 > > > > > > > > > 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 > > > > > > > > > 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 > > > > > > > > > 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 > > > > > > > > > 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 > > > > > > > > > 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 > > > > > > > > > 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 > > > > > > > > > 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 > > > > > > > > > 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 > > > > > > > > > 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 > > > > > > > > > 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 > > > > > > > > > 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 > > > > > > > > > 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 > > > > > > > > > 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 > > > > > > > > > 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 > > > > > > > > > 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 > > > > > > > > > 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 > > > > > > > > > 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 > > > > > > > > > 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 > > > > > > > > > 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 > > > > > > > > > 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 > > > > > > > > > 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 > > > > > > > > > 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 > > > > > > > > > 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 > > > > > > > > > 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 > > > > > > > > > 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 > > > > > > > > > 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 > > > > > > > > > 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 > > > > > > > > > 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 > > > > > > > > > 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 > > > > > > > > > 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 > > > > > > > > > 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 > > > > > > > > > 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 > > > > > > > > > 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 > > > > > > > > > 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 > > > > > > > > > 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 > > > > > > > > > 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 > > > > > > > > > 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 > > > > > > > > > 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 > > > > > > > > > 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 > > > > > > > > > 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 > > > > > > > > > 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 > > > > > > > > > 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 > > > > > > > > > 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 > > > > > > > > > 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 > > > > > > > > > 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > > > 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 > > > > > > > > > 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 > > > > > > > > > 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 > > > > > > > > > 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 > > > > > > > > > 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 > > > > > > > > > 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 > > > > > > > > > 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 > > > > > > > > > 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 > > > > > > > > > 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > > > 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 > > > > > > > > > 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 > > > > > > > > > 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 > > > > > > > > > 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 > > > > > > > > > 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 > > > > > > > > > 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 > > > > > > > > > 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 > > > > > > > > > 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 > > > > > > > > > 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 > > > > > > > > > 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 > > > > > > > > > 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 > > > > > > > > > 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 > > > > > > > > > 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 > > > > > > > > > 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 > > > > > > > > > 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 > > > > > > > > > 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 > > > > > > > > > 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 > > > > > > > > > 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 > > > > > > > > > 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 > > > > > > > > > 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 > > > > > > > > > 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 > > > > > > > > > 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 > > > > > > > > > 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 > > > > > > > > > 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 > > > > > > > > > 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 > > > > > > > > > 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 > > > > > > > > > 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 > > > > > > > > > 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 > > > > > > > > > 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 > > > > > > > > > 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 > > > > > > > > > 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 > > > > > > > > > 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley wrote: > > > > > > > > > On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: > > > > > > > > > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > > > > > > > > > > > > > > > > > ierr = VecAssemblyBegin(x); > > > > > > > > > ierr = VecAssemblyEnd(x); > > > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > > > ierr = VecAssemblyBegin(b); > > > > > > > > > ierr = VecAssemblyEnd(b); > > > > > > > > > This is probably unnecessary > > > > > > > > > > > > > > > > > > > > > > > > > > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); > > > > > > > > > ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 > > > > > > > > > Is your rhs consistent with this nullspace? > > > > > > > > > > > > > > > > > > // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > > > > > > > > KSPSetOperators(ksp,A,A); > > > > > > > > > > > > > > > > > > KSPSetType(ksp,KSPBCGS); > > > > > > > > > > > > > > > > > > KSPSetComputeSingularValues(ksp, PETSC_TRUE); > > > > > > > > > #if defined(__HYPRE__) > > > > > > > > > KSPGetPC(ksp, &pc); > > > > > > > > > PCSetType(pc, PCHYPRE); > > > > > > > > > PCHYPRESetType(pc,"boomeramg"); > > > > > > > > > This is terribly unnecessary. You just use > > > > > > > > > > > > > > > > > > -pc_type hypre -pc_hypre_type boomeramg > > > > > > > > > > > > > > > > > > or > > > > > > > > > > > > > > > > > > -pc_type gamg > > > > > > > > > > > > > > > > > > #else > > > > > > > > > KSPSetType(ksp,KSPBCGSL); > > > > > > > > > KSPBCGSLSetEll(ksp,2); > > > > > > > > > #endif /* defined(__HYPRE__) */ > > > > > > > > > > > > > > > > > > KSPSetFromOptions(ksp); > > > > > > > > > KSPSetUp(ksp); > > > > > > > > > > > > > > > > > > ierr = KSPSolve(ksp,b,x); > > > > > > > > > > > > > > > > > > > > > > > > > > > command line > > > > > > > > > > > > > > > > > > You did not provide any of what I asked for the in the eprevious mail. > > > > > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > > > > > On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley wrote: > > > > > > > > > On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: > > > > > > > > > hi, > > > > > > > > > > > > > > > > > > I implemented HYPRE preconditioner for my study due to the fact that without preconditioner, PETSc solver will take thousands of iterations to converge for fine grid simulation. > > > > > > > > > > > > > > > > > > with HYPRE, depending on the parallel partition, it will take HYPRE forever to do anything. observation of output file is that the simulation is hanging with no output. > > > > > > > > > > > > > > > > > > Any idea what happened? will post snippet of code. > > > > > > > > > > > > > > > > > > 1) For any question about convergence, we need to see the output of > > > > > > > > > > > > > > > > > > -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > > > > > > > > > > > > > > > 2) Hypre has many preconditioners, which one are you talking about > > > > > > > > > > > > > > > > > > 3) PETSc has some preconditioners in common with Hypre, like AMG > > > > > > > > > > > > > > > > > > Thanks, > > > > > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > > > > > -- > > > > > > > > > Hao Zhang > > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > > Stony Brook University, > > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > > Hao Zhang > > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > > Stony Brook University, > > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > > Hao Zhang > > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > > Stony Brook University, > > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > > > Hao Zhang > > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > > Stony Brook University, > > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > -- > > > > > > > > Hao Zhang > > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > > Stony Brook University, > > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > Hao Zhang > > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > > Stony Brook University, > > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > Hao Zhang > > > > > > Dept. of Applid Mathematics and Statistics, > > > > > > Stony Brook University, > > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > Hao Zhang > > > > > Dept. of Applid Mathematics and Statistics, > > > > > Stony Brook University, > > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > > > > > > -- > > > > Hao Zhang > > > > Dept. of Applid Mathematics and Statistics, > > > > Stony Brook University, > > > > Stony Brook, New York, 11790 > > > > > > > > > > > > > > > -- > > > Hao Zhang > > > Dept. of Applid Mathematics and Statistics, > > > Stony Brook University, > > > Stony Brook, New York, 11790 > > > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > > > > > > > > > -- > > Hao Zhang > > Dept. of Applid Mathematics and Statistics, > > Stony Brook University, > > Stony Brook, New York, 11790 > > From griffith at cims.nyu.edu Mon Oct 23 10:55:10 2017 From: griffith at cims.nyu.edu (Boyce Griffith) Date: Mon, 23 Oct 2017 11:55:10 -0400 Subject: [petsc-users] HYPRE hanging or slow? from observation In-Reply-To: <1FF3BF35-124D-4754-B1BE-62DD8FFBD5B0@mcs.anl.gov> References: <3BAED004-5500-4F83-AA38-351EEAC532E0@mcs.anl.gov> <64D7FC90-EBE8-409C-B300-04A118155A98@mcs.anl.gov> <5DDD356C-2000-4803-9F3A-02B429898A7A@mcs.anl.gov> <8C8ADFC3-C71F-4DE1-B21C-5BF50EBC652D@mcs.anl.gov> <1FF3BF35-124D-4754-B1BE-62DD8FFBD5B0@mcs.anl.gov> Message-ID: <0A3EE134-4543-435B-9BDF-C99BB45DD742@cims.nyu.edu> > On Oct 23, 2017, at 11:51 AM, Barry Smith wrote: > > >> On Oct 23, 2017, at 10:38 AM, Mark Adams wrote: >> >> >> >> On Mon, Oct 23, 2017 at 11:36 AM, Barry Smith wrote: >> >> I'm confused. Is hypre + GMRES ever not working fine for you? Why not just always use hypre + gmres; no basic solver is going to be faster for large problems ever. >> >> I assume the problem is that as he refines the mesh the solver dies *and* the solution goes bad. Right? > > He never said this when using hypre + gmres; he said this happened before they tried hypre. Hao, is there any possibility that you wind up with inconsistent boundary conditions under grid refinement? From the thread, it looks like this is using Dirichlet boundary conditions for the velocity, so the pressure Poisson solve is homogeneous Neumann. -- Boyce > Barry > >> >> >> Barry >> >>> On Oct 23, 2017, at 9:09 AM, Hao Zhang wrote: >>> >>> Yes. >>> >>> On Mon, Oct 23, 2017 at 10:07 AM, Mark Adams wrote: >>> >>> >>> On Mon, Oct 23, 2017 at 10:04 AM, Hao Zhang wrote: >>> The big picture is I'm solving 3d incompressible Navier-Stokes Equations using staggered/MAC grid with Finite Difference Method. The particular function is poisson pressure solver or Laplacian Mark Adam mentioned. The simulation runs fine for medium size mesh grid. When I try harder to go very fine grid, not DNS level, I'm having some difficulties to have meaning physical results. All PETSc solvers converge but come with huge iterations. That's when/how I started using HYPRE. >>> >>> Are we just talking about the convergence of the pressure solve? >>> >>> >>> On Mon, Oct 23, 2017 at 9:01 AM, Mark Adams wrote: >>> Just to be clear: 1) are you solving the Laplacian (div grad) and 2) what type of discretizations are you using? and 3) do you have stretched or terrible grids in some way? >>> >>> On Sun, Oct 22, 2017 at 3:57 PM, Barry Smith wrote: >>> >>> One thing important to understand is that multigrid is an optimal or nearly optimal algorithm. This means, when it works, as you refine the mesh the number of iterations remains nearly constant, regardless of the problem size and number of processes. Simple preconditioners such as ILU, block Jacobi, one level additive Schwarz etc have iterations that increase with the problem size and likely also with the number of processes. Thus these algorithms become essentially impractical for very large problems while multigrid can remain practical (when it works). >>> >>> Good luck >>> >>> Barry >>>> On Oct 22, 2017, at 2:35 PM, Hao Zhang wrote: >>>> >>>> Thanks for all the inputs. before simulating finer grid, HYPRE wasn't used and simulations were just fine. I will do a few tests and post more information later. >>>> >>>> On Sun, Oct 22, 2017 at 12:11 PM, Barry Smith wrote: >>>> >>>>> On Oct 21, 2017, at 11:16 PM, Hao Zhang wrote: >>>>> >>>>> the reason is when I do finer grid simulation, matrix become more stiff. >>>> >>>> Are you saying that for a finer grid but everything else the same, the convergence of hypre (with the same GMRES) with the same options gets much worse? This normally will not happen, that is the fundamental beauty of multigrid methods (when they work well). >>>> >>>> Yes the matrix condition number increases but multigrid doesn't care about that, its number of iterations should remain pretty much the same. >>>> >>>> Something must be different (with this finer grid case), either the mesh becomes horrible, or the physics changes, or there are errors in the code that lead to the problem. >>>> >>>> What happens if you just refine the mesh a little? Then a little more? Then a little more? Does the convergence rate suddenly go bad at some point, or does it just get worse slowly? >>>> >>>> Barry >>>> >>>> >>>> >>>>> Much larger condition number. just to give you a perspective, it will take 6000 iterations to converge and the solver does converge. I want to reduce the number of iterations while keeping the convergence rate. that's main drive to do so much heavy lifting around. please advise. snippet will be provided upon request. >>>>> >>>>> Thanks again. >>>>> >>>>> On Sun, Oct 22, 2017 at 12:08 AM, Barry Smith wrote: >>>>> >>>>> Oh, you change KSP but not hypre. I did not understand this. >>>>> >>>>> Why not just use GMRES all the time? Why mess with BCGS if it is not robust? Not worth the small optimization if it breaks everything. >>>>> >>>>> Barry >>>>> >>>>> >>>>> >>>>>> On Oct 21, 2017, at 11:05 PM, Hao Zhang wrote: >>>>>> >>>>>> this is the initial pressure solver output regarding use of PETSc. it failed to converge after 40000 iterations, then use GMRES. >>>>>> >>>>>> 39987 KSP preconditioned resid norm 3.853125986269e-08 true resid norm 1.147359212216e-05 ||r(i)||/||b|| 1.557696568706e-06 >>>>>> 39988 KSP preconditioned resid norm 3.853126044003e-08 true resid norm 1.147359257282e-05 ||r(i)||/||b|| 1.557696629889e-06 >>>>>> 39989 KSP preconditioned resid norm 3.853126052100e-08 true resid norm 1.147359233695e-05 ||r(i)||/||b|| 1.557696597866e-06 >>>>>> 39990 KSP preconditioned resid norm 3.853126027357e-08 true resid norm 1.147359219860e-05 ||r(i)||/||b|| 1.557696579083e-06 >>>>>> 39991 KSP preconditioned resid norm 3.853126058478e-08 true resid norm 1.147359234281e-05 ||r(i)||/||b|| 1.557696598662e-06 >>>>>> 39992 KSP preconditioned resid norm 3.853126064006e-08 true resid norm 1.147359261420e-05 ||r(i)||/||b|| 1.557696635506e-06 >>>>>> 39993 KSP preconditioned resid norm 3.853126050203e-08 true resid norm 1.147359235972e-05 ||r(i)||/||b|| 1.557696600957e-06 >>>>>> 39994 KSP preconditioned resid norm 3.853126050182e-08 true resid norm 1.147359253713e-05 ||r(i)||/||b|| 1.557696625043e-06 >>>>>> 39995 KSP preconditioned resid norm 3.853125976795e-08 true resid norm 1.147359226222e-05 ||r(i)||/||b|| 1.557696587720e-06 >>>>>> 39996 KSP preconditioned resid norm 3.853125805127e-08 true resid norm 1.147359262747e-05 ||r(i)||/||b|| 1.557696637308e-06 >>>>>> 39997 KSP preconditioned resid norm 3.853125811756e-08 true resid norm 1.147359216008e-05 ||r(i)||/||b|| 1.557696573853e-06 >>>>>> 39998 KSP preconditioned resid norm 3.853125827833e-08 true resid norm 1.147359238372e-05 ||r(i)||/||b|| 1.557696604216e-06 >>>>>> 39999 KSP preconditioned resid norm 3.853127937068e-08 true resid norm 1.147359264043e-05 ||r(i)||/||b|| 1.557696639067e-06 >>>>>> 40000 KSP preconditioned resid norm 3.853122732867e-08 true resid norm 1.147359257776e-05 ||r(i)||/||b|| 1.557696630559e-06 >>>>>> Linear solve did not converge due to DIVERGED_ITS iterations 40000 >>>>>> KSP Object: 24 MPI processes >>>>>> type: bcgs >>>>>> maximum iterations=40000, initial guess is zero >>>>>> tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >>>>>> left preconditioning >>>>>> using PRECONDITIONED norm type for convergence test >>>>>> PC Object: 24 MPI processes >>>>>> type: hypre >>>>>> HYPRE BoomerAMG preconditioning >>>>>> Cycle type V >>>>>> Maximum number of levels 25 >>>>>> Maximum number of iterations PER hypre call 1 >>>>>> Convergence tolerance PER hypre call 0. >>>>>> Threshold for strong coupling 0.25 >>>>>> Interpolation truncation factor 0. >>>>>> Interpolation: max elements per row 0 >>>>>> Number of levels of aggressive coarsening 0 >>>>>> Number of paths for aggressive coarsening 1 >>>>>> Maximum row sums 0.9 >>>>>> Sweeps down 1 >>>>>> Sweeps up 1 >>>>>> Sweeps on coarse 1 >>>>>> Relax down symmetric-SOR/Jacobi >>>>>> Relax up symmetric-SOR/Jacobi >>>>>> Relax on coarse Gaussian-elimination >>>>>> Relax weight (all) 1. >>>>>> Outer relax weight (all) 1. >>>>>> Using CF-relaxation >>>>>> Not using more complex smoothers. >>>>>> Measure type local >>>>>> Coarsen type Falgout >>>>>> Interpolation type classical >>>>>> linear system matrix = precond matrix: >>>>>> Mat Object: A 24 MPI processes >>>>>> type: mpiaij >>>>>> rows=497664, cols=497664 >>>>>> total: nonzeros=3363552, allocated nonzeros=6967296 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> has attached null space >>>>>> not using I-node (on process 0) routines >>>>>> >>>>>> The solution diverges for p0! The residual is 3.853123e-08. Solve again using GMRES! >>>>>> KSP Object: 24 MPI processes >>>>>> type: gmres >>>>>> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>> happy breakdown tolerance 1e-30 >>>>>> maximum iterations=40000, initial guess is zero >>>>>> tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >>>>>> left preconditioning >>>>>> using PRECONDITIONED norm type for convergence test >>>>>> PC Object: 24 MPI processes >>>>>> type: hypre >>>>>> HYPRE BoomerAMG preconditioning >>>>>> Cycle type V >>>>>> Maximum number of levels 25 >>>>>> Maximum number of iterations PER hypre call 1 >>>>>> Convergence tolerance PER hypre call 0. >>>>>> Threshold for strong coupling 0.25 >>>>>> Interpolation truncation factor 0. >>>>>> Interpolation: max elements per row 0 >>>>>> Number of levels of aggressive coarsening 0 >>>>>> Number of paths for aggressive coarsening 1 >>>>>> Maximum row sums 0.9 >>>>>> Sweeps down 1 >>>>>> Sweeps up 1 >>>>>> Sweeps on coarse 1 >>>>>> Relax down symmetric-SOR/Jacobi >>>>>> Relax up symmetric-SOR/Jacobi >>>>>> Relax on coarse Gaussian-elimination >>>>>> Relax weight (all) 1. >>>>>> Outer relax weight (all) 1. >>>>>> Using CF-relaxation >>>>>> Not using more complex smoothers. >>>>>> Measure type local >>>>>> Coarsen type Falgout >>>>>> Interpolation type classical >>>>>> linear system matrix = precond matrix: >>>>>> Mat Object: A 24 MPI processes >>>>>> type: mpiaij >>>>>> rows=497664, cols=497664 >>>>>> total: nonzeros=3363552, allocated nonzeros=6967296 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> has attached null space >>>>>> not using I-node (on process 0) routines >>>>>> 0 KSP preconditioned resid norm 1.593802941804e+01 true resid norm 7.365742695119e+00 ||r(i)||/||b|| 1.000000000000e+00 >>>>>> 1 KSP preconditioned resid norm 6.338666661133e-01 true resid norm 2.880722209358e+00 ||r(i)||/||b|| 3.910973174867e-01 >>>>>> 2 KSP preconditioned resid norm 3.913420828350e-02 true resid norm 9.544089760671e-01 ||r(i)||/||b|| 1.295740315093e-01 >>>>>> 3 KSP preconditioned resid norm 2.928070366435e-03 true resid norm 1.874294004628e-01 ||r(i)||/||b|| 2.544609664236e-02 >>>>>> 4 KSP preconditioned resid norm 2.165607525823e-04 true resid norm 3.121122463949e-02 ||r(i)||/||b|| 4.237349298146e-03 >>>>>> 5 KSP preconditioned resid norm 1.635476480407e-05 true resid norm 3.984315313831e-03 ||r(i)||/||b|| 5.409251284967e-04 >>>>>> 6 KSP preconditioned resid norm 1.283358350575e-06 true resid norm 4.566583802915e-04 ||r(i)||/||b|| 6.199760149022e-05 >>>>>> 7 KSP preconditioned resid norm 8.479469225747e-08 true resid norm 3.824581791112e-05 ||r(i)||/||b|| 5.192391248810e-06 >>>>>> 8 KSP preconditioned resid norm 5.358636504683e-09 true resid norm 2.829730442033e-06 ||r(i)||/||b|| 3.841744898187e-07 >>>>>> 9 KSP preconditioned resid norm 3.447874504193e-10 true resid norm 1.756617036538e-07 ||r(i)||/||b|| 2.384847135242e-08 >>>>>> 10 KSP preconditioned resid norm 2.480228743414e-11 true resid norm 1.309399823577e-08 ||r(i)||/||b|| 1.777688792258e-09 >>>>>> 11 KSP preconditioned resid norm 1.728967759950e-12 true resid norm 9.406967045789e-10 ||r(i)||/||b|| 1.277124036931e-10 >>>>>> 12 KSP preconditioned resid norm 1.075458632828e-13 true resid norm 5.994505136212e-11 ||r(i)||/||b|| 8.138358050689e-12 >>>>>> Linear solve converged due to CONVERGED_RTOL iterations 12 >>>>>> KSP Object: 24 MPI processes >>>>>> type: gmres >>>>>> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>> happy breakdown tolerance 1e-30 >>>>>> maximum iterations=40000, initial guess is zero >>>>>> tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >>>>>> left preconditioning >>>>>> using PRECONDITIONED norm type for convergence test >>>>>> PC Object: 24 MPI processes >>>>>> type: hypre >>>>>> HYPRE BoomerAMG preconditioning >>>>>> Cycle type V >>>>>> Maximum number of levels 25 >>>>>> Maximum number of iterations PER hypre call 1 >>>>>> Convergence tolerance PER hypre call 0. >>>>>> Threshold for strong coupling 0.25 >>>>>> Interpolation truncation factor 0. >>>>>> Interpolation: max elements per row 0 >>>>>> Number of levels of aggressive coarsening 0 >>>>>> Number of paths for aggressive coarsening 1 >>>>>> Maximum row sums 0.9 >>>>>> Sweeps down 1 >>>>>> Sweeps up 1 >>>>>> Sweeps on coarse 1 >>>>>> Relax down symmetric-SOR/Jacobi >>>>>> Relax up symmetric-SOR/Jacobi >>>>>> Relax on coarse Gaussian-elimination >>>>>> Relax weight (all) 1. >>>>>> Outer relax weight (all) 1. >>>>>> Using CF-relaxation >>>>>> Not using more complex smoothers. >>>>>> Measure type local >>>>>> Coarsen type Falgout >>>>>> Interpolation type classical >>>>>> linear system matrix = precond matrix: >>>>>> Mat Object: A 24 MPI processes >>>>>> type: mpiaij >>>>>> rows=497664, cols=497664 >>>>>> total: nonzeros=3363552, allocated nonzeros=6967296 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> has attached null space >>>>>> not using I-node (on process 0) routines >>>>>> The max singular value of A = 1.000872 in poisson_solver3d_P0_vd >>>>>> The min singular value of A = 0.667688 in poisson_solver3d_P0_vd >>>>>> The Cond Num of A = 1.499012 in poisson_solver3d_P0_vd >>>>>> In poisson_solver3d_pressure(): num_iter = 12, rel_residual = 1.075459e-13 >>>>>> >>>>>> The max value of p0 is 0.03115845493408858 >>>>>> >>>>>> The min value of p0 is -0.07156715468428149 >>>>>> >>>>>> On Sun, Oct 22, 2017 at 12:00 AM, Barry Smith wrote: >>>>>> >>>>>>> On Oct 21, 2017, at 10:50 PM, Hao Zhang wrote: >>>>>>> >>>>>>> the incompressible NS solver algorithm call PETSc solver at different stage of each time step. The one you were saying "This is good. 12 digit reduction" is after the initial pressure solver, in which usually HYPRE doesn't give a good convergence, so the fall-back solver GMRES will be called after. >>>>>> >>>>>> Hmm, I don't understand. hypre should do well on a pressure solve. In fact, very well. >>>>>>> >>>>>>> Barry, you were mentioning that I could have a wrong nullspace. that particular solver is aimed to give an initial pressure profile for 3d incompressible NS simulation using all neumann boundary conditions. could you give some insight how to test if I have a wrong nullspace etc? >>>>>> >>>>>> -ksp_test_null_space >>>>>> >>>>>> But if your null space is consistently from all Neumann boundary conditions then it likely is not wrong. >>>>>> >>>>>> Barry >>>>>> >>>>>>> >>>>>>> Thanks! >>>>>>> >>>>>>> On Sat, Oct 21, 2017 at 11:41 PM, Barry Smith wrote: >>>>>>> >>>>>>> This is good. You get more than 12 digit reduction in the true residual norm. This is good AMG convergence. Expected when everything goes well. >>>>>>> >>>>>>> What is different in this case from the previous case that does not converge reasonably? >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> >>>>>>>> On Oct 21, 2017, at 9:29 PM, Hao Zhang wrote: >>>>>>>> >>>>>>>> Barry, Please advise what you make of this? this is poisson solver with all neumann BC 3d case Finite difference Scheme was used. >>>>>>>> Thanks! I'm in learning mode. >>>>>>>> >>>>>>>> KSP Object: 24 MPI processes >>>>>>>> type: bcgs >>>>>>>> maximum iterations=40000, initial guess is zero >>>>>>>> tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >>>>>>>> left preconditioning >>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>> PC Object: 24 MPI processes >>>>>>>> type: hypre >>>>>>>> HYPRE BoomerAMG preconditioning >>>>>>>> Cycle type V >>>>>>>> Maximum number of levels 25 >>>>>>>> Maximum number of iterations PER hypre call 1 >>>>>>>> Convergence tolerance PER hypre call 0. >>>>>>>> Threshold for strong coupling 0.25 >>>>>>>> Interpolation truncation factor 0. >>>>>>>> Interpolation: max elements per row 0 >>>>>>>> Number of levels of aggressive coarsening 0 >>>>>>>> Number of paths for aggressive coarsening 1 >>>>>>>> Maximum row sums 0.9 >>>>>>>> Sweeps down 1 >>>>>>>> Sweeps up 1 >>>>>>>> Sweeps on coarse 1 >>>>>>>> Relax down symmetric-SOR/Jacobi >>>>>>>> Relax up symmetric-SOR/Jacobi >>>>>>>> Relax on coarse Gaussian-elimination >>>>>>>> Relax weight (all) 1. >>>>>>>> Outer relax weight (all) 1. >>>>>>>> Using CF-relaxation >>>>>>>> Not using more complex smoothers. >>>>>>>> Measure type local >>>>>>>> Coarsen type Falgout >>>>>>>> Interpolation type classical >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Mat Object: A 24 MPI processes >>>>>>>> type: mpiaij >>>>>>>> rows=497664, cols=497664 >>>>>>>> total: nonzeros=3363552, allocated nonzeros=6967296 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> has attached null space >>>>>>>> not using I-node (on process 0) routines >>>>>>>> 0 KSP preconditioned resid norm 2.697270170623e-02 true resid norm 9.637159071207e+00 ||r(i)||/||b|| 1.000000000000e+00 >>>>>>>> 1 KSP preconditioned resid norm 4.828857674609e-04 true resid norm 6.294379664645e-01 ||r(i)||/||b|| 6.531364293291e-02 >>>>>>>> 2 KSP preconditioned resid norm 4.533649595815e-06 true resid norm 1.135508605857e-02 ||r(i)||/||b|| 1.178260727531e-03 >>>>>>>> 3 KSP preconditioned resid norm 1.131704082606e-07 true resid norm 1.196393029874e-04 ||r(i)||/||b|| 1.241437462051e-05 >>>>>>>> 4 KSP preconditioned resid norm 3.866281379569e-10 true resid norm 5.121520801846e-07 ||r(i)||/||b|| 5.314347064320e-08 >>>>>>>> 5 KSP preconditioned resid norm 1.070114785241e-11 true resid norm 1.203693733135e-08 ||r(i)||/||b|| 1.249013038221e-09 >>>>>>>> 6 KSP preconditioned resid norm 2.578780418765e-14 true resid norm 6.020297525927e-11 ||r(i)||/||b|| 6.246962908306e-12 >>>>>>>> 7 KSP preconditioned resid norm 8.691764190203e-16 true resid norm 1.866088098154e-12 ||r(i)||/||b|| 1.936346680973e-13 >>>>>>>> Linear solve converged due to CONVERGED_ATOL iterations 7 >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Sat, Oct 21, 2017 at 6:53 PM, Barry Smith wrote: >>>>>>>> >>>>>>>>> On Oct 21, 2017, at 5:47 PM, Hao Zhang wrote: >>>>>>>>> >>>>>>>>> hi, Barry: >>>>>>>>> what do you mean absurd by setting tolerance =1e-14? >>>>>>>> >>>>>>>> Trying to decrease the initial residual norm down by a factor of 1e-14 with an iterative method (or even direct method) is unrealistic, usually unachievable) and almost never necessary. You are requiring || r_n || < 1.e-14 || r_0|| when with double precision numbers you only have roughly 14 decimal digits total to compute with. Round off alone will lead to differences far larger than 1e-14 >>>>>>>> >>>>>>>> If you are using the solver in the context of a nonlinear problem (i.e. inside Newton's method) then 1.e-6 is generally more than plenty to get quadratic convergence of Newton's method. >>>>>>>> >>>>>>>> If you are solving a linear problem then it is extremely likely that errors due to discretization errors (from finite element method etc) and the model are much much larger than even 1.e-8. >>>>>>>> >>>>>>>> So, in summary >>>>>>>> >>>>>>>> 1.e-14 is probably unachievable >>>>>>>> >>>>>>>> 1.e-14 is almost for sure not needed. >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> On Sat, Oct 21, 2017 at 18:42 Barry Smith wrote: >>>>>>>>> >>>>>>>>> Run with -ksp_view_mat binary -ksp_view_rhs binary and send the resulting output file called binaryoutput to petsc-maint at mcs.anl.gov >>>>>>>>> >>>>>>>>> Note you can also use -ksp_type gmres with hypre, unlikely to be a reason to use bcgs >>>>>>>>> >>>>>>>>> BTW: tolerances: relative=1e-14, is absurd >>>>>>>>> >>>>>>>>> My guess is your null space is incorrect. >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> On Oct 21, 2017, at 4:34 PM, Hao Zhang wrote: >>>>>>>>>> >>>>>>>>>> if this solver doesn't converge. I have a fall-back solution, which uses GMRES solver. this setup is fine with me. I just want to know if HYPRE is a reliable solution for me. Or I will have to go without preconditioner. >>>>>>>>>> >>>>>>>>>> Thanks! >>>>>>>>>> >>>>>>>>>> On Sat, Oct 21, 2017 at 5:30 PM, Hao Zhang wrote: >>>>>>>>>> this is serial run. still dumping output. parallel more or less the same. >>>>>>>>>> >>>>>>>>>> KSP Object: 1 MPI processes >>>>>>>>>> type: bcgs >>>>>>>>>> maximum iterations=40000, initial guess is zero >>>>>>>>>> tolerances: relative=1e-14, absolute=1e-14, divergence=10000. >>>>>>>>>> left preconditioning >>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>> PC Object: 1 MPI processes >>>>>>>>>> type: hypre >>>>>>>>>> HYPRE BoomerAMG preconditioning >>>>>>>>>> Cycle type V >>>>>>>>>> Maximum number of levels 25 >>>>>>>>>> Maximum number of iterations PER hypre call 1 >>>>>>>>>> Convergence tolerance PER hypre call 0. >>>>>>>>>> Threshold for strong coupling 0.25 >>>>>>>>>> Interpolation truncation factor 0. >>>>>>>>>> Interpolation: max elements per row 0 >>>>>>>>>> Number of levels of aggressive coarsening 0 >>>>>>>>>> Number of paths for aggressive coarsening 1 >>>>>>>>>> Maximum row sums 0.9 >>>>>>>>>> Sweeps down 1 >>>>>>>>>> Sweeps up 1 >>>>>>>>>> Sweeps on coarse 1 >>>>>>>>>> Relax down symmetric-SOR/Jacobi >>>>>>>>>> Relax up symmetric-SOR/Jacobi >>>>>>>>>> Relax on coarse Gaussian-elimination >>>>>>>>>> Relax weight (all) 1. >>>>>>>>>> Outer relax weight (all) 1. >>>>>>>>>> Using CF-relaxation >>>>>>>>>> Not using more complex smoothers. >>>>>>>>>> Measure type local >>>>>>>>>> Coarsen type Falgout >>>>>>>>>> Interpolation type classical >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Mat Object: A 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=497664, cols=497664 >>>>>>>>>> total: nonzeros=3363552, allocated nonzeros=3483648 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> has attached null space >>>>>>>>>> not using I-node routines >>>>>>>>>> 0 KSP preconditioned resid norm 1.630377897956e+01 true resid norm 7.365742695123e+00 ||r(i)||/||b|| 1.000000000000e+00 >>>>>>>>>> 1 KSP preconditioned resid norm 3.815819909205e-02 true resid norm 7.592300567353e-01 ||r(i)||/||b|| 1.030758320187e-01 >>>>>>>>>> 2 KSP preconditioned resid norm 4.312671277701e-04 true resid norm 2.553060965521e-02 ||r(i)||/||b|| 3.466128360975e-03 >>>>>>>>>> 3 KSP preconditioned resid norm 3.011875330569e-04 true resid norm 6.836208627386e-04 ||r(i)||/||b|| 9.281085303065e-05 >>>>>>>>>> 4 KSP preconditioned resid norm 3.011783821295e-04 true resid norm 2.504661140204e-04 ||r(i)||/||b|| 3.400418998972e-05 >>>>>>>>>> 5 KSP preconditioned resid norm 3.011783818372e-04 true resid norm 2.504053673004e-04 ||r(i)||/||b|| 3.399594279422e-05 >>>>>>>>>> 6 KSP preconditioned resid norm 3.011783818375e-04 true resid norm 2.503984078890e-04 ||r(i)||/||b|| 3.399499795925e-05 >>>>>>>>>> 7 KSP preconditioned resid norm 3.011783887442e-04 true resid norm 2.504121501772e-04 ||r(i)||/||b|| 3.399686366224e-05 >>>>>>>>>> 8 KSP preconditioned resid norm 3.010913654181e-04 true resid norm 2.504150259031e-04 ||r(i)||/||b|| 3.399725408124e-05 >>>>>>>>>> 9 KSP preconditioned resid norm 3.006520688232e-04 true resid norm 2.504061607382e-04 ||r(i)||/||b|| 3.399605051423e-05 >>>>>>>>>> 10 KSP preconditioned resid norm 3.007309991942e-04 true resid norm 2.503843638523e-04 ||r(i)||/||b|| 3.399309128978e-05 >>>>>>>>>> 11 KSP preconditioned resid norm 3.015946168077e-04 true resid norm 2.503644677844e-04 ||r(i)||/||b|| 3.399039012728e-05 >>>>>>>>>> 12 KSP preconditioned resid norm 2.956643907377e-04 true resid norm 2.503863046509e-04 ||r(i)||/||b|| 3.399335477965e-05 >>>>>>>>>> 13 KSP preconditioned resid norm 2.997992358936e-04 true resid norm 2.504336903093e-04 ||r(i)||/||b|| 3.399978802886e-05 >>>>>>>>>> 14 KSP preconditioned resid norm 2.481415420420e-05 true resid norm 2.491591201250e-04 ||r(i)||/||b|| 3.382674774806e-05 >>>>>>>>>> 15 KSP preconditioned resid norm 2.615494786181e-05 true resid norm 2.490353237273e-04 ||r(i)||/||b|| 3.380994069915e-05 >>>>>>>>>> 16 KSP preconditioned resid norm 2.645126692130e-05 true resid norm 2.490535523344e-04 ||r(i)||/||b|| 3.381241548111e-05 >>>>>>>>>> 17 KSP preconditioned resid norm 2.667223026209e-05 true resid norm 2.490482602536e-04 ||r(i)||/||b|| 3.381169700898e-05 >>>>>>>>>> 18 KSP preconditioned resid norm 2.650813432116e-05 true resid norm 2.490473169014e-04 ||r(i)||/||b|| 3.381156893606e-05 >>>>>>>>>> 19 KSP preconditioned resid norm 2.613309555449e-05 true resid norm 2.490465633690e-04 ||r(i)||/||b|| 3.381146663375e-05 >>>>>>>>>> 20 KSP preconditioned resid norm 2.644160446804e-05 true resid norm 2.490532739949e-04 ||r(i)||/||b|| 3.381237769272e-05 >>>>>>>>>> 21 KSP preconditioned resid norm 2.635987608975e-05 true resid norm 2.490499548926e-04 ||r(i)||/||b|| 3.381192707933e-05 >>>>>>>>>> 22 KSP preconditioned resid norm 2.640527129095e-05 true resid norm 2.490594066529e-04 ||r(i)||/||b|| 3.381321028466e-05 >>>>>>>>>> 23 KSP preconditioned resid norm 2.627505117691e-05 true resid norm 2.490550162585e-04 ||r(i)||/||b|| 3.381261422875e-05 >>>>>>>>>> 24 KSP preconditioned resid norm 2.642659196388e-05 true resid norm 2.490504347640e-04 ||r(i)||/||b|| 3.381199222842e-05 >>>>>>>>>> 25 KSP preconditioned resid norm 2.659432190695e-05 true resid norm 2.490510775152e-04 ||r(i)||/||b|| 3.381207949065e-05 >>>>>>>>>> 26 KSP preconditioned resid norm 2.687918062951e-05 true resid norm 2.490518882015e-04 ||r(i)||/||b|| 3.381218955237e-05 >>>>>>>>>> 27 KSP preconditioned resid norm 2.662909048432e-05 true resid norm 2.490446263285e-04 ||r(i)||/||b|| 3.381120365409e-05 >>>>>>>>>> 28 KSP preconditioned resid norm 2.085466483199e-05 true resid norm 2.490131612366e-04 ||r(i)||/||b|| 3.380693183886e-05 >>>>>>>>>> 29 KSP preconditioned resid norm 2.098541330282e-05 true resid norm 2.490126933398e-04 ||r(i)||/||b|| 3.380686831549e-05 >>>>>>>>>> 30 KSP preconditioned resid norm 2.175345180286e-05 true resid norm 2.490098852429e-04 ||r(i)||/||b|| 3.380648707805e-05 >>>>>>>>>> 31 KSP preconditioned resid norm 2.182182437676e-05 true resid norm 2.490028301020e-04 ||r(i)||/||b|| 3.380552924648e-05 >>>>>>>>>> 32 KSP preconditioned resid norm 2.152970404369e-05 true resid norm 2.490089939838e-04 ||r(i)||/||b|| 3.380636607747e-05 >>>>>>>>>> 33 KSP preconditioned resid norm 2.187932450016e-05 true resid norm 2.490085293931e-04 ||r(i)||/||b|| 3.380630300295e-05 >>>>>>>>>> 34 KSP preconditioned resid norm 2.207255875067e-05 true resid norm 2.490039036092e-04 ||r(i)||/||b|| 3.380567498971e-05 >>>>>>>>>> 35 KSP preconditioned resid norm 2.205060279701e-05 true resid norm 2.490101636150e-04 ||r(i)||/||b|| 3.380652487086e-05 >>>>>>>>>> 36 KSP preconditioned resid norm 2.168654200416e-05 true resid norm 2.490091609876e-04 ||r(i)||/||b|| 3.380638875052e-05 >>>>>>>>>> 37 KSP preconditioned resid norm 2.164521042361e-05 true resid norm 2.490083143913e-04 ||r(i)||/||b|| 3.380627381352e-05 >>>>>>>>>> 38 KSP preconditioned resid norm 2.154429063973e-05 true resid norm 2.490075485470e-04 ||r(i)||/||b|| 3.380616983972e-05 >>>>>>>>>> 39 KSP preconditioned resid norm 2.165962086228e-05 true resid norm 2.490099695056e-04 ||r(i)||/||b|| 3.380649851786e-05 >>>>>>>>>> 40 KSP preconditioned resid norm 2.153877616091e-05 true resid norm 2.490090652619e-04 ||r(i)||/||b|| 3.380637575444e-05 >>>>>>>>>> 41 KSP preconditioned resid norm 2.347651187611e-05 true resid norm 2.490233544624e-04 ||r(i)||/||b|| 3.380831570825e-05 >>>>>>>>>> 42 KSP preconditioned resid norm 2.352860162514e-05 true resid norm 2.490191394202e-04 ||r(i)||/||b|| 3.380774345879e-05 >>>>>>>>>> 43 KSP preconditioned resid norm 2.312377506928e-05 true resid norm 2.490209491359e-04 ||r(i)||/||b|| 3.380798915237e-05 >>>>>>>>>> 44 KSP preconditioned resid norm 2.295770973533e-05 true resid norm 2.490178136759e-04 ||r(i)||/||b|| 3.380756347093e-05 >>>>>>>>>> 45 KSP preconditioned resid norm 2.833646456041e-05 true resid norm 2.489991602651e-04 ||r(i)||/||b|| 3.380503101608e-05 >>>>>>>>>> 46 KSP preconditioned resid norm 2.760296424494e-05 true resid norm 2.490104320666e-04 ||r(i)||/||b|| 3.380656131682e-05 >>>>>>>>>> 47 KSP preconditioned resid norm 2.451504295239e-05 true resid norm 2.490241388672e-04 ||r(i)||/||b|| 3.380842220189e-05 >>>>>>>>>> 48 KSP preconditioned resid norm 2.512391514098e-05 true resid norm 2.490245923753e-04 ||r(i)||/||b|| 3.380848377180e-05 >>>>>>>>>> 49 KSP preconditioned resid norm 2.483419450528e-05 true resid norm 2.490273364402e-04 ||r(i)||/||b|| 3.380885631602e-05 >>>>>>>>>> 50 KSP preconditioned resid norm 2.507460538466e-05 true resid norm 2.490309488780e-04 ||r(i)||/||b|| 3.380934675371e-05 >>>>>>>>>> 51 KSP preconditioned resid norm 2.499708772881e-05 true resid norm 2.490300908170e-04 ||r(i)||/||b|| 3.380923026022e-05 >>>>>>>>>> 52 KSP preconditioned resid norm 1.059778259446e-05 true resid norm 2.489352833521e-04 ||r(i)||/||b|| 3.379635885420e-05 >>>>>>>>>> 53 KSP preconditioned resid norm 1.074975117060e-05 true resid norm 2.489294722901e-04 ||r(i)||/||b|| 3.379556992330e-05 >>>>>>>>>> 54 KSP preconditioned resid norm 1.095242219559e-05 true resid norm 2.489295454212e-04 ||r(i)||/||b|| 3.379557985184e-05 >>>>>>>>>> 55 KSP preconditioned resid norm 8.359999674720e-06 true resid norm 2.489673581944e-04 ||r(i)||/||b|| 3.380071345137e-05 >>>>>>>>>> 56 KSP preconditioned resid norm 8.368232998470e-06 true resid norm 2.489700421343e-04 ||r(i)||/||b|| 3.380107783281e-05 >>>>>>>>>> 57 KSP preconditioned resid norm 8.443378041101e-06 true resid norm 2.489702900875e-04 ||r(i)||/||b|| 3.380111149584e-05 >>>>>>>>>> 58 KSP preconditioned resid norm 8.647159584302e-06 true resid norm 2.489640805831e-04 ||r(i)||/||b|| 3.380026847095e-05 >>>>>>>>>> 59 KSP preconditioned resid norm 1.024742790737e-05 true resid norm 2.489447846660e-04 ||r(i)||/||b|| 3.379764878711e-05 >>>>>>>>>> 60 KSP preconditioned resid norm 1.033394118910e-05 true resid norm 2.489441404923e-04 ||r(i)||/||b|| 3.379756133175e-05 >>>>>>>>>> 61 KSP preconditioned resid norm 1.030066336446e-05 true resid norm 2.489399918556e-04 ||r(i)||/||b|| 3.379699809776e-05 >>>>>>>>>> 62 KSP preconditioned resid norm 1.029956398963e-05 true resid norm 2.489445295139e-04 ||r(i)||/||b|| 3.379761414674e-05 >>>>>>>>>> 63 KSP preconditioned resid norm 1.028190129002e-05 true resid norm 2.489456200527e-04 ||r(i)||/||b|| 3.379776220225e-05 >>>>>>>>>> 64 KSP preconditioned resid norm 9.878799185773e-06 true resid norm 2.489488742330e-04 ||r(i)||/||b|| 3.379820400160e-05 >>>>>>>>>> 65 KSP preconditioned resid norm 9.917711104174e-06 true resid norm 2.489478066593e-04 ||r(i)||/||b|| 3.379805906391e-05 >>>>>>>>>> 66 KSP preconditioned resid norm 1.003572019576e-05 true resid norm 2.489441995703e-04 ||r(i)||/||b|| 3.379756935240e-05 >>>>>>>>>> 67 KSP preconditioned resid norm 9.924487278236e-06 true resid norm 2.489475403451e-04 ||r(i)||/||b|| 3.379802290812e-05 >>>>>>>>>> 68 KSP preconditioned resid norm 9.804213483359e-06 true resid norm 2.489457781760e-04 ||r(i)||/||b|| 3.379778366964e-05 >>>>>>>>>> 69 KSP preconditioned resid norm 9.748922705476e-06 true resid norm 2.489408473578e-04 ||r(i)||/||b|| 3.379711424383e-05 >>>>>>>>>> 70 KSP preconditioned resid norm 9.886044523689e-06 true resid norm 2.489514438395e-04 ||r(i)||/||b|| 3.379855286071e-05 >>>>>>>>>> 71 KSP preconditioned resid norm 1.083888478689e-05 true resid norm 2.489420898851e-04 ||r(i)||/||b|| 3.379728293386e-05 >>>>>>>>>> 72 KSP preconditioned resid norm 1.106561823757e-05 true resid norm 2.489364778104e-04 ||r(i)||/||b|| 3.379652101821e-05 >>>>>>>>>> 73 KSP preconditioned resid norm 1.132091515426e-05 true resid norm 2.489456804535e-04 ||r(i)||/||b|| 3.379777040248e-05 >>>>>>>>>> 74 KSP preconditioned resid norm 1.330905328963e-05 true resid norm 2.489317458981e-04 ||r(i)||/||b|| 3.379587859660e-05 >>>>>>>>>> 75 KSP preconditioned resid norm 1.305555302619e-05 true resid norm 2.489320939810e-04 ||r(i)||/||b|| 3.379592585359e-05 >>>>>>>>>> 76 KSP preconditioned resid norm 1.308083397399e-05 true resid norm 2.489299951581e-04 ||r(i)||/||b|| 3.379564090977e-05 >>>>>>>>>> 77 KSP preconditioned resid norm 1.320098861853e-05 true resid norm 2.489323669317e-04 ||r(i)||/||b|| 3.379596291036e-05 >>>>>>>>>> 78 KSP preconditioned resid norm 1.300160788274e-05 true resid norm 2.489306393356e-04 ||r(i)||/||b|| 3.379572836564e-05 >>>>>>>>>> 79 KSP preconditioned resid norm 1.317651537793e-05 true resid norm 2.489381364970e-04 ||r(i)||/||b|| 3.379674620752e-05 >>>>>>>>>> 80 KSP preconditioned resid norm 1.309769805765e-05 true resid norm 2.489285056062e-04 ||r(i)||/||b|| 3.379543868279e-05 >>>>>>>>>> 81 KSP preconditioned resid norm 1.293686496271e-05 true resid norm 2.489347818072e-04 ||r(i)||/||b|| 3.379629076264e-05 >>>>>>>>>> 82 KSP preconditioned resid norm 1.311788285799e-05 true resid norm 2.489320040215e-04 ||r(i)||/||b|| 3.379591364037e-05 >>>>>>>>>> 83 KSP preconditioned resid norm 1.313667378798e-05 true resid norm 2.489329437217e-04 ||r(i)||/||b|| 3.379604121748e-05 >>>>>>>>>> 84 KSP preconditioned resid norm 1.416138205017e-05 true resid norm 2.489266908838e-04 ||r(i)||/||b|| 3.379519230948e-05 >>>>>>>>>> 85 KSP preconditioned resid norm 1.452253464774e-05 true resid norm 2.489285688375e-04 ||r(i)||/||b|| 3.379544726729e-05 >>>>>>>>>> 86 KSP preconditioned resid norm 1.426709413370e-05 true resid norm 2.489362313402e-04 ||r(i)||/||b|| 3.379648755651e-05 >>>>>>>>>> 87 KSP preconditioned resid norm 1.427480849552e-05 true resid norm 2.489378183000e-04 ||r(i)||/||b|| 3.379670300795e-05 >>>>>>>>>> 88 KSP preconditioned resid norm 1.413870980147e-05 true resid norm 2.489325756118e-04 ||r(i)||/||b|| 3.379599124153e-05 >>>>>>>>>> 89 KSP preconditioned resid norm 1.353259857657e-05 true resid norm 2.489318968308e-04 ||r(i)||/||b|| 3.379589908776e-05 >>>>>>>>>> 90 KSP preconditioned resid norm 1.347676448611e-05 true resid norm 2.489332074417e-04 ||r(i)||/||b|| 3.379607702106e-05 >>>>>>>>>> 91 KSP preconditioned resid norm 1.362825902909e-05 true resid norm 2.489344974971e-04 ||r(i)||/||b|| 3.379625216367e-05 >>>>>>>>>> 92 KSP preconditioned resid norm 1.346280901052e-05 true resid norm 2.489302570131e-04 ||r(i)||/||b|| 3.379567646016e-05 >>>>>>>>>> 93 KSP preconditioned resid norm 1.328052169696e-05 true resid norm 2.489346601224e-04 ||r(i)||/||b|| 3.379627424228e-05 >>>>>>>>>> 94 KSP preconditioned resid norm 1.554682082515e-05 true resid norm 2.489309078759e-04 ||r(i)||/||b|| 3.379576482365e-05 >>>>>>>>>> 95 KSP preconditioned resid norm 1.557128675775e-05 true resid norm 2.489317143582e-04 ||r(i)||/||b|| 3.379587431462e-05 >>>>>>>>>> 96 KSP preconditioned resid norm 1.542571813923e-05 true resid norm 2.489319910303e-04 ||r(i)||/||b|| 3.379591187663e-05 >>>>>>>>>> 97 KSP preconditioned resid norm 1.570516684444e-05 true resid norm 2.489321980894e-04 ||r(i)||/||b|| 3.379593998772e-05 >>>>>>>>>> 98 KSP preconditioned resid norm 1.600431789899e-05 true resid norm 2.489297450311e-04 ||r(i)||/||b|| 3.379560695162e-05 >>>>>>>>>> 99 KSP preconditioned resid norm 1.587495554658e-05 true resid norm 2.489339000570e-04 ||r(i)||/||b|| 3.379617105303e-05 >>>>>>>>>> 100 KSP preconditioned resid norm 1.621163002878e-05 true resid norm 2.489299953360e-04 ||r(i)||/||b|| 3.379564093392e-05 >>>>>>>>>> 101 KSP preconditioned resid norm 1.627060872574e-05 true resid norm 2.489301570161e-04 ||r(i)||/||b|| 3.379566288419e-05 >>>>>>>>>> 102 KSP preconditioned resid norm 1.622931647243e-05 true resid norm 2.489277930910e-04 ||r(i)||/||b|| 3.379534194913e-05 >>>>>>>>>> 103 KSP preconditioned resid norm 1.612544300282e-05 true resid norm 2.489317483299e-04 ||r(i)||/||b|| 3.379587892674e-05 >>>>>>>>>> 104 KSP preconditioned resid norm 1.880131646630e-05 true resid norm 2.489335862583e-04 ||r(i)||/||b|| 3.379612845059e-05 >>>>>>>>>> 105 KSP preconditioned resid norm 1.880563295793e-05 true resid norm 2.489365017923e-04 ||r(i)||/||b|| 3.379652427408e-05 >>>>>>>>>> 106 KSP preconditioned resid norm 1.860619184027e-05 true resid norm 2.489362382373e-04 ||r(i)||/||b|| 3.379648849288e-05 >>>>>>>>>> 107 KSP preconditioned resid norm 1.877134148719e-05 true resid norm 2.489425484523e-04 ||r(i)||/||b|| 3.379734519061e-05 >>>>>>>>>> 108 KSP preconditioned resid norm 1.914810713538e-05 true resid norm 2.489347415573e-04 ||r(i)||/||b|| 3.379628529818e-05 >>>>>>>>>> 109 KSP preconditioned resid norm 1.220673255622e-05 true resid norm 2.490196186357e-04 ||r(i)||/||b|| 3.380780851884e-05 >>>>>>>>>> 110 KSP preconditioned resid norm 1.215819132910e-05 true resid norm 2.490233518072e-04 ||r(i)||/||b|| 3.380831534776e-05 >>>>>>>>>> 111 KSP preconditioned resid norm 1.196565427400e-05 true resid norm 2.490279438906e-04 ||r(i)||/||b|| 3.380893878570e-05 >>>>>>>>>> 112 KSP preconditioned resid norm 1.171748185197e-05 true resid norm 2.490242578983e-04 ||r(i)||/||b|| 3.380843836198e-05 >>>>>>>>>> 113 KSP preconditioned resid norm 1.162855824118e-05 true resid norm 2.490229018536e-04 ||r(i)||/||b|| 3.380825426043e-05 >>>>>>>>>> 114 KSP preconditioned resid norm 1.175594685689e-05 true resid norm 2.490274328440e-04 ||r(i)||/||b|| 3.380886940415e-05 >>>>>>>>>> 115 KSP preconditioned resid norm 1.167979454122e-05 true resid norm 2.490271161036e-04 ||r(i)||/||b|| 3.380882640232e-05 >>>>>>>>>> 116 KSP preconditioned resid norm 1.181010893019e-05 true resid norm 2.490235420657e-04 ||r(i)||/||b|| 3.380834117795e-05 >>>>>>>>>> 117 KSP preconditioned resid norm 1.175206638194e-05 true resid norm 2.490263165345e-04 ||r(i)||/||b|| 3.380871784992e-05 >>>>>>>>>> 118 KSP preconditioned resid norm 1.183804125791e-05 true resid norm 2.490221353083e-04 ||r(i)||/||b|| 3.380815019145e-05 >>>>>>>>>> 119 KSP preconditioned resid norm 1.186426973727e-05 true resid norm 2.490227115336e-04 ||r(i)||/||b|| 3.380822842189e-05 >>>>>>>>>> 120 KSP preconditioned resid norm 1.181986776689e-05 true resid norm 2.490257884230e-04 ||r(i)||/||b|| 3.380864615159e-05 >>>>>>>>>> 121 KSP preconditioned resid norm 1.131443277370e-05 true resid norm 2.490259110230e-04 ||r(i)||/||b|| 3.380866279620e-05 >>>>>>>>>> 122 KSP preconditioned resid norm 1.114920075859e-05 true resid norm 2.490249829382e-04 ||r(i)||/||b|| 3.380853679603e-05 >>>>>>>>>> 123 KSP preconditioned resid norm 1.082073321672e-05 true resid norm 2.490314084868e-04 ||r(i)||/||b|| 3.380940915187e-05 >>>>>>>>>> 124 KSP preconditioned resid norm 3.307785860990e-06 true resid norm 2.490613501549e-04 ||r(i)||/||b|| 3.381347414155e-05 >>>>>>>>>> 125 KSP preconditioned resid norm 3.287051720572e-06 true resid norm 2.490584648195e-04 ||r(i)||/||b|| 3.381308241794e-05 >>>>>>>>>> 126 KSP preconditioned resid norm 3.286797046069e-06 true resid norm 2.490654396386e-04 ||r(i)||/||b|| 3.381402934473e-05 >>>>>>>>>> 127 KSP preconditioned resid norm 3.311592899411e-06 true resid norm 2.490627973588e-04 ||r(i)||/||b|| 3.381367061922e-05 >>>>>>>>>> 128 KSP preconditioned resid norm 3.560993694635e-06 true resid norm 2.490571732816e-04 ||r(i)||/||b|| 3.381290707406e-05 >>>>>>>>>> 129 KSP preconditioned resid norm 3.411994617661e-06 true resid norm 2.490652122141e-04 ||r(i)||/||b|| 3.381399846875e-05 >>>>>>>>>> 130 KSP preconditioned resid norm 3.412383310721e-06 true resid norm 2.490633454022e-04 ||r(i)||/||b|| 3.381374502359e-05 >>>>>>>>>> 131 KSP preconditioned resid norm 3.288320044878e-06 true resid norm 2.490639470096e-04 ||r(i)||/||b|| 3.381382669999e-05 >>>>>>>>>> 132 KSP preconditioned resid norm 3.273215756565e-06 true resid norm 2.490640390847e-04 ||r(i)||/||b|| 3.381383920043e-05 >>>>>>>>>> 133 KSP preconditioned resid norm 3.236969051459e-06 true resid norm 2.490678216102e-04 ||r(i)||/||b|| 3.381435272985e-05 >>>>>>>>>> 134 KSP preconditioned resid norm 3.203260913942e-06 true resid norm 2.490640965346e-04 ||r(i)||/||b|| 3.381384700005e-05 >>>>>>>>>> 135 KSP preconditioned resid norm 3.224117152353e-06 true resid norm 2.490655026376e-04 ||r(i)||/||b|| 3.381403789770e-05 >>>>>>>>>> 136 KSP preconditioned resid norm 3.221577997984e-06 true resid norm 2.490684737611e-04 ||r(i)||/||b|| 3.381444126823e-05 >>>>>>>>>> 137 KSP preconditioned resid norm 3.195936222128e-06 true resid norm 2.490673982333e-04 ||r(i)||/||b|| 3.381429525066e-05 >>>>>>>>>> 138 KSP preconditioned resid norm 3.207528137426e-06 true resid norm 2.490641247196e-04 ||r(i)||/||b|| 3.381385082655e-05 >>>>>>>>>> 139 KSP preconditioned resid norm 3.240134271963e-06 true resid norm 2.490615861251e-04 ||r(i)||/||b|| 3.381350617773e-05 >>>>>>>>>> 140 KSP preconditioned resid norm 2.698833607230e-06 true resid norm 2.490638954889e-04 ||r(i)||/||b|| 3.381381970535e-05 >>>>>>>>>> 141 KSP preconditioned resid norm 2.599151209137e-06 true resid norm 2.490657106698e-04 ||r(i)||/||b|| 3.381406614091e-05 >>>>>>>>>> 142 KSP preconditioned resid norm 2.633939920994e-06 true resid norm 2.490707754695e-04 ||r(i)||/||b|| 3.381475375652e-05 >>>>>>>>>> 143 KSP preconditioned resid norm 2.519609221376e-06 true resid norm 2.490639100480e-04 ||r(i)||/||b|| 3.381382168195e-05 >>>>>>>>>> 144 KSP preconditioned resid norm 3.768526937684e-06 true resid norm 2.490654096698e-04 ||r(i)||/||b|| 3.381402527606e-05 >>>>>>>>>> 145 KSP preconditioned resid norm 3.707841943289e-06 true resid norm 2.490630207923e-04 ||r(i)||/||b|| 3.381370095336e-05 >>>>>>>>>> 146 KSP preconditioned resid norm 3.698827503486e-06 true resid norm 2.490646071561e-04 ||r(i)||/||b|| 3.381391632387e-05 >>>>>>>>>> 147 KSP preconditioned resid norm 3.642747039615e-06 true resid norm 2.490610990161e-04 ||r(i)||/||b|| 3.381344004604e-05 >>>>>>>>>> 148 KSP preconditioned resid norm 3.613100087842e-06 true resid norm 2.490617159023e-04 ||r(i)||/||b|| 3.381352379676e-05 >>>>>>>>>> 149 KSP preconditioned resid norm 3.637646399299e-06 true resid norm 2.490648063023e-04 ||r(i)||/||b|| 3.381394336069e-05 >>>>>>>>>> 150 KSP preconditioned resid norm 3.640235367864e-06 true resid norm 2.490648516718e-04 ||r(i)||/||b|| 3.381394952022e-05 >>>>>>>>>> 151 KSP preconditioned resid norm 3.724708848977e-06 true resid norm 2.490622201040e-04 ||r(i)||/||b|| 3.381359224901e-05 >>>>>>>>>> 152 KSP preconditioned resid norm 3.665185002770e-06 true resid norm 2.490664302790e-04 ||r(i)||/||b|| 3.381416383766e-05 >>>>>>>>>> 153 KSP preconditioned resid norm 3.348992579120e-06 true resid norm 2.490655722697e-04 ||r(i)||/||b|| 3.381404735121e-05 >>>>>>>>>> 154 KSP preconditioned resid norm 3.309431137943e-06 true resid norm 2.490727563300e-04 ||r(i)||/||b|| 3.381502268535e-05 >>>>>>>>>> 155 KSP preconditioned resid norm 3.299031245428e-06 true resid norm 2.490688392843e-04 ||r(i)||/||b|| 3.381449089298e-05 >>>>>>>>>> 156 KSP preconditioned resid norm 3.297127463503e-06 true resid norm 2.490642207769e-04 ||r(i)||/||b|| 3.381386386763e-05 >>>>>>>>>> 157 KSP preconditioned resid norm 3.297370198641e-06 true resid norm 2.490666651723e-04 ||r(i)||/||b|| 3.381419572764e-05 >>>>>>>>>> 158 KSP preconditioned resid norm 3.290873165210e-06 true resid norm 2.490679189538e-04 ||r(i)||/||b|| 3.381436594557e-05 >>>>>>>>>> 159 KSP preconditioned resid norm 3.346705292419e-06 true resid norm 2.490617329776e-04 ||r(i)||/||b|| 3.381352611496e-05 >>>>>>>>>> 160 KSP preconditioned resid norm 3.429583550890e-06 true resid norm 2.490675116236e-04 ||r(i)||/||b|| 3.381431064494e-05 >>>>>>>>>> 161 KSP preconditioned resid norm 3.425238504679e-06 true resid norm 2.490648199058e-04 ||r(i)||/||b|| 3.381394520756e-05 >>>>>>>>>> 162 KSP preconditioned resid norm 3.423484857849e-06 true resid norm 2.490723208298e-04 ||r(i)||/||b|| 3.381496356025e-05 >>>>>>>>>> 163 KSP preconditioned resid norm 3.383655922943e-06 true resid norm 2.490659981249e-04 ||r(i)||/||b|| 3.381410516686e-05 >>>>>>>>>> 164 KSP preconditioned resid norm 3.477197358452e-06 true resid norm 2.490665979073e-04 ||r(i)||/||b|| 3.381418659549e-05 >>>>>>>>>> 165 KSP preconditioned resid norm 3.454672202601e-06 true resid norm 2.490651358644e-04 ||r(i)||/||b|| 3.381398810323e-05 >>>>>>>>>> 166 KSP preconditioned resid norm 3.399075522566e-06 true resid norm 2.490678159511e-04 ||r(i)||/||b|| 3.381435196154e-05 >>>>>>>>>> 167 KSP preconditioned resid norm 3.305455787400e-06 true resid norm 2.490651924523e-04 ||r(i)||/||b|| 3.381399578581e-05 >>>>>>>>>> 168 KSP preconditioned resid norm 3.368445533284e-06 true resid norm 2.490688061735e-04 ||r(i)||/||b|| 3.381448639774e-05 >>>>>>>>>> 169 KSP preconditioned resid norm 2.981519724814e-06 true resid norm 2.490676378334e-04 ||r(i)||/||b|| 3.381432777964e-05 >>>>>>>>>> 170 KSP preconditioned resid norm 3.034423065539e-06 true resid norm 2.490694458885e-04 ||r(i)||/||b|| 3.381457324777e-05 >>>>>>>>>> 171 KSP preconditioned resid norm 2.885972780503e-06 true resid norm 2.490688033729e-04 ||r(i)||/||b|| 3.381448601752e-05 >>>>>>>>>> 172 KSP preconditioned resid norm 2.892491075033e-06 true resid norm 2.490692993765e-04 ||r(i)||/||b|| 3.381455335678e-05 >>>>>>>>>> 173 KSP preconditioned resid norm 2.921316177611e-06 true resid norm 2.490697629787e-04 ||r(i)||/||b|| 3.381461629709e-05 >>>>>>>>>> 174 KSP preconditioned resid norm 2.999889222269e-06 true resid norm 2.490707272626e-04 ||r(i)||/||b|| 3.381474721178e-05 >>>>>>>>>> 175 KSP preconditioned resid norm 2.975590207575e-06 true resid norm 2.490685439925e-04 ||r(i)||/||b|| 3.381445080310e-05 >>>>>>>>>> 176 KSP preconditioned resid norm 2.983065843597e-06 true resid norm 2.490701883671e-04 ||r(i)||/||b|| 3.381467404937e-05 >>>>>>>>>> 177 KSP preconditioned resid norm 2.965959610245e-06 true resid norm 2.490711538630e-04 ||r(i)||/||b|| 3.381480512861e-05 >>>>>>>>>> 178 KSP preconditioned resid norm 3.005389788827e-06 true resid norm 2.490751808095e-04 ||r(i)||/||b|| 3.381535184150e-05 >>>>>>>>>> 179 KSP preconditioned resid norm 2.956581668772e-06 true resid norm 2.490653125636e-04 ||r(i)||/||b|| 3.381401209257e-05 >>>>>>>>>> 180 KSP preconditioned resid norm 2.937498883661e-06 true resid norm 2.490666056653e-04 ||r(i)||/||b|| 3.381418764874e-05 >>>>>>>>>> 181 KSP preconditioned resid norm 2.913227475431e-06 true resid norm 2.490682436979e-04 ||r(i)||/||b|| 3.381441003402e-05 >>>>>>>>>> 182 KSP preconditioned resid norm 3.048172862254e-06 true resid norm 2.490719669872e-04 ||r(i)||/||b|| 3.381491552130e-05 >>>>>>>>>> 183 KSP preconditioned resid norm 3.023868104933e-06 true resid norm 2.490648745555e-04 ||r(i)||/||b|| 3.381395262699e-05 >>>>>>>>>> 184 KSP preconditioned resid norm 2.985947506400e-06 true resid norm 2.490638818852e-04 ||r(i)||/||b|| 3.381381785846e-05 >>>>>>>>>> 185 KSP preconditioned resid norm 2.840032055776e-06 true resid norm 2.490701112392e-04 ||r(i)||/||b|| 3.381466357820e-05 >>>>>>>>>> 186 KSP preconditioned resid norm 2.229279683815e-06 true resid norm 2.490609220680e-04 ||r(i)||/||b|| 3.381341602292e-05 >>>>>>>>>> 187 KSP preconditioned resid norm 2.441513276379e-06 true resid norm 2.490674056899e-04 ||r(i)||/||b|| 3.381429626300e-05 >>>>>>>>>> 188 KSP preconditioned resid norm 2.467046864016e-06 true resid norm 2.490691622632e-04 ||r(i)||/||b|| 3.381453474178e-05 >>>>>>>>>> 189 KSP preconditioned resid norm 2.482124586361e-06 true resid norm 2.490664992339e-04 ||r(i)||/||b|| 3.381417319923e-05 >>>>>>>>>> 190 KSP preconditioned resid norm 2.470564926502e-06 true resid norm 2.490617019713e-04 ||r(i)||/||b|| 3.381352190543e-05 >>>>>>>>>> 191 KSP preconditioned resid norm 2.457947086578e-06 true resid norm 2.490628644250e-04 ||r(i)||/||b|| 3.381367972437e-05 >>>>>>>>>> 192 KSP preconditioned resid norm 2.469444741724e-06 true resid norm 2.490639416335e-04 ||r(i)||/||b|| 3.381382597011e-05 >>>>>>>>>> 193 KSP preconditioned resid norm 2.469951525219e-06 true resid norm 2.490599769764e-04 ||r(i)||/||b|| 3.381328771385e-05 >>>>>>>>>> 194 KSP preconditioned resid norm 2.467486786643e-06 true resid norm 2.490630178622e-04 ||r(i)||/||b|| 3.381370055556e-05 >>>>>>>>>> 195 KSP preconditioned resid norm 2.409684391404e-06 true resid norm 2.490640302606e-04 ||r(i)||/||b|| 3.381383800245e-05 >>>>>>>>>> 196 KSP preconditioned resid norm 2.456046691135e-06 true resid norm 2.490637730235e-04 ||r(i)||/||b|| 3.381380307900e-05 >>>>>>>>>> 197 KSP preconditioned resid norm 2.300015653805e-06 true resid norm 2.490615406913e-04 ||r(i)||/||b|| 3.381350000947e-05 >>>>>>>>>> 198 KSP preconditioned resid norm 2.238328275301e-06 true resid norm 2.490647641246e-04 ||r(i)||/||b|| 3.381393763449e-05 >>>>>>>>>> 199 KSP preconditioned resid norm 2.317293820319e-06 true resid norm 2.490641611282e-04 ||r(i)||/||b|| 3.381385576951e-05 >>>>>>>>>> 200 KSP preconditioned resid norm 2.359590971314e-06 true resid norm 2.490685242974e-04 ||r(i)||/||b|| 3.381444812922e-05 >>>>>>>>>> 201 KSP preconditioned resid norm 2.311199691596e-06 true resid norm 2.490656791753e-04 ||r(i)||/||b|| 3.381406186510e-05 >>>>>>>>>> 202 KSP preconditioned resid norm 2.328772904196e-06 true resid norm 2.490651045523e-04 ||r(i)||/||b|| 3.381398385220e-05 >>>>>>>>>> 203 KSP preconditioned resid norm 2.332731604717e-06 true resid norm 2.490649960574e-04 ||r(i)||/||b|| 3.381396912253e-05 >>>>>>>>>> 204 KSP preconditioned resid norm 2.357629383490e-06 true resid norm 2.490686317727e-04 ||r(i)||/||b|| 3.381446272046e-05 >>>>>>>>>> 205 KSP preconditioned resid norm 2.374856180299e-06 true resid norm 2.490645897176e-04 ||r(i)||/||b|| 3.381391395637e-05 >>>>>>>>>> 206 KSP preconditioned resid norm 2.340395514404e-06 true resid norm 2.490618341127e-04 ||r(i)||/||b|| 3.381353984542e-05 >>>>>>>>>> 207 KSP preconditioned resid norm 2.314963680954e-06 true resid norm 2.490676153984e-04 ||r(i)||/||b|| 3.381432473379e-05 >>>>>>>>>> 208 KSP preconditioned resid norm 2.448070953106e-06 true resid norm 2.490644606776e-04 ||r(i)||/||b|| 3.381389643743e-05 >>>>>>>>>> 209 KSP preconditioned resid norm 2.428805110632e-06 true resid norm 2.490635817597e-04 ||r(i)||/||b|| 3.381377711234e-05 >>>>>>>>>> 210 KSP preconditioned resid norm 2.537929937808e-06 true resid norm 2.490680589404e-04 ||r(i)||/||b|| 3.381438495066e-05 >>>>>>>>>> 211 KSP preconditioned resid norm 2.515909029682e-06 true resid norm 2.490687803038e-04 ||r(i)||/||b|| 3.381448288557e-05 >>>>>>>>>> 212 KSP preconditioned resid norm 2.497907513266e-06 true resid norm 2.490618016885e-04 ||r(i)||/||b|| 3.381353544340e-05 >>>>>>>>>> 213 KSP preconditioned resid norm 1.783501869502e-06 true resid norm 2.490632647470e-04 ||r(i)||/||b|| 3.381373407354e-05 >>>>>>>>>> 214 KSP preconditioned resid norm 1.767420653144e-06 true resid norm 2.490685569328e-04 ||r(i)||/||b|| 3.381445255992e-05 >>>>>>>>>> 215 KSP preconditioned resid norm 1.854926068272e-06 true resid norm 2.490609365464e-04 ||r(i)||/||b|| 3.381341798856e-05 >>>>>>>>>> 216 KSP preconditioned resid norm 1.818308539774e-06 true resid norm 2.490639142283e-04 ||r(i)||/||b|| 3.381382224948e-05 >>>>>>>>>> 217 KSP preconditioned resid norm 1.809431578070e-06 true resid norm 2.490605125049e-04 ||r(i)||/||b|| 3.381336041915e-05 >>>>>>>>>> 218 KSP preconditioned resid norm 1.789862735999e-06 true resid norm 2.490564024901e-04 ||r(i)||/||b|| 3.381280242859e-05 >>>>>>>>>> 219 KSP preconditioned resid norm 1.769239890163e-06 true resid norm 2.490647825316e-04 ||r(i)||/||b|| 3.381394013349e-05 >>>>>>>>>> 220 KSP preconditioned resid norm 1.780760773109e-06 true resid norm 2.490622606663e-04 ||r(i)||/||b|| 3.381359775589e-05 >>>>>>>>>> 221 KSP preconditioned resid norm 5.009024913368e-07 true resid norm 2.490659101637e-04 ||r(i)||/||b|| 3.381409322492e-05 >>>>>>>>>> 222 KSP preconditioned resid norm 4.974450322799e-07 true resid norm 2.490714287402e-04 ||r(i)||/||b|| 3.381484244693e-05 >>>>>>>>>> 223 KSP preconditioned resid norm 4.938819481519e-07 true resid norm 2.490665661715e-04 ||r(i)||/||b|| 3.381418228693e-05 >>>>>>>>>> 224 KSP preconditioned resid norm 4.973231831266e-07 true resid norm 2.490725000995e-04 ||r(i)||/||b|| 3.381498789855e-05 >>>>>>>>>> 225 KSP preconditioned resid norm 5.086864036771e-07 true resid norm 2.490664132954e-04 ||r(i)||/||b|| 3.381416153192e-05 >>>>>>>>>> 226 KSP preconditioned resid norm 5.046954570561e-07 true resid norm 2.490698772594e-04 ||r(i)||/||b|| 3.381463181226e-05 >>>>>>>>>> 227 KSP preconditioned resid norm 5.086852920874e-07 true resid norm 2.490703544723e-04 ||r(i)||/||b|| 3.381469660041e-05 >>>>>>>>>> 228 KSP preconditioned resid norm 5.182381756169e-07 true resid norm 2.490665200032e-04 ||r(i)||/||b|| 3.381417601896e-05 >>>>>>>>>> 229 KSP preconditioned resid norm 5.261455182896e-07 true resid norm 2.490697169472e-04 ||r(i)||/||b|| 3.381461004770e-05 >>>>>>>>>> 230 KSP preconditioned resid norm 5.265262522400e-07 true resid norm 2.490726890541e-04 ||r(i)||/||b|| 3.381501355172e-05 >>>>>>>>>> 231 KSP preconditioned resid norm 5.220652263946e-07 true resid norm 2.490689325236e-04 ||r(i)||/||b|| 3.381450355149e-05 >>>>>>>>>> 232 KSP preconditioned resid norm 5.256466259888e-07 true resid norm 2.490694033989e-04 ||r(i)||/||b|| 3.381456747923e-05 >>>>>>>>>> 233 KSP preconditioned resid norm 5.443022648374e-07 true resid norm 2.490650183144e-04 ||r(i)||/||b|| 3.381397214423e-05 >>>>>>>>>> 234 KSP preconditioned resid norm 5.562619006436e-07 true resid norm 2.490764576883e-04 ||r(i)||/||b|| 3.381552519520e-05 >>>>>>>>>> 235 KSP preconditioned resid norm 5.998148629545e-07 true resid norm 2.490714032716e-04 ||r(i)||/||b|| 3.381483898922e-05 >>>>>>>>>> 236 KSP preconditioned resid norm 6.498977322955e-07 true resid norm 2.490650270144e-04 ||r(i)||/||b|| 3.381397332537e-05 >>>>>>>>>> 237 KSP preconditioned resid norm 6.503686003429e-07 true resid norm 2.490706976108e-04 ||r(i)||/||b|| 3.381474318615e-05 >>>>>>>>>> 238 KSP preconditioned resid norm 6.566719023119e-07 true resid norm 2.490664107559e-04 ||r(i)||/||b|| 3.381416118714e-05 >>>>>>>>>> 239 KSP preconditioned resid norm 6.549737473208e-07 true resid norm 2.490721547909e-04 ||r(i)||/||b|| 3.381494101821e-05 >>>>>>>>>> 240 KSP preconditioned resid norm 6.616898981418e-07 true resid norm 2.490679659838e-04 ||r(i)||/||b|| 3.381437233053e-05 >>>>>>>>>> 241 KSP preconditioned resid norm 6.829917691021e-07 true resid norm 2.490728328614e-04 ||r(i)||/||b|| 3.381503307553e-05 >>>>>>>>>> 242 KSP preconditioned resid norm 7.030239869389e-07 true resid norm 2.490706345115e-04 ||r(i)||/||b|| 3.381473461955e-05 >>>>>>>>>> 243 KSP preconditioned resid norm 7.018435683340e-07 true resid norm 2.490650978460e-04 ||r(i)||/||b|| 3.381398294172e-05 >>>>>>>>>> 244 KSP preconditioned resid norm 7.058047080376e-07 true resid norm 2.490685975642e-04 ||r(i)||/||b|| 3.381445807618e-05 >>>>>>>>>> 245 KSP preconditioned resid norm 6.896300385099e-07 true resid norm 2.490708566380e-04 ||r(i)||/||b|| 3.381476477625e-05 >>>>>>>>>> 246 KSP preconditioned resid norm 7.093960074437e-07 true resid norm 2.490667427871e-04 ||r(i)||/||b|| 3.381420626490e-05 >>>>>>>>>> 247 KSP preconditioned resid norm 7.817121711853e-07 true resid norm 2.490692299030e-04 ||r(i)||/||b|| 3.381454392480e-05 >>>>>>>>>> 248 KSP preconditioned resid norm 7.976109778309e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >>>>>>>>>> 249 KSP preconditioned resid norm 7.855322750445e-07 true resid norm 2.490720861966e-04 ||r(i)||/||b|| 3.381493170560e-05 >>>>>>>>>> 250 KSP preconditioned resid norm 7.778531114042e-07 true resid norm 2.490673235034e-04 ||r(i)||/||b|| 3.381428510506e-05 >>>>>>>>>> 251 KSP preconditioned resid norm 7.848682182070e-07 true resid norm 2.490686360729e-04 ||r(i)||/||b|| 3.381446330426e-05 >>>>>>>>>> 252 KSP preconditioned resid norm 7.967291867330e-07 true resid norm 2.490724820229e-04 ||r(i)||/||b|| 3.381498544442e-05 >>>>>>>>>> 253 KSP preconditioned resid norm 7.865012959525e-07 true resid norm 2.490666662028e-04 ||r(i)||/||b|| 3.381419586754e-05 >>>>>>>>>> 254 KSP preconditioned resid norm 7.656025385804e-07 true resid norm 2.490686283214e-04 ||r(i)||/||b|| 3.381446225190e-05 >>>>>>>>>> 255 KSP preconditioned resid norm 7.757018653468e-07 true resid norm 2.490655983763e-04 ||r(i)||/||b|| 3.381405089553e-05 >>>>>>>>>> 256 KSP preconditioned resid norm 6.686490372981e-07 true resid norm 2.490715698964e-04 ||r(i)||/||b|| 3.381486161081e-05 >>>>>>>>>> 257 KSP preconditioned resid norm 6.596005109428e-07 true resid norm 2.490666403003e-04 ||r(i)||/||b|| 3.381419235092e-05 >>>>>>>>>> 258 KSP preconditioned resid norm 6.681742296333e-07 true resid norm 2.490683725835e-04 ||r(i)||/||b|| 3.381442753198e-05 >>>>>>>>>> 259 KSP preconditioned resid norm 1.089245482033e-06 true resid norm 2.490688086568e-04 ||r(i)||/||b|| 3.381448673488e-05 >>>>>>>>>> 260 KSP preconditioned resid norm 1.099844873189e-06 true resid norm 2.490690703265e-04 ||r(i)||/||b|| 3.381452226011e-05 >>>>>>>>>> 261 KSP preconditioned resid norm 1.112925540869e-06 true resid norm 2.490664481058e-04 ||r(i)||/||b|| 3.381416625790e-05 >>>>>>>>>> 262 KSP preconditioned resid norm 1.113056910480e-06 true resid norm 2.490658753273e-04 ||r(i)||/||b|| 3.381408849541e-05 >>>>>>>>>> 263 KSP preconditioned resid norm 1.104801535149e-06 true resid norm 2.490736510776e-04 ||r(i)||/||b|| 3.381514415953e-05 >>>>>>>>>> 264 KSP preconditioned resid norm 1.158709147873e-06 true resid norm 2.490607531152e-04 ||r(i)||/||b|| 3.381339308528e-05 >>>>>>>>>> 265 KSP preconditioned resid norm 1.178985740182e-06 true resid norm 2.490727895619e-04 ||r(i)||/||b|| 3.381502719703e-05 >>>>>>>>>> 266 KSP preconditioned resid norm 1.165130533478e-06 true resid norm 2.490639076693e-04 ||r(i)||/||b|| 3.381382135901e-05 >>>>>>>>>> 267 KSP preconditioned resid norm 1.181364114499e-06 true resid norm 2.490667871436e-04 ||r(i)||/||b|| 3.381421228690e-05 >>>>>>>>>> 268 KSP preconditioned resid norm 1.170295348543e-06 true resid norm 2.490662613306e-04 ||r(i)||/||b|| 3.381414090063e-05 >>>>>>>>>> 269 KSP preconditioned resid norm 1.213243016230e-06 true resid norm 2.490666173719e-04 ||r(i)||/||b|| 3.381418923808e-05 >>>>>>>>>> 270 KSP preconditioned resid norm 1.239691953997e-06 true resid norm 2.490678323197e-04 ||r(i)||/||b|| 3.381435418381e-05 >>>>>>>>>> 271 KSP preconditioned resid norm 1.219891740100e-06 true resid norm 2.490625009256e-04 ||r(i)||/||b|| 3.381363037437e-05 >>>>>>>>>> 272 KSP preconditioned resid norm 1.231321334346e-06 true resid norm 2.490659733696e-04 ||r(i)||/||b|| 3.381410180599e-05 >>>>>>>>>> 273 KSP preconditioned resid norm 1.208183234158e-06 true resid norm 2.490685987255e-04 ||r(i)||/||b|| 3.381445823385e-05 >>>>>>>>>> 274 KSP preconditioned resid norm 1.211768545589e-06 true resid norm 2.490671548953e-04 ||r(i)||/||b|| 3.381426221421e-05 >>>>>>>>>> 275 KSP preconditioned resid norm 1.209433459842e-06 true resid norm 2.490669016096e-04 ||r(i)||/||b|| 3.381422782722e-05 >>>>>>>>>> 276 KSP preconditioned resid norm 1.223729184405e-06 true resid norm 2.490658128014e-04 ||r(i)||/||b|| 3.381408000666e-05 >>>>>>>>>> 277 KSP preconditioned resid norm 1.243915201868e-06 true resid norm 2.490693375756e-04 ||r(i)||/||b|| 3.381455854282e-05 >>>>>>>>>> 278 KSP preconditioned resid norm 1.231994655529e-06 true resid norm 2.490682988311e-04 ||r(i)||/||b|| 3.381441751910e-05 >>>>>>>>>> 279 KSP preconditioned resid norm 1.227930683777e-06 true resid norm 2.490667825866e-04 ||r(i)||/||b|| 3.381421166823e-05 >>>>>>>>>> 280 KSP preconditioned resid norm 1.193458846469e-06 true resid norm 2.490687366117e-04 ||r(i)||/||b|| 3.381447695378e-05 >>>>>>>>>> 281 KSP preconditioned resid norm 1.217089059805e-06 true resid norm 2.490674797371e-04 ||r(i)||/||b|| 3.381430631591e-05 >>>>>>>>>> 282 KSP preconditioned resid norm 1.249318287709e-06 true resid norm 2.490662866951e-04 ||r(i)||/||b|| 3.381414434420e-05 >>>>>>>>>> 283 KSP preconditioned resid norm 1.183320029547e-06 true resid norm 2.490645783630e-04 ||r(i)||/||b|| 3.381391241482e-05 >>>>>>>>>> 284 KSP preconditioned resid norm 1.174730603102e-06 true resid norm 2.490686881647e-04 ||r(i)||/||b|| 3.381447037643e-05 >>>>>>>>>> 285 KSP preconditioned resid norm 1.175838261923e-06 true resid norm 2.490665969300e-04 ||r(i)||/||b|| 3.381418646281e-05 >>>>>>>>>> 286 KSP preconditioned resid norm 1.188946188368e-06 true resid norm 2.490661974622e-04 ||r(i)||/||b|| 3.381413222961e-05 >>>>>>>>>> 287 KSP preconditioned resid norm 1.177848565707e-06 true resid norm 2.490660236206e-04 ||r(i)||/||b|| 3.381410862824e-05 >>>>>>>>>> 288 KSP preconditioned resid norm 1.200075508281e-06 true resid norm 2.490645353536e-04 ||r(i)||/||b|| 3.381390657571e-05 >>>>>>>>>> 289 KSP preconditioned resid norm 1.184589570618e-06 true resid norm 2.490664920355e-04 ||r(i)||/||b|| 3.381417222195e-05 >>>>>>>>>> 290 KSP preconditioned resid norm 1.221114703873e-06 true resid norm 2.490670597538e-04 ||r(i)||/||b|| 3.381424929746e-05 >>>>>>>>>> 291 KSP preconditioned resid norm 1.249479658256e-06 true resid norm 2.490641582876e-04 ||r(i)||/||b|| 3.381385538385e-05 >>>>>>>>>> 292 KSP preconditioned resid norm 1.245768496850e-06 true resid norm 2.490704480588e-04 ||r(i)||/||b|| 3.381470930606e-05 >>>>>>>>>> 293 KSP preconditioned resid norm 1.243742607953e-06 true resid norm 2.490649690604e-04 ||r(i)||/||b|| 3.381396545733e-05 >>>>>>>>>> 294 KSP preconditioned resid norm 1.342758483339e-06 true resid norm 2.490676207432e-04 ||r(i)||/||b|| 3.381432545942e-05 >>>>>>>>>> 295 KSP preconditioned resid norm 1.353816099600e-06 true resid norm 2.490695263153e-04 ||r(i)||/||b|| 3.381458416681e-05 >>>>>>>>>> 296 KSP preconditioned resid norm 1.343886763293e-06 true resid norm 2.490673674307e-04 ||r(i)||/||b|| 3.381429106879e-05 >>>>>>>>>> 297 KSP preconditioned resid norm 1.355511022815e-06 true resid norm 2.490686565995e-04 ||r(i)||/||b|| 3.381446609103e-05 >>>>>>>>>> 298 KSP preconditioned resid norm 1.347247627243e-06 true resid norm 2.490696287707e-04 ||r(i)||/||b|| 3.381459807652e-05 >>>>>>>>>> 299 KSP preconditioned resid norm 1.414742595618e-06 true resid norm 2.490749815091e-04 ||r(i)||/||b|| 3.381532478374e-05 >>>>>>>>>> 300 KSP preconditioned resid norm 1.418560683189e-06 true resid norm 2.490721501153e-04 ||r(i)||/||b|| 3.381494038343e-05 >>>>>>>>>> 301 KSP preconditioned resid norm 1.416276404923e-06 true resid norm 2.490689576447e-04 ||r(i)||/||b|| 3.381450696203e-05 >>>>>>>>>> 302 KSP preconditioned resid norm 1.431448272112e-06 true resid norm 2.490688812701e-04 ||r(i)||/||b|| 3.381449659312e-05 >>>>>>>>>> 303 KSP preconditioned resid norm 1.446154958969e-06 true resid norm 2.490727536322e-04 ||r(i)||/||b|| 3.381502231909e-05 >>>>>>>>>> 304 KSP preconditioned resid norm 1.468860617921e-06 true resid norm 2.490692363788e-04 ||r(i)||/||b|| 3.381454480397e-05 >>>>>>>>>> 305 KSP preconditioned resid norm 1.627595214971e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >>>>>>>>>> 306 KSP preconditioned resid norm 1.614384672893e-06 true resid norm 2.490687019603e-04 ||r(i)||/||b|| 3.381447224938e-05 >>>>>>>>>> 307 KSP preconditioned resid norm 1.605568020532e-06 true resid norm 2.490699757693e-04 ||r(i)||/||b|| 3.381464518632e-05 >>>>>>>>>> 308 KSP preconditioned resid norm 1.617069685075e-06 true resid norm 2.490649282923e-04 ||r(i)||/||b|| 3.381395992249e-05 >>>>>>>>>> 309 KSP preconditioned resid norm 1.654297792738e-06 true resid norm 2.490644766626e-04 ||r(i)||/||b|| 3.381389860760e-05 >>>>>>>>>> 310 KSP preconditioned resid norm 1.587528143215e-06 true resid norm 2.490696752096e-04 ||r(i)||/||b|| 3.381460438124e-05 >>>>>>>>>> 311 KSP preconditioned resid norm 1.662782022388e-06 true resid norm 2.490699317737e-04 ||r(i)||/||b|| 3.381463921332e-05 >>>>>>>>>> 312 KSP preconditioned resid norm 1.618211471748e-06 true resid norm 2.490735831308e-04 ||r(i)||/||b|| 3.381513493483e-05 >>>>>>>>>> 313 KSP preconditioned resid norm 1.609074961921e-06 true resid norm 2.490679566436e-04 ||r(i)||/||b|| 3.381437106247e-05 >>>>>>>>>> 314 KSP preconditioned resid norm 1.548068942878e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >>>>>>>>>> 315 KSP preconditioned resid norm 1.526718322150e-06 true resid norm 2.490619832967e-04 ||r(i)||/||b|| 3.381356009919e-05 >>>>>>>>>> 316 KSP preconditioned resid norm 1.553150959105e-06 true resid norm 2.490660071226e-04 ||r(i)||/||b|| 3.381410638842e-05 >>>>>>>>>> 317 KSP preconditioned resid norm 1.615015320906e-06 true resid norm 2.490672348079e-04 ||r(i)||/||b|| 3.381427306343e-05 >>>>>>>>>> 318 KSP preconditioned resid norm 1.602904469797e-06 true resid norm 2.490696731006e-04 ||r(i)||/||b|| 3.381460409491e-05 >>>>>>>>>> 319 KSP preconditioned resid norm 1.538140323073e-06 true resid norm 2.490722982494e-04 ||r(i)||/||b|| 3.381496049466e-05 >>>>>>>>>> 320 KSP preconditioned resid norm 1.534779679430e-06 true resid norm 2.490778499789e-04 ||r(i)||/||b|| 3.381571421763e-05 >>>>>>>>>> 321 KSP preconditioned resid norm 1.547155843355e-06 true resid norm 2.490767612985e-04 ||r(i)||/||b|| 3.381556641442e-05 >>>>>>>>>> 322 KSP preconditioned resid norm 1.422137008870e-06 true resid norm 2.490737676309e-04 ||r(i)||/||b|| 3.381515998323e-05 >>>>>>>>>> 323 KSP preconditioned resid norm 1.403072558954e-06 true resid norm 2.490741361870e-04 ||r(i)||/||b|| 3.381521001975e-05 >>>>>>>>>> 324 KSP preconditioned resid norm 1.373070436118e-06 true resid norm 2.490742214990e-04 ||r(i)||/||b|| 3.381522160202e-05 >>>>>>>>>> 325 KSP preconditioned resid norm 1.359547585233e-06 true resid norm 2.490792987570e-04 ||r(i)||/||b|| 3.381591090902e-05 >>>>>>>>>> 326 KSP preconditioned resid norm 1.370351913612e-06 true resid norm 2.490727161158e-04 ||r(i)||/||b|| 3.381501722573e-05 >>>>>>>>>> 327 KSP preconditioned resid norm 1.365238666187e-06 true resid norm 2.490716949642e-04 ||r(i)||/||b|| 3.381487859046e-05 >>>>>>>>>> 328 KSP preconditioned resid norm 1.369073373042e-06 true resid norm 2.490807360288e-04 ||r(i)||/||b|| 3.381610603826e-05 >>>>>>>>>> 329 KSP preconditioned resid norm 1.426698981572e-06 true resid norm 2.490791479521e-04 ||r(i)||/||b|| 3.381589043520e-05 >>>>>>>>>> 330 KSP preconditioned resid norm 1.445542403570e-06 true resid norm 2.490775981409e-04 ||r(i)||/||b|| 3.381568002720e-05 >>>>>>>>>> 331 KSP preconditioned resid norm 1.464506963984e-06 true resid norm 2.490740562430e-04 ||r(i)||/||b|| 3.381519916626e-05 >>>>>>>>>> 332 KSP preconditioned resid norm 1.461462964401e-06 true resid norm 2.490768016856e-04 ||r(i)||/||b|| 3.381557189753e-05 >>>>>>>>>> 333 KSP preconditioned resid norm 1.476680847971e-06 true resid norm 2.490744321516e-04 ||r(i)||/||b|| 3.381525020097e-05 >>>>>>>>>> 334 KSP preconditioned resid norm 1.459640372198e-06 true resid norm 2.490788817993e-04 ||r(i)||/||b|| 3.381585430132e-05 >>>>>>>>>> 335 KSP preconditioned resid norm 1.790770882365e-06 true resid norm 2.490771711471e-04 ||r(i)||/||b|| 3.381562205697e-05 >>>>>>>>>> 336 KSP preconditioned resid norm 1.803770155018e-06 true resid norm 2.490768953858e-04 ||r(i)||/||b|| 3.381558461860e-05 >>>>>>>>>> 337 KSP preconditioned resid norm 1.787821255995e-06 true resid norm 2.490767985676e-04 ||r(i)||/||b|| 3.381557147421e-05 >>>>>>>>>> 338 KSP preconditioned resid norm 1.749912220831e-06 true resid norm 2.490760198704e-04 ||r(i)||/||b|| 3.381546575545e-05 >>>>>>>>>> 339 KSP preconditioned resid norm 1.802915839010e-06 true resid norm 2.490815556273e-04 ||r(i)||/||b|| 3.381621730993e-05 >>>>>>>>>> 340 KSP preconditioned resid norm 1.800777670709e-06 true resid norm 2.490823909286e-04 ||r(i)||/||b|| 3.381633071347e-05 >>>>>>>>>> 341 KSP preconditioned resid norm 1.962516327690e-06 true resid norm 2.490773477410e-04 ||r(i)||/||b|| 3.381564603199e-05 >>>>>>>>>> 342 KSP preconditioned resid norm 1.981726465132e-06 true resid norm 2.490769116884e-04 ||r(i)||/||b|| 3.381558683191e-05 >>>>>>>>>> 343 KSP preconditioned resid norm 1.963419167052e-06 true resid norm 2.490764009914e-04 ||r(i)||/||b|| 3.381551749783e-05 >>>>>>>>>> 344 KSP preconditioned resid norm 1.992082169278e-06 true resid norm 2.490806829883e-04 ||r(i)||/||b|| 3.381609883728e-05 >>>>>>>>>> 345 KSP preconditioned resid norm 1.981005134253e-06 true resid norm 2.490748677339e-04 ||r(i)||/||b|| 3.381530933721e-05 >>>>>>>>>> 346 KSP preconditioned resid norm 1.959802663114e-06 true resid norm 2.490773752317e-04 ||r(i)||/||b|| 3.381564976423e-05 >>>>>>>>>> >>>>>>>>>> On Sat, Oct 21, 2017 at 5:25 PM, Matthew Knepley wrote: >>>>>>>>>> On Sat, Oct 21, 2017 at 5:21 PM, Hao Zhang wrote: >>>>>>>>>> ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); >>>>>>>>>> ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); >>>>>>>>>> >>>>>>>>>> ierr = VecAssemblyBegin(x); >>>>>>>>>> ierr = VecAssemblyEnd(x); >>>>>>>>>> This is probably unnecessary >>>>>>>>>> >>>>>>>>>> ierr = VecAssemblyBegin(b); >>>>>>>>>> ierr = VecAssemblyEnd(b); >>>>>>>>>> This is probably unnecessary >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> ierr = MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,&nullsp); >>>>>>>>>> ierr = MatSetNullSpace(A,nullsp); // Petsc-3.8 >>>>>>>>>> Is your rhs consistent with this nullspace? >>>>>>>>>> >>>>>>>>>> // KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); >>>>>>>>>> KSPSetOperators(ksp,A,A); >>>>>>>>>> >>>>>>>>>> KSPSetType(ksp,KSPBCGS); >>>>>>>>>> >>>>>>>>>> KSPSetComputeSingularValues(ksp, PETSC_TRUE); >>>>>>>>>> #if defined(__HYPRE__) >>>>>>>>>> KSPGetPC(ksp, &pc); >>>>>>>>>> PCSetType(pc, PCHYPRE); >>>>>>>>>> PCHYPRESetType(pc,"boomeramg"); >>>>>>>>>> This is terribly unnecessary. You just use >>>>>>>>>> >>>>>>>>>> -pc_type hypre -pc_hypre_type boomeramg >>>>>>>>>> >>>>>>>>>> or >>>>>>>>>> >>>>>>>>>> -pc_type gamg >>>>>>>>>> >>>>>>>>>> #else >>>>>>>>>> KSPSetType(ksp,KSPBCGSL); >>>>>>>>>> KSPBCGSLSetEll(ksp,2); >>>>>>>>>> #endif /* defined(__HYPRE__) */ >>>>>>>>>> >>>>>>>>>> KSPSetFromOptions(ksp); >>>>>>>>>> KSPSetUp(ksp); >>>>>>>>>> >>>>>>>>>> ierr = KSPSolve(ksp,b,x); >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> command line >>>>>>>>>> >>>>>>>>>> You did not provide any of what I asked for the in the eprevious mail. >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> On Sat, Oct 21, 2017 at 5:16 PM, Matthew Knepley wrote: >>>>>>>>>> On Sat, Oct 21, 2017 at 5:04 PM, Hao Zhang wrote: >>>>>>>>>> hi, >>>>>>>>>> >>>>>>>>>> I implemented HYPRE preconditioner for my study due to the fact that without preconditioner, PETSc solver will take thousands of iterations to converge for fine grid simulation. >>>>>>>>>> >>>>>>>>>> with HYPRE, depending on the parallel partition, it will take HYPRE forever to do anything. observation of output file is that the simulation is hanging with no output. >>>>>>>>>> >>>>>>>>>> Any idea what happened? will post snippet of code. >>>>>>>>>> >>>>>>>>>> 1) For any question about convergence, we need to see the output of >>>>>>>>>> >>>>>>>>>> -ksp_view_pre -ksp_view -ksp_monitor_true_residual -ksp_converged_reason >>>>>>>>>> >>>>>>>>>> 2) Hypre has many preconditioners, which one are you talking about >>>>>>>>>> >>>>>>>>>> 3) PETSc has some preconditioners in common with Hypre, like AMG >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> Hao Zhang >>>>>>>>>> Dept. of Applid Mathematics and Statistics, >>>>>>>>>> Stony Brook University, >>>>>>>>>> Stony Brook, New York, 11790 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> Hao Zhang >>>>>>>>>> Dept. of Applid Mathematics and Statistics, >>>>>>>>>> Stony Brook University, >>>>>>>>>> Stony Brook, New York, 11790 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> Hao Zhang >>>>>>>>>> Dept. of Applid Mathematics and Statistics, >>>>>>>>>> Stony Brook University, >>>>>>>>>> Stony Brook, New York, 11790 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> Hao Zhang >>>>>>>>>> Dept. of Applid Mathematics and Statistics, >>>>>>>>>> Stony Brook University, >>>>>>>>>> Stony Brook, New York, 11790 >>>>>>>>> >>>>>>>>> -- >>>>>>>>> Hao Zhang >>>>>>>>> Dept. of Applid Mathematics and Statistics, >>>>>>>>> Stony Brook University, >>>>>>>>> Stony Brook, New York, 11790 >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> Hao Zhang >>>>>>>> Dept. of Applid Mathematics and Statistics, >>>>>>>> Stony Brook University, >>>>>>>> Stony Brook, New York, 11790 >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> Hao Zhang >>>>>>> Dept. of Applid Mathematics and Statistics, >>>>>>> Stony Brook University, >>>>>>> Stony Brook, New York, 11790 >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> Hao Zhang >>>>>> Dept. of Applid Mathematics and Statistics, >>>>>> Stony Brook University, >>>>>> Stony Brook, New York, 11790 >>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> Hao Zhang >>>>> Dept. of Applid Mathematics and Statistics, >>>>> Stony Brook University, >>>>> Stony Brook, New York, 11790 >>>> >>>> >>>> >>>> >>>> -- >>>> Hao Zhang >>>> Dept. of Applid Mathematics and Statistics, >>>> Stony Brook University, >>>> Stony Brook, New York, 11790 >>> >>> >>> >>> >>> >>> -- >>> Hao Zhang >>> Dept. of Applid Mathematics and Statistics, >>> Stony Brook University, >>> Stony Brook, New York, 11790 >>> >>> >>> >>> >>> -- >>> Hao Zhang >>> Dept. of Applid Mathematics and Statistics, >>> Stony Brook University, >>> Stony Brook, New York, 11790 >> >> From jroman at dsic.upv.es Mon Oct 23 11:03:19 2017 From: jroman at dsic.upv.es (Jose E. Roman) Date: Mon, 23 Oct 2017 18:03:19 +0200 Subject: [petsc-users] slepc NHEP error In-Reply-To: <2792468E-84AC-4F09-A347-A4AB2F68A3E5@dsic.upv.es> References: <5EBDC484-4AC6-4AF3-8D3C-FF999830604F@ornl.gov> <8B4ECCCC-86B1-4580-8D3A-97DF12F02D7E@ornl.gov> <708A0DB5-AE36-40EE-86A9-288A9282A8B9@ornl.gov> <248F9117-5F43-438A-9E79-10D1E9EF9795@mcs.anl.gov> <0BED9C76-8FC4-4E58-B12C-45E21EC183DE@ornl.gov> <92EA6381-8F1F-4C9F-88BD-C1357B8C1C42@mcs.anl.gov> <9C04F853-DEDF-4F28-B63E-316AB14E0A97@mcs.anl.gov> <34243810-1A49-499E-812E-0C0CCCC38565@mcs.anl.gov> <4D23FC40-1491-44AF-825E-C4C4160F1F1E@ornl.gov> <2A3DB53A-D92A-4FC6-8454-5C11039B0343@dsic.upv.es> <4650CE13-784F-4FBE-B5FC-45717BD30103@ornl.gov> <3A414042-4AC3-4B8F-8CE6-6C0A45509ECF@dsic.upv.es> <6C5B1E55-B678-4A54-881B-421E627932E5@dsic.upv.es> <25798DA5-ECA6-40E5-995D-2BE90D6! ! FDBAF@mcs.anl.gov> <2756491B-117D-4985-BB1A-BDF91A21D5BC@mcs.anl.gov> <32692246-AB15-4605-BEB1-82CE12B43FCB@mcs.anl.gov> <3F1CBD8E-315C-4A2F-B38D-E907C8790BC4@dsic.upv.es> <0C7C7E28-DDD7-4867-A387-71CA499740CC@ornl.gov> <2792468E-84AC-4F09-A347-A4AB2F68A3E5@dsic.upv.es> Message-ID: <3E32E90C-45D5-4717-A62F-14BA11F23D74@dsic.upv.es> To close this old thread, I would like to mention that in SLEPc 3.8 we have added a command-line option that should fix the problem: -ds_parallel synchronized This option forces a synchronization of the results of local computations in DS (that involve LAPACK calls), so that all MPI processes have exactly the same result. This was causing the failure you reported. If this option is not provided, the behaviour is the same as in SLEPc 3.7 and before, i.e., all processes do the computation redundantly (-ds_parallel redundant). Jose > El 16 jun 2017, a las 17:36, Jose E. Roman escribi?: > > I still need to work on this, but in principle my previous comments are confirmed. In particular, in my tests it seems that the problem does not appear if PETSc has been configured with --download-fblaslapack > If you have a deadline, I would suggest you to go this way, until I can find a more definitive solution. > > Jose > > > >> El 16 jun 2017, a las 14:50, Kannan, Ramakrishnan escribi?: >> >> Jose/Barry, >> >> Excellent. This is a good news. I have a deadline on this code next Wednesday and hope it is not a big one to address. Please keep me posted. >> -- >> Regards, >> Ramki >> >> >> On 6/16/17, 8:44 AM, "Jose E. Roman" wrote: >> >> I was able to reproduce the problem. I will try to track it down. >> Jose >> >>> El 16 jun 2017, a las 2:03, Barry Smith escribi?: >>> >>> >>> Ok, got it. >>> >>>> On Jun 15, 2017, at 6:56 PM, Kannan, Ramakrishnan wrote: >>>> >>>> You don't need to install. Just download and extract the tar file. There will be a folder of include files. Point this in build.sh. >>>> >>>> Regards, Ramki >>>> Android keyboard at work. Excuse typos and brevity >>>> From: Barry Smith >>>> Sent: Thursday, June 15, 2017 7:54 PM >>>> To: "Kannan, Ramakrishnan" >>>> CC: "Jose E. Roman" ,petsc-users at mcs.anl.gov >>>> Subject: Re: [petsc-users] slepc NHEP error >>>> >>>> >>>> >>>> brew install Armadillo fails for me on brew install hdf5 I have reported this to home-brew and hopefully they'll have a fix within a couple of days so I can try to run the test case. >>>> >>>> Barry >>>> >>>>> On Jun 15, 2017, at 6:34 PM, Kannan, Ramakrishnan wrote: >>>>> >>>>> Barry, >>>>> >>>>> Attached is the quick test program I extracted out of my existing code. This is not clean but you can still understand. I use slepc 3.7.3 and 32 bit real petsc 3.7.4. >>>>> >>>>> This requires armadillo from http://arma.sourceforge.net/download.html. Just extract and show the correct path of armadillo in the build.sh. >>>>> >>>>> I compiled, ran the code. The error and the output file are also in the tar.gz file. >>>>> >>>>> Appreciate your kind support and looking forward for early resolution. >>>>> -- >>>>> Regards, >>>>> Ramki >>>>> >>>>> >>>>> On 6/15/17, 4:35 PM, "Barry Smith" wrote: >>>>> >>>>> >>>>>> On Jun 15, 2017, at 1:45 PM, Kannan, Ramakrishnan wrote: >>>>>> >>>>>> Attached is the latest error w/ 32 bit petsc and the uniform random input matrix. Let me know if you are looking for more information. >>>>> >>>>> Could you please send the full program that reads in the data files and runs SLEPc generating the problem? We don't have any way of using the data files you sent us. >>>>> >>>>> Barry >>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> Regards, >>>>>> Ramki >>>>>> >>>>>> >>>>>> On 6/15/17, 2:27 PM, "Jose E. Roman" wrote: >>>>>> >>>>>> >>>>>>> El 15 jun 2017, a las 19:35, Barry Smith escribi?: >>>>>>> >>>>>>> So where in the code is the decision on how many columns to use made? If we look at that it might help see why it could ever produce different results on different processes. >>>>>> >>>>>> After seeing the call stack again, I think my previous comment is wrong. I really don't know what is happening. If the number of columns was different in different processes, it would have failed before reaching that line of code. >>>>>> >>>>>> Ramki: could you send me the matrix somehow? I could try it in a machine here. Which options are you using for the solver? >>>>>> >>>>>> Jose >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>> >> >> >> >> > From knepley at gmail.com Mon Oct 23 11:03:36 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 23 Oct 2017 12:03:36 -0400 Subject: [petsc-users] "Must select a target sorting criterion if using shift-and-invert" In-Reply-To: References: Message-ID: On Mon, Oct 23, 2017 at 11:18 AM, Jose E. Roman wrote: > Changed. Hope it is clear now. > https://bitbucket.org/slepc/slepc/commits/511900656a27a161c1df6fe2e42fd8 > d66d071800 Great, that is much clearer. Thanks, Matt > > Jose > > > El 21 oct 2017, a las 14:27, Matthew Knepley > escribi?: > > > > On Sat, Oct 21, 2017 at 2:20 AM, Jose E. Roman > wrote: > > This was added in 3.8 to check the common case when people incorrectly > sets shift-and-invert with EPS_SMALLEST_MAGNITUDE. To compute smallest > eigenvalues with shift-and-invert the correct way is to set target=0 and > which=EPS_TARGET_MAGNITUDE. See for instance > > http://slepc.upv.es/documentation/current/src/eps/ > examples/tutorials/ex13.c.html > > > > Jose, one thing we are trying to do in PETSc now is to give the options > to fix a problem (or at least representative options) > > directly in the error message. Or maybe a pointer to the relevant manual > or tutorial section. This gives users a hand up. > > > > Thanks, > > > > Matt > > > > > > Jose > > > > > > > El 21 oct 2017, a las 1:51, Matthew Knepley > escribi?: > > > > > > On Fri, Oct 20, 2017 at 7:43 PM, Kong, Fande > wrote: > > > Hi All, > > > > > > I am trying to solve a generalized eigenvalue problem (using SLEPc) > with "-eps_type krylovschur -st_type sinvert". I got an error message: > "Must select a target sorting criterion if using shift-and-invert". > > > > > > Not sure how to proceed. I do not quite understand this sentence. > > > > > > You need to know how to choose the shift. So for instance you want the > smallest eigenvalues, or the closest to zero, etc. > > > I don't know the options, but they are in the manual. > > > > > > Matt > > > > > > > > > Fande, > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From fdkong.jd at gmail.com Mon Oct 23 11:08:37 2017 From: fdkong.jd at gmail.com (Fande Kong) Date: Mon, 23 Oct 2017 10:08:37 -0600 Subject: [petsc-users] "Must select a target sorting criterion if using shift-and-invert" In-Reply-To: References: Message-ID: Thanks Jose, I like this change. Fande On Mon, Oct 23, 2017 at 10:03 AM, Matthew Knepley wrote: > On Mon, Oct 23, 2017 at 11:18 AM, Jose E. Roman > wrote: > >> Changed. Hope it is clear now. >> https://bitbucket.org/slepc/slepc/commits/511900656a27a161c1 >> df6fe2e42fd8d66d071800 > > > Great, that is much clearer. > > Thanks, > > Matt > > >> >> Jose >> >> > El 21 oct 2017, a las 14:27, Matthew Knepley >> escribi?: >> > >> > On Sat, Oct 21, 2017 at 2:20 AM, Jose E. Roman >> wrote: >> > This was added in 3.8 to check the common case when people incorrectly >> sets shift-and-invert with EPS_SMALLEST_MAGNITUDE. To compute smallest >> eigenvalues with shift-and-invert the correct way is to set target=0 and >> which=EPS_TARGET_MAGNITUDE. See for instance >> > http://slepc.upv.es/documentation/current/src/eps/examples/ >> tutorials/ex13.c.html >> > >> > Jose, one thing we are trying to do in PETSc now is to give the options >> to fix a problem (or at least representative options) >> > directly in the error message. Or maybe a pointer to the relevant >> manual or tutorial section. This gives users a hand up. >> > >> > Thanks, >> > >> > Matt >> > >> > >> > Jose >> > >> > >> > > El 21 oct 2017, a las 1:51, Matthew Knepley >> escribi?: >> > > >> > > On Fri, Oct 20, 2017 at 7:43 PM, Kong, Fande >> wrote: >> > > Hi All, >> > > >> > > I am trying to solve a generalized eigenvalue problem (using SLEPc) >> with "-eps_type krylovschur -st_type sinvert". I got an error message: >> "Must select a target sorting criterion if using shift-and-invert". >> > > >> > > Not sure how to proceed. I do not quite understand this sentence. >> > > >> > > You need to know how to choose the shift. So for instance you want >> the smallest eigenvalues, or the closest to zero, etc. >> > > I don't know the options, but they are in the manual. >> > > >> > > Matt >> > > >> > > >> > > Fande, >> > > >> > > >> > > >> > > -- >> > > What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> > > -- Norbert Wiener >> > > >> > > https://www.cse.buffalo.edu/~knepley/ >> > >> > >> > >> > >> > -- >> > What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> > -- Norbert Wiener >> > >> > https://www.cse.buffalo.edu/~knepley/ >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kannanr at ornl.gov Mon Oct 23 11:46:07 2017 From: kannanr at ornl.gov (Kannan, Ramakrishnan) Date: Mon, 23 Oct 2017 16:46:07 +0000 Subject: [petsc-users] slepc NHEP error In-Reply-To: <3E32E90C-45D5-4717-A62F-14BA11F23D74@dsic.upv.es> References: <5EBDC484-4AC6-4AF3-8D3C-FF999830604F@ornl.gov> <8B4ECCCC-86B1-4580-8D3A-97DF12F02D7E@ornl.gov> <708A0DB5-AE36-40EE-86A9-288A9282A8B9@ornl.gov> <248F9117-5F43-438A-9E79-10D1E9EF9795@mcs.anl.gov> <0BED9C76-8FC4-4E58-B12C-45E21EC183DE@ornl.gov> <92EA6381-8F1F-4C9F-88BD-C1357B8C1C42@mcs.anl.gov> <9C04F853-DEDF-4F28-B63E-316AB14E0A97@mcs.anl.gov> <34243810-1A49-499E-812E-0C0CCCC38565@mcs.anl.gov> <4D23FC40-1491-44AF-825E-C4C4160F1F1E@ornl.gov> <2A3DB53A-D92A-4FC6-8454-5C11039B0343@dsic.upv.es> <4650CE13-784F-4FBE-B5FC-45717BD30103@ornl.gov> <3A414042-4AC3-4B8F-8CE6-6C0A45509ECF@dsic.upv.es> <6C5B1E55-B678-4A54-881B-421E627932E5@dsic.upv.es> <2756491B-117D-4985-BB1A-BDF91A21D5BC@mcs.anl.gov> <32692246-AB15-4605-BEB1-82CE12B43FCB@mcs.anl.gov> <3F1CBD8E-315C-4A2F-B38D-E907C8790BC4@dsic.upv.es> <0C7C7E28-DDD7-4867-A387-71CA499740CC@ornl.gov> <2792468E-84AC-4F09-A347-A4AB2F68A3E5@dsic.upv.es> <3E32E90C-45D5-4717-A62F-14BA11F23D74@dsic.upv.es> Message-ID: <9FB6F7FC-EAE9-41FB-B296-A18555DB8CF3@ornl.gov> Jose, Really appreciate your fix on this. I will try out slepc 3.8, with the mentioned option and also with the flbaslapack. I will get back to you if that fixes the problem. -- Regards, Ramki On 10/23/17, 12:03 PM, "Jose E. Roman" wrote: To close this old thread, I would like to mention that in SLEPc 3.8 we have added a command-line option that should fix the problem: -ds_parallel synchronized This option forces a synchronization of the results of local computations in DS (that involve LAPACK calls), so that all MPI processes have exactly the same result. This was causing the failure you reported. If this option is not provided, the behaviour is the same as in SLEPc 3.7 and before, i.e., all processes do the computation redundantly (-ds_parallel redundant). Jose > El 16 jun 2017, a las 17:36, Jose E. Roman escribi?: > > I still need to work on this, but in principle my previous comments are confirmed. In particular, in my tests it seems that the problem does not appear if PETSc has been configured with --download-fblaslapack > If you have a deadline, I would suggest you to go this way, until I can find a more definitive solution. > > Jose > > > >> El 16 jun 2017, a las 14:50, Kannan, Ramakrishnan escribi?: >> >> Jose/Barry, >> >> Excellent. This is a good news. I have a deadline on this code next Wednesday and hope it is not a big one to address. Please keep me posted. >> -- >> Regards, >> Ramki >> >> >> On 6/16/17, 8:44 AM, "Jose E. Roman" wrote: >> >> I was able to reproduce the problem. I will try to track it down. >> Jose >> >>> El 16 jun 2017, a las 2:03, Barry Smith escribi?: >>> >>> >>> Ok, got it. >>> >>>> On Jun 15, 2017, at 6:56 PM, Kannan, Ramakrishnan wrote: >>>> >>>> You don't need to install. Just download and extract the tar file. There will be a folder of include files. Point this in build.sh. >>>> >>>> Regards, Ramki >>>> Android keyboard at work. Excuse typos and brevity >>>> From: Barry Smith >>>> Sent: Thursday, June 15, 2017 7:54 PM >>>> To: "Kannan, Ramakrishnan" >>>> CC: "Jose E. Roman" ,petsc-users at mcs.anl.gov >>>> Subject: Re: [petsc-users] slepc NHEP error >>>> >>>> >>>> >>>> brew install Armadillo fails for me on brew install hdf5 I have reported this to home-brew and hopefully they'll have a fix within a couple of days so I can try to run the test case. >>>> >>>> Barry >>>> >>>>> On Jun 15, 2017, at 6:34 PM, Kannan, Ramakrishnan wrote: >>>>> >>>>> Barry, >>>>> >>>>> Attached is the quick test program I extracted out of my existing code. This is not clean but you can still understand. I use slepc 3.7.3 and 32 bit real petsc 3.7.4. >>>>> >>>>> This requires armadillo from http://arma.sourceforge.net/download.html. Just extract and show the correct path of armadillo in the build.sh. >>>>> >>>>> I compiled, ran the code. The error and the output file are also in the tar.gz file. >>>>> >>>>> Appreciate your kind support and looking forward for early resolution. >>>>> -- >>>>> Regards, >>>>> Ramki >>>>> >>>>> >>>>> On 6/15/17, 4:35 PM, "Barry Smith" wrote: >>>>> >>>>> >>>>>> On Jun 15, 2017, at 1:45 PM, Kannan, Ramakrishnan wrote: >>>>>> >>>>>> Attached is the latest error w/ 32 bit petsc and the uniform random input matrix. Let me know if you are looking for more information. >>>>> >>>>> Could you please send the full program that reads in the data files and runs SLEPc generating the problem? We don't have any way of using the data files you sent us. >>>>> >>>>> Barry >>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> Regards, >>>>>> Ramki >>>>>> >>>>>> >>>>>> On 6/15/17, 2:27 PM, "Jose E. Roman" wrote: >>>>>> >>>>>> >>>>>>> El 15 jun 2017, a las 19:35, Barry Smith escribi?: >>>>>>> >>>>>>> So where in the code is the decision on how many columns to use made? If we look at that it might help see why it could ever produce different results on different processes. >>>>>> >>>>>> After seeing the call stack again, I think my previous comment is wrong. I really don't know what is happening. If the number of columns was different in different processes, it would have failed before reaching that line of code. >>>>>> >>>>>> Ramki: could you send me the matrix somehow? I could try it in a machine here. Which options are you using for the solver? >>>>>> >>>>>> Jose >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>> >> >> >> >> > From zakaryah at gmail.com Mon Oct 23 14:13:08 2017 From: zakaryah at gmail.com (zakaryah .) Date: Mon, 23 Oct 2017 15:13:08 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <877evnyyty.fsf@jedbrown.org> References: <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> Message-ID: Thanks Matt and Jed, the suggestion of DMComposite is perfect for what I want to do. Thanks Lukasz, as your implementation is also illuminating. I'm working on the code now. I've realized that this is a good opportunity to set up the code so that it will work properly with multigrid. My problem has a dependence on some external fields. In other words, there is constant data at each point in the DMDA. I don't know how to implement those so that they will be scaled properly as the grid is coarsened/refined. Any hints? Thanks again - it's amazing to me how thorough the PETSc methods are, and the ease with which the user can access so many powerful methods, while your support is so knowledgeable and responsive. On Sun, Oct 22, 2017 at 11:41 AM, Jed Brown wrote: > Alternatively, see DMComposite and src/snes/examples/tutorials/ex22.c. > > Lukasz Kaczmarczyk writes: > > > On 22 Oct 2017, at 03:16, zakaryah . aryah at gmail.com>> wrote: > > > > OK, it turns out Lukasz was exactly correct. With whatever method I > try, the solver or stepper approaches a critical point, which is associated > with some kind of snap-through. I have looked into the control techniques > and they are pretty ingenious, and I think they should work for my problem, > in that I hope to continue through the critical point. I have a technical > question about the implementation, though. > > > > Following Riks 1979 for example, the control parameter is the > approximate arc-length in the phase space of loading intensity and > displacements. It represents one additional variable in the system, and > there is one additional equation in the system (in Riks, this is eq. 3.9). > > > > In my implementation, the displacements are implemented as a DMDA with 3 > dof, since I'm working in 3D. I'm not sure about the best way to add the > single additional variable and equality. The way I see it, I either give > up on using the DMDA, in which case I'm not sure how to efficiently > implement the stencil I need to calculate spatial derivatives of the > displacements, or I have to add a rather large number of extra variables. > For example, if my DMDA is WxHxD, I would have to make it (W+1)xHxD, and > each of the extra HxD variables will have 3 dof. Then 3xHxD-1 variables > are in the nullspace (because they don't represent anything, so I would > have to add a bunch of zeros to the function and the Jacobian), while the > remaining variable is used as the control parameter. I'm aware of other > methods, e.g. Crisfield 1983, but I'm interested in whether there is a > straightforward way to implement Riks' method in PETSc. I'm sure I'm > missing something so hopefully someone can give me some hints. > > > > Thanks for all the help! > > > > > > Zakaryah, > > > > If you like to have a peek how we doing that, you can see > > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_8hpp_source.html > > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_8cpp_source.html > > > > The implementation is specific features related to MoFEM implementation. > However, you can follow the same idea; implement shell matrix, which adds > column and row with controlling and controlled equation, respectively, This > shell matrix has to have an operator for matrix-vector multiplication. Then > you have to add preconditioner, which is based on Riks and others. In fact > you can use as well FieldSplit pre-conditioner, Riks method is some variant > of Schur complement. > > > > Such implementation allows running multi-grid preconditioner and other > preconditions with control equation. > > > > Hope that this will be helpful. > > > > Regards, > > Lukasz > > > > > > > > On Thu, Oct 12, 2017 at 2:02 PM, zakaryah . zakaryah at gmail.com>> wrote: > > Thanks for the response, Matt - these are excellent questions. > > > > On theoretical grounds, I am certain that the solution to the continuous > PDE exists. Without any serious treatment, I think this means the > discretized system should have a solution up to discretization error, but > perhaps this is indeed a bad approach. > > > > I am not sure whether the equations are "really hard to solve". At each > point, the equations are third order polynomials of the state variable at > that point and at nearby points (i.e. in the stencil). One possible > complication is that the external forces which are applied to the interior > of the material can be fairly complex - they are smooth, but they can have > many inflection points. > > > > I don't have a great test case for which I know a good solution. To my > thinking, there is no way that time-stepping the parabolic version of the > same PDE can fail to yield a solution at infinite time. So, I'm going to > try starting there. Converting the problem to a minimization is a bit > trickier, because the discretization has to be performed one step earlier > in the calculation, and therefore the gradient and Hessian would need to be > recalculated. > > > > Even if there are some problems with time-stepping (speed of > convergence?), maybe I can use the solutions as better test cases for the > elliptic PDE solved via SNES. > > > > Can you give me any additional lingo or references for the fracture > problem? > > > > Thanks, Zak > > > > On Wed, Oct 11, 2017 at 8:53 PM, Matthew Knepley > wrote: > > On Wed, Oct 11, 2017 at 11:33 AM, zakaryah . zakaryah at gmail.com>> wrote: > > Many thanks for the suggestions, Matt. > > > > I tried putting the solvers in a loop, like this: > > > > do { > > NewtonLS > > check convergence > > if (converged) break > > NRichardson or NGMRES > > } while (!converged) > > > > The results were interesting, to me at least. With NRichardson, there > was indeed improvement in the residual norm, followed by improvement with > NewtonLS, and so on for a few iterations of this loop. In each case, after > a few iterations the NewtonLS appeared to be stuck in the same way as after > the first iteration. Eventually neither method was able to reduce the > residual norm, which was still significant, so this was not a total > success. With NGMRES, the initial behavior was similar, but eventually the > NGMRES progress became erratic. The minimal residual norm was a bit better > using NGMRES than NRichardson, but neither combination of methods fully > converged. For both NRichardson and NGMRES, I simply used the defaults, as > I have no knowledge of how to tune the options for my problem. > > > > Are you certain that the equations have a solution? I become a little > concerned when richardson stops converging. Its > > still possible you have really hard to solve equations, it just becomes > less likely. And even if they truly are hard to solve, > > then there should be physical reasons for this. For example, it could be > that discretizing the minimizing PDE is just the > > wrong thing to do. I believe this is the case in fracture, where you > attack the minimization problem directly. > > > > Matt > > > > On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley > wrote: > > On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . zakaryah at gmail.com>> wrote: > > Thanks for clearing that up. > > > > I'd appreciate any further help. Here's a summary: > > > > My ultimate goal is to find a vector field which minimizes an action. > The action is a (nonlinear) function of the field and its first spatial > derivatives. > > > > My current approach is to derive the (continuous) Euler-Lagrange > equations, which results in a nonlinear PDE that the minimizing field must > satisfy. These Euler-Lagrange equations are then discretized, and I'm > trying to use an SNES to solve them. > > > > The problem is that the solver seems to reach a point at which the > Jacobian (this corresponds to the second variation of the action, which is > like a Hessian of the energy) becomes nearly singular, but where the > residual (RHS of PDE) is not close to zero. The residual does not decrease > over additional SNES iterations, and the line search results in tiny step > sizes. My interpretation is that this point of stagnation is a critical > point. > > > > The normal thing to do here (I think) is to engage solvers which do not > depend on that particular point. So using > > NRichardson, or maybe NGMRES, to get past that. I would be interested to > see if this is successful. > > > > Matt > > > > I have checked the hand-coded Jacobian very carefully and I am confident > that it is correct. > > > > I am guessing that such a situation is well-known in the field, but I > don't know the lingo or literature. If anyone has suggestions I'd be > thrilled. Are there documentation/methodologies within PETSc for this type > of situation? > > > > Is there any advantage to discretizing the action itself and using the > optimization routines? With minor modifications I'll have the gradient and > Hessian calculations coded. Are the optimization routines likely to > stagnate in the same way as the nonlinear solver, or can they take > advantage of the structure of the problem to overcome this? > > > > Thanks a lot in advance for any help. > > > > On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith bsmith at mcs.anl.gov>> wrote: > > > > There is apparently confusing in understanding the ordering. Is this > all on one process that you get funny results? Are you using > MatSetValuesStencil() to provide the matrix (it is generally easier than > providing it yourself). In parallel MatView() always maps the rows and > columns to the natural ordering before printing, if you use a matrix > created from the DMDA. If you create the matrix yourself it has a different > MatView in parallel that is in in thePETSc ordering.\ > > > > > > Barry > > > > > > > >> On Oct 8, 2017, at 8:05 AM, zakaryah . aryah at gmail.com>> wrote: > >> > >> I'm more confused than ever. I don't understand the output of > -snes_type test -snes_test_display. > >> > >> For the user-defined state of the vector (where I'd like to test the > Jacobian), the finite difference Jacobian at row 0 evaluates as: > >> > >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, > 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, > 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) > (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) > (37, 16.325) (38, 4.83918) > >> > >> But the hand-coded Jacobian at row 0 evaluates as: > >> > >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, > 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, > 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) > (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, > 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) > >> and the difference between the Jacobians at row 0 evaluates as: > >> > >> row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, > 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, > -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) > (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, > -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) > (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, > 0.) (41, 0.) > >> > >> The difference between the column numbering between the finite > difference and the hand-coded Jacobians looks like a serious problem to me, > but I'm probably missing something. > >> > >> I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, and > for this test problem the grid dimensions are 11x7x6. For a grid point > x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? > If so, then the column numbers of the hand-coded Jacobian match those of > the 27 point stencil I have in mind. However, I am then at a loss to > explain the column numbers in the finite difference Jacobian. > >> > >> > >> > >> > >> On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . zakaryah at gmail.com>> wrote: > >> OK - I ran with -snes_monitor -snes_converged_reason > -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual > -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls > -snes_compare_explicit > >> > >> and here is the full error message, output immediately after > >> > >> Finite difference Jacobian > >> Mat Object: 24 MPI processes > >> type: mpiaij > >> > >> [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > >> > >> [0]PETSC ERROR: Invalid argument > >> > >> [0]PETSC ERROR: Matrix not generated from a DMDA > >> > >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > >> > >> [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 > >> > >> [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named > node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 > >> > >> [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 > --download-fblaslapack -with-debugging=0 > >> > >> [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in > /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c > >> > >> [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/ > PETSc/build/petsc-3.7.6/src/mat/interface/matrix.c > >> > >> [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in > /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > >> > >> [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in > /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c > >> > >> [0]PETSC ERROR: #5 SNESSolve() line 4005 in /rugpfs/fs0/home/zfrentz/ > PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > >> > >> [0]PETSC ERROR: #6 solveWarp3D() line 659 in > /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c > >> > >> > >> On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown jedbrown.org>> wrote: > >> Always always always send the whole error message. > >> > >> "zakaryah ." > writes: > >> > >> > I tried -snes_compare_explicit, and got the following error: > >> > > >> > [0]PETSC ERROR: Invalid argument > >> > > >> > [0]PETSC ERROR: Matrix not generated from a DMDA > >> > > >> > What am I doing wrong? > >> > > >> > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown jed at jedbrown.org>> wrote: > >> > > >> >> Barry Smith > writes: > >> >> > >> >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . > wrote: > >> >> >> > >> >> >> I'm still working on this. I've made some progress, and it looks > like > >> >> the issue is with the KSP, at least for now. The Jacobian may be > >> >> ill-conditioned. Is it possible to use -snes_test_display during an > >> >> intermediate step of the analysis? I would like to inspect the > Jacobian > >> >> after several solves have already completed, > >> >> > > >> >> > No, our currently code for testing Jacobians is poor quality and > >> >> poorly organized. Needs a major refactoring to do things properly. > Sorry > >> >> > >> >> You can use -snes_compare_explicit or -snes_compare_coloring to > output > >> >> differences on each Newton step. > >> >> > >> > >> > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Oct 23 14:37:37 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 23 Oct 2017 14:37:37 -0500 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> Message-ID: <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> > On Oct 23, 2017, at 2:13 PM, zakaryah . wrote: > > Thanks Matt and Jed, the suggestion of DMComposite is perfect for what I want to do. Thanks Lukasz, as your implementation is also illuminating. > > I'm working on the code now. I've realized that this is a good opportunity to set up the code so that it will work properly with multigrid. My problem has a dependence on some external fields. In other words, there is constant data at each point in the DMDA. I don't know how to implement those so that they will be scaled properly as the grid is coarsened/refined. Any hints? Depending on the meaning of the fields to coarsen one can use either injection (just grab values on the coarse grid points) or restriction (normally this is the transpose of linear interpolation and sort of an averages values near the coarse grid points. To refine one generally uses linear interpolation. The DMDA can provide these operations (it also provides them to the geometric multigrid solver). DMCreateInjection(), DMCreateInterpolation(), DMCreateRestriction(). Barry > > Thanks again - it's amazing to me how thorough the PETSc methods are, and the ease with which the user can access so many powerful methods, while your support is so knowledgeable and responsive. > > On Sun, Oct 22, 2017 at 11:41 AM, Jed Brown wrote: > Alternatively, see DMComposite and src/snes/examples/tutorials/ex22.c. > > Lukasz Kaczmarczyk writes: > > > On 22 Oct 2017, at 03:16, zakaryah . > wrote: > > > > OK, it turns out Lukasz was exactly correct. With whatever method I try, the solver or stepper approaches a critical point, which is associated with some kind of snap-through. I have looked into the control techniques and they are pretty ingenious, and I think they should work for my problem, in that I hope to continue through the critical point. I have a technical question about the implementation, though. > > > > Following Riks 1979 for example, the control parameter is the approximate arc-length in the phase space of loading intensity and displacements. It represents one additional variable in the system, and there is one additional equation in the system (in Riks, this is eq. 3.9). > > > > In my implementation, the displacements are implemented as a DMDA with 3 dof, since I'm working in 3D. I'm not sure about the best way to add the single additional variable and equality. The way I see it, I either give up on using the DMDA, in which case I'm not sure how to efficiently implement the stencil I need to calculate spatial derivatives of the displacements, or I have to add a rather large number of extra variables. For example, if my DMDA is WxHxD, I would have to make it (W+1)xHxD, and each of the extra HxD variables will have 3 dof. Then 3xHxD-1 variables are in the nullspace (because they don't represent anything, so I would have to add a bunch of zeros to the function and the Jacobian), while the remaining variable is used as the control parameter. I'm aware of other methods, e.g. Crisfield 1983, but I'm interested in whether there is a straightforward way to implement Riks' method in PETSc. I'm sure I'm missing something so hopefully someone can give me some hints. > > > > Thanks for all the help! > > > > > > Zakaryah, > > > > If you like to have a peek how we doing that, you can see > > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_8hpp_source.html > > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_8cpp_source.html > > > > The implementation is specific features related to MoFEM implementation. However, you can follow the same idea; implement shell matrix, which adds column and row with controlling and controlled equation, respectively, This shell matrix has to have an operator for matrix-vector multiplication. Then you have to add preconditioner, which is based on Riks and others. In fact you can use as well FieldSplit pre-conditioner, Riks method is some variant of Schur complement. > > > > Such implementation allows running multi-grid preconditioner and other preconditions with control equation. > > > > Hope that this will be helpful. > > > > Regards, > > Lukasz > > > > > > > > On Thu, Oct 12, 2017 at 2:02 PM, zakaryah . > wrote: > > Thanks for the response, Matt - these are excellent questions. > > > > On theoretical grounds, I am certain that the solution to the continuous PDE exists. Without any serious treatment, I think this means the discretized system should have a solution up to discretization error, but perhaps this is indeed a bad approach. > > > > I am not sure whether the equations are "really hard to solve". At each point, the equations are third order polynomials of the state variable at that point and at nearby points (i.e. in the stencil). One possible complication is that the external forces which are applied to the interior of the material can be fairly complex - they are smooth, but they can have many inflection points. > > > > I don't have a great test case for which I know a good solution. To my thinking, there is no way that time-stepping the parabolic version of the same PDE can fail to yield a solution at infinite time. So, I'm going to try starting there. Converting the problem to a minimization is a bit trickier, because the discretization has to be performed one step earlier in the calculation, and therefore the gradient and Hessian would need to be recalculated. > > > > Even if there are some problems with time-stepping (speed of convergence?), maybe I can use the solutions as better test cases for the elliptic PDE solved via SNES. > > > > Can you give me any additional lingo or references for the fracture problem? > > > > Thanks, Zak > > > > On Wed, Oct 11, 2017 at 8:53 PM, Matthew Knepley > wrote: > > On Wed, Oct 11, 2017 at 11:33 AM, zakaryah . > wrote: > > Many thanks for the suggestions, Matt. > > > > I tried putting the solvers in a loop, like this: > > > > do { > > NewtonLS > > check convergence > > if (converged) break > > NRichardson or NGMRES > > } while (!converged) > > > > The results were interesting, to me at least. With NRichardson, there was indeed improvement in the residual norm, followed by improvement with NewtonLS, and so on for a few iterations of this loop. In each case, after a few iterations the NewtonLS appeared to be stuck in the same way as after the first iteration. Eventually neither method was able to reduce the residual norm, which was still significant, so this was not a total success. With NGMRES, the initial behavior was similar, but eventually the NGMRES progress became erratic. The minimal residual norm was a bit better using NGMRES than NRichardson, but neither combination of methods fully converged. For both NRichardson and NGMRES, I simply used the defaults, as I have no knowledge of how to tune the options for my problem. > > > > Are you certain that the equations have a solution? I become a little concerned when richardson stops converging. Its > > still possible you have really hard to solve equations, it just becomes less likely. And even if they truly are hard to solve, > > then there should be physical reasons for this. For example, it could be that discretizing the minimizing PDE is just the > > wrong thing to do. I believe this is the case in fracture, where you attack the minimization problem directly. > > > > Matt > > > > On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley > wrote: > > On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . > wrote: > > Thanks for clearing that up. > > > > I'd appreciate any further help. Here's a summary: > > > > My ultimate goal is to find a vector field which minimizes an action. The action is a (nonlinear) function of the field and its first spatial derivatives. > > > > My current approach is to derive the (continuous) Euler-Lagrange equations, which results in a nonlinear PDE that the minimizing field must satisfy. These Euler-Lagrange equations are then discretized, and I'm trying to use an SNES to solve them. > > > > The problem is that the solver seems to reach a point at which the Jacobian (this corresponds to the second variation of the action, which is like a Hessian of the energy) becomes nearly singular, but where the residual (RHS of PDE) is not close to zero. The residual does not decrease over additional SNES iterations, and the line search results in tiny step sizes. My interpretation is that this point of stagnation is a critical point. > > > > The normal thing to do here (I think) is to engage solvers which do not depend on that particular point. So using > > NRichardson, or maybe NGMRES, to get past that. I would be interested to see if this is successful. > > > > Matt > > > > I have checked the hand-coded Jacobian very carefully and I am confident that it is correct. > > > > I am guessing that such a situation is well-known in the field, but I don't know the lingo or literature. If anyone has suggestions I'd be thrilled. Are there documentation/methodologies within PETSc for this type of situation? > > > > Is there any advantage to discretizing the action itself and using the optimization routines? With minor modifications I'll have the gradient and Hessian calculations coded. Are the optimization routines likely to stagnate in the same way as the nonlinear solver, or can they take advantage of the structure of the problem to overcome this? > > > > Thanks a lot in advance for any help. > > > > On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith > wrote: > > > > There is apparently confusing in understanding the ordering. Is this all on one process that you get funny results? Are you using MatSetValuesStencil() to provide the matrix (it is generally easier than providing it yourself). In parallel MatView() always maps the rows and columns to the natural ordering before printing, if you use a matrix created from the DMDA. If you create the matrix yourself it has a different MatView in parallel that is in in thePETSc ordering.\ > > > > > > Barry > > > > > > > >> On Oct 8, 2017, at 8:05 AM, zakaryah . > wrote: > >> > >> I'm more confused than ever. I don't understand the output of -snes_type test -snes_test_display. > >> > >> For the user-defined state of the vector (where I'd like to test the Jacobian), the finite difference Jacobian at row 0 evaluates as: > >> > >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) (37, 16.325) (38, 4.83918) > >> > >> But the hand-coded Jacobian at row 0 evaluates as: > >> > >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) > >> and the difference between the Jacobians at row 0 evaluates as: > >> > >> row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, 0.) (41, 0.) > >> > >> The difference between the column numbering between the finite difference and the hand-coded Jacobians looks like a serious problem to me, but I'm probably missing something. > >> > >> I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, and for this test problem the grid dimensions are 11x7x6. For a grid point x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? If so, then the column numbers of the hand-coded Jacobian match those of the 27 point stencil I have in mind. However, I am then at a loss to explain the column numbers in the finite difference Jacobian. > >> > >> > >> > >> > >> On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . > wrote: > >> OK - I ran with -snes_monitor -snes_converged_reason -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls -snes_compare_explicit > >> > >> and here is the full error message, output immediately after > >> > >> Finite difference Jacobian > >> Mat Object: 24 MPI processes > >> type: mpiaij > >> > >> [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > >> > >> [0]PETSC ERROR: Invalid argument > >> > >> [0]PETSC ERROR: Matrix not generated from a DMDA > >> > >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > >> > >> [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 > >> > >> [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 > >> > >> [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 --download-fblaslapack -with-debugging=0 > >> > >> [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c > >> > >> [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/mat/interface/matrix.c > >> > >> [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > >> > >> [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c > >> > >> [0]PETSC ERROR: #5 SNESSolve() line 4005 in /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > >> > >> [0]PETSC ERROR: #6 solveWarp3D() line 659 in /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c > >> > >> > >> On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown > wrote: > >> Always always always send the whole error message. > >> > >> "zakaryah ." > writes: > >> > >> > I tried -snes_compare_explicit, and got the following error: > >> > > >> > [0]PETSC ERROR: Invalid argument > >> > > >> > [0]PETSC ERROR: Matrix not generated from a DMDA > >> > > >> > What am I doing wrong? > >> > > >> > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown > wrote: > >> > > >> >> Barry Smith > writes: > >> >> > >> >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . > wrote: > >> >> >> > >> >> >> I'm still working on this. I've made some progress, and it looks like > >> >> the issue is with the KSP, at least for now. The Jacobian may be > >> >> ill-conditioned. Is it possible to use -snes_test_display during an > >> >> intermediate step of the analysis? I would like to inspect the Jacobian > >> >> after several solves have already completed, > >> >> > > >> >> > No, our currently code for testing Jacobians is poor quality and > >> >> poorly organized. Needs a major refactoring to do things properly. Sorry > >> >> > >> >> You can use -snes_compare_explicit or -snes_compare_coloring to output > >> >> differences on each Newton step. > >> >> > >> > >> > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ From zakaryah at gmail.com Mon Oct 23 19:30:24 2017 From: zakaryah at gmail.com (zakaryah .) Date: Mon, 23 Oct 2017 20:30:24 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> References: <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> Message-ID: Thanks Barry. For my problem, restriction seems more natural. I still don't understand how to actually introduce the field. As I understand it, the multi-grid procedures (coarsening, interpolating, etc.) are performed on the state variables, which for my problem can be naturally represented by a DMDA. I imagine for the external fields, I create a second DMDA, but I'm not sure if I somehow couple it to the state variable DMDA or just pass it through the user defined context or what. I have another question about the DMComposite. How do I calculate the value of the function for the redundant DM, as it depends on the state variables at all grid locations? The example ex21 is easy, because the function for the redundant variable depends only on a Lagrange multiplier which is known to belong to the first processor. I am hoping that I can do something like increment to the function value for the redundant DM? Then everything is safe because it happens between VecGetArray and VecRestoreArray? Or, is the thread safety due to the DMCompositeScatter/DMCompositeGather calls? In that case, am I forced to use ADD_VALUES? My last question, for now. Matt - you said it would be tricky to preallocate the composite matrix, and that I should turn off allocation and just stick in what I need. Does this mean I should call DMSetMatrixStructureOnly with PETSC_TRUE? In that case, can I assume that the initial assembly of the matrix will be slow due to allocation on the fly, but since the matrix always has the same non-zero structure, this is not an issue for the many repeated uses of the matrix that will occur? Thanks so much for all the help. On Mon, Oct 23, 2017 at 3:37 PM, Barry Smith wrote: > > > On Oct 23, 2017, at 2:13 PM, zakaryah . wrote: > > > > Thanks Matt and Jed, the suggestion of DMComposite is perfect for what I > want to do. Thanks Lukasz, as your implementation is also illuminating. > > > > I'm working on the code now. I've realized that this is a good > opportunity to set up the code so that it will work properly with > multigrid. My problem has a dependence on some external fields. In other > words, there is constant data at each point in the DMDA. I don't know how > to implement those so that they will be scaled properly as the grid is > coarsened/refined. Any hints? > > Depending on the meaning of the fields to coarsen one can use either > injection (just grab values on the coarse grid points) or restriction > (normally this is the transpose of linear interpolation and sort of an > averages values near the coarse grid points. To refine one generally uses > linear interpolation. The DMDA can provide these operations (it also > provides them to the geometric multigrid solver). DMCreateInjection(), > DMCreateInterpolation(), DMCreateRestriction(). > > Barry > > > > > Thanks again - it's amazing to me how thorough the PETSc methods are, > and the ease with which the user can access so many powerful methods, while > your support is so knowledgeable and responsive. > > > > On Sun, Oct 22, 2017 at 11:41 AM, Jed Brown wrote: > > Alternatively, see DMComposite and src/snes/examples/tutorials/ex22.c. > > > > Lukasz Kaczmarczyk writes: > > > > > On 22 Oct 2017, at 03:16, zakaryah . aryah at gmail.com>> wrote: > > > > > > OK, it turns out Lukasz was exactly correct. With whatever method I > try, the solver or stepper approaches a critical point, which is associated > with some kind of snap-through. I have looked into the control techniques > and they are pretty ingenious, and I think they should work for my problem, > in that I hope to continue through the critical point. I have a technical > question about the implementation, though. > > > > > > Following Riks 1979 for example, the control parameter is the > approximate arc-length in the phase space of loading intensity and > displacements. It represents one additional variable in the system, and > there is one additional equation in the system (in Riks, this is eq. 3.9). > > > > > > In my implementation, the displacements are implemented as a DMDA with > 3 dof, since I'm working in 3D. I'm not sure about the best way to add the > single additional variable and equality. The way I see it, I either give > up on using the DMDA, in which case I'm not sure how to efficiently > implement the stencil I need to calculate spatial derivatives of the > displacements, or I have to add a rather large number of extra variables. > For example, if my DMDA is WxHxD, I would have to make it (W+1)xHxD, and > each of the extra HxD variables will have 3 dof. Then 3xHxD-1 variables > are in the nullspace (because they don't represent anything, so I would > have to add a bunch of zeros to the function and the Jacobian), while the > remaining variable is used as the control parameter. I'm aware of other > methods, e.g. Crisfield 1983, but I'm interested in whether there is a > straightforward way to implement Riks' method in PETSc. I'm sure I'm > missing something so hopefully someone can give me some hints. > > > > > > Thanks for all the help! > > > > > > > > > Zakaryah, > > > > > > If you like to have a peek how we doing that, you can see > > > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_ > 8hpp_source.html > > > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_ > 8cpp_source.html > > > > > > The implementation is specific features related to MoFEM > implementation. However, you can follow the same idea; implement shell > matrix, which adds column and row with controlling and controlled equation, > respectively, This shell matrix has to have an operator for matrix-vector > multiplication. Then you have to add preconditioner, which is based on Riks > and others. In fact you can use as well FieldSplit pre-conditioner, Riks > method is some variant of Schur complement. > > > > > > Such implementation allows running multi-grid preconditioner and other > preconditions with control equation. > > > > > > Hope that this will be helpful. > > > > > > Regards, > > > Lukasz > > > > > > > > > > > > On Thu, Oct 12, 2017 at 2:02 PM, zakaryah . > wrote: > > > Thanks for the response, Matt - these are excellent questions. > > > > > > On theoretical grounds, I am certain that the solution to the > continuous PDE exists. Without any serious treatment, I think this means > the discretized system should have a solution up to discretization error, > but perhaps this is indeed a bad approach. > > > > > > I am not sure whether the equations are "really hard to solve". At > each point, the equations are third order polynomials of the state variable > at that point and at nearby points (i.e. in the stencil). One possible > complication is that the external forces which are applied to the interior > of the material can be fairly complex - they are smooth, but they can have > many inflection points. > > > > > > I don't have a great test case for which I know a good solution. To > my thinking, there is no way that time-stepping the parabolic version of > the same PDE can fail to yield a solution at infinite time. So, I'm going > to try starting there. Converting the problem to a minimization is a bit > trickier, because the discretization has to be performed one step earlier > in the calculation, and therefore the gradient and Hessian would need to be > recalculated. > > > > > > Even if there are some problems with time-stepping (speed of > convergence?), maybe I can use the solutions as better test cases for the > elliptic PDE solved via SNES. > > > > > > Can you give me any additional lingo or references for the fracture > problem? > > > > > > Thanks, Zak > > > > > > On Wed, Oct 11, 2017 at 8:53 PM, Matthew Knepley > wrote: > > > On Wed, Oct 11, 2017 at 11:33 AM, zakaryah . > wrote: > > > Many thanks for the suggestions, Matt. > > > > > > I tried putting the solvers in a loop, like this: > > > > > > do { > > > NewtonLS > > > check convergence > > > if (converged) break > > > NRichardson or NGMRES > > > } while (!converged) > > > > > > The results were interesting, to me at least. With NRichardson, there > was indeed improvement in the residual norm, followed by improvement with > NewtonLS, and so on for a few iterations of this loop. In each case, after > a few iterations the NewtonLS appeared to be stuck in the same way as after > the first iteration. Eventually neither method was able to reduce the > residual norm, which was still significant, so this was not a total > success. With NGMRES, the initial behavior was similar, but eventually the > NGMRES progress became erratic. The minimal residual norm was a bit better > using NGMRES than NRichardson, but neither combination of methods fully > converged. For both NRichardson and NGMRES, I simply used the defaults, as > I have no knowledge of how to tune the options for my problem. > > > > > > Are you certain that the equations have a solution? I become a little > concerned when richardson stops converging. Its > > > still possible you have really hard to solve equations, it just > becomes less likely. And even if they truly are hard to solve, > > > then there should be physical reasons for this. For example, it could > be that discretizing the minimizing PDE is just the > > > wrong thing to do. I believe this is the case in fracture, where you > attack the minimization problem directly. > > > > > > Matt > > > > > > On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley > wrote: > > > On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . > wrote: > > > Thanks for clearing that up. > > > > > > I'd appreciate any further help. Here's a summary: > > > > > > My ultimate goal is to find a vector field which minimizes an action. > The action is a (nonlinear) function of the field and its first spatial > derivatives. > > > > > > My current approach is to derive the (continuous) Euler-Lagrange > equations, which results in a nonlinear PDE that the minimizing field must > satisfy. These Euler-Lagrange equations are then discretized, and I'm > trying to use an SNES to solve them. > > > > > > The problem is that the solver seems to reach a point at which the > Jacobian (this corresponds to the second variation of the action, which is > like a Hessian of the energy) becomes nearly singular, but where the > residual (RHS of PDE) is not close to zero. The residual does not decrease > over additional SNES iterations, and the line search results in tiny step > sizes. My interpretation is that this point of stagnation is a critical > point. > > > > > > The normal thing to do here (I think) is to engage solvers which do > not depend on that particular point. So using > > > NRichardson, or maybe NGMRES, to get past that. I would be interested > to see if this is successful. > > > > > > Matt > > > > > > I have checked the hand-coded Jacobian very carefully and I am > confident that it is correct. > > > > > > I am guessing that such a situation is well-known in the field, but I > don't know the lingo or literature. If anyone has suggestions I'd be > thrilled. Are there documentation/methodologies within PETSc for this type > of situation? > > > > > > Is there any advantage to discretizing the action itself and using the > optimization routines? With minor modifications I'll have the gradient and > Hessian calculations coded. Are the optimization routines likely to > stagnate in the same way as the nonlinear solver, or can they take > advantage of the structure of the problem to overcome this? > > > > > > Thanks a lot in advance for any help. > > > > > > On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith > wrote: > > > > > > There is apparently confusing in understanding the ordering. Is this > all on one process that you get funny results? Are you using > MatSetValuesStencil() to provide the matrix (it is generally easier than > providing it yourself). In parallel MatView() always maps the rows and > columns to the natural ordering before printing, if you use a matrix > created from the DMDA. If you create the matrix yourself it has a different > MatView in parallel that is in in thePETSc ordering.\ > > > > > > > > > Barry > > > > > > > > > > > >> On Oct 8, 2017, at 8:05 AM, zakaryah . aryah at gmail.com>> wrote: > > >> > > >> I'm more confused than ever. I don't understand the output of > -snes_type test -snes_test_display. > > >> > > >> For the user-defined state of the vector (where I'd like to test the > Jacobian), the finite difference Jacobian at row 0 evaluates as: > > >> > > >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, > 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, > 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) > (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) > (37, 16.325) (38, 4.83918) > > >> > > >> But the hand-coded Jacobian at row 0 evaluates as: > > >> > > >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, > 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, > 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) > (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, > 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) > > >> and the difference between the Jacobians at row 0 evaluates as: > > >> > > >> row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, > 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, > -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) > (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, > -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) > (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, > 0.) (41, 0.) > > >> > > >> The difference between the column numbering between the finite > difference and the hand-coded Jacobians looks like a serious problem to me, > but I'm probably missing something. > > >> > > >> I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, > and for this test problem the grid dimensions are 11x7x6. For a grid point > x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? > If so, then the column numbers of the hand-coded Jacobian match those of > the 27 point stencil I have in mind. However, I am then at a loss to > explain the column numbers in the finite difference Jacobian. > > >> > > >> > > >> > > >> > > >> On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . > wrote: > > >> OK - I ran with -snes_monitor -snes_converged_reason > -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual > -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls > -snes_compare_explicit > > >> > > >> and here is the full error message, output immediately after > > >> > > >> Finite difference Jacobian > > >> Mat Object: 24 MPI processes > > >> type: mpiaij > > >> > > >> [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > >> > > >> [0]PETSC ERROR: Invalid argument > > >> > > >> [0]PETSC ERROR: Matrix not generated from a DMDA > > >> > > >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/ > documentation/faq.html for trouble shooting. > > >> > > >> [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 > > >> > > >> [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named > node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 > > >> > > >> [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 > --download-fblaslapack -with-debugging=0 > > >> > > >> [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in > /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c > > >> > > >> [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/ > PETSc/build/petsc-3.7.6/src/mat/interface/matrix.c > > >> > > >> [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in > /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > >> > > >> [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in > /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c > > >> > > >> [0]PETSC ERROR: #5 SNESSolve() line 4005 in /rugpfs/fs0/home/zfrentz/ > PETSc/build/petsc-3.7.6/src/snes/interface/snes.c > > >> > > >> [0]PETSC ERROR: #6 solveWarp3D() line 659 in > /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c > > >> > > >> > > >> On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown jed at jedbrown.org>> wrote: > > >> Always always always send the whole error message. > > >> > > >> "zakaryah ." > writes: > > >> > > >> > I tried -snes_compare_explicit, and got the following error: > > >> > > > >> > [0]PETSC ERROR: Invalid argument > > >> > > > >> > [0]PETSC ERROR: Matrix not generated from a DMDA > > >> > > > >> > What am I doing wrong? > > >> > > > >> > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown > wrote: > > >> > > > >> >> Barry Smith > > writes: > > >> >> > > >> >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . > wrote: > > >> >> >> > > >> >> >> I'm still working on this. I've made some progress, and it > looks like > > >> >> the issue is with the KSP, at least for now. The Jacobian may be > > >> >> ill-conditioned. Is it possible to use -snes_test_display during > an > > >> >> intermediate step of the analysis? I would like to inspect the > Jacobian > > >> >> after several solves have already completed, > > >> >> > > > >> >> > No, our currently code for testing Jacobians is poor quality > and > > >> >> poorly organized. Needs a major refactoring to do things properly. > Sorry > > >> >> > > >> >> You can use -snes_compare_explicit or -snes_compare_coloring to > output > > >> >> differences on each Newton step. > > >> >> > > >> > > >> > > > > > > > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Oct 23 22:09:17 2017 From: jed at jedbrown.org (Jed Brown) Date: Mon, 23 Oct 2017 21:09:17 -0600 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> Message-ID: <8760b5xmw2.fsf@jedbrown.org> There is DMCoarsenHookAdd() and DMGetNamedGlobalVector()/DMGetNamedLocalVector() that provides a composable way to store and transfer auxilliary variables between grids. "zakaryah ." writes: > Thanks Barry. For my problem, restriction seems more natural. I still > don't understand how to actually introduce the field. As I understand it, > the multi-grid procedures (coarsening, interpolating, etc.) are performed > on the state variables, which for my problem can be naturally represented > by a DMDA. I imagine for the external fields, I create a second DMDA, but > I'm not sure if I somehow couple it to the state variable DMDA or just pass > it through the user defined context or what. > > I have another question about the DMComposite. How do I calculate the > value of the function for the redundant DM, as it depends on the state > variables at all grid locations? The example ex21 is easy, because the > function for the redundant variable depends only on a Lagrange multiplier > which is known to belong to the first processor. I am hoping that I can do > something like increment to the function value for the redundant DM? Then > everything is safe because it happens between VecGetArray and > VecRestoreArray? Or, is the thread safety due to the > DMCompositeScatter/DMCompositeGather calls? In that case, am I forced to > use ADD_VALUES? > > My last question, for now. Matt - you said it would be tricky to > preallocate the composite matrix, and that I should turn off allocation and > just stick in what I need. Does this mean I should call > DMSetMatrixStructureOnly with PETSC_TRUE? In that case, can I assume that > the initial assembly of the matrix will be slow due to allocation on the > fly, but since the matrix always has the same non-zero structure, this is > not an issue for the many repeated uses of the matrix that will occur? > > Thanks so much for all the help. > > On Mon, Oct 23, 2017 at 3:37 PM, Barry Smith wrote: > >> >> > On Oct 23, 2017, at 2:13 PM, zakaryah . wrote: >> > >> > Thanks Matt and Jed, the suggestion of DMComposite is perfect for what I >> want to do. Thanks Lukasz, as your implementation is also illuminating. >> > >> > I'm working on the code now. I've realized that this is a good >> opportunity to set up the code so that it will work properly with >> multigrid. My problem has a dependence on some external fields. In other >> words, there is constant data at each point in the DMDA. I don't know how >> to implement those so that they will be scaled properly as the grid is >> coarsened/refined. Any hints? >> >> Depending on the meaning of the fields to coarsen one can use either >> injection (just grab values on the coarse grid points) or restriction >> (normally this is the transpose of linear interpolation and sort of an >> averages values near the coarse grid points. To refine one generally uses >> linear interpolation. The DMDA can provide these operations (it also >> provides them to the geometric multigrid solver). DMCreateInjection(), >> DMCreateInterpolation(), DMCreateRestriction(). >> >> Barry >> >> > >> > Thanks again - it's amazing to me how thorough the PETSc methods are, >> and the ease with which the user can access so many powerful methods, while >> your support is so knowledgeable and responsive. >> > >> > On Sun, Oct 22, 2017 at 11:41 AM, Jed Brown wrote: >> > Alternatively, see DMComposite and src/snes/examples/tutorials/ex22.c. >> > >> > Lukasz Kaczmarczyk writes: >> > >> > > On 22 Oct 2017, at 03:16, zakaryah . > aryah at gmail.com>> wrote: >> > > >> > > OK, it turns out Lukasz was exactly correct. With whatever method I >> try, the solver or stepper approaches a critical point, which is associated >> with some kind of snap-through. I have looked into the control techniques >> and they are pretty ingenious, and I think they should work for my problem, >> in that I hope to continue through the critical point. I have a technical >> question about the implementation, though. >> > > >> > > Following Riks 1979 for example, the control parameter is the >> approximate arc-length in the phase space of loading intensity and >> displacements. It represents one additional variable in the system, and >> there is one additional equation in the system (in Riks, this is eq. 3.9). >> > > >> > > In my implementation, the displacements are implemented as a DMDA with >> 3 dof, since I'm working in 3D. I'm not sure about the best way to add the >> single additional variable and equality. The way I see it, I either give >> up on using the DMDA, in which case I'm not sure how to efficiently >> implement the stencil I need to calculate spatial derivatives of the >> displacements, or I have to add a rather large number of extra variables. >> For example, if my DMDA is WxHxD, I would have to make it (W+1)xHxD, and >> each of the extra HxD variables will have 3 dof. Then 3xHxD-1 variables >> are in the nullspace (because they don't represent anything, so I would >> have to add a bunch of zeros to the function and the Jacobian), while the >> remaining variable is used as the control parameter. I'm aware of other >> methods, e.g. Crisfield 1983, but I'm interested in whether there is a >> straightforward way to implement Riks' method in PETSc. I'm sure I'm >> missing something so hopefully someone can give me some hints. >> > > >> > > Thanks for all the help! >> > > >> > > >> > > Zakaryah, >> > > >> > > If you like to have a peek how we doing that, you can see >> > > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_ >> 8hpp_source.html >> > > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_ >> 8cpp_source.html >> > > >> > > The implementation is specific features related to MoFEM >> implementation. However, you can follow the same idea; implement shell >> matrix, which adds column and row with controlling and controlled equation, >> respectively, This shell matrix has to have an operator for matrix-vector >> multiplication. Then you have to add preconditioner, which is based on Riks >> and others. In fact you can use as well FieldSplit pre-conditioner, Riks >> method is some variant of Schur complement. >> > > >> > > Such implementation allows running multi-grid preconditioner and other >> preconditions with control equation. >> > > >> > > Hope that this will be helpful. >> > > >> > > Regards, >> > > Lukasz >> > > >> > > >> > > >> > > On Thu, Oct 12, 2017 at 2:02 PM, zakaryah . > > wrote: >> > > Thanks for the response, Matt - these are excellent questions. >> > > >> > > On theoretical grounds, I am certain that the solution to the >> continuous PDE exists. Without any serious treatment, I think this means >> the discretized system should have a solution up to discretization error, >> but perhaps this is indeed a bad approach. >> > > >> > > I am not sure whether the equations are "really hard to solve". At >> each point, the equations are third order polynomials of the state variable >> at that point and at nearby points (i.e. in the stencil). One possible >> complication is that the external forces which are applied to the interior >> of the material can be fairly complex - they are smooth, but they can have >> many inflection points. >> > > >> > > I don't have a great test case for which I know a good solution. To >> my thinking, there is no way that time-stepping the parabolic version of >> the same PDE can fail to yield a solution at infinite time. So, I'm going >> to try starting there. Converting the problem to a minimization is a bit >> trickier, because the discretization has to be performed one step earlier >> in the calculation, and therefore the gradient and Hessian would need to be >> recalculated. >> > > >> > > Even if there are some problems with time-stepping (speed of >> convergence?), maybe I can use the solutions as better test cases for the >> elliptic PDE solved via SNES. >> > > >> > > Can you give me any additional lingo or references for the fracture >> problem? >> > > >> > > Thanks, Zak >> > > >> > > On Wed, Oct 11, 2017 at 8:53 PM, Matthew Knepley > > wrote: >> > > On Wed, Oct 11, 2017 at 11:33 AM, zakaryah . > > wrote: >> > > Many thanks for the suggestions, Matt. >> > > >> > > I tried putting the solvers in a loop, like this: >> > > >> > > do { >> > > NewtonLS >> > > check convergence >> > > if (converged) break >> > > NRichardson or NGMRES >> > > } while (!converged) >> > > >> > > The results were interesting, to me at least. With NRichardson, there >> was indeed improvement in the residual norm, followed by improvement with >> NewtonLS, and so on for a few iterations of this loop. In each case, after >> a few iterations the NewtonLS appeared to be stuck in the same way as after >> the first iteration. Eventually neither method was able to reduce the >> residual norm, which was still significant, so this was not a total >> success. With NGMRES, the initial behavior was similar, but eventually the >> NGMRES progress became erratic. The minimal residual norm was a bit better >> using NGMRES than NRichardson, but neither combination of methods fully >> converged. For both NRichardson and NGMRES, I simply used the defaults, as >> I have no knowledge of how to tune the options for my problem. >> > > >> > > Are you certain that the equations have a solution? I become a little >> concerned when richardson stops converging. Its >> > > still possible you have really hard to solve equations, it just >> becomes less likely. And even if they truly are hard to solve, >> > > then there should be physical reasons for this. For example, it could >> be that discretizing the minimizing PDE is just the >> > > wrong thing to do. I believe this is the case in fracture, where you >> attack the minimization problem directly. >> > > >> > > Matt >> > > >> > > On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley > > wrote: >> > > On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . > > wrote: >> > > Thanks for clearing that up. >> > > >> > > I'd appreciate any further help. Here's a summary: >> > > >> > > My ultimate goal is to find a vector field which minimizes an action. >> The action is a (nonlinear) function of the field and its first spatial >> derivatives. >> > > >> > > My current approach is to derive the (continuous) Euler-Lagrange >> equations, which results in a nonlinear PDE that the minimizing field must >> satisfy. These Euler-Lagrange equations are then discretized, and I'm >> trying to use an SNES to solve them. >> > > >> > > The problem is that the solver seems to reach a point at which the >> Jacobian (this corresponds to the second variation of the action, which is >> like a Hessian of the energy) becomes nearly singular, but where the >> residual (RHS of PDE) is not close to zero. The residual does not decrease >> over additional SNES iterations, and the line search results in tiny step >> sizes. My interpretation is that this point of stagnation is a critical >> point. >> > > >> > > The normal thing to do here (I think) is to engage solvers which do >> not depend on that particular point. So using >> > > NRichardson, or maybe NGMRES, to get past that. I would be interested >> to see if this is successful. >> > > >> > > Matt >> > > >> > > I have checked the hand-coded Jacobian very carefully and I am >> confident that it is correct. >> > > >> > > I am guessing that such a situation is well-known in the field, but I >> don't know the lingo or literature. If anyone has suggestions I'd be >> thrilled. Are there documentation/methodologies within PETSc for this type >> of situation? >> > > >> > > Is there any advantage to discretizing the action itself and using the >> optimization routines? With minor modifications I'll have the gradient and >> Hessian calculations coded. Are the optimization routines likely to >> stagnate in the same way as the nonlinear solver, or can they take >> advantage of the structure of the problem to overcome this? >> > > >> > > Thanks a lot in advance for any help. >> > > >> > > On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith > > wrote: >> > > >> > > There is apparently confusing in understanding the ordering. Is this >> all on one process that you get funny results? Are you using >> MatSetValuesStencil() to provide the matrix (it is generally easier than >> providing it yourself). In parallel MatView() always maps the rows and >> columns to the natural ordering before printing, if you use a matrix >> created from the DMDA. If you create the matrix yourself it has a different >> MatView in parallel that is in in thePETSc ordering.\ >> > > >> > > >> > > Barry >> > > >> > > >> > > >> > >> On Oct 8, 2017, at 8:05 AM, zakaryah . > aryah at gmail.com>> wrote: >> > >> >> > >> I'm more confused than ever. I don't understand the output of >> -snes_type test -snes_test_display. >> > >> >> > >> For the user-defined state of the vector (where I'd like to test the >> Jacobian), the finite difference Jacobian at row 0 evaluates as: >> > >> >> > >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >> 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, >> 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) >> (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) >> (37, 16.325) (38, 4.83918) >> > >> >> > >> But the hand-coded Jacobian at row 0 evaluates as: >> > >> >> > >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >> 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, >> 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) >> (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, >> 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) >> > >> and the difference between the Jacobians at row 0 evaluates as: >> > >> >> > >> row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, >> 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, >> -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) >> (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, >> -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) >> (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, >> 0.) (41, 0.) >> > >> >> > >> The difference between the column numbering between the finite >> difference and the hand-coded Jacobians looks like a serious problem to me, >> but I'm probably missing something. >> > >> >> > >> I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, >> and for this test problem the grid dimensions are 11x7x6. For a grid point >> x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? >> If so, then the column numbers of the hand-coded Jacobian match those of >> the 27 point stencil I have in mind. However, I am then at a loss to >> explain the column numbers in the finite difference Jacobian. >> > >> >> > >> >> > >> >> > >> >> > >> On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . > > wrote: >> > >> OK - I ran with -snes_monitor -snes_converged_reason >> -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual >> -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls >> -snes_compare_explicit >> > >> >> > >> and here is the full error message, output immediately after >> > >> >> > >> Finite difference Jacobian >> > >> Mat Object: 24 MPI processes >> > >> type: mpiaij >> > >> >> > >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> > >> >> > >> [0]PETSC ERROR: Invalid argument >> > >> >> > >> [0]PETSC ERROR: Matrix not generated from a DMDA >> > >> >> > >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/ >> documentation/faq.html for trouble shooting. >> > >> >> > >> [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 >> > >> >> > >> [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named >> node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 >> > >> >> > >> [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 >> --download-fblaslapack -with-debugging=0 >> > >> >> > >> [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c >> > >> >> > >> [0]PETSC ERROR: #2 MatView() line 901 in /rugpfs/fs0/home/zfrentz/ >> PETSc/build/petsc-3.7.6/src/mat/interface/matrix.c >> > >> >> > >> [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/interface/snes.c >> > >> >> > >> [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c >> > >> >> > >> [0]PETSC ERROR: #5 SNESSolve() line 4005 in /rugpfs/fs0/home/zfrentz/ >> PETSc/build/petsc-3.7.6/src/snes/interface/snes.c >> > >> >> > >> [0]PETSC ERROR: #6 solveWarp3D() line 659 in >> /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c >> > >> >> > >> >> > >> On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown > jed at jedbrown.org>> wrote: >> > >> Always always always send the whole error message. >> > >> >> > >> "zakaryah ." > writes: >> > >> >> > >> > I tried -snes_compare_explicit, and got the following error: >> > >> > >> > >> > [0]PETSC ERROR: Invalid argument >> > >> > >> > >> > [0]PETSC ERROR: Matrix not generated from a DMDA >> > >> > >> > >> > What am I doing wrong? >> > >> > >> > >> > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown > > wrote: >> > >> > >> > >> >> Barry Smith > >> writes: >> > >> >> >> > >> >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . > > wrote: >> > >> >> >> >> > >> >> >> I'm still working on this. I've made some progress, and it >> looks like >> > >> >> the issue is with the KSP, at least for now. The Jacobian may be >> > >> >> ill-conditioned. Is it possible to use -snes_test_display during >> an >> > >> >> intermediate step of the analysis? I would like to inspect the >> Jacobian >> > >> >> after several solves have already completed, >> > >> >> > >> > >> >> > No, our currently code for testing Jacobians is poor quality >> and >> > >> >> poorly organized. Needs a major refactoring to do things properly. >> Sorry >> > >> >> >> > >> >> You can use -snes_compare_explicit or -snes_compare_coloring to >> output >> > >> >> differences on each Newton step. >> > >> >> >> > >> >> > >> >> > > >> > > >> > > >> > > >> > > >> > > -- >> > > What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> > > -- Norbert Wiener >> > > >> > > https://www.cse.buffalo.edu/~knepley/ >> > > >> > > >> > > >> > > >> > > -- >> > > What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> > > -- Norbert Wiener >> > > >> > > https://www.cse.buffalo.edu/~knepley/ >> >> From knepley at gmail.com Tue Oct 24 06:57:45 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 24 Oct 2017 07:57:45 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> Message-ID: On Mon, Oct 23, 2017 at 8:30 PM, zakaryah . wrote: > Thanks Barry. For my problem, restriction seems more natural. I still > don't understand how to actually introduce the field. As I understand it, > the multi-grid procedures (coarsening, interpolating, etc.) are performed > on the state variables, which for my problem can be naturally represented > by a DMDA. I imagine for the external fields, I create a second DMDA, but > I'm not sure if I somehow couple it to the state variable DMDA or just pass > it through the user defined context or what. > > I have another question about the DMComposite. How do I calculate the > value of the function for the redundant DM, as it depends on the state > variables at all grid locations? The example ex21 is easy, because the > function for the redundant variable depends only on a Lagrange multiplier > which is known to belong to the first processor. I am hoping that I can do > something like increment to the function value for the redundant DM? Then > everything is safe because it happens between VecGetArray and > VecRestoreArray? Or, is the thread safety due to the DMCompositeScatter/DMCompositeGather > calls? In that case, am I forced to use ADD_VALUES? > If its the global vector, one guy will have the value and everyone else will have no values. Then you just stick in the value on that process. To calculate the value, just have everyone calculate the local value and then use MPI_Reduce() to get the value to stick in. > My last question, for now. Matt - you said it would be tricky to > preallocate the composite matrix, and that I should turn off allocation and > just stick in what I need. Does this mean I should call > DMSetMatrixStructureOnly with PETSC_TRUE? In that case, can I assume that > the initial assembly of the matrix will be slow due to allocation on the > fly, but since the matrix always has the same non-zero structure, this is > not an issue for the many repeated uses of the matrix that will occur? > No. Use the matrix the DM gives you, but it will have no allocation for the coupling. So call MatSetOption() with PETSC_FALSE for the ALLOCATION_ERR flag. Thanks, Matt > Thanks so much for all the help. > > On Mon, Oct 23, 2017 at 3:37 PM, Barry Smith wrote: > >> >> > On Oct 23, 2017, at 2:13 PM, zakaryah . wrote: >> > >> > Thanks Matt and Jed, the suggestion of DMComposite is perfect for what >> I want to do. Thanks Lukasz, as your implementation is also illuminating. >> > >> > I'm working on the code now. I've realized that this is a good >> opportunity to set up the code so that it will work properly with >> multigrid. My problem has a dependence on some external fields. In other >> words, there is constant data at each point in the DMDA. I don't know how >> to implement those so that they will be scaled properly as the grid is >> coarsened/refined. Any hints? >> >> Depending on the meaning of the fields to coarsen one can use either >> injection (just grab values on the coarse grid points) or restriction >> (normally this is the transpose of linear interpolation and sort of an >> averages values near the coarse grid points. To refine one generally uses >> linear interpolation. The DMDA can provide these operations (it also >> provides them to the geometric multigrid solver). DMCreateInjection(), >> DMCreateInterpolation(), DMCreateRestriction(). >> >> Barry >> >> > >> > Thanks again - it's amazing to me how thorough the PETSc methods are, >> and the ease with which the user can access so many powerful methods, while >> your support is so knowledgeable and responsive. >> > >> > On Sun, Oct 22, 2017 at 11:41 AM, Jed Brown wrote: >> > Alternatively, see DMComposite and src/snes/examples/tutorials/ex22.c. >> > >> > Lukasz Kaczmarczyk writes: >> > >> > > On 22 Oct 2017, at 03:16, zakaryah . > aryah at gmail.com>> wrote: >> > > >> > > OK, it turns out Lukasz was exactly correct. With whatever method I >> try, the solver or stepper approaches a critical point, which is associated >> with some kind of snap-through. I have looked into the control techniques >> and they are pretty ingenious, and I think they should work for my problem, >> in that I hope to continue through the critical point. I have a technical >> question about the implementation, though. >> > > >> > > Following Riks 1979 for example, the control parameter is the >> approximate arc-length in the phase space of loading intensity and >> displacements. It represents one additional variable in the system, and >> there is one additional equation in the system (in Riks, this is eq. 3.9). >> > > >> > > In my implementation, the displacements are implemented as a DMDA >> with 3 dof, since I'm working in 3D. I'm not sure about the best way to >> add the single additional variable and equality. The way I see it, I >> either give up on using the DMDA, in which case I'm not sure how to >> efficiently implement the stencil I need to calculate spatial derivatives >> of the displacements, or I have to add a rather large number of extra >> variables. For example, if my DMDA is WxHxD, I would have to make it >> (W+1)xHxD, and each of the extra HxD variables will have 3 dof. Then >> 3xHxD-1 variables are in the nullspace (because they don't represent >> anything, so I would have to add a bunch of zeros to the function and the >> Jacobian), while the remaining variable is used as the control parameter. >> I'm aware of other methods, e.g. Crisfield 1983, but I'm interested in >> whether there is a straightforward way to implement Riks' method in PETSc. >> I'm sure I'm missing something so hopefully someone can give me some hints. >> > > >> > > Thanks for all the help! >> > > >> > > >> > > Zakaryah, >> > > >> > > If you like to have a peek how we doing that, you can see >> > > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_8hpp >> _source.html >> > > http://mofem.eng.gla.ac.uk/mofem/html/_arc_length_tools_8cpp >> _source.html >> > > >> > > The implementation is specific features related to MoFEM >> implementation. However, you can follow the same idea; implement shell >> matrix, which adds column and row with controlling and controlled equation, >> respectively, This shell matrix has to have an operator for matrix-vector >> multiplication. Then you have to add preconditioner, which is based on Riks >> and others. In fact you can use as well FieldSplit pre-conditioner, Riks >> method is some variant of Schur complement. >> > > >> > > Such implementation allows running multi-grid preconditioner and >> other preconditions with control equation. >> > > >> > > Hope that this will be helpful. >> > > >> > > Regards, >> > > Lukasz >> > > >> > > >> > > >> > > On Thu, Oct 12, 2017 at 2:02 PM, zakaryah . > > wrote: >> > > Thanks for the response, Matt - these are excellent questions. >> > > >> > > On theoretical grounds, I am certain that the solution to the >> continuous PDE exists. Without any serious treatment, I think this means >> the discretized system should have a solution up to discretization error, >> but perhaps this is indeed a bad approach. >> > > >> > > I am not sure whether the equations are "really hard to solve". At >> each point, the equations are third order polynomials of the state variable >> at that point and at nearby points (i.e. in the stencil). One possible >> complication is that the external forces which are applied to the interior >> of the material can be fairly complex - they are smooth, but they can have >> many inflection points. >> > > >> > > I don't have a great test case for which I know a good solution. To >> my thinking, there is no way that time-stepping the parabolic version of >> the same PDE can fail to yield a solution at infinite time. So, I'm going >> to try starting there. Converting the problem to a minimization is a bit >> trickier, because the discretization has to be performed one step earlier >> in the calculation, and therefore the gradient and Hessian would need to be >> recalculated. >> > > >> > > Even if there are some problems with time-stepping (speed of >> convergence?), maybe I can use the solutions as better test cases for the >> elliptic PDE solved via SNES. >> > > >> > > Can you give me any additional lingo or references for the fracture >> problem? >> > > >> > > Thanks, Zak >> > > >> > > On Wed, Oct 11, 2017 at 8:53 PM, Matthew Knepley > > wrote: >> > > On Wed, Oct 11, 2017 at 11:33 AM, zakaryah . > > wrote: >> > > Many thanks for the suggestions, Matt. >> > > >> > > I tried putting the solvers in a loop, like this: >> > > >> > > do { >> > > NewtonLS >> > > check convergence >> > > if (converged) break >> > > NRichardson or NGMRES >> > > } while (!converged) >> > > >> > > The results were interesting, to me at least. With NRichardson, >> there was indeed improvement in the residual norm, followed by improvement >> with NewtonLS, and so on for a few iterations of this loop. In each case, >> after a few iterations the NewtonLS appeared to be stuck in the same way as >> after the first iteration. Eventually neither method was able to reduce >> the residual norm, which was still significant, so this was not a total >> success. With NGMRES, the initial behavior was similar, but eventually the >> NGMRES progress became erratic. The minimal residual norm was a bit better >> using NGMRES than NRichardson, but neither combination of methods fully >> converged. For both NRichardson and NGMRES, I simply used the defaults, as >> I have no knowledge of how to tune the options for my problem. >> > > >> > > Are you certain that the equations have a solution? I become a little >> concerned when richardson stops converging. Its >> > > still possible you have really hard to solve equations, it just >> becomes less likely. And even if they truly are hard to solve, >> > > then there should be physical reasons for this. For example, it could >> be that discretizing the minimizing PDE is just the >> > > wrong thing to do. I believe this is the case in fracture, where you >> attack the minimization problem directly. >> > > >> > > Matt >> > > >> > > On Tue, Oct 10, 2017 at 4:08 PM, Matthew Knepley > > wrote: >> > > On Tue, Oct 10, 2017 at 12:08 PM, zakaryah . > > wrote: >> > > Thanks for clearing that up. >> > > >> > > I'd appreciate any further help. Here's a summary: >> > > >> > > My ultimate goal is to find a vector field which minimizes an >> action. The action is a (nonlinear) function of the field and its first >> spatial derivatives. >> > > >> > > My current approach is to derive the (continuous) Euler-Lagrange >> equations, which results in a nonlinear PDE that the minimizing field must >> satisfy. These Euler-Lagrange equations are then discretized, and I'm >> trying to use an SNES to solve them. >> > > >> > > The problem is that the solver seems to reach a point at which the >> Jacobian (this corresponds to the second variation of the action, which is >> like a Hessian of the energy) becomes nearly singular, but where the >> residual (RHS of PDE) is not close to zero. The residual does not decrease >> over additional SNES iterations, and the line search results in tiny step >> sizes. My interpretation is that this point of stagnation is a critical >> point. >> > > >> > > The normal thing to do here (I think) is to engage solvers which do >> not depend on that particular point. So using >> > > NRichardson, or maybe NGMRES, to get past that. I would be interested >> to see if this is successful. >> > > >> > > Matt >> > > >> > > I have checked the hand-coded Jacobian very carefully and I am >> confident that it is correct. >> > > >> > > I am guessing that such a situation is well-known in the field, but I >> don't know the lingo or literature. If anyone has suggestions I'd be >> thrilled. Are there documentation/methodologies within PETSc for this type >> of situation? >> > > >> > > Is there any advantage to discretizing the action itself and using >> the optimization routines? With minor modifications I'll have the gradient >> and Hessian calculations coded. Are the optimization routines likely to >> stagnate in the same way as the nonlinear solver, or can they take >> advantage of the structure of the problem to overcome this? >> > > >> > > Thanks a lot in advance for any help. >> > > >> > > On Sun, Oct 8, 2017 at 5:57 AM, Barry Smith > > wrote: >> > > >> > > There is apparently confusing in understanding the ordering. Is >> this all on one process that you get funny results? Are you using >> MatSetValuesStencil() to provide the matrix (it is generally easier than >> providing it yourself). In parallel MatView() always maps the rows and >> columns to the natural ordering before printing, if you use a matrix >> created from the DMDA. If you create the matrix yourself it has a different >> MatView in parallel that is in in thePETSc ordering.\ >> > > >> > > >> > > Barry >> > > >> > > >> > > >> > >> On Oct 8, 2017, at 8:05 AM, zakaryah . > zakaryah at gmail.com>> wrote: >> > >> >> > >> I'm more confused than ever. I don't understand the output of >> -snes_type test -snes_test_display. >> > >> >> > >> For the user-defined state of the vector (where I'd like to test the >> Jacobian), the finite difference Jacobian at row 0 evaluates as: >> > >> >> > >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >> 287.797) (5, 744.695) (9, -1454.66) (10, 6.08793) (11, 148.172) (12, >> 13.1089) (13, -36.5783) (14, -9.99399) (27, -3423.49) (28, -2175.34) >> (29, 548.662) (30, 145.753) (31, 17.6603) (32, -15.1079) (36, 76.8575) >> (37, 16.325) (38, 4.83918) >> > >> >> > >> But the hand-coded Jacobian at row 0 evaluates as: >> > >> >> > >> row 0: (0, 10768.6) (1, 2715.33) (2, -1422.41) (3, -6121.71) (4, >> 287.797) (5, 744.695) (33, -1454.66) (34, 6.08792) (35, 148.172) (36, >> 13.1089) (37, -36.5783) (38, -9.99399) (231, -3423.49) (232, -2175.34) >> (233, 548.662) (234, 145.753) (235, 17.6603) (236, -15.1079) (264, >> 76.8575) (265, 16.325) (266, 4.83917) (267, 0.) (268, 0.) (269, 0.) >> > >> and the difference between the Jacobians at row 0 evaluates as: >> > >> >> > >> row 0: (0, 0.000189908) (1, 7.17315e-05) (2, 9.31778e-05) (3, >> 0.000514947) (4, 0.000178659) (5, 0.000178217) (9, -2.25457e-05) (10, >> -6.34278e-06) (11, -5.93241e-07) (12, 9.48544e-06) (13, 4.79709e-06) >> (14, 2.40016e-06) (27, -0.000335696) (28, -0.000106734) (29, >> -0.000106653) (30, 2.73119e-06) (31, -7.93382e-07) (32, 1.24048e-07) >> (36, -4.0302e-06) (37, 3.67494e-06) (38, -2.70115e-06) (39, 0.) (40, >> 0.) (41, 0.) >> > >> >> > >> The difference between the column numbering between the finite >> difference and the hand-coded Jacobians looks like a serious problem to me, >> but I'm probably missing something. >> > >> >> > >> I am trying to use a 3D DMDA with 3 dof, a box stencil of width 1, >> and for this test problem the grid dimensions are 11x7x6. For a grid point >> x,y,z, and dof c, is the index calculated as c + 3*x + 3*11*y + 3*11*7*z? >> If so, then the column numbers of the hand-coded Jacobian match those of >> the 27 point stencil I have in mind. However, I am then at a loss to >> explain the column numbers in the finite difference Jacobian. >> > >> >> > >> >> > >> >> > >> >> > >> On Sat, Oct 7, 2017 at 1:49 PM, zakaryah . > > wrote: >> > >> OK - I ran with -snes_monitor -snes_converged_reason >> -snes_linesearch_monitor -ksp_monitor -ksp_monitor_true_residual >> -ksp_converged_reason -pc_type svd -pc_svd_monitor -snes_type newtonls >> -snes_compare_explicit >> > >> >> > >> and here is the full error message, output immediately after >> > >> >> > >> Finite difference Jacobian >> > >> Mat Object: 24 MPI processes >> > >> type: mpiaij >> > >> >> > >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> > >> >> > >> [0]PETSC ERROR: Invalid argument >> > >> >> > >> [0]PETSC ERROR: Matrix not generated from a DMDA >> > >> >> > >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/d >> ocumentation/faq.html for trouble shooting. >> > >> >> > >> [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 >> > >> >> > >> [0]PETSC ERROR: ./CalculateOpticalFlow on a arch-linux2-c-opt named >> node046.hpc.rockefeller.internal by zfrentz Sat Oct 7 13:44:44 2017 >> > >> >> > >> [0]PETSC ERROR: Configure options --prefix=/ru-auth/local/home/zfrentz/PETSc-3.7.6 >> --download-fblaslapack -with-debugging=0 >> > >> >> > >> [0]PETSC ERROR: #1 MatView_MPI_DA() line 551 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/dm/impls/da/fdda.c >> > >> >> > >> [0]PETSC ERROR: #2 MatView() line 901 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/mat/ >> interface/matrix.c >> > >> >> > >> [0]PETSC ERROR: #3 SNESComputeJacobian() line 2371 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/ >> interface/snes.c >> > >> >> > >> [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() line 228 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/impls/ls/ls.c >> > >> >> > >> [0]PETSC ERROR: #5 SNESSolve() line 4005 in >> /rugpfs/fs0/home/zfrentz/PETSc/build/petsc-3.7.6/src/snes/ >> interface/snes.c >> > >> >> > >> [0]PETSC ERROR: #6 solveWarp3D() line 659 in >> /ru-auth/local/home/zfrentz/Code/OpticalFlow/working/October6_2017/mshs.c >> > >> >> > >> >> > >> On Tue, Oct 3, 2017 at 5:37 PM, Jed Brown > jed at jedbrown.org>> wrote: >> > >> Always always always send the whole error message. >> > >> >> > >> "zakaryah ." > writes: >> > >> >> > >> > I tried -snes_compare_explicit, and got the following error: >> > >> > >> > >> > [0]PETSC ERROR: Invalid argument >> > >> > >> > >> > [0]PETSC ERROR: Matrix not generated from a DMDA >> > >> > >> > >> > What am I doing wrong? >> > >> > >> > >> > On Tue, Oct 3, 2017 at 10:08 AM, Jed Brown > > wrote: >> > >> > >> > >> >> Barry Smith > >> writes: >> > >> >> >> > >> >> >> On Oct 3, 2017, at 5:54 AM, zakaryah . > > wrote: >> > >> >> >> >> > >> >> >> I'm still working on this. I've made some progress, and it >> looks like >> > >> >> the issue is with the KSP, at least for now. The Jacobian may be >> > >> >> ill-conditioned. Is it possible to use -snes_test_display during >> an >> > >> >> intermediate step of the analysis? I would like to inspect the >> Jacobian >> > >> >> after several solves have already completed, >> > >> >> > >> > >> >> > No, our currently code for testing Jacobians is poor quality >> and >> > >> >> poorly organized. Needs a major refactoring to do things >> properly. Sorry >> > >> >> >> > >> >> You can use -snes_compare_explicit or -snes_compare_coloring to >> output >> > >> >> differences on each Newton step. >> > >> >> >> > >> >> > >> >> > > >> > > >> > > >> > > >> > > >> > > -- >> > > What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> > > -- Norbert Wiener >> > > >> > > https://www.cse.buffalo.edu/~knepley/> > >> > > >> > > >> > > >> > > >> > > -- >> > > What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> > > -- Norbert Wiener >> > > >> > > https://www.cse.buffalo.edu/~knepley/> > >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From datadc at datadc-notifications.com Tue Oct 24 11:34:56 2017 From: datadc at datadc-notifications.com (Data.com Connect) Date: Tue, 24 Oct 2017 09:34:56 -0700 Subject: [petsc-users] Data.com: Update or remove your business information Message-ID: <1508862896.13520.0.899118@subcontrl.com> Dear Ganesh, You are receiving this email because a member of the Data.com Connect community recently submitted your business contact information -- name, title, company, email, and phone number -- for inclusion in our online business directory of millions of business professionals. As part of our commitment to protect your privacy, we are providing you with the tools to manage your online identity and the ways you prefer to be contacted by our members and customers about business-related topics relevant to your career, industry, or employer. If you prefer not to have your business contact information included in our online business directory, simply click the following link: Remove my profile: http://subcontrl.com/rd/9z1zrg8egh2htrn3iev2ohn55ernpn4pnui2hbotivg By choosing to remove yourself, your contact information will not be available or searchable in our database. We will keep your information offline, to ensure you are never added again. If you choose to include your information, you can click on the link below to: * View your business contact information and update it to ensure accuracy. * Control the way other business professionals reach you by setting your contact preferences. Review and Set Preferences: http://subcontrl.com/rd/9z1zu2ut6rp1tt2c0kmnl32eqip00bn5ddalk3rmlfg You can also choose to remove yourself any time. Simply visit Data.com, click Are You in Data.com, and follow the instructions provided. For information about how our data is compiled, please review our Business Contact Privacy Statement. Please note that our Terms of Use prohibit members from entering non-business related information, such as mobile phone numbers and personal email addresses, and require members to comply with anti-spam laws, including the CAN-SPAM Act. Business Contact Privacy Statement: http://subcontrl.com/rd/9z1zvr2t8tlroj767sq8u65kqlg06eugb2gshm2u7og Thank you, The Data.com Team ************************************************* Data.com is a brand of salesforce.com(r). salesforce.com, inc. is located at The Landmark @ One Market Street, Suite 300, San Francisco, California 94105. If the above "Remove my profile" link is disabled, please copy and paste the following link into your browser address bar: http://subcontrl.com/rd/9z1znq287ie50c9lumnlt6kc4raiia1mfjua1i8nsa0 If you have questions about removing yourself from Data.com Connect, please visit our Removal Support page. http://subcontrl.com/rd/9z1z8d0oejnnej6i3s742sp1gooak7tj2vo1v4vfcfo -------------- next part -------------- An HTML attachment was scrubbed... URL: From zakaryah at gmail.com Tue Oct 24 13:45:28 2017 From: zakaryah at gmail.com (zakaryah .) Date: Tue, 24 Oct 2017 14:45:28 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> Message-ID: I see - I use a local variable to add up the terms on each process, then call MPI_Reduce within the function on process 0, which owns the redundant variable. I have one more question - for the calculation of the Jacobian, my life is made much much easier by using MatSetValuesStencil. However, the matrix which the SNES Jacobian receives as argument is the "full" matrix, containing the DMDA variables (displacements), plus the redundant variable. How do I access the submatrix corresponding just to the DMDA? If I can do that, then I can call MatSetValuesStencil on the submatrix. Is this the right approach? I'm not sure how to set the elements of the Jacobian which correspond to the redundant variable, either - i.e., how do I get the ordering? -------------- next part -------------- An HTML attachment was scrubbed... URL: From zakaryah at gmail.com Tue Oct 24 14:36:04 2017 From: zakaryah at gmail.com (zakaryah .) Date: Tue, 24 Oct 2017 15:36:04 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <66C5A502-B1F1-47FB-80CA-57708ECA613F@mcs.anl.gov> <87o9qp5kou.fsf@jedbrown.org> <3B9781BF-4A27-4D51-814E-EEEEBDBBEAF9@mcs.anl.gov> <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> Message-ID: Well I made a little progress by considering SNES ex28.c. In the Jacobian routine, I call DMCompositeGetLocalISs, then use the IS to call MatGetLocalSubMatrix. I call these J_rr, J_rh, J_hr, and J_hh, where r represents the redundant variables and h represents the displacements. I assume I can call MatSetValuesStencil on J_hh, as before, and MatSetValue on J_rr (which is 1x1). I'm guessing that J_rr, J_rh, and J_hr can only be set on the processor which owns the redundant variable - is this correct? How do I determine the ordering for J_hr and J_rh? On Tue, Oct 24, 2017 at 2:45 PM, zakaryah . wrote: > I see - I use a local variable to add up the terms on each process, then > call MPI_Reduce within the function on process 0, which owns the redundant > variable. > > I have one more question - for the calculation of the Jacobian, my life is > made much much easier by using MatSetValuesStencil. However, the matrix > which the SNES Jacobian receives as argument is the "full" matrix, > containing the DMDA variables (displacements), plus the redundant > variable. How do I access the submatrix corresponding just to the DMDA? > If I can do that, then I can call MatSetValuesStencil on the submatrix. Is > this the right approach? I'm not sure how to set the elements of the > Jacobian which correspond to the redundant variable, either - i.e., how do > I get the ordering? > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Oct 24 22:01:18 2017 From: jed at jedbrown.org (Jed Brown) Date: Tue, 24 Oct 2017 21:01:18 -0600 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> Message-ID: <87inf4vsld.fsf@jedbrown.org> Hmm, this is a use case we would like to support, but I'm not sure how to do it. I think at this point that your local submatrix will only have a stencil set if you were using MatNest, which I don't recommend (whatever you do, don't depend on it). But since you have the DMDA for your J_hh block, you can call MatSetStencil() yourself. ierr = DMDAGetGhostCorners(da,&starts[0],&starts[1],&starts[2],&dims[0],&dims[1],&dims[2]);CHKERRQ(ierr); ierr = MatSetStencil(A,dim,dims,starts,dof);CHKERRQ(ierr); I don't know of code you could use to translate the stencil coordinates to local indices (it's just a lexicographic ordering so simple for you to write), but you can use MatSetValuesLocal on the J_hr/J_rh blocks. "zakaryah ." writes: > Well I made a little progress by considering SNES ex28.c. In the Jacobian > routine, I call DMCompositeGetLocalISs, then use the IS to call > MatGetLocalSubMatrix. I call these J_rr, J_rh, J_hr, and J_hh, where r > represents the redundant variables and h represents the displacements. I > assume I can call MatSetValuesStencil on J_hh, as before, and MatSetValue > on J_rr (which is 1x1). I'm guessing that J_rr, J_rh, and J_hr can only be > set on the processor which owns the redundant variable - is this correct? > How do I determine the ordering for J_hr and J_rh? > > On Tue, Oct 24, 2017 at 2:45 PM, zakaryah . wrote: > >> I see - I use a local variable to add up the terms on each process, then >> call MPI_Reduce within the function on process 0, which owns the redundant >> variable. >> >> I have one more question - for the calculation of the Jacobian, my life is >> made much much easier by using MatSetValuesStencil. However, the matrix >> which the SNES Jacobian receives as argument is the "full" matrix, >> containing the DMDA variables (displacements), plus the redundant >> variable. How do I access the submatrix corresponding just to the DMDA? >> If I can do that, then I can call MatSetValuesStencil on the submatrix. Is >> this the right approach? I'm not sure how to set the elements of the >> Jacobian which correspond to the redundant variable, either - i.e., how do >> I get the ordering? >> >> From niko.karin at gmail.com Wed Oct 25 03:32:33 2017 From: niko.karin at gmail.com (Karin&NiKo) Date: Wed, 25 Oct 2017 10:32:33 +0200 Subject: [petsc-users] Matload with given parallel layout Message-ID: Dear PETSc team, I have a code that creates a parallel matrix based on domain decomposition. I serialize this matrix with MatView. Then, I would like to relaod it with MatLoad but not leaving PETSc decide the parallel layout but rather use the original distribution of the degrees of freedom. Does MatLoad help to do that? Shall I use the IS of the local/global mapping ? I look forward to reading you, Nicolas -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 25 04:40:12 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 25 Oct 2017 05:40:12 -0400 Subject: [petsc-users] Matload with given parallel layout In-Reply-To: References: Message-ID: On Wed, Oct 25, 2017 at 4:32 AM, Karin&NiKo wrote: > Dear PETSc team, > > I have a code that creates a parallel matrix based on domain > decomposition. I serialize this matrix with MatView. > Then, I would like to relaod it with MatLoad but not leaving PETSc decide > the parallel layout but rather use the original distribution of the degrees > of freedom. > Does MatLoad help to do that? Shall I use the IS of the local/global > mapping ? > You can prescribe the layout of the Mat that is passed to MatLoad(). However, if you are asking that the original layout be stored in the file, we do not do that. It could be made to work by writing the layout to the .info file and then having the option call MatSetSizes() on load. Should not be much code. Thanks, Matt > I look forward to reading you, > Nicolas > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From niko.karin at gmail.com Wed Oct 25 05:10:32 2017 From: niko.karin at gmail.com (Karin&NiKo) Date: Wed, 25 Oct 2017 12:10:32 +0200 Subject: [petsc-users] Matload with given parallel layout In-Reply-To: References: Message-ID: Thank you very much for your answer. Is there an example in the PETSc tests that shows how to prescribe the layout of the Mat that is passed to MatLoad()? I see how to specify a local size with MatSetSize but I do not see how to assign an entry of the Mat to a given process... Thanks, Nicolas 2017-10-25 11:40 GMT+02:00 Matthew Knepley : > On Wed, Oct 25, 2017 at 4:32 AM, Karin&NiKo wrote: > >> Dear PETSc team, >> >> I have a code that creates a parallel matrix based on domain >> decomposition. I serialize this matrix with MatView. >> Then, I would like to relaod it with MatLoad but not leaving PETSc decide >> the parallel layout but rather use the original distribution of the degrees >> of freedom. >> Does MatLoad help to do that? Shall I use the IS of the local/global >> mapping ? >> > > You can prescribe the layout of the Mat that is passed to MatLoad(). > However, if you are asking that the original > layout be stored in the file, we do not do that. It could be made to work > by writing the layout to the .info file and > then having the option call MatSetSizes() on load. Should not be much code. > > Thanks, > > Matt > > >> I look forward to reading you, >> Nicolas >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 25 05:13:28 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 25 Oct 2017 06:13:28 -0400 Subject: [petsc-users] Matload with given parallel layout In-Reply-To: References: Message-ID: On Wed, Oct 25, 2017 at 6:10 AM, Karin&NiKo wrote: > Thank you very much for your answer. > Is there an example in the PETSc tests that shows how to prescribe the > layout of the Mat that is passed to MatLoad()? I see how to specify a local > size with MatSetSize but I do not see how to assign an entry of the Mat to > a given process... > Mat object only have contiguous division of rows on processes, so specifying the size of each partition is all you can do. Matt > Thanks, > Nicolas > > 2017-10-25 11:40 GMT+02:00 Matthew Knepley : > >> On Wed, Oct 25, 2017 at 4:32 AM, Karin&NiKo wrote: >> >>> Dear PETSc team, >>> >>> I have a code that creates a parallel matrix based on domain >>> decomposition. I serialize this matrix with MatView. >>> Then, I would like to relaod it with MatLoad but not leaving PETSc >>> decide the parallel layout but rather use the original distribution of the >>> degrees of freedom. >>> Does MatLoad help to do that? Shall I use the IS of the local/global >>> mapping ? >>> >> >> You can prescribe the layout of the Mat that is passed to MatLoad(). >> However, if you are asking that the original >> layout be stored in the file, we do not do that. It could be made to work >> by writing the layout to the .info file and >> then having the option call MatSetSizes() on load. Should not be much >> code. >> >> Thanks, >> >> Matt >> >> >>> I look forward to reading you, >>> Nicolas >>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Oct 25 09:55:35 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 25 Oct 2017 09:55:35 -0500 Subject: [petsc-users] Matload with given parallel layout In-Reply-To: References: Message-ID: An HTML attachment was scrubbed... URL: From barryfsmith at me.com Wed Oct 25 10:29:58 2017 From: barryfsmith at me.com (Barry Smith) Date: Wed, 25 Oct 2017 10:29:58 -0500 Subject: [petsc-users] TSTHETA not working without explicit declaration with TSSetRHSJacobian In-Reply-To: References: <47137a4c-c897-39cb-e0d0-d19179635c19@yahoo.com> Message-ID: <494C02FC-FC89-4EE3-9C6D-B40BF9B43F5D@me.com> Sorry for the delay in answering this if you have not gotten and answer. If there is no DM and you want to use SNESComputeJacobianColor then you need to provide a Jacobian with the correct nonzero pattern (all the "nonzero" entries can just be set to zero when you pass in the matrix) this is because the coloring code uses the nonzero structure to color the matrix and then compute the Jacobians. It cannot work without the nonzero structure. I checked this and the manual page does explicitly state this. Barry > On Oct 12, 2017, at 12:08 PM, Ali Berk Kahraman wrote: > > My apologies for missing this earlier, but I have found these lines in the documentation that provides the solution for my problem. > > "To use a fully implicit method like TSTHETA or TSGL, either provide the Jacobian of F () (and G() > if G() is provided) or use a DM that provides a coloring so the Jacobian can be computed efficiently > via finite differences." > > However, I'm still confused with this, since the computation process of the Jacobian happens anyway, instead of giving an error such as "no jacobian is entered, aborting". > > > On 12-10-2017 19:45, Ali Berk Kahraman wrote: >> Hello All, >> >> >> I am trying to use TS solver without declaring RHS Jacobian, because I do not have it analytically. I'm getting the error posted at the end of the e-mail. As I understand, the SNES solver within the TS solver sees that I have not declared a Jacobian, so it calls the function SNESComputeJacobianColor to get a finite difference aproximation of the Jacobian, as I am too lazy to make this approximation myself, and SNESComputeJacobianColor calls MatFDColoringCreate() and this function finally gives the error "Matrix is in wrong state, Matrix must be assembled by calls to MatAssemblyBegin/End(); ". >> >> I am not sure if this is a bug, or it is something I'm doing wrong. It looks like a bug to me since the error is generated when the code understands that I haven't provided a Jacobian and consequently trying to compute it for me. However, I cannot be sure because I'm still pretty inexperienced using PETSc, so I'm writing this here and not to petsc-maint. Any ideas? >> >> >> Best Regards , >> >> Ali >> >> >> [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- >> [0]PETSC ERROR: Object is in wrong state >> [0]PETSC ERROR: Matrix must be assembled by calls to MatAssemblyBegin/End(); >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. >> [0]PETSC ERROR: Petsc Release Version 3.8.0, unknown >> [0]PETSC ERROR: ./FastWavelet1DTransientHeat on a arch-linux2-c-debug named abk-CFDLab by abk Thu Oct 12 19:39:21 2017 >> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-fblaslapack >> [0]PETSC ERROR: #1 MatFDColoringCreate() line 464 in /home/abk/petsc/src/mat/matfd/fdmatrix.c >> [0]PETSC ERROR: #2 SNESComputeJacobianDefaultColor() line 83 in /home/abk/petsc/src/snes/interface/snesj2.c >> [0]PETSC ERROR: #3 SNESComputeJacobian() line 2358 in /home/abk/petsc/src/snes/interface/snes.c >> [0]PETSC ERROR: #4 SNESSolve_KSPONLY() line 36 in /home/abk/petsc/src/snes/impls/ksponly/ksponly.c >> [0]PETSC ERROR: #5 SNESSolve() line 4106 in /home/abk/petsc/src/snes/interface/snes.c >> [0]PETSC ERROR: #6 TS_SNESSolve() line 176 in /home/abk/petsc/src/ts/impls/implicit/theta/theta.c >> [0]PETSC ERROR: #7 TSStep_Theta() line 216 in /home/abk/petsc/src/ts/impls/implicit/theta/theta.c >> [0]PETSC ERROR: #8 TSStep() line 4120 in /home/abk/petsc/src/ts/interface/ts.c >> [0]PETSC ERROR: #9 TSSolve() line 4374 in /home/abk/petsc/src/ts/interface/ts.c >> [0]PETSC ERROR: #10 main() line 886 in /home/abk/Dropbox/MyWorkspace/WaveletCollocation/FastWaveletCollocation1D/FastWavelet1DTransientHeat.c >> [0]PETSC ERROR: No PETSc Option Table entries >> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- >> application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0 >> [unset]: aborting job: >> application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0 >> >> >> ------------------ >> (program exited with code: 73) >> Press return to continue >> >> > From niko.karin at gmail.com Wed Oct 25 11:59:31 2017 From: niko.karin at gmail.com (Karin&NiKo) Date: Wed, 25 Oct 2017 18:59:31 +0200 Subject: [petsc-users] Matload with given parallel layout In-Reply-To: References: Message-ID: Barry, Matt, Thank you very much for that clarification. My question is mainly motivated by the will to transfer test matrices from an industrial software to a more light and agile PETSc code, in order to run numerical experiments. Since we are dealing with DDM, I would like to ensure that we keep the parallel layout of our matrices. Regards, Nicolas 2017-10-25 16:55 GMT+02:00 Barry Smith : > > On Oct 25, 2017, at 5:13 AM, Matthew Knepley wrote: > > On Wed, Oct 25, 2017 at 6:10 AM, Karin&NiKo wrote: > Thank you very much for your answer. > Is there an example in the PETSc tests that shows how to prescribe the > layout of the Mat that is passed to MatLoad()? I see how to specify a local > size with MatSetSize but I do not see how to assign an entry of the Mat to > a given process... > > Mat object only have contiguous division of rows on processes, so > specifying the size of each partition is all you can do. > > > And indeed if you want the same layout as your MatViewed matrix then > this is exactly what you need. MatView just stores from row 0 to n-1 in > order and MatLoad reads from 0 to n-1 in order. If you want a different > parallel ordering, like based on using a partitioning then you load the > matrix and use MatPartitioningCreate() and then MatCreateSubMatrix() to > redistribute the matrix (note in this case the "sub matrix" has the same > size as the original matrix". > > Note we don't recommend saving big old matrices to files and then > reloading them for solves etc. This is not scalable, better to write your > applications so the entire process doesn't require saving and load matrices. > > Barry > > > > Matt > > Thanks, > Nicolas > > 2017-10-25 11:40 GMT+02:00 Matthew Knepley : > On Wed, Oct 25, 2017 at 4:32 AM, Karin&NiKo wrote: > Dear PETSc team, > > I have a code that creates a parallel matrix based on domain > decomposition. I serialize this matrix with MatView. > Then, I would like to relaod it with MatLoad but not leaving PETSc decide > the parallel layout but rather use the original distribution of the degrees > of freedom. > Does MatLoad help to do that? Shall I use the IS of the local/global > mapping ? > > You can prescribe the layout of the Mat that is passed to MatLoad(). > However, if you are asking that the original > layout be stored in the file, we do not do that. It could be made to work > by writing the layout to the .info file and > then having the option call MatSetSizes() on load. Should not be much code. > > Thanks, > > Matt > > I look forward to reading you, > Nicolas > > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Oct 25 12:41:09 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Wed, 25 Oct 2017 17:41:09 +0000 Subject: [petsc-users] Matload with given parallel layout In-Reply-To: References: Message-ID: <3386FCEE-E969-4C18-8B02-670D8BCC42E8@mcs.anl.gov> > On Oct 25, 2017, at 11:59 AM, Karin&NiKo wrote: > > Barry, Matt, > Thank you very much for that clarification. My question is mainly motivated by the will to transfer test matrices from an industrial software to a more light and agile PETSc code, in order to run numerical experiments. Since we are dealing with DDM, I would like to ensure that we keep the parallel layout of our matrices. Got it. > > Regards, > Nicolas > > 2017-10-25 16:55 GMT+02:00 Barry Smith : > >> On Oct 25, 2017, at 5:13 AM, Matthew Knepley wrote: >> >> On Wed, Oct 25, 2017 at 6:10 AM, Karin&NiKo wrote: >> Thank you very much for your answer. >> Is there an example in the PETSc tests that shows how to prescribe the layout of the Mat that is passed to MatLoad()? I see how to specify a local size with MatSetSize but I do not see how to assign an entry of the Mat to a given process... >> >> Mat object only have contiguous division of rows on processes, so specifying the size of each partition is all you can do. > > And indeed if you want the same layout as your MatViewed matrix then this is exactly what you need. MatView just stores from row 0 to n-1 in order and MatLoad reads from 0 to n-1 in order. If you want a different parallel ordering, like based on using a partitioning then you load the matrix and use MatPartitioningCreate() and then MatCreateSubMatrix() to redistribute the matrix (note in this case the "sub matrix" has the same size as the original matrix". > > Note we don't recommend saving big old matrices to files and then reloading them for solves etc. This is not scalable, better to write your applications so the entire process doesn't require saving and load matrices. > > Barry > > >> >> Matt >> >> Thanks, >> Nicolas >> >> 2017-10-25 11:40 GMT+02:00 Matthew Knepley : >> On Wed, Oct 25, 2017 at 4:32 AM, Karin&NiKo wrote: >> Dear PETSc team, >> >> I have a code that creates a parallel matrix based on domain decomposition. I serialize this matrix with MatView. >> Then, I would like to relaod it with MatLoad but not leaving PETSc decide the parallel layout but rather use the original distribution of the degrees of freedom. >> Does MatLoad help to do that? Shall I use the IS of the local/global mapping ? >> >> You can prescribe the layout of the Mat that is passed to MatLoad(). However, if you are asking that the original >> layout be stored in the file, we do not do that. It could be made to work by writing the layout to the .info file and >> then having the option call MatSetSizes() on load. Should not be much code. >> >> Thanks, >> >> Matt >> >> I look forward to reading you, >> Nicolas >> >> >> >> >> >> -- >> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> >> >> >> -- >> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ > > From ccetinbas at anl.gov Wed Oct 25 13:52:55 2017 From: ccetinbas at anl.gov (Cetinbas, Cankur Firat) Date: Wed, 25 Oct 2017 18:52:55 +0000 Subject: [petsc-users] petsc4py Message-ID: Hi, We have a code in python to solve for 3D water transport in a nano scale pore geometry obtained by tomography image at APS. The code generates a sparse matrix with csr_matrix method in Python and solve the corresponding linear system with linsolve in Python. For small domains our code runs well but we need to make these two steps (sparse matrix generation and solving the linear system) parallel to be able run the code on realistic size domains. I know the diagonal values and the non-diagonal values with corresponding positions (positions are not regular, it is not like 5 point stencil). These values and locations change every time step and the matrix size grows (as water moves in pores). I checked the attached presentation, I was wondering if there is an easy way to generate a parallel sparse matrix from row, column and corresponding values as in Python csr? Since my off diagonal locations are not generic for each processor, it is not so trivial to me as a beginner in petsc. By the way, in the matrix generation example (in the attached pdf), when I use the "getOwnershipRange" comment both Istart and Iend returns 0. How can generate the same matrix with multiple nodes in petsc4py? Thanks for your help in advance. Regards, Firat -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Lisandro-Dalcin-petsc4py.pdf Type: application/pdf Size: 218697 bytes Desc: Lisandro-Dalcin-petsc4py.pdf URL: From bsmith at mcs.anl.gov Wed Oct 25 15:44:57 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Wed, 25 Oct 2017 20:44:57 +0000 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: Since the matrix changes size you will need to create a new matrix, fill it up, solve with it, destroy it for each time-step. No big deal. > On Oct 25, 2017, at 1:52 PM, Cetinbas, Cankur Firat wrote: > > Hi, > > We have a code in python to solve for 3D water transport in a nano scale pore geometry obtained by tomography image at APS. > > The code generates a sparse matrix with csr_matrix method in Python and solve the corresponding linear system with linsolve in Python. For small domains our code runs well but we need to make these two steps (sparse matrix generation and solving the linear system) parallel to be able run the code on realistic size domains. > > I know the diagonal values and the non-diagonal values with corresponding positions (positions are not regular, it is not like 5 point stencil). These values and locations change every time step and the matrix size grows (as water moves in pores). > > I checked the attached presentation, I was wondering if there is an easy way to generate a parallel sparse matrix from row, column and corresponding values as in Python csr? Since my off diagonal locations are not generic for each processor, it is not so trivial to me as a beginner in petsc. Each process will call MatSetValues() with "its" nonzero entries. > > By the way, in the matrix generation example (in the attached pdf), when I use the ?getOwnershipRange? comment both Istart and Iend returns 0. How can generate the same matrix with multiple nodes in petsc4py? You call MatCreate() with the MPI_COMM world communicator and that single matrix is owned by all the processes. Did you set the matrix size before calling getOwnershipRange? If you still have trouble with getOwnershipRange() returning all zeros on all processes you will need to email the sample code that produces the problem. Barry > > Thanks for your help in advance. > > Regards, > > Firat > > From zakaryah at gmail.com Wed Oct 25 20:40:44 2017 From: zakaryah at gmail.com (zakaryah .) Date: Wed, 25 Oct 2017 21:40:44 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <87inf4vsld.fsf@jedbrown.org> References: <87bmlo9vds.fsf@jedbrown.org> <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> Message-ID: Thanks Jed, that seems like it will work. I still have a problem, which I suspect is pretty simple to solve. To initialize the solve, I need to run the SNES once without the control parameter. My composite DM is dac, the DMDA inside dac is dah, the redundant field is dab, and I have the vectors set up for the solve and the residual. But I don't know how to setup the matrices for the Jacobian, in the main loop, i.e. in the scope from which SNESSolve is called. I think I have the matrix set up properly for the composite, and the evaluation routine for the composite system extracts the submatrices as we discussed. But I'm not sure how to setup a global matrix for the first solve, which is in a smaller space. I've gotten a lot out of SNES ex28 but there the smaller solve and the full solve never run in the same launch. I guess I can create two matrices for the Jacobians, one like DMCreateMatrix(dac,&J) for the full system and DMCreateMatrix(dah,&J_hh) for the initial solve. Is there a more efficient way - can I just create J and somehow extract the global submatrix J_hh? On Tue, Oct 24, 2017 at 11:01 PM, Jed Brown wrote: > Hmm, this is a use case we would like to support, but I'm not sure how > to do it. I think at this point that your local submatrix will only > have a stencil set if you were using MatNest, which I don't recommend > (whatever you do, don't depend on it). But since you have the DMDA for > your J_hh block, you can call MatSetStencil() yourself. > > ierr = DMDAGetGhostCorners(da,&starts[0],&starts[1],&starts[ > 2],&dims[0],&dims[1],&dims[2]);CHKERRQ(ierr); > ierr = MatSetStencil(A,dim,dims,starts,dof);CHKERRQ(ierr); > > I don't know of code you could use to translate the stencil coordinates > to local indices (it's just a lexicographic ordering so simple for you > to write), but you can use MatSetValuesLocal on the J_hr/J_rh blocks. > > "zakaryah ." writes: > > > Well I made a little progress by considering SNES ex28.c. In the > Jacobian > > routine, I call DMCompositeGetLocalISs, then use the IS to call > > MatGetLocalSubMatrix. I call these J_rr, J_rh, J_hr, and J_hh, where r > > represents the redundant variables and h represents the displacements. I > > assume I can call MatSetValuesStencil on J_hh, as before, and MatSetValue > > on J_rr (which is 1x1). I'm guessing that J_rr, J_rh, and J_hr can only > be > > set on the processor which owns the redundant variable - is this correct? > > How do I determine the ordering for J_hr and J_rh? > > > > On Tue, Oct 24, 2017 at 2:45 PM, zakaryah . wrote: > > > >> I see - I use a local variable to add up the terms on each process, then > >> call MPI_Reduce within the function on process 0, which owns the > redundant > >> variable. > >> > >> I have one more question - for the calculation of the Jacobian, my life > is > >> made much much easier by using MatSetValuesStencil. However, the matrix > >> which the SNES Jacobian receives as argument is the "full" matrix, > >> containing the DMDA variables (displacements), plus the redundant > >> variable. How do I access the submatrix corresponding just to the DMDA? > >> If I can do that, then I can call MatSetValuesStencil on the > submatrix. Is > >> this the right approach? I'm not sure how to set the elements of the > >> Jacobian which correspond to the redundant variable, either - i.e., how > do > >> I get the ordering? > >> > >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Oct 25 21:22:05 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 25 Oct 2017 20:22:05 -0600 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> Message-ID: <87k1zisl6a.fsf@jedbrown.org> "zakaryah ." writes: > Thanks Jed, that seems like it will work. > > I still have a problem, which I suspect is pretty simple to solve. To > initialize the solve, I need to run the SNES once without the control > parameter. My composite DM is dac, the DMDA inside dac is dah, the > redundant field is dab, and I have the vectors set up for the solve and the > residual. But I don't know how to setup the matrices for the Jacobian, in > the main loop, i.e. in the scope from which SNESSolve is called. I think I > have the matrix set up properly for the composite, and the evaluation > routine for the composite system extracts the submatrices as we discussed. > But I'm not sure how to setup a global matrix for the first solve, which is > in a smaller space. I've gotten a lot out of SNES ex28 but there the > smaller solve and the full solve never run in the same launch. Why not use a different SNES? Or just make that extra equation trivial u_r = 0? > I guess I can create two matrices for the Jacobians, one like > DMCreateMatrix(dac,&J) for the full system and > DMCreateMatrix(dah,&J_hh) for the initial solve. Is there a more > efficient way - can I just create J and somehow extract the global > submatrix J_hh? > > On Tue, Oct 24, 2017 at 11:01 PM, Jed Brown wrote: > >> Hmm, this is a use case we would like to support, but I'm not sure how >> to do it. I think at this point that your local submatrix will only >> have a stencil set if you were using MatNest, which I don't recommend >> (whatever you do, don't depend on it). But since you have the DMDA for >> your J_hh block, you can call MatSetStencil() yourself. >> >> ierr = DMDAGetGhostCorners(da,&starts[0],&starts[1],&starts[ >> 2],&dims[0],&dims[1],&dims[2]);CHKERRQ(ierr); >> ierr = MatSetStencil(A,dim,dims,starts,dof);CHKERRQ(ierr); >> >> I don't know of code you could use to translate the stencil coordinates >> to local indices (it's just a lexicographic ordering so simple for you >> to write), but you can use MatSetValuesLocal on the J_hr/J_rh blocks. >> >> "zakaryah ." writes: >> >> > Well I made a little progress by considering SNES ex28.c. In the >> Jacobian >> > routine, I call DMCompositeGetLocalISs, then use the IS to call >> > MatGetLocalSubMatrix. I call these J_rr, J_rh, J_hr, and J_hh, where r >> > represents the redundant variables and h represents the displacements. I >> > assume I can call MatSetValuesStencil on J_hh, as before, and MatSetValue >> > on J_rr (which is 1x1). I'm guessing that J_rr, J_rh, and J_hr can only >> be >> > set on the processor which owns the redundant variable - is this correct? >> > How do I determine the ordering for J_hr and J_rh? >> > >> > On Tue, Oct 24, 2017 at 2:45 PM, zakaryah . wrote: >> > >> >> I see - I use a local variable to add up the terms on each process, then >> >> call MPI_Reduce within the function on process 0, which owns the >> redundant >> >> variable. >> >> >> >> I have one more question - for the calculation of the Jacobian, my life >> is >> >> made much much easier by using MatSetValuesStencil. However, the matrix >> >> which the SNES Jacobian receives as argument is the "full" matrix, >> >> containing the DMDA variables (displacements), plus the redundant >> >> variable. How do I access the submatrix corresponding just to the DMDA? >> >> If I can do that, then I can call MatSetValuesStencil on the >> submatrix. Is >> >> this the right approach? I'm not sure how to set the elements of the >> >> Jacobian which correspond to the redundant variable, either - i.e., how >> do >> >> I get the ordering? >> >> >> >> >> From david.knezevic at akselos.com Sat Oct 28 09:31:35 2017 From: david.knezevic at akselos.com (David Knezevic) Date: Sat, 28 Oct 2017 10:31:35 -0400 Subject: [petsc-users] DIVERGED_DTOL with SNES after switching to 3.8 Message-ID: Hi all, I updated to the head of the maint branch (b5b0337). Previously (with version 3.7), I had a SNES test case that converged like this: NL step 0, |residual|_2 = 9.074962e-03 NL step 1, |residual|_2 = 5.516783e+03 NL step 2, |residual|_2 = 1.370884e-12 Number of nonlinear iterations: 2 Nonlinear solver convergence/divergence reason: CONVERGED_FNORM_RELATIVE and now I get the following behavior: NL step 0, |residual|_2 = 9.074962e-03 NL step 1, |residual|_2 = 5.516783e+03 Number of nonlinear iterations: 1 Nonlinear solver convergence/divergence reason: DIVERGED_DTOL I didn't change anything in my code, so I guess the default DTOL options in SNES have changed? If someone could suggest how I can recover the old behavior, that'd be great. (Note that this is a contact problem using an augmented Lagrangian penalty method, so the convergence is pretty nasty and it's hard to avoid these large jumps in the residual, so I don't want to get the DIVERGED_DTOL error in this case.) Thanks, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Oct 28 09:44:19 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Sat, 28 Oct 2017 14:44:19 +0000 Subject: [petsc-users] DIVERGED_DTOL with SNES after switching to 3.8 In-Reply-To: References: Message-ID: Run with something like -snes_divergence_tolerance 1e20 Barry > On Oct 28, 2017, at 9:31 AM, David Knezevic wrote: > > Hi all, > > I updated to the head of the maint branch (b5b0337). Previously (with version 3.7), I had a SNES test case that converged like this: > > NL step 0, |residual|_2 = 9.074962e-03 > NL step 1, |residual|_2 = 5.516783e+03 > NL step 2, |residual|_2 = 1.370884e-12 > Number of nonlinear iterations: 2 > Nonlinear solver convergence/divergence reason: CONVERGED_FNORM_RELATIVE > > and now I get the following behavior: > > NL step 0, |residual|_2 = 9.074962e-03 > NL step 1, |residual|_2 = 5.516783e+03 > Number of nonlinear iterations: 1 > Nonlinear solver convergence/divergence reason: DIVERGED_DTOL > > I didn't change anything in my code, so I guess the default DTOL options in SNES have changed? If someone could suggest how I can recover the old behavior, that'd be great. > > (Note that this is a contact problem using an augmented Lagrangian penalty method, so the convergence is pretty nasty and it's hard to avoid these large jumps in the residual, so I don't want to get the DIVERGED_DTOL error in this case.) > > Thanks, > David From david.knezevic at akselos.com Sat Oct 28 09:54:00 2017 From: david.knezevic at akselos.com (David Knezevic) Date: Sat, 28 Oct 2017 10:54:00 -0400 Subject: [petsc-users] DIVERGED_DTOL with SNES after switching to 3.8 In-Reply-To: References: Message-ID: That works, thanks! David On Sat, Oct 28, 2017 at 10:44 AM, Smith, Barry F. wrote: > > Run with something like -snes_divergence_tolerance 1e20 > > Barry > > > > > > > > > On Oct 28, 2017, at 9:31 AM, David Knezevic > wrote: > > > > Hi all, > > > > I updated to the head of the maint branch (b5b0337). Previously (with > version 3.7), I had a SNES test case that converged like this: > > > > NL step 0, |residual|_2 = 9.074962e-03 > > NL step 1, |residual|_2 = 5.516783e+03 > > NL step 2, |residual|_2 = 1.370884e-12 > > Number of nonlinear iterations: 2 > > Nonlinear solver convergence/divergence reason: CONVERGED_FNORM_RELATIVE > > > > and now I get the following behavior: > > > > NL step 0, |residual|_2 = 9.074962e-03 > > NL step 1, |residual|_2 = 5.516783e+03 > > Number of nonlinear iterations: 1 > > Nonlinear solver convergence/divergence reason: DIVERGED_DTOL > > > > I didn't change anything in my code, so I guess the default DTOL options > in SNES have changed? If someone could suggest how I can recover the old > behavior, that'd be great. > > > > (Note that this is a contact problem using an augmented Lagrangian > penalty method, so the convergence is pretty nasty and it's hard to avoid > these large jumps in the residual, so I don't want to get the DIVERGED_DTOL > error in this case.) > > > > Thanks, > > David > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mlohry at gmail.com Sun Oct 29 11:50:47 2017 From: mlohry at gmail.com (Mark Lohry) Date: Sun, 29 Oct 2017 12:50:47 -0400 Subject: [petsc-users] preconditioning matrix-free newton-krylov In-Reply-To: References: <1C4B04A3719F56479255009C095BE3B593D45932@CSGMBX214W.pu.win.princeton.edu> Message-ID: Thanks again Barry, I've got the preconditioners hooked up with -snes_mf_operator and at least AMG looks to be working great on a high order unstructured DG problem. Couple questions on the SNESSetLagJacobian + SNESSetLagPreconditioner code flow: 1) With -snes_mf_operator, and given SNESSetLagJacobian(snes, 1) (default) and SNESSetLagPreconditioner(snes, 2), after the first KSP solve in a newton iteration, will it do the finite different jacobian calculation? Or will the Jacobian only be computed when the preconditioner lag setting demands it on the 3rd newton step? I suspect it's the latter based on where I see the code pause. 2) How do implicit TS and SNESSetLagPreconditioner/Persists interact? Does the counter since-last-preconditioner-compute reset with time steps, or does that lag counter just increment with every SNES solve regardless of how many nonlinear solves might have happened in a given timestep? Say lag preconditioner is 2, and a time stepper uses 3, 2, and 3 nonlinear solves on 3 steps, is the flow (time step 1)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) (time step 2)->(snes solve)->(update preconditioner)->(snes solve) (time step 3)->(snes solve)->(update preconditioner)->(snes solve)->(snes solve) or (time step 1)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) (time step 2)->(update preconditioner)->(snes solve)->(snes solve) (time step 3)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) ? I think for implicit time stepping I'd probably want the preconditioner to be recomputed just once at the beginning of each time step, or some multiple of that. Does that sound reasonable? 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the FD computation of the jacobians, or for the computation of the preconditioner? I'd like to get a handle on the relative costs of these. Best, Mark On Sat, Sep 23, 2017 at 3:28 PM, Mark Lohry wrote: > Great, thanks Barry. > > On Sat, Sep 23, 2017 at 3:12 PM, Barry Smith wrote: > >> >> > On Sep 23, 2017, at 12:48 PM, Mark W. Lohry >> wrote: >> > >> > I'm currently using JFNK in an application where I don't have a >> hand-coded jacobian, and it's working well enough but as expected the >> scaling isn't great. >> > >> > What is the general process for using PC with MatMFFDComputeJacobian? >> Does it make sense to occasionally have petsc re-compute the jacobian via >> finite differences, and then recompute the preconditioner? Any that just >> need the sparsity structure? >> >> Mark >> >> Yes, this is a common approach. SNESSetLagJacobian -snes_lag_jacobian >> >> The normal approach in SNES to use matrix-free for the operator and >> use finite differences to compute an approximate Jacobian used to construct >> preconditioners is to to create a sparse matrix with the sparsity of the >> approximate Jacobian (yes you need a way to figure out the sparsity, if you >> use DMDA it will figure out the sparsity for you). Then you use >> >> SNESSetJacobian(snes,J,J, SNESComputeJacobianDefaultColor, NULL); >> >> and use the options database option -snes_mf_operator >> >> >> > Are there any PCs that don't work in the matrix-free context? >> >> If you do the above you can use almost all the PC since you are >> providing an explicit matrix from which to build the preconditioner >> >> > Are there any example codes I overlooked? >> > >> > Last but not least... can the Boomer-AMG preconditioner work with JFNK? >> To really show my ignorance of AMG, can it actually be written as a matrix >> P^-1(Ax-b)=0, , or is it just a linear operator? >> >> Again, if you provide an approximate Jacobian like above you can use it >> with BoomerAMG, if you provide NO explicit matrix you cannot use BoomerAMG >> or almost any other preconditioner. >> >> Barry >> >> > >> > Thanks, >> > Mark >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zakaryah at gmail.com Sun Oct 29 16:15:13 2017 From: zakaryah at gmail.com (zakaryah .) Date: Sun, 29 Oct 2017 17:15:13 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <87k1zisl6a.fsf@jedbrown.org> References: <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> <87k1zisl6a.fsf@jedbrown.org> Message-ID: Good point, Jed - I feel silly for missing this. Can I use -snes_type test -snes_test_display with the Jacobian generated from a DMComposite? When I try, it looks like the finite difference Jacobian is missing all the elements in the row corresponding to the redundant variable, except the diagonal, which is wrong. I'm not sure my code for setting the submatrices is correct. I'm especially uncertain about the submatrix J_bh, where b is the redundant variable and h is the displacements. This submatrix has only one row, and all of its columns are non-zero. Can its values be set with MatSetValuesLocal, on all processors? Is there an example of manually coding a Jacobian with a DMRedundant? -------------- next part -------------- An HTML attachment was scrubbed... URL: From giuntoli1991 at gmail.com Sun Oct 29 17:20:09 2017 From: giuntoli1991 at gmail.com (Guido Giuntoli) Date: Sun, 29 Oct 2017 23:20:09 +0100 Subject: [petsc-users] setting ghost padding region Message-ID: I have a ghost vector "b" and I want to work with my local representation adding values in my local part and my ghost padding region. Is this possible or I have to use VecSetValues ? VecGhostGetLocalForm( b , &b_loc ); VecGetArray( b_loc, &b_arr ); ... VecSetValues( b_loc, n , index, r_e, ADD_VALUES ); VecRestoreArray( b_loc, &b_arr ); // this communicate something in the ghost padding region ? I tried to see the code of VecRestoreArray but does not seem to be communicating anything. I am seeing some routines and I think that adding VecGhostUpdateBegin(b, ADD_VALUES, SCATTER_REVERSE); VecGhostUpdateEnd(b, ADD_VALUES, SCATTER_REVERSE); can do what I want, i.e. adding the ghost padding regions on all the processes that correspond.Is this true ? Thank you, Guido. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Oct 29 17:30:52 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Sun, 29 Oct 2017 22:30:52 +0000 Subject: [petsc-users] setting ghost padding region In-Reply-To: References: Message-ID: <5E45790B-52A6-47EF-A355-E14406C86417@mcs.anl.gov> > On Oct 29, 2017, at 5:20 PM, Guido Giuntoli wrote: > > I have a ghost vector "b" and I want to work with my local representation adding values in my local part and my ghost padding region. Is this possible or I have to use VecSetValues ? What do you really want to do? put values into the local and ghosted part and then add up all the ghosted parts into the local parts or put values into the local parts and then have the vector fill up the ghosted locations based on the local parts? > > VecGhostGetLocalForm( b , &b_loc ); VecGetArray( b_loc, &b_arr ); > ... > VecSetValues( b_loc, n , index, r_e, ADD_VALUES ); > VecRestoreArray( b_loc, &b_arr ); // this communicate something in the ghost padding region ? No, restore array communicates nothing. And you don't mix VecGetArray()/RestoreArray() with VecSetValues(). You either call VecSetValues() and then VecAssemblyBegin/End() to put values into a vector OR you use VecGetArray() put the values into the array and then call VecRestoreArray() > > I tried to see the code of VecRestoreArray but does not seem to be communicating anything. I am seeing some routines and I think that adding > > VecGhostUpdateBegin(b, ADD_VALUES, SCATTER_REVERSE); > VecGhostUpdateEnd(b, ADD_VALUES, SCATTER_REVERSE); > > can do what I want, i.e. adding the ghost padding regions on all the processes that correspond.Is this true ? Thank you, Guido. This will take the values you put into the ghost locations and add them up and put them into the appropriate global (non-ghost) locations. Barry From giuntoli1991 at gmail.com Sun Oct 29 17:37:34 2017 From: giuntoli1991 at gmail.com (Guido Giuntoli) Date: Sun, 29 Oct 2017 23:37:34 +0100 Subject: [petsc-users] setting ghost padding region In-Reply-To: <5E45790B-52A6-47EF-A355-E14406C86417@mcs.anl.gov> References: <5E45790B-52A6-47EF-A355-E14406C86417@mcs.anl.gov> Message-ID: very clear., thank you Barry. 2017-10-29 23:30 GMT+01:00 Smith, Barry F. : > > > On Oct 29, 2017, at 5:20 PM, Guido Giuntoli > wrote: > > > > I have a ghost vector "b" and I want to work with my local > representation adding values in my local part and my ghost padding region. > Is this possible or I have to use VecSetValues ? > > What do you really want to do? > > put values into the local and ghosted part and then add up all the > ghosted parts into the local parts or > > put values into the local parts and then have the vector fill up the > ghosted locations based on the local parts? > > > > > VecGhostGetLocalForm( b , &b_loc ); > VecGetArray( b_loc, > &b_arr ); > > ... > > VecSetValues( b_loc, n , index, r_e, ADD_VALUES ); > > VecRestoreArray( b_loc, &b_arr ); // this communicate something in the > ghost padding region ? > > No, restore array communicates nothing. And you don't mix > VecGetArray()/RestoreArray() with VecSetValues(). You either call > VecSetValues() and then VecAssemblyBegin/End() to put values into a vector > OR you use VecGetArray() put the values into the array and then call > VecRestoreArray() > > > > I tried to see the code of VecRestoreArray but does not seem to be > communicating anything. I am seeing some routines and I think that adding > > > > VecGhostUpdateBegin(b, ADD_VALUES, SCATTER_REVERSE); > > VecGhostUpdateEnd(b, ADD_VALUES, SCATTER_REVERSE); > > > > can do what I want, i.e. adding the ghost padding regions on all the > processes that correspond.Is this true ? Thank you, Guido. > > This will take the values you put into the ghost locations and add them > up and put them into the appropriate global (non-ghost) locations. > > > > Barry > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Oct 29 18:07:07 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 29 Oct 2017 19:07:07 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> <87k1zisl6a.fsf@jedbrown.org> Message-ID: On Sun, Oct 29, 2017 at 5:15 PM, zakaryah . wrote: > Good point, Jed - I feel silly for missing this. > > Can I use -snes_type test -snes_test_display with the Jacobian generated > from a DMComposite? When I try, it looks like the finite difference > Jacobian is missing all the elements in the row corresponding to the > redundant variable, except the diagonal, which is wrong. > Well, this leads me to believe the residual function is wrong. What the FD Jacobian does is just call the residual twice with different solutions. Thus if the residual is different when you perturb the redundant variable, you should have Jacobian entries there. > I'm not sure my code for setting the submatrices is correct. I'm > especially uncertain about the submatrix J_bh, where b is the redundant > variable and h is the displacements. This submatrix has only one row, and > all of its columns are non-zero. Can its values be set with > MatSetValuesLocal, on all processors? > > Is there an example of manually coding a Jacobian with a DMRedundant? > I don't think so. We welcome contributions. Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Oct 30 10:29:22 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Mon, 30 Oct 2017 15:29:22 +0000 Subject: [petsc-users] preconditioning matrix-free newton-krylov In-Reply-To: References: <1C4B04A3719F56479255009C095BE3B593D45932@CSGMBX214W.pu.win.princeton.edu> Message-ID: <9D8F0366-3B1E-461F-8126-ADD87FA38F65@mcs.anl.gov> > On Oct 29, 2017, at 11:50 AM, Mark Lohry wrote: > > Thanks again Barry, I've got the preconditioners hooked up with -snes_mf_operator and at least AMG looks to be working great on a high order unstructured DG problem. > > Couple questions on the SNESSetLagJacobian + SNESSetLagPreconditioner code flow: > > 1) With -snes_mf_operator, and given SNESSetLagJacobian(snes, 1) (default) and SNESSetLagPreconditioner(snes, 2), after the first KSP solve in a newton iteration, will it do the finite different jacobian calculation? Or will the Jacobian only be computed when the preconditioner lag setting demands it on the 3rd newton step? I suspect it's the latter based on where I see the code pause. SNES with -snes_mf_operator will ALWAYS use the matrix-free finite difference f(x+h) - f(x) to apply the matrix vector product. The LagJacobian and LagPreconditioner are not coordinated. The first determines how often the Jacobian used for preconditioning is recomputed and the second determines how often the preconditioner is recomputed. If you are using -snes_mf_operator then it never makes sense to have lagJacobian < lagPreconditioner since it would recompute the Jacobian but not actually use it. It also makes no sense for lagPreconditioner < lagJacobian because you'd be recomputing the preconditioner on the same Jacobian. But actually if you don't change the Jacobian used in building the preconditioner then when it tries to recompute the preconditioner it determines the matrix has not changed so skips rebuilding the preconditioner. So when using -snes_mf_operator there is really no reason generally to set the preconditioner lag. > > 2) How do implicit TS and SNESSetLagPreconditioner/Persists interact? Does the counter since-last-preconditioner-compute reset with time steps, or does that lag counter just increment with every SNES solve regardless of how many nonlinear solves might have happened in a given timestep? Say lag preconditioner is 2, and a time stepper uses 3, 2, and 3 nonlinear solves on 3 steps, is the flow > > (time step 1)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > (time step 2)->(snes solve)->(update preconditioner)->(snes solve) > (time step 3)->(snes solve)->(update preconditioner)->(snes solve)->(snes solve) > > or > > (time step 1)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > (time step 2)->(update preconditioner)->(snes solve)->(snes solve) > (time step 3)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > > ? > > I think for implicit time stepping I'd probably want the preconditioner to be recomputed just once at the beginning of each time step, or some multiple of that. Does that sound reasonable? Yes, what you want to do is completely reasonable. You can use SNESSetLagJacobian() and SNESSetLagJacobianPersists() in combination to have the Jacobian recomputed ever fixed number of times; if you set the persists flag and set LagJacobian to 10 it will recompute the Jacobian used in the preconditioner every 10th time a new Jacobian is needed. If you want to compute the new Jacobian used to build the preconditioner once at the beginning of each new TS stage you can set SNESSetLagJacobian() to negative -2 in the TS prestage call. There are possibly other tricks you can do by setting the two flags at different locations. An alternative to hardwiring how often the Jacobian used to build the preconditioner is rebuilt is to rebuild based on when the preconditioner starts "working less well". Here you could put an additional KSPMonitor or SNESMonitor that detects if the number of linear iterations is above a certain amount and then sets the recompute Jacobian flag to -2 so that for the next solve it recreates the Jacobian used in building the preconditioner. > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the FD computation of the jacobians, or for the computation of the preconditioner? I'd like to get a handle on the relative costs of these. No, do you just want the time? You can get that from the logging; for example -log_view > > > Best, > Mark > > On Sat, Sep 23, 2017 at 3:28 PM, Mark Lohry wrote: > Great, thanks Barry. > > On Sat, Sep 23, 2017 at 3:12 PM, Barry Smith wrote: > > > On Sep 23, 2017, at 12:48 PM, Mark W. Lohry wrote: > > > > I'm currently using JFNK in an application where I don't have a hand-coded jacobian, and it's working well enough but as expected the scaling isn't great. > > > > What is the general process for using PC with MatMFFDComputeJacobian? Does it make sense to occasionally have petsc re-compute the jacobian via finite differences, and then recompute the preconditioner? Any that just need the sparsity structure? > > Mark > > Yes, this is a common approach. SNESSetLagJacobian -snes_lag_jacobian > > The normal approach in SNES to use matrix-free for the operator and use finite differences to compute an approximate Jacobian used to construct preconditioners is to to create a sparse matrix with the sparsity of the approximate Jacobian (yes you need a way to figure out the sparsity, if you use DMDA it will figure out the sparsity for you). Then you use > > SNESSetJacobian(snes,J,J, SNESComputeJacobianDefaultColor, NULL); > > and use the options database option -snes_mf_operator > > > > Are there any PCs that don't work in the matrix-free context? > > If you do the above you can use almost all the PC since you are providing an explicit matrix from which to build the preconditioner > > > Are there any example codes I overlooked? > > > > Last but not least... can the Boomer-AMG preconditioner work with JFNK? To really show my ignorance of AMG, can it actually be written as a matrix P^-1(Ax-b)=0, , or is it just a linear operator? > > Again, if you provide an approximate Jacobian like above you can use it with BoomerAMG, if you provide NO explicit matrix you cannot use BoomerAMG or almost any other preconditioner. > > Barry > > > > > Thanks, > > Mark > > > From mc0710 at gmail.com Mon Oct 30 10:49:38 2017 From: mc0710 at gmail.com (Mani Chandra) Date: Mon, 30 Oct 2017 21:19:38 +0530 Subject: [petsc-users] Poisson problem with boundaries inside the domain Message-ID: Hello, I'm trying to solve the Poisson problem but with Dirichlet boundaries inside the domain (in addition to those imposed in the ghost zones). I'm using DMDA to create a structured grid and SNES coupled to this DMDA to solve the problem. The issue is that SNES doesn't converge when I impose Dirichlet boundary values inside the domain. I tried setting the residual corresponding to these internal points to zero but that isn't helping. Is there a right way to approach this problem? I just want to tell SNES to remove the chosen internal grid points from the solver iterations. Thanks, Mani -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Oct 30 10:58:31 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Mon, 30 Oct 2017 15:58:31 +0000 Subject: [petsc-users] Poisson problem with boundaries inside the domain In-Reply-To: References: Message-ID: <9E71D9F6-5D8E-4296-BDCF-C57A9915D311@mcs.anl.gov> If you using DMDA then you can't just "remove" some grid points from the vector. What you need to do is for your function evaluation at these interior grid points do f[j][i] = u[j][i] - ub in the FormFunction (ub is the Dirichlet value) and put a 1 one the diagonal of that row of the Jacobian (and nothing off diagonal). Barry > On Oct 30, 2017, at 10:49 AM, Mani Chandra wrote: > > Hello, > > I'm trying to solve the Poisson problem but with Dirichlet boundaries inside the domain (in addition to those imposed in the ghost zones). I'm using DMDA to create a structured grid and SNES coupled to this DMDA to solve the problem. > > The issue is that SNES doesn't converge when I impose Dirichlet boundary values inside the domain. I tried setting the residual corresponding to these internal points to zero but that isn't helping. Is there a right way to approach this problem? I just want to tell SNES to remove the chosen internal grid points from the solver iterations. > > Thanks, > Mani From bhatiamanav at gmail.com Mon Oct 30 11:14:11 2017 From: bhatiamanav at gmail.com (Manav Bhatia) Date: Mon, 30 Oct 2017 11:14:11 -0500 Subject: [petsc-users] configuration error Message-ID: Hi, I am trying to install pets 3.8 on a new MacBook machine with OS 10.13. I have installed openmpi from macports and I am getting this error on configuration. Attached is also the configure.log file. I am not sure how to proceed with this. Any advice will be greatly appreciated! Regards, Manav =============================================================================== Configuring PETSc to compile on your system =============================================================================== =============================================================================== ***** WARNING: Using default optimization C flags -g -O3 You might consider manually setting optimal optimization flags for your system with COPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default C++ optimization flags -g -O3 You might consider manually setting optimal optimization flags for your system with CXXOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default FORTRAN optimization flags -g -O You might consider manually setting optimal optimization flags for your system with FOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== WARNING! Compiling PETSc with no debugging, this should only be done for timing and production runs. All development should be done when configured using --with-debugging=1 =============================================================================== TESTING: checkCLibraries from config.compilers(config/BuildSystem/config/compilers.py:171) ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- C libraries cannot directly be used from Fortran ******************************************************************************* -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 438151 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: From fande.kong at inl.gov Mon Oct 30 11:36:45 2017 From: fande.kong at inl.gov (Kong, Fande) Date: Mon, 30 Oct 2017 10:36:45 -0600 Subject: [petsc-users] configuration error In-Reply-To: References: Message-ID: We had exactly the same issue when upgraded compilers. I guess this is somehow related to gfortran. A simple way to work around for us is to change* if with_rpath*: to * if False *at line 54 of config/BuildSystem/config/libraries.py. Not sure if it works for you. Fande, On Mon, Oct 30, 2017 at 10:14 AM, Manav Bhatia wrote: > Hi, > > I am trying to install pets 3.8 on a new MacBook machine with OS 10.13. > I have installed openmpi from macports and I am getting this error on > configuration. Attached is also the configure.log file. > > I am not sure how to proceed with this. Any advice will be greatly > appreciated! > > Regards, > Manav > > ============================================================ > =================== > Configuring PETSc to compile on your system > > ============================================================ > =================== > =============================================================================== > > > ***** WARNING: Using default optimization C flags -g -O3 > > > You might consider manually setting optimal optimization flags for your > system with > > COPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for > examples > > =============================================================================== > > > =============================================================================== > > > ***** WARNING: Using default C++ optimization flags -g -O3 > > > You might consider manually setting optimal optimization flags for your > system with > > CXXOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py > for examples > > =============================================================================== > > > =============================================================================== > > > ***** WARNING: Using default FORTRAN optimization flags -g -O > > > You might consider manually setting optimal optimization flags for your > system with > > FOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for > examples > > =============================================================================== > > > =============================================================================== > > > WARNING! Compiling PETSc with no debugging, this should > > > only be done for timing and production runs. All > development should > > be done when configured using > --with-debugging=1 > > ============================== > ================================================= > > TESTING: checkCLibraries > from config.compilers(config/BuildSystem/config/compilers.py:171) > > > ************************************************************ > ******************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > details): > ------------------------------------------------------------ > ------------------- > C libraries cannot directly be used from Fortran > ************************************************************ > ******************* > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mc0710 at gmail.com Mon Oct 30 11:37:16 2017 From: mc0710 at gmail.com (Mani Chandra) Date: Mon, 30 Oct 2017 22:07:16 +0530 Subject: [petsc-users] Poisson problem with boundaries inside the domain In-Reply-To: <9E71D9F6-5D8E-4296-BDCF-C57A9915D311@mcs.anl.gov> References: <9E71D9F6-5D8E-4296-BDCF-C57A9915D311@mcs.anl.gov> Message-ID: Thanks! f[j][i] = u[j][i] - ub did it. It even works with the automated Jacobian assembly. On Mon, Oct 30, 2017 at 9:28 PM, Smith, Barry F. wrote: > > If you using DMDA then you can't just "remove" some grid points from > the vector. > > What you need to do is for your function evaluation at these interior > grid points do f[j][i] = u[j][i] - ub in the FormFunction (ub is the > Dirichlet value) and put a 1 one the diagonal of that row of the Jacobian > (and nothing off diagonal). > > Barry > > > On Oct 30, 2017, at 10:49 AM, Mani Chandra wrote: > > > > Hello, > > > > I'm trying to solve the Poisson problem but with Dirichlet boundaries > inside the domain (in addition to those imposed in the ghost zones). I'm > using DMDA to create a structured grid and SNES coupled to this DMDA to > solve the problem. > > > > The issue is that SNES doesn't converge when I impose Dirichlet boundary > values inside the domain. I tried setting the residual corresponding to > these internal points to zero but that isn't helping. Is there a right way > to approach this problem? I just want to tell SNES to remove the chosen > internal grid points from the solver iterations. > > > > Thanks, > > Mani > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bhatiamanav at gmail.com Mon Oct 30 11:43:12 2017 From: bhatiamanav at gmail.com (Manav Bhatia) Date: Mon, 30 Oct 2017 11:43:12 -0500 Subject: [petsc-users] configuration error In-Reply-To: References: Message-ID: Fande, I made the change you recommended and it seems to have moved past that stage in the configuration. Thanks for your help! Regards, Manav > On Oct 30, 2017, at 11:36 AM, Kong, Fande wrote: > > We had exactly the same issue when upgraded compilers. I guess this is somehow related to gfortran. A simple way to work around for us is to change if with_rpath: to if False at line 54 of config/BuildSystem/config/libraries.py. > > Not sure if it works for you. > > Fande, > > > > > On Mon, Oct 30, 2017 at 10:14 AM, Manav Bhatia > wrote: > Hi, > > I am trying to install pets 3.8 on a new MacBook machine with OS 10.13. I have installed openmpi from macports and I am getting this error on configuration. Attached is also the configure.log file. > > I am not sure how to proceed with this. Any advice will be greatly appreciated! > > Regards, > Manav > > =============================================================================== > Configuring PETSc to compile on your system > =============================================================================== > =============================================================================== ***** WARNING: Using default optimization C flags -g -O3 You might consider manually setting optimal optimization flags for your system with COPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default C++ optimization flags -g -O3 You might consider manually setting optimal optimization flags for your system with CXXOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default FORTRAN optimization flags -g -O You might consider manually setting optimal optimization flags for your system with FOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== WARNING! Compiling PETSc with no debugging, this should only be done for timing and production runs. All development should be done when configured using --with-debugging=1 =============================================================================== TESTING: checkCLibraries from config.compilers(config/BuildSystem/config/compilers.py:171) ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > ------------------------------------------------------------------------------- > C libraries cannot directly be used from Fortran > ******************************************************************************* > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bhatiamanav at gmail.com Mon Oct 30 11:57:24 2017 From: bhatiamanav at gmail.com (Manav Bhatia) Date: Mon, 30 Oct 2017 11:57:24 -0500 Subject: [petsc-users] configuration error In-Reply-To: References: Message-ID: <07C87D43-41E1-42D1-940A-F85BECFE9A8E@gmail.com> I am not getting errors with mumps (please see below). Interestingly, I just compiled this on another machine with clang-3.8 and gfortran-6.7 without problems. -Manav mumps_c.c:307:53: error: no member named 'nnz' in 'CMUMPS_STRUC_C'; did you mean 'nz'? mumps_par->n=0; mumps_par->nz=0; mumps_par->nnz=0; mumps_par->nz_loc=0; mumps_par->nnz_loc=0; mumps_par->nelt=0;mumps_par->instance_number=0;mumps_par->deficiency=0;mumps_par->lwk_user=0;mumps_par->size_schur=0;mumps_par->lrhs=0; mumps_par->lredrhs=0; mumps_par->nrhs=0; mumps_par->nz_rhs=0; mumps_par->lsol_loc=0; ^~~ nz /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:56:20: note: 'nz' declared here MUMPS_INT nz; ^ mumps_c.c:307:92: error: no member named 'nnz_loc' in 'CMUMPS_STRUC_C'; did you mean 'nz_loc'? mumps_par->n=0; mumps_par->nz=0; mumps_par->nnz=0; mumps_par->nz_loc=0; mumps_par->nnz_loc=0; mumps_par->nelt=0;mumps_par->instance_number=0;mumps_par->deficiency=0;mumps_par->lwk_user=0;mumps_par->size_schur=0;mumps_par->lrhs=0; mumps_par->lredrhs=0; mumps_par->nrhs=0; mumps_par->nz_rhs=0; mumps_par->lsol_loc=0; ^~~~~~~ nz_loc /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:62:20: note: 'nz_loc' declared here MUMPS_INT nz_loc; ^ mumps_c.c:419:42: error: no member named 'nnz' in 'CMUMPS_STRUC_C'; did you mean 'nz'? &(mumps_par->nz), &(mumps_par->nnz), irn, &irn_avail, jcn, &jcn_avail, a, &a_avail, ^~~ nz /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:56:20: note: 'nz' declared here MUMPS_INT nz; ^ mumps_c.c:420:46: error: no member named 'nnz_loc' in 'CMUMPS_STRUC_C'; did you mean 'nz_loc'? &(mumps_par->nz_loc), &(mumps_par->nnz_loc), irn_loc, &irn_loc_avail, jcn_loc, &jcn_loc_avail, ^~~~~~~ nz_loc /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:62:20: note: 'nz_loc' declared here MUMPS_INT nz_loc; ^ mumps_c.c:419:29: warning: incompatible pointer types passing 'int *' to parameter of type 'int64_t *' (aka 'long long *') [-Wincompatible-pointer-types] &(mumps_par->nz), &(mumps_par->nnz), irn, &irn_avail, jcn, &jcn_avail, a, &a_avail, ^~~~~~~~~~~~~~~~~ mumps_c.c:99:28: note: passing argument to parameter 'nnz' here MUMPS_INT8 *nnz, ^ mumps_c.c:420:33: warning: incompatible pointer types passing 'int *' to parameter of type 'int64_t *' (aka 'long long *') [-Wincompatible-pointer-types] &(mumps_par->nz_loc), &(mumps_par->nnz_loc), irn_loc, &irn_loc_avail, jcn_loc, &jcn_loc_avail, ^~~~~~~~~~~~~~~~~~~~~ mumps_c.c:107:28: note: passing argument to parameter 'nnz_loc' here MUMPS_INT8 *nnz_loc, ^ 2 warnings and 4 errors generated. > On Oct 30, 2017, at 11:36 AM, Kong, Fande wrote: > > We had exactly the same issue when upgraded compilers. I guess this is somehow related to gfortran. A simple way to work around for us is to change if with_rpath: to if False at line 54 of config/BuildSystem/config/libraries.py. > > Not sure if it works for you. > > Fande, > > > > > On Mon, Oct 30, 2017 at 10:14 AM, Manav Bhatia > wrote: > Hi, > > I am trying to install pets 3.8 on a new MacBook machine with OS 10.13. I have installed openmpi from macports and I am getting this error on configuration. Attached is also the configure.log file. > > I am not sure how to proceed with this. Any advice will be greatly appreciated! > > Regards, > Manav > > =============================================================================== > Configuring PETSc to compile on your system > =============================================================================== > =============================================================================== ***** WARNING: Using default optimization C flags -g -O3 You might consider manually setting optimal optimization flags for your system with COPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default C++ optimization flags -g -O3 You might consider manually setting optimal optimization flags for your system with CXXOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default FORTRAN optimization flags -g -O You might consider manually setting optimal optimization flags for your system with FOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== WARNING! Compiling PETSc with no debugging, this should only be done for timing and production runs. All development should be done when configured using --with-debugging=1 =============================================================================== TESTING: checkCLibraries from config.compilers(config/BuildSystem/config/compilers.py:171) ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > ------------------------------------------------------------------------------- > C libraries cannot directly be used from Fortran > ******************************************************************************* > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rchurchi at pppl.gov Mon Oct 30 12:18:21 2017 From: rchurchi at pppl.gov (Randy Michael Churchill) Date: Mon, 30 Oct 2017 12:18:21 -0500 Subject: [petsc-users] Updating Fortran code to petsc 3.8 Message-ID: I'm updating my Fortran code to petsc 3.8. I have several modules with types, and often when a type is initialized in the code, there are certain Petsc variables that aren't ready to be initialized, so they are set to 0. Later, in several parts of the code, these are checked with and if statments, e.g.: module modmat #include #include use petsc use petscmat type mymat Mat :: Amat end type mymat contains subroutine mymat_init(this) this%Amat = 0 end subroutine mymat_init end module modmat program main use modmat implicit none type(mymat)::mymat1 call mymat_init(mymat1) call somethingamazing(mymat1) end program main subroutine somethingamazing(this) use modmat typ(mymat) :: this if (this%Amat == 0) then print 'Amat not initialized' endif end subroutine somethingamazing However, with Petsc 3.8, if I set Amat to 0, I get the error: error #6303: The assignment operation or the binary expression operation is invalid for the data types of the two operands. I replaced the 0 with PETSC_NULL_MAT, but then the if statement gives an error: error #6355: This binary operation is invalid for this data type. I change the if statement to: if (this%Amat == PETSC_NULL_MAT) then and it compiles. Is this the right way to update Fortran code to v3.8? Is there any alternative? My concern is we will have no backwards compatibility, because I believe the specific PETSC_NULL_XXX objects were only introduced in 3.8. So if we update to 3.8, we would either have to drop support for <3.8 or have a lot of #if PETSC_VERSION statements. -- R. Michael Churchill -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Oct 30 12:43:16 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Mon, 30 Oct 2017 17:43:16 +0000 Subject: [petsc-users] Updating Fortran code to petsc 3.8 In-Reply-To: References: Message-ID: <832CB0BB-333F-4B38-AFCD-14E0CEC9E69D@mcs.anl.gov> Please clarify. 1) You can successfully update to 3.8 and compile and run the code? 2) You do not have a way to support both 3.7 and 3.8 except by putting a large number of #ifdef in the code? Yes, 3.8 is a dramatic shift in API (to one we feel is much better), it is not simple to write code that works with both versions. We recommend just using a 3.8 version once you verify it behaves as the previous version. Note that on any computer including those at NERSc if 3.8 is not installed by default you may install it yourself. Barry > On Oct 30, 2017, at 12:18 PM, Randy Michael Churchill wrote: > > I'm updating my Fortran code to petsc 3.8. I have several modules with types, and often when a type is initialized in the code, there are certain Petsc variables that aren't ready to be initialized, so they are set to 0. Later, in several parts of the code, these are checked with and if statments, e.g.: > > module modmat > #include > #include > use petsc > use petscmat > > type mymat > Mat :: Amat > end type mymat > > contains > subroutine mymat_init(this) > this%Amat = 0 > end subroutine mymat_init > end module modmat > > > program main > use modmat > implicit none > type(mymat)::mymat1 > > call mymat_init(mymat1) > call somethingamazing(mymat1) > end program main > > subroutine somethingamazing(this) > use modmat > typ(mymat) :: this > if (this%Amat == 0) then > print 'Amat not initialized' > endif > end subroutine somethingamazing > > > However, with Petsc 3.8, if I set Amat to 0, I get the error: > error #6303: The assignment operation or the binary expression operation is invalid for the data types of the two operands. > > I replaced the 0 with PETSC_NULL_MAT, but then the if statement gives an error: > error #6355: This binary operation is invalid for this data type. > > I change the if statement to: > if (this%Amat == PETSC_NULL_MAT) then > and it compiles. > > Is this the right way to update Fortran code to v3.8? Is there any alternative? My concern is we will have no backwards compatibility, because I believe the specific PETSC_NULL_XXX objects were only introduced in 3.8. So if we update to 3.8, we would either have to drop support for <3.8 or have a lot of #if PETSC_VERSION statements. > > -- > R. Michael Churchill From bsmith at mcs.anl.gov Mon Oct 30 12:55:17 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Mon, 30 Oct 2017 17:55:17 +0000 Subject: [petsc-users] preconditioning matrix-free newton-krylov In-Reply-To: References: <1C4B04A3719F56479255009C095BE3B593D45932@CSGMBX214W.pu.win.princeton.edu> <9D8F0366-3B1E-461F-8126-ADD87FA38F65@mcs.anl.gov> Message-ID: <0BB37C70-80EB-4043-BB4F-4BF65BF7F8EB@mcs.anl.gov> > On Oct 30, 2017, at 12:39 PM, Mark Lohry wrote: > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the FD computation of the jacobians, or for the computation of the preconditioner? I'd like to get a handle on the relative costs of these. > > No, do you just want the time? You can get that from the logging; for example -log_view > > Yes, was just thinking in regards to your suggestion of recomputing when the number of linear iterations gets too high; I assume it's the ratio of preconditioner cost vs linear solver cost at runtime that's the metric of interest, and not the absolute value of either. But I'll cross that bridge when I come to it. > > When I had asked, I was looking to see where a long pause was happening thinking it was the FD jacobian; turned out to be before that in MatColoringApply which seems surprisingly expensive. MATCOLORINGJP took ~15 minutes on 32 cores on a small 153,000^2 system, with MATCOLORINGGREEDY taking 30 seconds. Any guidance there, or is this expected? I'm not using DM, just manually entering the sparsity resulting from a metis decomposition of a tetrahedral mesh. Hmm, metis doesn't really have anything to do with the sparsity of the Jacobian does it? Matt, These times are huge. What is going on? Barry > > > Thanks for the info on the lag logic, I'll play with the TS pre/post calls for the time-accurate problems and only use LagJacobian. > > On Mon, Oct 30, 2017 at 11:29 AM, Smith, Barry F. wrote: > > > On Oct 29, 2017, at 11:50 AM, Mark Lohry wrote: > > > > Thanks again Barry, I've got the preconditioners hooked up with -snes_mf_operator and at least AMG looks to be working great on a high order unstructured DG problem. > > > > Couple questions on the SNESSetLagJacobian + SNESSetLagPreconditioner code flow: > > > > 1) With -snes_mf_operator, and given SNESSetLagJacobian(snes, 1) (default) and SNESSetLagPreconditioner(snes, 2), after the first KSP solve in a newton iteration, will it do the finite different jacobian calculation? Or will the Jacobian only be computed when the preconditioner lag setting demands it on the 3rd newton step? I suspect it's the latter based on where I see the code pause. > > SNES with -snes_mf_operator will ALWAYS use the matrix-free finite difference f(x+h) - f(x) to apply the matrix vector product. > > The LagJacobian and LagPreconditioner are not coordinated. The first determines how often the Jacobian used for preconditioning is recomputed and the second determines how often the preconditioner is recomputed. > > If you are using -snes_mf_operator then it never makes sense to have lagJacobian < lagPreconditioner since it would recompute the Jacobian but not actually use it. It also makes no sense for lagPreconditioner < lagJacobian because you'd be recomputing the preconditioner on the same Jacobian. > > But actually if you don't change the Jacobian used in building the preconditioner then when it tries to recompute the preconditioner it determines the matrix has not changed so skips rebuilding the preconditioner. So when using -snes_mf_operator there is really no reason generally to set the preconditioner lag. > > > > 2) How do implicit TS and SNESSetLagPreconditioner/Persists interact? Does the counter since-last-preconditioner-compute reset with time steps, or does that lag counter just increment with every SNES solve regardless of how many nonlinear solves might have happened in a given timestep? Say lag preconditioner is 2, and a time stepper uses 3, 2, and 3 nonlinear solves on 3 steps, is the flow > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > > (time step 2)->(snes solve)->(update preconditioner)->(snes solve) > > (time step 3)->(snes solve)->(update preconditioner)->(snes solve)->(snes solve) > > > > or > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > > (time step 2)->(update preconditioner)->(snes solve)->(snes solve) > > (time step 3)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > > > > ? > > > > I think for implicit time stepping I'd probably want the preconditioner to be recomputed just once at the beginning of each time step, or some multiple of that. Does that sound reasonable? > > Yes, what you want to do is completely reasonable. > > You can use SNESSetLagJacobian() and SNESSetLagJacobianPersists() in combination to have the Jacobian recomputed ever fixed number of times; if you set the persists flag and set LagJacobian to 10 it will recompute the Jacobian used in the preconditioner every 10th time a new Jacobian is needed. > > If you want to compute the new Jacobian used to build the preconditioner once at the beginning of each new TS stage you can set SNESSetLagJacobian() to negative -2 in the TS prestage call. There are possibly other tricks you can do by setting the two flags at different locations. > > An alternative to hardwiring how often the Jacobian used to build the preconditioner is rebuilt is to rebuild based on when the preconditioner starts "working less well". Here you could put an additional KSPMonitor or SNESMonitor that detects if the number of linear iterations is above a certain amount and then sets the recompute Jacobian flag to -2 so that for the next solve it recreates the Jacobian used in building the preconditioner. > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the FD computation of the jacobians, or for the computation of the preconditioner? I'd like to get a handle on the relative costs of these. > > No, do you just want the time? You can get that from the logging; for example -log_view > > > > > > > Best, > > Mark > > > > On Sat, Sep 23, 2017 at 3:28 PM, Mark Lohry wrote: > > Great, thanks Barry. > > > > On Sat, Sep 23, 2017 at 3:12 PM, Barry Smith wrote: > > > > > On Sep 23, 2017, at 12:48 PM, Mark W. Lohry wrote: > > > > > > I'm currently using JFNK in an application where I don't have a hand-coded jacobian, and it's working well enough but as expected the scaling isn't great. > > > > > > What is the general process for using PC with MatMFFDComputeJacobian? Does it make sense to occasionally have petsc re-compute the jacobian via finite differences, and then recompute the preconditioner? Any that just need the sparsity structure? > > > > Mark > > > > Yes, this is a common approach. SNESSetLagJacobian -snes_lag_jacobian > > > > The normal approach in SNES to use matrix-free for the operator and use finite differences to compute an approximate Jacobian used to construct preconditioners is to to create a sparse matrix with the sparsity of the approximate Jacobian (yes you need a way to figure out the sparsity, if you use DMDA it will figure out the sparsity for you). Then you use > > > > SNESSetJacobian(snes,J,J, SNESComputeJacobianDefaultColor, NULL); > > > > and use the options database option -snes_mf_operator > > > > > > > Are there any PCs that don't work in the matrix-free context? > > > > If you do the above you can use almost all the PC since you are providing an explicit matrix from which to build the preconditioner > > > > > Are there any example codes I overlooked? > > > > > > Last but not least... can the Boomer-AMG preconditioner work with JFNK? To really show my ignorance of AMG, can it actually be written as a matrix P^-1(Ax-b)=0, , or is it just a linear operator? > > > > Again, if you provide an approximate Jacobian like above you can use it with BoomerAMG, if you provide NO explicit matrix you cannot use BoomerAMG or almost any other preconditioner. > > > > Barry > > > > > > > > Thanks, > > > Mark > > > > > > > > From rchurchi at pppl.gov Mon Oct 30 13:06:55 2017 From: rchurchi at pppl.gov (Randy Michael Churchill) Date: Mon, 30 Oct 2017 13:06:55 -0500 Subject: [petsc-users] Updating Fortran code to petsc 3.8 In-Reply-To: <832CB0BB-333F-4B38-AFCD-14E0CEC9E69D@mcs.anl.gov> References: <832CB0BB-333F-4B38-AFCD-14E0CEC9E69D@mcs.anl.gov> Message-ID: > > Please clarify. > > 1) You can successfully update to 3.8 and compile and run the code? I haven't updated all of the code to 3.8, I wanted to make sure there wasn't any tricks to maintain backwards compatibility before I change everything. I'm only in the process of changing several of the codebase's files, updating any "Amat = 0" statements or "if (Amat==0) then" statements, which do not compile with 3.8 (they need the specific PETSC_NULL_XXX instead of 0). When I make those changes, that section of the code compiles (btw, on NERSC Edison, with my own compiled petsc v3.8). > > 2) You do not have a way to support both 3.7 and 3.8 except by putting a > large number of #ifdef in the code? > If I have to change "Amat = 0" to "Amat = PETSC_NULL_MAT" for 3.8, and change all of the PETSC_NULL_OBJECT for 3.8, many of those PETSC_NULL_XXX references are not included in v3.7, correct? If not, and I want to keep backwards compatibility, everywhere there is a PETSC_NULL_OBJECT, I would have to do: #include #if PETSC_VERSION_GE(3,8,0) call MatNullSpaceCreate(comm, PETSC_TRUE, 0, PETSC_NULL_VEC, nullsp,ierr) #else call MatNullSpaceCreate(comm, PETSC_TRUE, 0, PETSC_NULL_OBJECT, nullsp,ierr) #endif If this is the only way to upgrade our codebase to use petsc v3.8, thats fine, I just wanted to verify that there wasn't another way, which would be cleaner (e.g. perhaps creating an include file with all of the PETSC_NULL_XXX definitions which aren't in v3.7, and thus avoid many of these #if PETSC_VERSION statements). > > Yes, 3.8 is a dramatic shift in API (to one we feel is much better), it > is not simple to write code that works with both versions. We recommend > just using a 3.8 version once you verify it behaves as the previous > version. Note that on any computer including those at NERSc if 3.8 is not > installed by default you may install it yourself. > > Barry > > > On Oct 30, 2017, at 12:18 PM, Randy Michael Churchill > wrote: > > > > I'm updating my Fortran code to petsc 3.8. I have several modules with > types, and often when a type is initialized in the code, there are certain > Petsc variables that aren't ready to be initialized, so they are set to 0. > Later, in several parts of the code, these are checked with and if > statments, e.g.: > > > > module modmat > > #include > > #include > > use petsc > > use petscmat > > > > type mymat > > Mat :: Amat > > end type mymat > > > > contains > > subroutine mymat_init(this) > > this%Amat = 0 > > end subroutine mymat_init > > end module modmat > > > > > > program main > > use modmat > > implicit none > > type(mymat)::mymat1 > > > > call mymat_init(mymat1) > > call somethingamazing(mymat1) > > end program main > > > > subroutine somethingamazing(this) > > use modmat > > typ(mymat) :: this > > if (this%Amat == 0) then > > print 'Amat not initialized' > > endif > > end subroutine somethingamazing > > > > > > However, with Petsc 3.8, if I set Amat to 0, I get the error: > > error #6303: The assignment operation or the binary expression operation > is invalid for the data types of the two operands. > > > > I replaced the 0 with PETSC_NULL_MAT, but then the if statement gives an > error: > > error #6355: This binary operation is invalid for this data type. > > > > I change the if statement to: > > if (this%Amat == PETSC_NULL_MAT) then > > and it compiles. > > > > Is this the right way to update Fortran code to v3.8? Is there any > alternative? My concern is we will have no backwards compatibility, because > I believe the specific PETSC_NULL_XXX objects were only introduced in 3.8. > So if we update to 3.8, we would either have to drop support for <3.8 or > have a lot of #if PETSC_VERSION statements. > > > > -- > > R. Michael Churchill > > -- R. Michael Churchill -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Oct 30 13:16:06 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 30 Oct 2017 13:16:06 -0500 Subject: [petsc-users] Updating Fortran code to petsc 3.8 In-Reply-To: References: <832CB0BB-333F-4B38-AFCD-14E0CEC9E69D@mcs.anl.gov> Message-ID: On Mon, 30 Oct 2017, Randy Michael Churchill wrote: > > > > Please clarify. > > > > 1) You can successfully update to 3.8 and compile and run the code? > > I haven't updated all of the code to 3.8, I wanted to make sure there > wasn't any tricks to maintain backwards compatibility before I change > everything. I'm only in the process of changing several of the codebase's > files, updating any "Amat = 0" statements or "if (Amat==0) then" > statements, which do not compile with 3.8 (they need the specific > PETSC_NULL_XXX instead of 0). When I make those changes, that section of > the code compiles (btw, on NERSC Edison, with my own compiled petsc v3.8). > > > > > > > 2) You do not have a way to support both 3.7 and 3.8 except by putting a > > large number of #ifdef in the code? > > > If I have to change "Amat = 0" to "Amat = PETSC_NULL_MAT" for 3.8, and > change all of the PETSC_NULL_OBJECT for 3.8, many of those PETSC_NULL_XXX > references are not included in v3.7, correct? If not, and I want to keep > backwards compatibility, everywhere there is a PETSC_NULL_OBJECT, I would > have to do: > #include > #if PETSC_VERSION_GE(3,8,0) > call MatNullSpaceCreate(comm, PETSC_TRUE, 0, PETSC_NULL_VEC, nullsp,ierr) > #else > call MatNullSpaceCreate(comm, PETSC_TRUE, 0, PETSC_NULL_OBJECT, nullsp,ierr) > #endif > > If this is the only way to upgrade our codebase to use petsc v3.8, thats > fine, I just wanted to verify that there wasn't another way, which would be > cleaner (e.g. perhaps creating an include file with all of the > PETSC_NULL_XXX definitions which aren't in v3.7, and thus avoid many of > these #if PETSC_VERSION statements). Perhaps the following simple addtion will keep the 3.7 compatibility [for this case] #if PETSC_VERSION_LT(3,8,0) #define PETSC_NULL_VEC PETSC_NULL_OBJECT #define PETSC_NULL_MAT PETSC_NULL_OBJECT #endif Satish From mlohry at gmail.com Mon Oct 30 13:58:50 2017 From: mlohry at gmail.com (Mark Lohry) Date: Mon, 30 Oct 2017 14:58:50 -0400 Subject: [petsc-users] preconditioning matrix-free newton-krylov In-Reply-To: <0BB37C70-80EB-4043-BB4F-4BF65BF7F8EB@mcs.anl.gov> References: <1C4B04A3719F56479255009C095BE3B593D45932@CSGMBX214W.pu.win.princeton.edu> <9D8F0366-3B1E-461F-8126-ADD87FA38F65@mcs.anl.gov> <0BB37C70-80EB-4043-BB4F-4BF65BF7F8EB@mcs.anl.gov> Message-ID: > > Hmm, metis doesn't really have anything to do with the sparsity of the > Jacobian does it? No, I just mean I'm doing initial partitioning and parallel communication for the residual evaluations independently of petsc, and then doing a 1-to-1 mapping to the petsc solution vector. Along with manually setting the non-zero structure of the MPIAIJ system as in the user manual. I don't think there's anything wrong with the system structure as it gives the same correct answer as the un-preconditioned matrix-free approach. The exact system those MatColoring times came from has size (100x100) blocks on the diagonals corresponding to the tetrahedral cells, with those having 4 neighbor blocks on the same row (or fewer for elements on boundaries.) On Mon, Oct 30, 2017 at 1:55 PM, Smith, Barry F. wrote: > > > On Oct 30, 2017, at 12:39 PM, Mark Lohry wrote: > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the > FD computation of the jacobians, or for the computation of the > preconditioner? I'd like to get a handle on the relative costs of these. > > > > No, do you just want the time? You can get that from the logging; for > example -log_view > > > > Yes, was just thinking in regards to your suggestion of recomputing when > the number of linear iterations gets too high; I assume it's the ratio of > preconditioner cost vs linear solver cost at runtime that's the metric of > interest, and not the absolute value of either. But I'll cross that bridge > when I come to it. > > > > When I had asked, I was looking to see where a long pause was happening > thinking it was the FD jacobian; turned out to be before that in > MatColoringApply which seems surprisingly expensive. MATCOLORINGJP took ~15 > minutes on 32 cores on a small 153,000^2 system, with MATCOLORINGGREEDY > taking 30 seconds. Any guidance there, or is this expected? I'm not using > DM, just manually entering the sparsity resulting from a metis > decomposition of a tetrahedral mesh. > > Hmm, metis doesn't really have anything to do with the sparsity of the > Jacobian does it? > > Matt, > > These times are huge. What is going on? > > Barry > > > > > > > Thanks for the info on the lag logic, I'll play with the TS pre/post > calls for the time-accurate problems and only use LagJacobian. > > > > On Mon, Oct 30, 2017 at 11:29 AM, Smith, Barry F. > wrote: > > > > > On Oct 29, 2017, at 11:50 AM, Mark Lohry wrote: > > > > > > Thanks again Barry, I've got the preconditioners hooked up with > -snes_mf_operator and at least AMG looks to be working great on a high > order unstructured DG problem. > > > > > > Couple questions on the SNESSetLagJacobian + SNESSetLagPreconditioner > code flow: > > > > > > 1) With -snes_mf_operator, and given SNESSetLagJacobian(snes, 1) > (default) and SNESSetLagPreconditioner(snes, 2), after the first KSP solve > in a newton iteration, will it do the finite different jacobian > calculation? Or will the Jacobian only be computed when the preconditioner > lag setting demands it on the 3rd newton step? I suspect it's the latter > based on where I see the code pause. > > > > SNES with -snes_mf_operator will ALWAYS use the matrix-free finite > difference f(x+h) - f(x) to apply the matrix vector product. > > > > The LagJacobian and LagPreconditioner are not coordinated. The first > determines how often the Jacobian used for preconditioning is recomputed > and the second determines how often the preconditioner is recomputed. > > > > If you are using -snes_mf_operator then it never makes sense to have > lagJacobian < lagPreconditioner since it would recompute the Jacobian but > not actually use it. It also makes no sense for lagPreconditioner < > lagJacobian because you'd be recomputing the preconditioner on the same > Jacobian. > > > > But actually if you don't change the Jacobian used in building the > preconditioner then when it tries to recompute the preconditioner it > determines the matrix has not changed so skips rebuilding the > preconditioner. So when using -snes_mf_operator there is really no reason > generally to set the preconditioner lag. > > > > > > 2) How do implicit TS and SNESSetLagPreconditioner/Persists interact? > Does the counter since-last-preconditioner-compute reset with time steps, > or does that lag counter just increment with every SNES solve regardless of > how many nonlinear solves might have happened in a given timestep? Say lag > preconditioner is 2, and a time stepper uses 3, 2, and 3 nonlinear solves > on 3 steps, is the flow > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > (time step 2)->(snes solve)->(update preconditioner)->(snes solve) > > > (time step 3)->(snes solve)->(update preconditioner)->(snes > solve)->(snes solve) > > > > > > or > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > (time step 2)->(update preconditioner)->(snes solve)->(snes solve) > > > (time step 3)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > > > > ? > > > > > > I think for implicit time stepping I'd probably want the > preconditioner to be recomputed just once at the beginning of each time > step, or some multiple of that. Does that sound reasonable? > > > > Yes, what you want to do is completely reasonable. > > > > You can use SNESSetLagJacobian() and SNESSetLagJacobianPersists() in > combination to have the Jacobian recomputed ever fixed number of times; if > you set the persists flag and set LagJacobian to 10 it will recompute the > Jacobian used in the preconditioner every 10th time a new Jacobian is > needed. > > > > If you want to compute the new Jacobian used to build the > preconditioner once at the beginning of each new TS stage you can set > SNESSetLagJacobian() to negative -2 in the TS prestage call. There are > possibly other tricks you can do by setting the two flags at different > locations. > > > > An alternative to hardwiring how often the Jacobian used to build the > preconditioner is rebuilt is to rebuild based on when the preconditioner > starts "working less well". Here you could put an additional KSPMonitor or > SNESMonitor that detects if the number of linear iterations is above a > certain amount and then sets the recompute Jacobian flag to -2 so that for > the next solve it recreates the Jacobian used in building the > preconditioner. > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the > FD computation of the jacobians, or for the computation of the > preconditioner? I'd like to get a handle on the relative costs of these. > > > > No, do you just want the time? You can get that from the logging; for > example -log_view > > > > > > > > > > > Best, > > > Mark > > > > > > On Sat, Sep 23, 2017 at 3:28 PM, Mark Lohry wrote: > > > Great, thanks Barry. > > > > > > On Sat, Sep 23, 2017 at 3:12 PM, Barry Smith > wrote: > > > > > > > On Sep 23, 2017, at 12:48 PM, Mark W. Lohry > wrote: > > > > > > > > I'm currently using JFNK in an application where I don't have a > hand-coded jacobian, and it's working well enough but as expected the > scaling isn't great. > > > > > > > > What is the general process for using PC with > MatMFFDComputeJacobian? Does it make sense to occasionally have petsc > re-compute the jacobian via finite differences, and then recompute the > preconditioner? Any that just need the sparsity structure? > > > > > > Mark > > > > > > Yes, this is a common approach. SNESSetLagJacobian > -snes_lag_jacobian > > > > > > The normal approach in SNES to use matrix-free for the operator > and use finite differences to compute an approximate Jacobian used to > construct preconditioners is to to create a sparse matrix with the sparsity > of the approximate Jacobian (yes you need a way to figure out the sparsity, > if you use DMDA it will figure out the sparsity for you). Then you use > > > > > > SNESSetJacobian(snes,J,J, SNESComputeJacobianDefaultColor, NULL); > > > > > > and use the options database option -snes_mf_operator > > > > > > > > > > Are there any PCs that don't work in the matrix-free context? > > > > > > If you do the above you can use almost all the PC since you are > providing an explicit matrix from which to build the preconditioner > > > > > > > Are there any example codes I overlooked? > > > > > > > > Last but not least... can the Boomer-AMG preconditioner work with > JFNK? To really show my ignorance of AMG, can it actually be written as a > matrix P^-1(Ax-b)=0, , or is it just a linear operator? > > > > > > Again, if you provide an approximate Jacobian like above you can use > it with BoomerAMG, if you provide NO explicit matrix you cannot use > BoomerAMG or almost any other preconditioner. > > > > > > Barry > > > > > > > > > > > Thanks, > > > > Mark > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Oct 30 14:02:39 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Mon, 30 Oct 2017 19:02:39 +0000 Subject: [petsc-users] preconditioning matrix-free newton-krylov In-Reply-To: References: <1C4B04A3719F56479255009C095BE3B593D45932@CSGMBX214W.pu.win.princeton.edu> <9D8F0366-3B1E-461F-8126-ADD87FA38F65@mcs.anl.gov> <0BB37C70-80EB-4043-BB4F-4BF65BF7F8EB@mcs.anl.gov> Message-ID: > On Oct 30, 2017, at 1:58 PM, Mark Lohry wrote: > > Hmm, metis doesn't really have anything to do with the sparsity of the Jacobian does it? > > No, I just mean I'm doing initial partitioning and parallel communication for the residual evaluations independently of petsc, and then doing a 1-to-1 mapping to the petsc solution vector. Along with manually setting the non-zero structure of the MPIAIJ system as in the user manual. I don't think there's anything wrong with the system structure as it gives the same correct answer as the un-preconditioned matrix-free approach. > > The exact system those MatColoring times came from has size (100x100) blocks on the diagonals corresponding to the tetrahedral cells, with those having 4 neighbor blocks on the same row (or fewer for elements on boundaries.) Hmm, are those blocks dense? If so you could benefit enormously from using BAIJ format. Matt, Sounds like performance bugs for the parallel coloring apply algorithms with big "diagonal blocks" Mark, Could you run with -ksp_view_mat binary and send the resulting file called binaryoutput and we can run the coloring codes local to performance debug. Barry > > On Mon, Oct 30, 2017 at 1:55 PM, Smith, Barry F. wrote: > > > On Oct 30, 2017, at 12:39 PM, Mark Lohry wrote: > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the FD computation of the jacobians, or for the computation of the preconditioner? I'd like to get a handle on the relative costs of these. > > > > No, do you just want the time? You can get that from the logging; for example -log_view > > > > Yes, was just thinking in regards to your suggestion of recomputing when the number of linear iterations gets too high; I assume it's the ratio of preconditioner cost vs linear solver cost at runtime that's the metric of interest, and not the absolute value of either. But I'll cross that bridge when I come to it. > > > > When I had asked, I was looking to see where a long pause was happening thinking it was the FD jacobian; turned out to be before that in MatColoringApply which seems surprisingly expensive. MATCOLORINGJP took ~15 minutes on 32 cores on a small 153,000^2 system, with MATCOLORINGGREEDY taking 30 seconds. Any guidance there, or is this expected? I'm not using DM, just manually entering the sparsity resulting from a metis decomposition of a tetrahedral mesh. > > Hmm, metis doesn't really have anything to do with the sparsity of the Jacobian does it? > > Matt, > > These times are huge. What is going on? > > Barry > > > > > > > Thanks for the info on the lag logic, I'll play with the TS pre/post calls for the time-accurate problems and only use LagJacobian. > > > > On Mon, Oct 30, 2017 at 11:29 AM, Smith, Barry F. wrote: > > > > > On Oct 29, 2017, at 11:50 AM, Mark Lohry wrote: > > > > > > Thanks again Barry, I've got the preconditioners hooked up with -snes_mf_operator and at least AMG looks to be working great on a high order unstructured DG problem. > > > > > > Couple questions on the SNESSetLagJacobian + SNESSetLagPreconditioner code flow: > > > > > > 1) With -snes_mf_operator, and given SNESSetLagJacobian(snes, 1) (default) and SNESSetLagPreconditioner(snes, 2), after the first KSP solve in a newton iteration, will it do the finite different jacobian calculation? Or will the Jacobian only be computed when the preconditioner lag setting demands it on the 3rd newton step? I suspect it's the latter based on where I see the code pause. > > > > SNES with -snes_mf_operator will ALWAYS use the matrix-free finite difference f(x+h) - f(x) to apply the matrix vector product. > > > > The LagJacobian and LagPreconditioner are not coordinated. The first determines how often the Jacobian used for preconditioning is recomputed and the second determines how often the preconditioner is recomputed. > > > > If you are using -snes_mf_operator then it never makes sense to have lagJacobian < lagPreconditioner since it would recompute the Jacobian but not actually use it. It also makes no sense for lagPreconditioner < lagJacobian because you'd be recomputing the preconditioner on the same Jacobian. > > > > But actually if you don't change the Jacobian used in building the preconditioner then when it tries to recompute the preconditioner it determines the matrix has not changed so skips rebuilding the preconditioner. So when using -snes_mf_operator there is really no reason generally to set the preconditioner lag. > > > > > > 2) How do implicit TS and SNESSetLagPreconditioner/Persists interact? Does the counter since-last-preconditioner-compute reset with time steps, or does that lag counter just increment with every SNES solve regardless of how many nonlinear solves might have happened in a given timestep? Say lag preconditioner is 2, and a time stepper uses 3, 2, and 3 nonlinear solves on 3 steps, is the flow > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > > > (time step 2)->(snes solve)->(update preconditioner)->(snes solve) > > > (time step 3)->(snes solve)->(update preconditioner)->(snes solve)->(snes solve) > > > > > > or > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > > > (time step 2)->(update preconditioner)->(snes solve)->(snes solve) > > > (time step 3)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > > > > > > ? > > > > > > I think for implicit time stepping I'd probably want the preconditioner to be recomputed just once at the beginning of each time step, or some multiple of that. Does that sound reasonable? > > > > Yes, what you want to do is completely reasonable. > > > > You can use SNESSetLagJacobian() and SNESSetLagJacobianPersists() in combination to have the Jacobian recomputed ever fixed number of times; if you set the persists flag and set LagJacobian to 10 it will recompute the Jacobian used in the preconditioner every 10th time a new Jacobian is needed. > > > > If you want to compute the new Jacobian used to build the preconditioner once at the beginning of each new TS stage you can set SNESSetLagJacobian() to negative -2 in the TS prestage call. There are possibly other tricks you can do by setting the two flags at different locations. > > > > An alternative to hardwiring how often the Jacobian used to build the preconditioner is rebuilt is to rebuild based on when the preconditioner starts "working less well". Here you could put an additional KSPMonitor or SNESMonitor that detects if the number of linear iterations is above a certain amount and then sets the recompute Jacobian flag to -2 so that for the next solve it recreates the Jacobian used in building the preconditioner. > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the FD computation of the jacobians, or for the computation of the preconditioner? I'd like to get a handle on the relative costs of these. > > > > No, do you just want the time? You can get that from the logging; for example -log_view > > > > > > > > > > > Best, > > > Mark > > > > > > On Sat, Sep 23, 2017 at 3:28 PM, Mark Lohry wrote: > > > Great, thanks Barry. > > > > > > On Sat, Sep 23, 2017 at 3:12 PM, Barry Smith wrote: > > > > > > > On Sep 23, 2017, at 12:48 PM, Mark W. Lohry wrote: > > > > > > > > I'm currently using JFNK in an application where I don't have a hand-coded jacobian, and it's working well enough but as expected the scaling isn't great. > > > > > > > > What is the general process for using PC with MatMFFDComputeJacobian? Does it make sense to occasionally have petsc re-compute the jacobian via finite differences, and then recompute the preconditioner? Any that just need the sparsity structure? > > > > > > Mark > > > > > > Yes, this is a common approach. SNESSetLagJacobian -snes_lag_jacobian > > > > > > The normal approach in SNES to use matrix-free for the operator and use finite differences to compute an approximate Jacobian used to construct preconditioners is to to create a sparse matrix with the sparsity of the approximate Jacobian (yes you need a way to figure out the sparsity, if you use DMDA it will figure out the sparsity for you). Then you use > > > > > > SNESSetJacobian(snes,J,J, SNESComputeJacobianDefaultColor, NULL); > > > > > > and use the options database option -snes_mf_operator > > > > > > > > > > Are there any PCs that don't work in the matrix-free context? > > > > > > If you do the above you can use almost all the PC since you are providing an explicit matrix from which to build the preconditioner > > > > > > > Are there any example codes I overlooked? > > > > > > > > Last but not least... can the Boomer-AMG preconditioner work with JFNK? To really show my ignorance of AMG, can it actually be written as a matrix P^-1(Ax-b)=0, , or is it just a linear operator? > > > > > > Again, if you provide an approximate Jacobian like above you can use it with BoomerAMG, if you provide NO explicit matrix you cannot use BoomerAMG or almost any other preconditioner. > > > > > > Barry > > > > > > > > > > > Thanks, > > > > Mark > > > > > > > > > > > > > > > From mlohry at gmail.com Mon Oct 30 14:23:04 2017 From: mlohry at gmail.com (Mark Lohry) Date: Mon, 30 Oct 2017 15:23:04 -0400 Subject: [petsc-users] preconditioning matrix-free newton-krylov In-Reply-To: References: <1C4B04A3719F56479255009C095BE3B593D45932@CSGMBX214W.pu.win.princeton.edu> <9D8F0366-3B1E-461F-8126-ADD87FA38F65@mcs.anl.gov> <0BB37C70-80EB-4043-BB4F-4BF65BF7F8EB@mcs.anl.gov> Message-ID: > > > Hmm, are those blocks dense? If so you could benefit enormously from > using BAIJ format. Yes they're dense blocks. Usually coupled compressible 3D NS with DG elements, 5 equations x order (N+1)*(N+2)*(N+3)/3 block size. So block sizes of 50^2 to 175^2 are typical. I'll try BAIJ; I initially set it up with AIJ as it seemed better supported in parallel on the linear solver table, but I suppose these are rather large blocks... still surprising performance as this was overall a pretty small system (1,536 elements/diagonal 100^2 blocks). Could you run with -ksp_view_mat binary and send the resulting file called > binaryoutput and we can run the coloring codes local to performance debug. Will send this evening. On Mon, Oct 30, 2017 at 3:02 PM, Smith, Barry F. wrote: > > > On Oct 30, 2017, at 1:58 PM, Mark Lohry wrote: > > > > Hmm, metis doesn't really have anything to do with the sparsity of the > Jacobian does it? > > > > No, I just mean I'm doing initial partitioning and parallel > communication for the residual evaluations independently of petsc, and then > doing a 1-to-1 mapping to the petsc solution vector. Along with manually > setting the non-zero structure of the MPIAIJ system as in the user manual. > I don't think there's anything wrong with the system structure as it gives > the same correct answer as the un-preconditioned matrix-free approach. > > > > The exact system those MatColoring times came from has size (100x100) > blocks on the diagonals corresponding to the tetrahedral cells, with those > having 4 neighbor blocks on the same row (or fewer for elements on > boundaries.) > > Hmm, are those blocks dense? If so you could benefit enormously from > using BAIJ format. > > Matt, > > Sounds like performance bugs for the parallel coloring apply > algorithms with big "diagonal blocks" > > Mark, > > Could you run with -ksp_view_mat binary and send the resulting file > called binaryoutput and we can run the coloring codes local to performance > debug. > > > Barry > > > > > On Mon, Oct 30, 2017 at 1:55 PM, Smith, Barry F. > wrote: > > > > > On Oct 30, 2017, at 12:39 PM, Mark Lohry wrote: > > > > > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the > FD computation of the jacobians, or for the computation of the > preconditioner? I'd like to get a handle on the relative costs of these. > > > > > > No, do you just want the time? You can get that from the logging; > for example -log_view > > > > > > Yes, was just thinking in regards to your suggestion of recomputing > when the number of linear iterations gets too high; I assume it's the ratio > of preconditioner cost vs linear solver cost at runtime that's the metric > of interest, and not the absolute value of either. But I'll cross that > bridge when I come to it. > > > > > > When I had asked, I was looking to see where a long pause was > happening thinking it was the FD jacobian; turned out to be before that in > MatColoringApply which seems surprisingly expensive. MATCOLORINGJP took ~15 > minutes on 32 cores on a small 153,000^2 system, with MATCOLORINGGREEDY > taking 30 seconds. Any guidance there, or is this expected? I'm not using > DM, just manually entering the sparsity resulting from a metis > decomposition of a tetrahedral mesh. > > > > Hmm, metis doesn't really have anything to do with the sparsity of > the Jacobian does it? > > > > Matt, > > > > These times are huge. What is going on? > > > > Barry > > > > > > > > > > > Thanks for the info on the lag logic, I'll play with the TS pre/post > calls for the time-accurate problems and only use LagJacobian. > > > > > > On Mon, Oct 30, 2017 at 11:29 AM, Smith, Barry F. > wrote: > > > > > > > On Oct 29, 2017, at 11:50 AM, Mark Lohry wrote: > > > > > > > > Thanks again Barry, I've got the preconditioners hooked up with > -snes_mf_operator and at least AMG looks to be working great on a high > order unstructured DG problem. > > > > > > > > Couple questions on the SNESSetLagJacobian + > SNESSetLagPreconditioner code flow: > > > > > > > > 1) With -snes_mf_operator, and given SNESSetLagJacobian(snes, 1) > (default) and SNESSetLagPreconditioner(snes, 2), after the first KSP solve > in a newton iteration, will it do the finite different jacobian > calculation? Or will the Jacobian only be computed when the preconditioner > lag setting demands it on the 3rd newton step? I suspect it's the latter > based on where I see the code pause. > > > > > > SNES with -snes_mf_operator will ALWAYS use the matrix-free finite > difference f(x+h) - f(x) to apply the matrix vector product. > > > > > > The LagJacobian and LagPreconditioner are not coordinated. The > first determines how often the Jacobian used for preconditioning is > recomputed and the second determines how often the preconditioner is > recomputed. > > > > > > If you are using -snes_mf_operator then it never makes sense to > have lagJacobian < lagPreconditioner since it would recompute the Jacobian > but not actually use it. It also makes no sense for lagPreconditioner < > lagJacobian because you'd be recomputing the preconditioner on the same > Jacobian. > > > > > > But actually if you don't change the Jacobian used in building the > preconditioner then when it tries to recompute the preconditioner it > determines the matrix has not changed so skips rebuilding the > preconditioner. So when using -snes_mf_operator there is really no reason > generally to set the preconditioner lag. > > > > > > > > 2) How do implicit TS and SNESSetLagPreconditioner/Persists > interact? Does the counter since-last-preconditioner-compute reset with > time steps, or does that lag counter just increment with every SNES solve > regardless of how many nonlinear solves might have happened in a given > timestep? Say lag preconditioner is 2, and a time stepper uses 3, 2, and 3 > nonlinear solves on 3 steps, is the flow > > > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > > (time step 2)->(snes solve)->(update preconditioner)->(snes solve) > > > > (time step 3)->(snes solve)->(update preconditioner)->(snes > solve)->(snes solve) > > > > > > > > or > > > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > > (time step 2)->(update preconditioner)->(snes solve)->(snes solve) > > > > (time step 3)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > > > > > > ? > > > > > > > > I think for implicit time stepping I'd probably want the > preconditioner to be recomputed just once at the beginning of each time > step, or some multiple of that. Does that sound reasonable? > > > > > > Yes, what you want to do is completely reasonable. > > > > > > You can use SNESSetLagJacobian() and SNESSetLagJacobianPersists() > in combination to have the Jacobian recomputed ever fixed number of times; > if you set the persists flag and set LagJacobian to 10 it will recompute > the Jacobian used in the preconditioner every 10th time a new Jacobian is > needed. > > > > > > If you want to compute the new Jacobian used to build the > preconditioner once at the beginning of each new TS stage you can set > SNESSetLagJacobian() to negative -2 in the TS prestage call. There are > possibly other tricks you can do by setting the two flags at different > locations. > > > > > > An alternative to hardwiring how often the Jacobian used to build > the preconditioner is rebuilt is to rebuild based on when the > preconditioner starts "working less well". Here you could put an additional > KSPMonitor or SNESMonitor that detects if the number of linear iterations > is above a certain amount and then sets the recompute Jacobian flag to -2 > so that for the next solve it recreates the Jacobian used in building the > preconditioner. > > > > > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the > FD computation of the jacobians, or for the computation of the > preconditioner? I'd like to get a handle on the relative costs of these. > > > > > > No, do you just want the time? You can get that from the logging; > for example -log_view > > > > > > > > > > > > > > > Best, > > > > Mark > > > > > > > > On Sat, Sep 23, 2017 at 3:28 PM, Mark Lohry > wrote: > > > > Great, thanks Barry. > > > > > > > > On Sat, Sep 23, 2017 at 3:12 PM, Barry Smith > wrote: > > > > > > > > > On Sep 23, 2017, at 12:48 PM, Mark W. Lohry > wrote: > > > > > > > > > > I'm currently using JFNK in an application where I don't have a > hand-coded jacobian, and it's working well enough but as expected the > scaling isn't great. > > > > > > > > > > What is the general process for using PC with > MatMFFDComputeJacobian? Does it make sense to occasionally have petsc > re-compute the jacobian via finite differences, and then recompute the > preconditioner? Any that just need the sparsity structure? > > > > > > > > Mark > > > > > > > > Yes, this is a common approach. SNESSetLagJacobian > -snes_lag_jacobian > > > > > > > > The normal approach in SNES to use matrix-free for the operator > and use finite differences to compute an approximate Jacobian used to > construct preconditioners is to to create a sparse matrix with the sparsity > of the approximate Jacobian (yes you need a way to figure out the sparsity, > if you use DMDA it will figure out the sparsity for you). Then you use > > > > > > > > SNESSetJacobian(snes,J,J, SNESComputeJacobianDefaultColor, NULL); > > > > > > > > and use the options database option -snes_mf_operator > > > > > > > > > > > > > Are there any PCs that don't work in the matrix-free context? > > > > > > > > If you do the above you can use almost all the PC since you are > providing an explicit matrix from which to build the preconditioner > > > > > > > > > Are there any example codes I overlooked? > > > > > > > > > > Last but not least... can the Boomer-AMG preconditioner work with > JFNK? To really show my ignorance of AMG, can it actually be written as a > matrix P^-1(Ax-b)=0, , or is it just a linear operator? > > > > > > > > Again, if you provide an approximate Jacobian like above you can > use it with BoomerAMG, if you provide NO explicit matrix you cannot use > BoomerAMG or almost any other preconditioner. > > > > > > > > Barry > > > > > > > > > > > > > > Thanks, > > > > > Mark > > > > > > > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Oct 30 14:28:30 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Mon, 30 Oct 2017 19:28:30 +0000 Subject: [petsc-users] preconditioning matrix-free newton-krylov In-Reply-To: References: <1C4B04A3719F56479255009C095BE3B593D45932@CSGMBX214W.pu.win.princeton.edu> <9D8F0366-3B1E-461F-8126-ADD87FA38F65@mcs.anl.gov> <0BB37C70-80EB-4043-BB4F-4BF65BF7F8EB@mcs.anl.gov> Message-ID: <22616605-EB8A-4ED6-BDCD-7CC1C805AFEB@mcs.anl.gov> > On Oct 30, 2017, at 2:23 PM, Mark Lohry wrote: > > > Hmm, are those blocks dense? If so you could benefit enormously from using BAIJ format. > > > Yes they're dense blocks. Usually coupled compressible 3D NS with DG elements, 5 equations x order (N+1)*(N+2)*(N+3)/3 block size. So block sizes of 50^2 to 175^2 are typical. I'll try BAIJ; I initially set it up with AIJ as it seemed better supported in parallel on the linear solver table, but I suppose these are rather large blocks... still surprising performance as this was overall a pretty small system (1,536 elements/diagonal 100^2 blocks). Something is really wrong to get those huge times. > > > Could you run with -ksp_view_mat binary and send the resulting file called binaryoutput and we can run the coloring codes local to performance debug. > > > Will send this evening. Thanks but send for the AIJ case, not BAIJ > > > On Mon, Oct 30, 2017 at 3:02 PM, Smith, Barry F. wrote: > > > On Oct 30, 2017, at 1:58 PM, Mark Lohry wrote: > > > > Hmm, metis doesn't really have anything to do with the sparsity of the Jacobian does it? > > > > No, I just mean I'm doing initial partitioning and parallel communication for the residual evaluations independently of petsc, and then doing a 1-to-1 mapping to the petsc solution vector. Along with manually setting the non-zero structure of the MPIAIJ system as in the user manual. I don't think there's anything wrong with the system structure as it gives the same correct answer as the un-preconditioned matrix-free approach. > > > > The exact system those MatColoring times came from has size (100x100) blocks on the diagonals corresponding to the tetrahedral cells, with those having 4 neighbor blocks on the same row (or fewer for elements on boundaries.) > > Hmm, are those blocks dense? If so you could benefit enormously from using BAIJ format. > > Matt, > > Sounds like performance bugs for the parallel coloring apply algorithms with big "diagonal blocks" > > Mark, > > Could you run with -ksp_view_mat binary and send the resulting file called binaryoutput and we can run the coloring codes local to performance debug. > > > Barry > > > > > On Mon, Oct 30, 2017 at 1:55 PM, Smith, Barry F. wrote: > > > > > On Oct 30, 2017, at 12:39 PM, Mark Lohry wrote: > > > > > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the FD computation of the jacobians, or for the computation of the preconditioner? I'd like to get a handle on the relative costs of these. > > > > > > No, do you just want the time? You can get that from the logging; for example -log_view > > > > > > Yes, was just thinking in regards to your suggestion of recomputing when the number of linear iterations gets too high; I assume it's the ratio of preconditioner cost vs linear solver cost at runtime that's the metric of interest, and not the absolute value of either. But I'll cross that bridge when I come to it. > > > > > > When I had asked, I was looking to see where a long pause was happening thinking it was the FD jacobian; turned out to be before that in MatColoringApply which seems surprisingly expensive. MATCOLORINGJP took ~15 minutes on 32 cores on a small 153,000^2 system, with MATCOLORINGGREEDY taking 30 seconds. Any guidance there, or is this expected? I'm not using DM, just manually entering the sparsity resulting from a metis decomposition of a tetrahedral mesh. > > > > Hmm, metis doesn't really have anything to do with the sparsity of the Jacobian does it? > > > > Matt, > > > > These times are huge. What is going on? > > > > Barry > > > > > > > > > > > Thanks for the info on the lag logic, I'll play with the TS pre/post calls for the time-accurate problems and only use LagJacobian. > > > > > > On Mon, Oct 30, 2017 at 11:29 AM, Smith, Barry F. wrote: > > > > > > > On Oct 29, 2017, at 11:50 AM, Mark Lohry wrote: > > > > > > > > Thanks again Barry, I've got the preconditioners hooked up with -snes_mf_operator and at least AMG looks to be working great on a high order unstructured DG problem. > > > > > > > > Couple questions on the SNESSetLagJacobian + SNESSetLagPreconditioner code flow: > > > > > > > > 1) With -snes_mf_operator, and given SNESSetLagJacobian(snes, 1) (default) and SNESSetLagPreconditioner(snes, 2), after the first KSP solve in a newton iteration, will it do the finite different jacobian calculation? Or will the Jacobian only be computed when the preconditioner lag setting demands it on the 3rd newton step? I suspect it's the latter based on where I see the code pause. > > > > > > SNES with -snes_mf_operator will ALWAYS use the matrix-free finite difference f(x+h) - f(x) to apply the matrix vector product. > > > > > > The LagJacobian and LagPreconditioner are not coordinated. The first determines how often the Jacobian used for preconditioning is recomputed and the second determines how often the preconditioner is recomputed. > > > > > > If you are using -snes_mf_operator then it never makes sense to have lagJacobian < lagPreconditioner since it would recompute the Jacobian but not actually use it. It also makes no sense for lagPreconditioner < lagJacobian because you'd be recomputing the preconditioner on the same Jacobian. > > > > > > But actually if you don't change the Jacobian used in building the preconditioner then when it tries to recompute the preconditioner it determines the matrix has not changed so skips rebuilding the preconditioner. So when using -snes_mf_operator there is really no reason generally to set the preconditioner lag. > > > > > > > > 2) How do implicit TS and SNESSetLagPreconditioner/Persists interact? Does the counter since-last-preconditioner-compute reset with time steps, or does that lag counter just increment with every SNES solve regardless of how many nonlinear solves might have happened in a given timestep? Say lag preconditioner is 2, and a time stepper uses 3, 2, and 3 nonlinear solves on 3 steps, is the flow > > > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > > > > (time step 2)->(snes solve)->(update preconditioner)->(snes solve) > > > > (time step 3)->(snes solve)->(update preconditioner)->(snes solve)->(snes solve) > > > > > > > > or > > > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > > > > (time step 2)->(update preconditioner)->(snes solve)->(snes solve) > > > > (time step 3)->(update preconditioner)->(snes solve)->(snes solve)->(update preconditioner)->(snes solve) > > > > > > > > ? > > > > > > > > I think for implicit time stepping I'd probably want the preconditioner to be recomputed just once at the beginning of each time step, or some multiple of that. Does that sound reasonable? > > > > > > Yes, what you want to do is completely reasonable. > > > > > > You can use SNESSetLagJacobian() and SNESSetLagJacobianPersists() in combination to have the Jacobian recomputed ever fixed number of times; if you set the persists flag and set LagJacobian to 10 it will recompute the Jacobian used in the preconditioner every 10th time a new Jacobian is needed. > > > > > > If you want to compute the new Jacobian used to build the preconditioner once at the beginning of each new TS stage you can set SNESSetLagJacobian() to negative -2 in the TS prestage call. There are possibly other tricks you can do by setting the two flags at different locations. > > > > > > An alternative to hardwiring how often the Jacobian used to build the preconditioner is rebuilt is to rebuild based on when the preconditioner starts "working less well". Here you could put an additional KSPMonitor or SNESMonitor that detects if the number of linear iterations is above a certain amount and then sets the recompute Jacobian flag to -2 so that for the next solve it recreates the Jacobian used in building the preconditioner. > > > > > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the FD computation of the jacobians, or for the computation of the preconditioner? I'd like to get a handle on the relative costs of these. > > > > > > No, do you just want the time? You can get that from the logging; for example -log_view > > > > > > > > > > > > > > > Best, > > > > Mark > > > > > > > > On Sat, Sep 23, 2017 at 3:28 PM, Mark Lohry wrote: > > > > Great, thanks Barry. > > > > > > > > On Sat, Sep 23, 2017 at 3:12 PM, Barry Smith wrote: > > > > > > > > > On Sep 23, 2017, at 12:48 PM, Mark W. Lohry wrote: > > > > > > > > > > I'm currently using JFNK in an application where I don't have a hand-coded jacobian, and it's working well enough but as expected the scaling isn't great. > > > > > > > > > > What is the general process for using PC with MatMFFDComputeJacobian? Does it make sense to occasionally have petsc re-compute the jacobian via finite differences, and then recompute the preconditioner? Any that just need the sparsity structure? > > > > > > > > Mark > > > > > > > > Yes, this is a common approach. SNESSetLagJacobian -snes_lag_jacobian > > > > > > > > The normal approach in SNES to use matrix-free for the operator and use finite differences to compute an approximate Jacobian used to construct preconditioners is to to create a sparse matrix with the sparsity of the approximate Jacobian (yes you need a way to figure out the sparsity, if you use DMDA it will figure out the sparsity for you). Then you use > > > > > > > > SNESSetJacobian(snes,J,J, SNESComputeJacobianDefaultColor, NULL); > > > > > > > > and use the options database option -snes_mf_operator > > > > > > > > > > > > > Are there any PCs that don't work in the matrix-free context? > > > > > > > > If you do the above you can use almost all the PC since you are providing an explicit matrix from which to build the preconditioner > > > > > > > > > Are there any example codes I overlooked? > > > > > > > > > > Last but not least... can the Boomer-AMG preconditioner work with JFNK? To really show my ignorance of AMG, can it actually be written as a matrix P^-1(Ax-b)=0, , or is it just a linear operator? > > > > > > > > Again, if you provide an approximate Jacobian like above you can use it with BoomerAMG, if you provide NO explicit matrix you cannot use BoomerAMG or almost any other preconditioner. > > > > > > > > Barry > > > > > > > > > > > > > > Thanks, > > > > > Mark > > > > > > > > > > > > > > > > > > > > > > > > From balay at mcs.anl.gov Mon Oct 30 18:14:54 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 30 Oct 2017 18:14:54 -0500 Subject: [petsc-users] configuration error In-Reply-To: <07C87D43-41E1-42D1-940A-F85BECFE9A8E@gmail.com> References: <07C87D43-41E1-42D1-940A-F85BECFE9A8E@gmail.com> Message-ID: --prefix=/Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.8.0/../ You have a strange prefix. You are basically using: --prefix=/Users/manav/Documents/codes/numerical_lib/petsc The general convention is to use a different prefix for different versions of libraries [or different type of builds] Can you redo a clean build [i.e rm -rf PETSC_ARCH] into a clean prefix location? Satish On Mon, 30 Oct 2017, Manav Bhatia wrote: > I am not getting errors with mumps (please see below). > Interestingly, I just compiled this on another machine with clang-3.8 and gfortran-6.7 without problems. > > -Manav > > mumps_c.c:307:53: error: no member named 'nnz' in 'CMUMPS_STRUC_C'; did you mean 'nz'? > mumps_par->n=0; mumps_par->nz=0; mumps_par->nnz=0; mumps_par->nz_loc=0; mumps_par->nnz_loc=0; mumps_par->nelt=0;mumps_par->instance_number=0;mumps_par->deficiency=0;mumps_par->lwk_user=0;mumps_par->size_schur=0;mumps_par->lrhs=0; mumps_par->lredrhs=0; mumps_par->nrhs=0; mumps_par->nz_rhs=0; mumps_par->lsol_loc=0; > ^~~ > nz > /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:56:20: note: 'nz' declared here > MUMPS_INT nz; > ^ > mumps_c.c:307:92: error: no member named 'nnz_loc' in 'CMUMPS_STRUC_C'; did you mean 'nz_loc'? > mumps_par->n=0; mumps_par->nz=0; mumps_par->nnz=0; mumps_par->nz_loc=0; mumps_par->nnz_loc=0; mumps_par->nelt=0;mumps_par->instance_number=0;mumps_par->deficiency=0;mumps_par->lwk_user=0;mumps_par->size_schur=0;mumps_par->lrhs=0; mumps_par->lredrhs=0; mumps_par->nrhs=0; mumps_par->nz_rhs=0; mumps_par->lsol_loc=0; > ^~~~~~~ > nz_loc > /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:62:20: note: 'nz_loc' declared here > MUMPS_INT nz_loc; > ^ > mumps_c.c:419:42: error: no member named 'nnz' in 'CMUMPS_STRUC_C'; did you mean 'nz'? > &(mumps_par->nz), &(mumps_par->nnz), irn, &irn_avail, jcn, &jcn_avail, a, &a_avail, > ^~~ > nz > /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:56:20: note: 'nz' declared here > MUMPS_INT nz; > ^ > mumps_c.c:420:46: error: no member named 'nnz_loc' in 'CMUMPS_STRUC_C'; did you mean 'nz_loc'? > &(mumps_par->nz_loc), &(mumps_par->nnz_loc), irn_loc, &irn_loc_avail, jcn_loc, &jcn_loc_avail, > ^~~~~~~ > nz_loc > /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:62:20: note: 'nz_loc' declared here > MUMPS_INT nz_loc; > ^ > mumps_c.c:419:29: warning: incompatible pointer types passing 'int *' to parameter of type 'int64_t *' (aka 'long long *') [-Wincompatible-pointer-types] > &(mumps_par->nz), &(mumps_par->nnz), irn, &irn_avail, jcn, &jcn_avail, a, &a_avail, > ^~~~~~~~~~~~~~~~~ > mumps_c.c:99:28: note: passing argument to parameter 'nnz' here > MUMPS_INT8 *nnz, > ^ > mumps_c.c:420:33: warning: incompatible pointer types passing 'int *' to parameter of type 'int64_t *' (aka 'long long *') [-Wincompatible-pointer-types] > &(mumps_par->nz_loc), &(mumps_par->nnz_loc), irn_loc, &irn_loc_avail, jcn_loc, &jcn_loc_avail, > ^~~~~~~~~~~~~~~~~~~~~ > mumps_c.c:107:28: note: passing argument to parameter 'nnz_loc' here > MUMPS_INT8 *nnz_loc, > ^ > 2 warnings and 4 errors generated. > > > > > On Oct 30, 2017, at 11:36 AM, Kong, Fande wrote: > > > > We had exactly the same issue when upgraded compilers. I guess this is somehow related to gfortran. A simple way to work around for us is to change if with_rpath: to if False at line 54 of config/BuildSystem/config/libraries.py. > > > > Not sure if it works for you. > > > > Fande, > > > > > > > > > > On Mon, Oct 30, 2017 at 10:14 AM, Manav Bhatia > wrote: > > Hi, > > > > I am trying to install pets 3.8 on a new MacBook machine with OS 10.13. I have installed openmpi from macports and I am getting this error on configuration. Attached is also the configure.log file. > > > > I am not sure how to proceed with this. Any advice will be greatly appreciated! > > > > Regards, > > Manav > > > > =============================================================================== > > Configuring PETSc to compile on your system > > =============================================================================== > > =============================================================================== ***** WARNING: Using default optimization C flags -g -O3 You might consider manually setting optimal optimization flags for your system with COPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default C++ optimization flags -g -O3 You might consider manually setting optimal optimization flags for your system with CXXOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default FORTRAN optimization flags -g -O You might consider manually setting optimal optimization flags for your system with FOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== WARNING! Compiling PETSc with no debugging, this should only be done for timing and production runs. All development should be done when configured using --with-debugging=1 =============================================================================== TESTING: checkCLibraries from config.compilers(config/BuildSystem/config/compilers.py:171) ******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > > ------------------------------------------------------------------------------- > > C libraries cannot directly be used from Fortran > > ******************************************************************************* > > > > > > > > > > > > > > From mlohry at gmail.com Mon Oct 30 18:38:07 2017 From: mlohry at gmail.com (Mark Lohry) Date: Mon, 30 Oct 2017 19:38:07 -0400 Subject: [petsc-users] preconditioning matrix-free newton-krylov In-Reply-To: <22616605-EB8A-4ED6-BDCD-7CC1C805AFEB@mcs.anl.gov> References: <1C4B04A3719F56479255009C095BE3B593D45932@CSGMBX214W.pu.win.princeton.edu> <9D8F0366-3B1E-461F-8126-ADD87FA38F65@mcs.anl.gov> <0BB37C70-80EB-4043-BB4F-4BF65BF7F8EB@mcs.anl.gov> <22616605-EB8A-4ED6-BDCD-7CC1C805AFEB@mcs.anl.gov> Message-ID: Sparsity pattern binary AIJ gzipped here, should have 100^2 blocks of all 1's indicating the non-zero positions: https://github.com/mlohry/petsc_miscellany/blob/master/jacobian_sparsity.dat.gz Using 32 processes on 2x16-core AMD 6274, timing for MatColoringApply is ~877 seconds/~15 minutes for MATCOLORINGJP, ~30 seconds for MATCOLORINGGREEDY. Let me know if you can't reproduce this and I'll put together a MWE. MATCOLORINGID and MATCOLORINGSL both seem to work in 9 seconds, though documentation says they're both serial? Regarding BAIJ, neither JP nor GREEDY work, throwing "Matrix must be AIJ for greedy coloring", so am I out of luck for using MPIBAIJ? On Mon, Oct 30, 2017 at 3:28 PM, Smith, Barry F. wrote: > > > On Oct 30, 2017, at 2:23 PM, Mark Lohry wrote: > > > > > > Hmm, are those blocks dense? If so you could benefit enormously from > using BAIJ format. > > > > > > Yes they're dense blocks. Usually coupled compressible 3D NS with DG > elements, 5 equations x order (N+1)*(N+2)*(N+3)/3 block size. So block > sizes of 50^2 to 175^2 are typical. I'll try BAIJ; I initially set it up > with AIJ as it seemed better supported in parallel on the linear solver > table, but I suppose these are rather large blocks... still surprising > performance as this was overall a pretty small system (1,536 > elements/diagonal 100^2 blocks). > > Something is really wrong to get those huge times. > > > > > > Could you run with -ksp_view_mat binary and send the resulting file > called binaryoutput and we can run the coloring codes local to performance > debug. > > > > > > Will send this evening. > > Thanks but send for the AIJ case, not BAIJ > > > > > > > On Mon, Oct 30, 2017 at 3:02 PM, Smith, Barry F. > wrote: > > > > > On Oct 30, 2017, at 1:58 PM, Mark Lohry wrote: > > > > > > Hmm, metis doesn't really have anything to do with the sparsity of the > Jacobian does it? > > > > > > No, I just mean I'm doing initial partitioning and parallel > communication for the residual evaluations independently of petsc, and then > doing a 1-to-1 mapping to the petsc solution vector. Along with manually > setting the non-zero structure of the MPIAIJ system as in the user manual. > I don't think there's anything wrong with the system structure as it gives > the same correct answer as the un-preconditioned matrix-free approach. > > > > > > The exact system those MatColoring times came from has size (100x100) > blocks on the diagonals corresponding to the tetrahedral cells, with those > having 4 neighbor blocks on the same row (or fewer for elements on > boundaries.) > > > > Hmm, are those blocks dense? If so you could benefit enormously from > using BAIJ format. > > > > Matt, > > > > Sounds like performance bugs for the parallel coloring apply > algorithms with big "diagonal blocks" > > > > Mark, > > > > Could you run with -ksp_view_mat binary and send the resulting file > called binaryoutput and we can run the coloring codes local to performance > debug. > > > > > > Barry > > > > > > > > On Mon, Oct 30, 2017 at 1:55 PM, Smith, Barry F. > wrote: > > > > > > > On Oct 30, 2017, at 12:39 PM, Mark Lohry wrote: > > > > > > > > > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for > the FD computation of the jacobians, or for the computation of the > preconditioner? I'd like to get a handle on the relative costs of these. > > > > > > > > No, do you just want the time? You can get that from the logging; > for example -log_view > > > > > > > > Yes, was just thinking in regards to your suggestion of recomputing > when the number of linear iterations gets too high; I assume it's the ratio > of preconditioner cost vs linear solver cost at runtime that's the metric > of interest, and not the absolute value of either. But I'll cross that > bridge when I come to it. > > > > > > > > When I had asked, I was looking to see where a long pause was > happening thinking it was the FD jacobian; turned out to be before that in > MatColoringApply which seems surprisingly expensive. MATCOLORINGJP took ~15 > minutes on 32 cores on a small 153,000^2 system, with MATCOLORINGGREEDY > taking 30 seconds. Any guidance there, or is this expected? I'm not using > DM, just manually entering the sparsity resulting from a metis > decomposition of a tetrahedral mesh. > > > > > > Hmm, metis doesn't really have anything to do with the sparsity of > the Jacobian does it? > > > > > > Matt, > > > > > > These times are huge. What is going on? > > > > > > Barry > > > > > > > > > > > > > > > Thanks for the info on the lag logic, I'll play with the TS pre/post > calls for the time-accurate problems and only use LagJacobian. > > > > > > > > On Mon, Oct 30, 2017 at 11:29 AM, Smith, Barry F. < > bsmith at mcs.anl.gov> wrote: > > > > > > > > > On Oct 29, 2017, at 11:50 AM, Mark Lohry wrote: > > > > > > > > > > Thanks again Barry, I've got the preconditioners hooked up with > -snes_mf_operator and at least AMG looks to be working great on a high > order unstructured DG problem. > > > > > > > > > > Couple questions on the SNESSetLagJacobian + > SNESSetLagPreconditioner code flow: > > > > > > > > > > 1) With -snes_mf_operator, and given SNESSetLagJacobian(snes, 1) > (default) and SNESSetLagPreconditioner(snes, 2), after the first KSP solve > in a newton iteration, will it do the finite different jacobian > calculation? Or will the Jacobian only be computed when the preconditioner > lag setting demands it on the 3rd newton step? I suspect it's the latter > based on where I see the code pause. > > > > > > > > SNES with -snes_mf_operator will ALWAYS use the matrix-free > finite difference f(x+h) - f(x) to apply the matrix vector product. > > > > > > > > The LagJacobian and LagPreconditioner are not coordinated. The > first determines how often the Jacobian used for preconditioning is > recomputed and the second determines how often the preconditioner is > recomputed. > > > > > > > > If you are using -snes_mf_operator then it never makes sense to > have lagJacobian < lagPreconditioner since it would recompute the Jacobian > but not actually use it. It also makes no sense for lagPreconditioner < > lagJacobian because you'd be recomputing the preconditioner on the same > Jacobian. > > > > > > > > But actually if you don't change the Jacobian used in building the > preconditioner then when it tries to recompute the preconditioner it > determines the matrix has not changed so skips rebuilding the > preconditioner. So when using -snes_mf_operator there is really no reason > generally to set the preconditioner lag. > > > > > > > > > > 2) How do implicit TS and SNESSetLagPreconditioner/Persists > interact? Does the counter since-last-preconditioner-compute reset with > time steps, or does that lag counter just increment with every SNES solve > regardless of how many nonlinear solves might have happened in a given > timestep? Say lag preconditioner is 2, and a time stepper uses 3, 2, and 3 > nonlinear solves on 3 steps, is the flow > > > > > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > > > (time step 2)->(snes solve)->(update preconditioner)->(snes solve) > > > > > (time step 3)->(snes solve)->(update preconditioner)->(snes > solve)->(snes solve) > > > > > > > > > > or > > > > > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > > > (time step 2)->(update preconditioner)->(snes solve)->(snes solve) > > > > > (time step 3)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > > > > > > > > ? > > > > > > > > > > I think for implicit time stepping I'd probably want the > preconditioner to be recomputed just once at the beginning of each time > step, or some multiple of that. Does that sound reasonable? > > > > > > > > Yes, what you want to do is completely reasonable. > > > > > > > > You can use SNESSetLagJacobian() and > SNESSetLagJacobianPersists() in combination to have the Jacobian > recomputed ever fixed number of times; if you set the persists flag and set > LagJacobian to 10 it will recompute the Jacobian used in the preconditioner > every 10th time a new Jacobian is needed. > > > > > > > > If you want to compute the new Jacobian used to build the > preconditioner once at the beginning of each new TS stage you can set > SNESSetLagJacobian() to negative -2 in the TS prestage call. There are > possibly other tricks you can do by setting the two flags at different > locations. > > > > > > > > An alternative to hardwiring how often the Jacobian used to build > the preconditioner is rebuilt is to rebuild based on when the > preconditioner starts "working less well". Here you could put an additional > KSPMonitor or SNESMonitor that detects if the number of linear iterations > is above a certain amount and then sets the recompute Jacobian flag to -2 > so that for the next solve it recreates the Jacobian used in building the > preconditioner. > > > > > > > > > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for > the FD computation of the jacobians, or for the computation of the > preconditioner? I'd like to get a handle on the relative costs of these. > > > > > > > > No, do you just want the time? You can get that from the logging; > for example -log_view > > > > > > > > > > > > > > > > > > > Best, > > > > > Mark > > > > > > > > > > On Sat, Sep 23, 2017 at 3:28 PM, Mark Lohry > wrote: > > > > > Great, thanks Barry. > > > > > > > > > > On Sat, Sep 23, 2017 at 3:12 PM, Barry Smith > wrote: > > > > > > > > > > > On Sep 23, 2017, at 12:48 PM, Mark W. Lohry < > mlohry at princeton.edu> wrote: > > > > > > > > > > > > I'm currently using JFNK in an application where I don't have a > hand-coded jacobian, and it's working well enough but as expected the > scaling isn't great. > > > > > > > > > > > > What is the general process for using PC with > MatMFFDComputeJacobian? Does it make sense to occasionally have petsc > re-compute the jacobian via finite differences, and then recompute the > preconditioner? Any that just need the sparsity structure? > > > > > > > > > > Mark > > > > > > > > > > Yes, this is a common approach. SNESSetLagJacobian > -snes_lag_jacobian > > > > > > > > > > The normal approach in SNES to use matrix-free for the > operator and use finite differences to compute an approximate Jacobian used > to construct preconditioners is to to create a sparse matrix with the > sparsity of the approximate Jacobian (yes you need a way to figure out the > sparsity, if you use DMDA it will figure out the sparsity for you). Then > you use > > > > > > > > > > SNESSetJacobian(snes,J,J, SNESComputeJacobianDefaultColor, > NULL); > > > > > > > > > > and use the options database option -snes_mf_operator > > > > > > > > > > > > > > > > Are there any PCs that don't work in the matrix-free context? > > > > > > > > > > If you do the above you can use almost all the PC since you are > providing an explicit matrix from which to build the preconditioner > > > > > > > > > > > Are there any example codes I overlooked? > > > > > > > > > > > > Last but not least... can the Boomer-AMG preconditioner work > with JFNK? To really show my ignorance of AMG, can it actually be written > as a matrix P^-1(Ax-b)=0, , or is it just a linear operator? > > > > > > > > > > Again, if you provide an approximate Jacobian like above you can > use it with BoomerAMG, if you provide NO explicit matrix you cannot use > BoomerAMG or almost any other preconditioner. > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > Thanks, > > > > > > Mark > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Oct 30 18:40:42 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 30 Oct 2017 18:40:42 -0500 Subject: [petsc-users] configuration error In-Reply-To: References: Message-ID: The compiler library detection code is a bit messy. Its there to help with interlanguage linking. One workarround [to such failures] is to tell configure not to guess, and specify the relavent info. For eg: balay at ipro^~/petsc(master) $ ./configure --download-mpich --download-hypre CC=clang CXX=clang++ FC=gfortran --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS='-L/usr/local/Cellar/gcc/7.2.0/lib/gcc/7/gcc/x86_64-apple-darwin17.0.0/7.2.0/../../.. -lgfortran -lc++' Satish On Mon, 30 Oct 2017, Kong, Fande wrote: > We had exactly the same issue when upgraded compilers. I guess this is > somehow related to gfortran. A simple way to work around for us is to > change* if with_rpath*: to * if False *at line 54 of > config/BuildSystem/config/libraries.py. > > Not sure if it works for you. > > Fande, > > > > > On Mon, Oct 30, 2017 at 10:14 AM, Manav Bhatia > wrote: > > > Hi, > > > > I am trying to install pets 3.8 on a new MacBook machine with OS 10.13. > > I have installed openmpi from macports and I am getting this error on > > configuration. Attached is also the configure.log file. > > > > I am not sure how to proceed with this. Any advice will be greatly > > appreciated! > > > > Regards, > > Manav > > > > ============================================================ > > =================== > > Configuring PETSc to compile on your system > > > > ============================================================ > > =================== > > =============================================================================== > > > > > > ***** WARNING: Using default optimization C flags -g -O3 > > > > > > You might consider manually setting optimal optimization flags for your > > system with > > > > COPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for > > examples > > > > =============================================================================== > > > > > > =============================================================================== > > > > > > ***** WARNING: Using default C++ optimization flags -g -O3 > > > > > > You might consider manually setting optimal optimization flags for your > > system with > > > > CXXOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py > > for examples > > > > =============================================================================== > > > > > > =============================================================================== > > > > > > ***** WARNING: Using default FORTRAN optimization flags -g -O > > > > > > You might consider manually setting optimal optimization flags for your > > system with > > > > FOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for > > examples > > > > =============================================================================== > > > > > > =============================================================================== > > > > > > WARNING! Compiling PETSc with no debugging, this should > > > > > > only be done for timing and production runs. All > > development should > > > > be done when configured using > > --with-debugging=1 > > > > ============================== > > ================================================= > > > > TESTING: checkCLibraries > > from config.compilers(config/BuildSystem/config/compilers.py:171) > > > > > > ************************************************************ > > ******************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > details): > > ------------------------------------------------------------ > > ------------------- > > C libraries cannot directly be used from Fortran > > ************************************************************ > > ******************* > > > > > > > > > > > > > From ccetinbas at anl.gov Mon Oct 30 19:06:05 2017 From: ccetinbas at anl.gov (Cetinbas, Cankur Firat) Date: Tue, 31 Oct 2017 00:06:05 +0000 Subject: [petsc-users] petsc4py sparse matrix construction time Message-ID: Hello, I am a beginner both in PETSc and mpi4py. I have been working on parallelizing our water transport code (where we solve linear system of equations) and I started with the toy code below. The toy code reads right hand size (rh), row, column, value vectors to construct sparse coefficient matrix and then scatters them to construct parallel PETSc coefficient matrix and right hand side vector. The sparse matrix generation time is extremely high in comparison to sps.csr_matrix((val, (row, col)), shape=(n,n)) in python. For instance python generates 181197x181197 sparse matrix in 0.06 seconds and this code with 32 cores:1.19s, 16 cores:6.98s and 8 cores:29.5 s. I was wondering if I am making a mistake in generating sparse matrix? Is there a more efficient way? Thanks for your help in advance. Regards, Firat from petsc4py import PETSc from mpi4py import MPI import numpy as np import time comm = MPI.COMM_WORLD rank = comm.Get_rank() size = comm.Get_size() if rank==0: # proc 0 loads tomo image and does fast calculations to append row, col, val, rh lists # in the real code this vectors will be available on proc 0 no txt files are read row = np.loadtxt('row.out') # indices of non-zero rows col = np.loadtxt('col.out') # indices of non-zero columns val = np.loadtxt('vs.out') # values in the sparse matrix rh = np.loadtxt('RHS.out') # right hand side vector n = row.shape[0] #1045699 m = rh.shape[0] #181197 square sparse matrix size else: n = None m = None row = None col = None val = None rh = None rh_ind = None m_lcl = comm.bcast(m,root=0) n_lcl = comm.bcast(n,root=0) neq = n_lcl//size meq = m_lcl//size nx = np.mod(n_lcl,size) mx = np.mod(m_lcl,size) row_lcl = np.zeros(neq) col_lcl = np.zeros(neq) val_lcl = np.zeros(neq) rh_lcl = np.zeros(meq) a = [neq]*size #send counts for Scatterv am = [meq]*size #send counts for Scatterv if nx>0: for i in range(0,nx): if rank==i: row_lcl = np.zeros(neq+1) col_lcl = np.zeros(neq+1) val_lcl = np.zeros(neq+1) a[i] = a[i]+1 if mx>0: for ii in range(0,mx): if rank==ii: rh_lcl = np.zeros(meq+1) am[ii] = am[ii]+1 comm.Scatterv([row,a],row_lcl) comm.Scatterv([col,a],col_lcl) comm.Scatterv([val,a],val_lcl) comm.Scatterv([rh,am],rh_lcl) comm.Barrier() A = PETSc.Mat() A.create() A.setSizes([m_lcl,m_lcl]) A.setType('aij') A.setUp() lr = row_lcl.shape[0] for i in range(0,lr): A[row_lcl[i],col_lcl[i]] = val_lcl[i] A.assemblyBegin() A.assemblyEnd() if size>1: # to get the range for scattered vectors ami = [0] ami = np.array([0]+am).cumsum() for kk in range(0,size): if rank==kk: Is = ami[kk] Ie = ami[kk+1] else: Is=0; Ie=m_lcl b= PETSc.Vec() b.create() b.setSizes(m_lcl) b.setFromOptions() b.setUp() b.setValues(list(range(Is,Ie)),rh_lcl) b.assemblyBegin() b.assemblyEnd() # solution vector x = b.duplicate() x.assemblyBegin() x.assemblyEnd() # create linear solver ksp = PETSc.KSP() ksp.create() ksp.setOperators(A) ksp.setType('cg') #ksp.getPC().setType('icc') # only sequential ksp.getPC().setType('jacobi') print('solving with:', ksp.getType()) #solve st=time.time() ksp.solve(b,x) et=time.time() print(et-st) if size>1: #gather if rank==0: xGthr = np.zeros(m) else: xGthr = None comm.Gatherv(x,[xGthr,am]) -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Oct 30 20:01:59 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 30 Oct 2017 21:01:59 -0400 Subject: [petsc-users] petsc4py sparse matrix construction time In-Reply-To: References: Message-ID: On Mon, Oct 30, 2017 at 8:06 PM, Cetinbas, Cankur Firat wrote: > Hello, > > > > I am a beginner both in PETSc and mpi4py. I have been working on > parallelizing our water transport code (where we solve linear system of > equations) and I started with the toy code below. > > > > The toy code reads right hand size (rh), row, column, value vectors to > construct sparse coefficient matrix and then scatters them to construct > parallel PETSc coefficient matrix and right hand side vector. > > > > The sparse matrix generation time is extremely high in comparison to > sps.csr_matrix((val, (row, col)), shape=(n,n)) in python. For instance > python generates 181197x181197 sparse matrix in 0.06 seconds and this code > with 32 cores:1.19s, 16 cores:6.98s and 8 cores:29.5 s. I was wondering if > I am making a mistake in generating sparse matrix? Is there a more > efficient way? > It looks like you do not preallocate the matrix. There is a chapter on this in the manual. Matt > Thanks for your help in advance. > > > > Regards, > > > > Firat > > > > from petsc4py import PETSc > > from mpi4py import MPI > > import numpy as np > > import time > > > > comm = MPI.COMM_WORLD > > rank = comm.Get_rank() > > size = comm.Get_size() > > > > if rank==0: > > # proc 0 loads tomo image and does fast calculations to append row, > col, val, rh lists > > # in the real code this vectors will be available on proc 0 no txt > files are read > > row = np.loadtxt('row.out') # indices of non-zero rows > > col = np.loadtxt('col.out') # indices of non-zero columns > > val = np.loadtxt('vs.out') # values in the sparse matrix > > rh = np.loadtxt('RHS.out') # right hand side vector > > n = row.shape[0] #1045699 > > m = rh.shape[0] #181197 square sparse matrix size > > else: > > n = None > > m = None > > row = None > > col = None > > val = None > > rh = None > > rh_ind = None > > > > m_lcl = comm.bcast(m,root=0) > > n_lcl = comm.bcast(n,root=0) > > neq = n_lcl//size > > meq = m_lcl//size > > nx = np.mod(n_lcl,size) > > mx = np.mod(m_lcl,size) > > row_lcl = np.zeros(neq) > > col_lcl = np.zeros(neq) > > val_lcl = np.zeros(neq) > > rh_lcl = np.zeros(meq) > > a = [neq]*size #send counts for Scatterv > > am = [meq]*size #send counts for Scatterv > > > > if nx>0: > > for i in range(0,nx): > > if rank==i: > > row_lcl = np.zeros(neq+1) > > col_lcl = np.zeros(neq+1) > > val_lcl = np.zeros(neq+1) > > a[i] = a[i]+1 > > if mx>0: > > for ii in range(0,mx): > > if rank==ii: > > rh_lcl = np.zeros(meq+1) > > am[ii] = am[ii]+1 > > > > comm.Scatterv([row,a],row_lcl) > > comm.Scatterv([col,a],col_lcl) > > comm.Scatterv([val,a],val_lcl) > > comm.Scatterv([rh,am],rh_lcl) > > comm.Barrier() > > > > A = PETSc.Mat() > > A.create() > > A.setSizes([m_lcl,m_lcl]) > > A.setType('aij') > > A.setUp() > > lr = row_lcl.shape[0] > > for i in range(0,lr): > > A[row_lcl[i],col_lcl[i]] = val_lcl[i] > > A.assemblyBegin() > > A.assemblyEnd() > > > > if size>1: # to get the range for scattered vectors > > ami = [0] > > ami = np.array([0]+am).cumsum() > > for kk in range(0,size): > > if rank==kk: > > Is = ami[kk] > > Ie = ami[kk+1] > > else: > > Is=0; Ie=m_lcl > > > > b= PETSc.Vec() > > b.create() > > b.setSizes(m_lcl) > > b.setFromOptions() > > b.setUp() > > b.setValues(list(range(Is,Ie)),rh_lcl) > > b.assemblyBegin() > > b.assemblyEnd() > > > > # solution vector > > x = b.duplicate() > > x.assemblyBegin() > > x.assemblyEnd() > > > > # create linear solver > > ksp = PETSc.KSP() > > ksp.create() > > ksp.setOperators(A) > > ksp.setType('cg') > > #ksp.getPC().setType('icc') # only sequential > > ksp.getPC().setType('jacobi') > > print('solving with:', ksp.getType()) > > > > #solve > > st=time.time() > > ksp.solve(b,x) > > et=time.time() > > print(et-st) > > > > if size>1: > > #gather > > if rank==0: > > xGthr = np.zeros(m) > > else: > > xGthr = None > > comm.Gatherv(x,[xGthr,am]) > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zakaryah at gmail.com Mon Oct 30 21:32:26 2017 From: zakaryah at gmail.com (zakaryah .) Date: Mon, 30 Oct 2017 22:32:26 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> <87k1zisl6a.fsf@jedbrown.org> Message-ID: You were right, of course. I fixed the problem with the function evaluation and the code seems to be working now, at least on small test problems. Is there a way to setup preallocation of the Jacobian matrix, with the entire first row and column non-zero? I set the preallocation error flag to false, as you suggested several messages ago, and this was great for testing, but now the first assembly of the Jacobian is terribly slow due to allocating on the fly. Thanks! On Sun, Oct 29, 2017 at 7:07 PM, Matthew Knepley wrote: > On Sun, Oct 29, 2017 at 5:15 PM, zakaryah . wrote: > >> Good point, Jed - I feel silly for missing this. >> >> Can I use -snes_type test -snes_test_display with the Jacobian generated >> from a DMComposite? When I try, it looks like the finite difference >> Jacobian is missing all the elements in the row corresponding to the >> redundant variable, except the diagonal, which is wrong. >> > > Well, this leads me to believe the residual function is wrong. What the FD > Jacobian does is just call the residual > twice with different solutions. Thus if the residual is different when you > perturb the redundant variable, you should > have Jacobian entries there. > > >> I'm not sure my code for setting the submatrices is correct. I'm >> especially uncertain about the submatrix J_bh, where b is the redundant >> variable and h is the displacements. This submatrix has only one row, and >> all of its columns are non-zero. Can its values be set with >> MatSetValuesLocal, on all processors? >> >> Is there an example of manually coding a Jacobian with a DMRedundant? >> > > I don't think so. We welcome contributions. > > Matt > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rchurchi at pppl.gov Mon Oct 30 23:58:59 2017 From: rchurchi at pppl.gov (Randy Michael Churchill) Date: Mon, 30 Oct 2017 23:58:59 -0500 Subject: [petsc-users] unsorted local columns in 3.8? Message-ID: I'm running a Fortran code that was just changed over to using petsc 3.8 (previously petsc 3.7.6). An error was thrown during a KSPSetUp() call. The error is "unsorted iscol_local is not implemented yet" (see full error below). I tried to trace down the difference in the source files, but where the error occurs (MatCreateSubMatrix_MPIAIJ_SameRowDist()) doesn't seem to have existed in v3.7.6, so I'm unsure how to compare. It seems the error is that the order of the columns locally are unsorted, though I don't think I specify a column order in the creation of the matrix: call MatCreate(this%comm,AA,ierr) call MatSetSizes(AA,npetscloc,npetscloc,nreal,nreal,ierr) call MatSetType(AA,MATAIJ,ierr) call MatSetup(AA,ierr) call MatGetOwnershipRange(AA,low,high,ierr) allocate(d_nnz(npetscloc),o_nnz(npetscloc)) call getNNZ(grid,npetscloc,low,high,d_nnz,o_nnz,this%xgc_petsc,nreal,ierr) call MatSeqAIJSetPreallocation(AA,PETSC_NULL_INTEGER,d_nnz,ierr) call MatMPIAIJSetPreallocation(AA,PETSC_NULL_INTEGER,d_nnz,PETSC_NULL_INTEGER,o_nnz,ierr) deallocate(d_nnz,o_nnz) call MatSetOption(AA,MAT_IGNORE_OFF_PROC_ENTRIES,PETSC_TRUE,ierr) call MatSetOption(AA,MAT_KEEP_NONZERO_PATTERN,PETSC_TRUE,ierr) call MatSetup(AA,ierr) [62]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [62]PETSC ERROR: No support for this operation for this object type [62]PETSC ERROR: unsorted iscol_local is not implemented yet [62]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [62]PETSC ERROR: Petsc Release Version 3.8.0, unknown[62]PETSC ERROR: #1 MatCreateSubMatrix_MPIAIJ_SameRowDist() line 3418 in /global/u1/r/rchurchi/petsc/3.8.0/src/mat/impls/aij/mpi/mpiaij.c [62]PETSC ERROR: #2 MatCreateSubMatrix_MPIAIJ() line 3247 in /global/u1/r/rchurchi/petsc/3.8.0/src/mat/impls/aij/mpi/mpiaij.c [62]PETSC ERROR: #3 MatCreateSubMatrix() line 7872 in /global/u1/r/rchurchi/petsc/3.8.0/src/mat/interface/matrix.c [62]PETSC ERROR: #4 PCGAMGCreateLevel_GAMG() line 383 in /global/u1/r/rchurchi/petsc/3.8.0/src/ksp/pc/impls/gamg/gamg.c [62]PETSC ERROR: #5 PCSetUp_GAMG() line 561 in /global/u1/r/rchurchi/petsc/3.8.0/src/ksp/pc/impls/gamg/gamg.c [62]PETSC ERROR: #6 PCSetUp() line 924 in /global/u1/r/rchurchi/petsc/3.8.0/src/ksp/pc/interface/precon.c [62]PETSC ERROR: #7 KSPSetUp() line 378 in /global/u1/r/rchurchi/petsc/3.8.0/src/ksp/ksp/interface/itfunc.c -- R. Michael Churchill -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Tue Oct 31 07:48:56 2017 From: mfadams at lbl.gov (Mark Adams) Date: Tue, 31 Oct 2017 08:48:56 -0400 Subject: [petsc-users] preconditioning matrix-free newton-krylov In-Reply-To: References: <1C4B04A3719F56479255009C095BE3B593D45932@CSGMBX214W.pu.win.princeton.edu> <9D8F0366-3B1E-461F-8126-ADD87FA38F65@mcs.anl.gov> <0BB37C70-80EB-4043-BB4F-4BF65BF7F8EB@mcs.anl.gov> <22616605-EB8A-4ED6-BDCD-7CC1C805AFEB@mcs.anl.gov> Message-ID: On Mon, Oct 30, 2017 at 7:38 PM, Mark Lohry wrote: > Sparsity pattern binary AIJ gzipped here, should have 100^2 blocks of all > 1's indicating the non-zero positions: > > https://github.com/mlohry/petsc_miscellany/blob/master/ > jacobian_sparsity.dat.gz > View Raw (Sorry about that, but we can?t show files that are this big right now.) > > Using 32 processes on 2x16-core AMD 6274, timing for MatColoringApply is > ~877 seconds/~15 minutes for MATCOLORINGJP, ~30 seconds for > MATCOLORINGGREEDY. Let me know if you can't reproduce this and I'll put > together a MWE. > > MATCOLORINGID and MATCOLORINGSL both seem to work in 9 seconds, though > documentation says they're both serial? > > > Regarding BAIJ, neither JP nor GREEDY work, throwing "Matrix must be AIJ > for greedy coloring", so am I out of luck for using MPIBAIJ? > > On Mon, Oct 30, 2017 at 3:28 PM, Smith, Barry F. > wrote: > >> >> > On Oct 30, 2017, at 2:23 PM, Mark Lohry wrote: >> > >> > >> > Hmm, are those blocks dense? If so you could benefit enormously from >> using BAIJ format. >> > >> > >> > Yes they're dense blocks. Usually coupled compressible 3D NS with DG >> elements, 5 equations x order (N+1)*(N+2)*(N+3)/3 block size. So block >> sizes of 50^2 to 175^2 are typical. I'll try BAIJ; I initially set it up >> with AIJ as it seemed better supported in parallel on the linear solver >> table, but I suppose these are rather large blocks... still surprising >> performance as this was overall a pretty small system (1,536 >> elements/diagonal 100^2 blocks). >> >> Something is really wrong to get those huge times. >> > >> > >> > Could you run with -ksp_view_mat binary and send the resulting file >> called binaryoutput and we can run the coloring codes local to performance >> debug. >> > >> > >> > Will send this evening. >> >> Thanks but send for the AIJ case, not BAIJ >> >> > >> > >> > On Mon, Oct 30, 2017 at 3:02 PM, Smith, Barry F. >> wrote: >> > >> > > On Oct 30, 2017, at 1:58 PM, Mark Lohry wrote: >> > > >> > > Hmm, metis doesn't really have anything to do with the sparsity of >> the Jacobian does it? >> > > >> > > No, I just mean I'm doing initial partitioning and parallel >> communication for the residual evaluations independently of petsc, and then >> doing a 1-to-1 mapping to the petsc solution vector. Along with manually >> setting the non-zero structure of the MPIAIJ system as in the user manual. >> I don't think there's anything wrong with the system structure as it gives >> the same correct answer as the un-preconditioned matrix-free approach. >> > > >> > > The exact system those MatColoring times came from has size (100x100) >> blocks on the diagonals corresponding to the tetrahedral cells, with those >> having 4 neighbor blocks on the same row (or fewer for elements on >> boundaries.) >> > >> > Hmm, are those blocks dense? If so you could benefit enormously from >> using BAIJ format. >> > >> > Matt, >> > >> > Sounds like performance bugs for the parallel coloring apply >> algorithms with big "diagonal blocks" >> > >> > Mark, >> > >> > Could you run with -ksp_view_mat binary and send the resulting >> file called binaryoutput and we can run the coloring codes local to >> performance debug. >> > >> > >> > Barry >> > >> > > >> > > On Mon, Oct 30, 2017 at 1:55 PM, Smith, Barry F. >> wrote: >> > > >> > > > On Oct 30, 2017, at 12:39 PM, Mark Lohry wrote: >> > > > >> > > > >> > > > > >> > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for >> the FD computation of the jacobians, or for the computation of the >> preconditioner? I'd like to get a handle on the relative costs of these. >> > > > >> > > > No, do you just want the time? You can get that from the logging; >> for example -log_view >> > > > >> > > > Yes, was just thinking in regards to your suggestion of recomputing >> when the number of linear iterations gets too high; I assume it's the ratio >> of preconditioner cost vs linear solver cost at runtime that's the metric >> of interest, and not the absolute value of either. But I'll cross that >> bridge when I come to it. >> > > > >> > > > When I had asked, I was looking to see where a long pause was >> happening thinking it was the FD jacobian; turned out to be before that in >> MatColoringApply which seems surprisingly expensive. MATCOLORINGJP took ~15 >> minutes on 32 cores on a small 153,000^2 system, with MATCOLORINGGREEDY >> taking 30 seconds. Any guidance there, or is this expected? I'm not using >> DM, just manually entering the sparsity resulting from a metis >> decomposition of a tetrahedral mesh. >> > > >> > > Hmm, metis doesn't really have anything to do with the sparsity of >> the Jacobian does it? >> > > >> > > Matt, >> > > >> > > These times are huge. What is going on? >> > > >> > > Barry >> > > >> > > > >> > > > >> > > > Thanks for the info on the lag logic, I'll play with the TS >> pre/post calls for the time-accurate problems and only use LagJacobian. >> > > > >> > > > On Mon, Oct 30, 2017 at 11:29 AM, Smith, Barry F. < >> bsmith at mcs.anl.gov> wrote: >> > > > >> > > > > On Oct 29, 2017, at 11:50 AM, Mark Lohry >> wrote: >> > > > > >> > > > > Thanks again Barry, I've got the preconditioners hooked up with >> -snes_mf_operator and at least AMG looks to be working great on a high >> order unstructured DG problem. >> > > > > >> > > > > Couple questions on the SNESSetLagJacobian + >> SNESSetLagPreconditioner code flow: >> > > > > >> > > > > 1) With -snes_mf_operator, and given SNESSetLagJacobian(snes, 1) >> (default) and SNESSetLagPreconditioner(snes, 2), after the first KSP solve >> in a newton iteration, will it do the finite different jacobian >> calculation? Or will the Jacobian only be computed when the preconditioner >> lag setting demands it on the 3rd newton step? I suspect it's the latter >> based on where I see the code pause. >> > > > >> > > > SNES with -snes_mf_operator will ALWAYS use the matrix-free >> finite difference f(x+h) - f(x) to apply the matrix vector product. >> > > > >> > > > The LagJacobian and LagPreconditioner are not coordinated. The >> first determines how often the Jacobian used for preconditioning is >> recomputed and the second determines how often the preconditioner is >> recomputed. >> > > > >> > > > If you are using -snes_mf_operator then it never makes sense to >> have lagJacobian < lagPreconditioner since it would recompute the Jacobian >> but not actually use it. It also makes no sense for lagPreconditioner < >> lagJacobian because you'd be recomputing the preconditioner on the same >> Jacobian. >> > > > >> > > > But actually if you don't change the Jacobian used in building the >> preconditioner then when it tries to recompute the preconditioner it >> determines the matrix has not changed so skips rebuilding the >> preconditioner. So when using -snes_mf_operator there is really no reason >> generally to set the preconditioner lag. >> > > > > >> > > > > 2) How do implicit TS and SNESSetLagPreconditioner/Persists >> interact? Does the counter since-last-preconditioner-compute reset with >> time steps, or does that lag counter just increment with every SNES solve >> regardless of how many nonlinear solves might have happened in a given >> timestep? Say lag preconditioner is 2, and a time stepper uses 3, 2, and 3 >> nonlinear solves on 3 steps, is the flow >> > > > > >> > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes >> solve)->(update preconditioner)->(snes solve) >> > > > > (time step 2)->(snes solve)->(update preconditioner)->(snes solve) >> > > > > (time step 3)->(snes solve)->(update preconditioner)->(snes >> solve)->(snes solve) >> > > > > >> > > > > or >> > > > > >> > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes >> solve)->(update preconditioner)->(snes solve) >> > > > > (time step 2)->(update preconditioner)->(snes solve)->(snes solve) >> > > > > (time step 3)->(update preconditioner)->(snes solve)->(snes >> solve)->(update preconditioner)->(snes solve) >> > > > > >> > > > > ? >> > > > > >> > > > > I think for implicit time stepping I'd probably want the >> preconditioner to be recomputed just once at the beginning of each time >> step, or some multiple of that. Does that sound reasonable? >> > > > >> > > > Yes, what you want to do is completely reasonable. >> > > > >> > > > You can use SNESSetLagJacobian() and >> SNESSetLagJacobianPersists() in combination to have the Jacobian >> recomputed ever fixed number of times; if you set the persists flag and set >> LagJacobian to 10 it will recompute the Jacobian used in the preconditioner >> every 10th time a new Jacobian is needed. >> > > > >> > > > If you want to compute the new Jacobian used to build the >> preconditioner once at the beginning of each new TS stage you can set >> SNESSetLagJacobian() to negative -2 in the TS prestage call. There are >> possibly other tricks you can do by setting the two flags at different >> locations. >> > > > >> > > > An alternative to hardwiring how often the Jacobian used to >> build the preconditioner is rebuilt is to rebuild based on when the >> preconditioner starts "working less well". Here you could put an additional >> KSPMonitor or SNESMonitor that detects if the number of linear iterations >> is above a certain amount and then sets the recompute Jacobian flag to -2 >> so that for the next solve it recreates the Jacobian used in building the >> preconditioner. >> > > > >> > > > >> > > > > >> > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for >> the FD computation of the jacobians, or for the computation of the >> preconditioner? I'd like to get a handle on the relative costs of these. >> > > > >> > > > No, do you just want the time? You can get that from the logging; >> for example -log_view >> > > > >> > > > > >> > > > > >> > > > > Best, >> > > > > Mark >> > > > > >> > > > > On Sat, Sep 23, 2017 at 3:28 PM, Mark Lohry >> wrote: >> > > > > Great, thanks Barry. >> > > > > >> > > > > On Sat, Sep 23, 2017 at 3:12 PM, Barry Smith >> wrote: >> > > > > >> > > > > > On Sep 23, 2017, at 12:48 PM, Mark W. Lohry < >> mlohry at princeton.edu> wrote: >> > > > > > >> > > > > > I'm currently using JFNK in an application where I don't have a >> hand-coded jacobian, and it's working well enough but as expected the >> scaling isn't great. >> > > > > > >> > > > > > What is the general process for using PC with >> MatMFFDComputeJacobian? Does it make sense to occasionally have petsc >> re-compute the jacobian via finite differences, and then recompute the >> preconditioner? Any that just need the sparsity structure? >> > > > > >> > > > > Mark >> > > > > >> > > > > Yes, this is a common approach. SNESSetLagJacobian >> -snes_lag_jacobian >> > > > > >> > > > > The normal approach in SNES to use matrix-free for the >> operator and use finite differences to compute an approximate Jacobian used >> to construct preconditioners is to to create a sparse matrix with the >> sparsity of the approximate Jacobian (yes you need a way to figure out the >> sparsity, if you use DMDA it will figure out the sparsity for you). Then >> you use >> > > > > >> > > > > SNESSetJacobian(snes,J,J, SNESComputeJacobianDefaultColor, >> NULL); >> > > > > >> > > > > and use the options database option -snes_mf_operator >> > > > > >> > > > > >> > > > > > Are there any PCs that don't work in the matrix-free context? >> > > > > >> > > > > If you do the above you can use almost all the PC since you are >> providing an explicit matrix from which to build the preconditioner >> > > > > >> > > > > > Are there any example codes I overlooked? >> > > > > > >> > > > > > Last but not least... can the Boomer-AMG preconditioner work >> with JFNK? To really show my ignorance of AMG, can it actually be written >> as a matrix P^-1(Ax-b)=0, , or is it just a linear operator? >> > > > > >> > > > > Again, if you provide an approximate Jacobian like above you >> can use it with BoomerAMG, if you provide NO explicit matrix you cannot use >> BoomerAMG or almost any other preconditioner. >> > > > > >> > > > > Barry >> > > > > >> > > > > > >> > > > > > Thanks, >> > > > > > Mark >> > > > > >> > > > > >> > > > > >> > > > >> > > > >> > > >> > > >> > >> > >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Oct 31 08:34:59 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 31 Oct 2017 09:34:59 -0400 Subject: [petsc-users] preconditioning matrix-free newton-krylov In-Reply-To: References: <1C4B04A3719F56479255009C095BE3B593D45932@CSGMBX214W.pu.win.princeton.edu> <9D8F0366-3B1E-461F-8126-ADD87FA38F65@mcs.anl.gov> <0BB37C70-80EB-4043-BB4F-4BF65BF7F8EB@mcs.anl.gov> Message-ID: On Mon, Oct 30, 2017 at 3:02 PM, Smith, Barry F. wrote: > > > On Oct 30, 2017, at 1:58 PM, Mark Lohry wrote: > > > > Hmm, metis doesn't really have anything to do with the sparsity of the > Jacobian does it? > > > > No, I just mean I'm doing initial partitioning and parallel > communication for the residual evaluations independently of petsc, and then > doing a 1-to-1 mapping to the petsc solution vector. Along with manually > setting the non-zero structure of the MPIAIJ system as in the user manual. > I don't think there's anything wrong with the system structure as it gives > the same correct answer as the un-preconditioned matrix-free approach. > > > > The exact system those MatColoring times came from has size (100x100) > blocks on the diagonals corresponding to the tetrahedral cells, with those > having 4 neighbor blocks on the same row (or fewer for elements on > boundaries.) > > Hmm, are those blocks dense? If so you could benefit enormously from > using BAIJ format. > > Matt, > > Sounds like performance bugs for the parallel coloring apply > algorithms with big "diagonal blocks" > Peter wrote the JP code (I think). I tried to look at it last night, but abstraction is not present. Its not easy to see where a performance problem might lurk. I think if we care, we just have to instrument it and run this example. Personally I have never used anything but greedy, which works great. Matt > Mark, > > Could you run with -ksp_view_mat binary and send the resulting file > called binaryoutput and we can run the coloring codes local to performance > debug. > > > Barry > > > > > On Mon, Oct 30, 2017 at 1:55 PM, Smith, Barry F. > wrote: > > > > > On Oct 30, 2017, at 12:39 PM, Mark Lohry wrote: > > > > > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the > FD computation of the jacobians, or for the computation of the > preconditioner? I'd like to get a handle on the relative costs of these. > > > > > > No, do you just want the time? You can get that from the logging; > for example -log_view > > > > > > Yes, was just thinking in regards to your suggestion of recomputing > when the number of linear iterations gets too high; I assume it's the ratio > of preconditioner cost vs linear solver cost at runtime that's the metric > of interest, and not the absolute value of either. But I'll cross that > bridge when I come to it. > > > > > > When I had asked, I was looking to see where a long pause was > happening thinking it was the FD jacobian; turned out to be before that in > MatColoringApply which seems surprisingly expensive. MATCOLORINGJP took ~15 > minutes on 32 cores on a small 153,000^2 system, with MATCOLORINGGREEDY > taking 30 seconds. Any guidance there, or is this expected? I'm not using > DM, just manually entering the sparsity resulting from a metis > decomposition of a tetrahedral mesh. > > > > Hmm, metis doesn't really have anything to do with the sparsity of > the Jacobian does it? > > > > Matt, > > > > These times are huge. What is going on? > > > > Barry > > > > > > > > > > > Thanks for the info on the lag logic, I'll play with the TS pre/post > calls for the time-accurate problems and only use LagJacobian. > > > > > > On Mon, Oct 30, 2017 at 11:29 AM, Smith, Barry F. > wrote: > > > > > > > On Oct 29, 2017, at 11:50 AM, Mark Lohry wrote: > > > > > > > > Thanks again Barry, I've got the preconditioners hooked up with > -snes_mf_operator and at least AMG looks to be working great on a high > order unstructured DG problem. > > > > > > > > Couple questions on the SNESSetLagJacobian + > SNESSetLagPreconditioner code flow: > > > > > > > > 1) With -snes_mf_operator, and given SNESSetLagJacobian(snes, 1) > (default) and SNESSetLagPreconditioner(snes, 2), after the first KSP solve > in a newton iteration, will it do the finite different jacobian > calculation? Or will the Jacobian only be computed when the preconditioner > lag setting demands it on the 3rd newton step? I suspect it's the latter > based on where I see the code pause. > > > > > > SNES with -snes_mf_operator will ALWAYS use the matrix-free finite > difference f(x+h) - f(x) to apply the matrix vector product. > > > > > > The LagJacobian and LagPreconditioner are not coordinated. The > first determines how often the Jacobian used for preconditioning is > recomputed and the second determines how often the preconditioner is > recomputed. > > > > > > If you are using -snes_mf_operator then it never makes sense to > have lagJacobian < lagPreconditioner since it would recompute the Jacobian > but not actually use it. It also makes no sense for lagPreconditioner < > lagJacobian because you'd be recomputing the preconditioner on the same > Jacobian. > > > > > > But actually if you don't change the Jacobian used in building the > preconditioner then when it tries to recompute the preconditioner it > determines the matrix has not changed so skips rebuilding the > preconditioner. So when using -snes_mf_operator there is really no reason > generally to set the preconditioner lag. > > > > > > > > 2) How do implicit TS and SNESSetLagPreconditioner/Persists > interact? Does the counter since-last-preconditioner-compute reset with > time steps, or does that lag counter just increment with every SNES solve > regardless of how many nonlinear solves might have happened in a given > timestep? Say lag preconditioner is 2, and a time stepper uses 3, 2, and 3 > nonlinear solves on 3 steps, is the flow > > > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > > (time step 2)->(snes solve)->(update preconditioner)->(snes solve) > > > > (time step 3)->(snes solve)->(update preconditioner)->(snes > solve)->(snes solve) > > > > > > > > or > > > > > > > > (time step 1)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > > (time step 2)->(update preconditioner)->(snes solve)->(snes solve) > > > > (time step 3)->(update preconditioner)->(snes solve)->(snes > solve)->(update preconditioner)->(snes solve) > > > > > > > > ? > > > > > > > > I think for implicit time stepping I'd probably want the > preconditioner to be recomputed just once at the beginning of each time > step, or some multiple of that. Does that sound reasonable? > > > > > > Yes, what you want to do is completely reasonable. > > > > > > You can use SNESSetLagJacobian() and SNESSetLagJacobianPersists() > in combination to have the Jacobian recomputed ever fixed number of times; > if you set the persists flag and set LagJacobian to 10 it will recompute > the Jacobian used in the preconditioner every 10th time a new Jacobian is > needed. > > > > > > If you want to compute the new Jacobian used to build the > preconditioner once at the beginning of each new TS stage you can set > SNESSetLagJacobian() to negative -2 in the TS prestage call. There are > possibly other tricks you can do by setting the two flags at different > locations. > > > > > > An alternative to hardwiring how often the Jacobian used to build > the preconditioner is rebuilt is to rebuild based on when the > preconditioner starts "working less well". Here you could put an additional > KSPMonitor or SNESMonitor that detects if the number of linear iterations > is above a certain amount and then sets the recompute Jacobian flag to -2 > so that for the next solve it recreates the Jacobian used in building the > preconditioner. > > > > > > > > > > > > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for the > FD computation of the jacobians, or for the computation of the > preconditioner? I'd like to get a handle on the relative costs of these. > > > > > > No, do you just want the time? You can get that from the logging; > for example -log_view > > > > > > > > > > > > > > > Best, > > > > Mark > > > > > > > > On Sat, Sep 23, 2017 at 3:28 PM, Mark Lohry > wrote: > > > > Great, thanks Barry. > > > > > > > > On Sat, Sep 23, 2017 at 3:12 PM, Barry Smith > wrote: > > > > > > > > > On Sep 23, 2017, at 12:48 PM, Mark W. Lohry > wrote: > > > > > > > > > > I'm currently using JFNK in an application where I don't have a > hand-coded jacobian, and it's working well enough but as expected the > scaling isn't great. > > > > > > > > > > What is the general process for using PC with > MatMFFDComputeJacobian? Does it make sense to occasionally have petsc > re-compute the jacobian via finite differences, and then recompute the > preconditioner? Any that just need the sparsity structure? > > > > > > > > Mark > > > > > > > > Yes, this is a common approach. SNESSetLagJacobian > -snes_lag_jacobian > > > > > > > > The normal approach in SNES to use matrix-free for the operator > and use finite differences to compute an approximate Jacobian used to > construct preconditioners is to to create a sparse matrix with the sparsity > of the approximate Jacobian (yes you need a way to figure out the sparsity, > if you use DMDA it will figure out the sparsity for you). Then you use > > > > > > > > SNESSetJacobian(snes,J,J, SNESComputeJacobianDefaultColor, NULL); > > > > > > > > and use the options database option -snes_mf_operator > > > > > > > > > > > > > Are there any PCs that don't work in the matrix-free context? > > > > > > > > If you do the above you can use almost all the PC since you are > providing an explicit matrix from which to build the preconditioner > > > > > > > > > Are there any example codes I overlooked? > > > > > > > > > > Last but not least... can the Boomer-AMG preconditioner work with > JFNK? To really show my ignorance of AMG, can it actually be written as a > matrix P^-1(Ax-b)=0, , or is it just a linear operator? > > > > > > > > Again, if you provide an approximate Jacobian like above you can > use it with BoomerAMG, if you provide NO explicit matrix you cannot use > BoomerAMG or almost any other preconditioner. > > > > > > > > Barry > > > > > > > > > > > > > > Thanks, > > > > > Mark > > > > > > > > > > > > > > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Tue Oct 31 09:58:38 2017 From: hzhang at mcs.anl.gov (Hong) Date: Tue, 31 Oct 2017 09:58:38 -0500 Subject: [petsc-users] unsorted local columns in 3.8? In-Reply-To: References: Message-ID: Randy: It could be a bug or a missing feature in our new MatCreateSubMatrix_MPIAIJ_ SameRowDist(). It would be helpful if you can provide us a simple example that produces this example. Hong I'm running a Fortran code that was just changed over to using petsc 3.8 > (previously petsc 3.7.6). An error was thrown during a KSPSetUp() call. The > error is "unsorted iscol_local is not implemented yet" (see full error > below). I tried to trace down the difference in the source files, but where > the error occurs (MatCreateSubMatrix_MPIAIJ_SameRowDist()) doesn't seem > to have existed in v3.7.6, so I'm unsure how to compare. It seems the error > is that the order of the columns locally are unsorted, though I don't think > I specify a column order in the creation of the matrix: > call MatCreate(this%comm,AA,ierr) > call MatSetSizes(AA,npetscloc,npetscloc,nreal,nreal,ierr) > call MatSetType(AA,MATAIJ,ierr) > call MatSetup(AA,ierr) > call MatGetOwnershipRange(AA,low,high,ierr) > allocate(d_nnz(npetscloc),o_nnz(npetscloc)) > call getNNZ(grid,npetscloc,low,high,d_nnz,o_nnz,this%xgc_ > petsc,nreal,ierr) > call MatSeqAIJSetPreallocation(AA,PETSC_NULL_INTEGER,d_nnz,ierr) > call MatMPIAIJSetPreallocation(AA,PETSC_NULL_INTEGER,d_nnz, > PETSC_NULL_INTEGER,o_nnz,ierr) > deallocate(d_nnz,o_nnz) > call MatSetOption(AA,MAT_IGNORE_OFF_PROC_ENTRIES,PETSC_TRUE,ierr) > call MatSetOption(AA,MAT_KEEP_NONZERO_PATTERN,PETSC_TRUE,ierr) > call MatSetup(AA,ierr) > > > [62]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [62]PETSC ERROR: No support for this operation for this object type > [62]PETSC ERROR: unsorted iscol_local is not implemented yet > [62]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [62]PETSC ERROR: Petsc Release Version 3.8.0, unknown[62]PETSC ERROR: #1 > MatCreateSubMatrix_MPIAIJ_SameRowDist() line 3418 in > /global/u1/r/rchurchi/petsc/3.8.0/src/mat/impls/aij/mpi/mpiaij.c > [62]PETSC ERROR: #2 MatCreateSubMatrix_MPIAIJ() line 3247 in > /global/u1/r/rchurchi/petsc/3.8.0/src/mat/impls/aij/mpi/mpiaij.c > [62]PETSC ERROR: #3 MatCreateSubMatrix() line 7872 in > /global/u1/r/rchurchi/petsc/3.8.0/src/mat/interface/matrix.c > [62]PETSC ERROR: #4 PCGAMGCreateLevel_GAMG() line 383 in > /global/u1/r/rchurchi/petsc/3.8.0/src/ksp/pc/impls/gamg/gamg.c > [62]PETSC ERROR: #5 PCSetUp_GAMG() line 561 in > /global/u1/r/rchurchi/petsc/3.8.0/src/ksp/pc/impls/gamg/gamg.c > [62]PETSC ERROR: #6 PCSetUp() line 924 in /global/u1/r/rchurchi/petsc/3. > 8.0/src/ksp/pc/interface/precon.c > [62]PETSC ERROR: #7 KSPSetUp() line 378 in /global/u1/r/rchurchi/petsc/3. > 8.0/src/ksp/ksp/interface/itfunc.c > > -- > R. Michael Churchill > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mlohry at gmail.com Tue Oct 31 10:01:45 2017 From: mlohry at gmail.com (Mark Lohry) Date: Tue, 31 Oct 2017 11:01:45 -0400 Subject: [petsc-users] preconditioning matrix-free newton-krylov In-Reply-To: References: <1C4B04A3719F56479255009C095BE3B593D45932@CSGMBX214W.pu.win.princeton.edu> <9D8F0366-3B1E-461F-8126-ADD87FA38F65@mcs.anl.gov> <0BB37C70-80EB-4043-BB4F-4BF65BF7F8EB@mcs.anl.gov> Message-ID: Matt, using greedy on that system took 30 seconds wallclock, does that seem reasonable? It's a bit concerning on scaling since this is a small system. MATCOLORINGSL (the default) takes <10 seconds; the user manual specifies it's a sequential method, yet seems to work on MPIAIJ without complaining. Also, any of these expected to work with BAIJ? -Mark On Tue, Oct 31, 2017 at 9:34 AM, Matthew Knepley wrote: > On Mon, Oct 30, 2017 at 3:02 PM, Smith, Barry F. > wrote: > >> >> > On Oct 30, 2017, at 1:58 PM, Mark Lohry wrote: >> > >> > Hmm, metis doesn't really have anything to do with the sparsity of the >> Jacobian does it? >> > >> > No, I just mean I'm doing initial partitioning and parallel >> communication for the residual evaluations independently of petsc, and then >> doing a 1-to-1 mapping to the petsc solution vector. Along with manually >> setting the non-zero structure of the MPIAIJ system as in the user manual. >> I don't think there's anything wrong with the system structure as it gives >> the same correct answer as the un-preconditioned matrix-free approach. >> > >> > The exact system those MatColoring times came from has size (100x100) >> blocks on the diagonals corresponding to the tetrahedral cells, with those >> having 4 neighbor blocks on the same row (or fewer for elements on >> boundaries.) >> >> Hmm, are those blocks dense? If so you could benefit enormously from >> using BAIJ format. >> >> Matt, >> >> Sounds like performance bugs for the parallel coloring apply >> algorithms with big "diagonal blocks" >> > > Peter wrote the JP code (I think). I tried to look at it last night, but > abstraction is not present. Its not easy > to see where a performance problem might lurk. I think if we care, we just > have to instrument it and run > this example. Personally I have never used anything but greedy, which > works great. > > Matt > > >> Mark, >> >> Could you run with -ksp_view_mat binary and send the resulting file >> called binaryoutput and we can run the coloring codes local to performance >> debug. >> >> >> Barry >> >> > >> > On Mon, Oct 30, 2017 at 1:55 PM, Smith, Barry F. >> wrote: >> > >> > > On Oct 30, 2017, at 12:39 PM, Mark Lohry wrote: >> > > >> > > >> > > > >> > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for >> the FD computation of the jacobians, or for the computation of the >> preconditioner? I'd like to get a handle on the relative costs of these. >> > > >> > > No, do you just want the time? You can get that from the logging; >> for example -log_view >> > > >> > > Yes, was just thinking in regards to your suggestion of recomputing >> when the number of linear iterations gets too high; I assume it's the ratio >> of preconditioner cost vs linear solver cost at runtime that's the metric >> of interest, and not the absolute value of either. But I'll cross that >> bridge when I come to it. >> > > >> > > When I had asked, I was looking to see where a long pause was >> happening thinking it was the FD jacobian; turned out to be before that in >> MatColoringApply which seems surprisingly expensive. MATCOLORINGJP took ~15 >> minutes on 32 cores on a small 153,000^2 system, with MATCOLORINGGREEDY >> taking 30 seconds. Any guidance there, or is this expected? I'm not using >> DM, just manually entering the sparsity resulting from a metis >> decomposition of a tetrahedral mesh. >> > >> > Hmm, metis doesn't really have anything to do with the sparsity of >> the Jacobian does it? >> > >> > Matt, >> > >> > These times are huge. What is going on? >> > >> > Barry >> > >> > > >> > > >> > > Thanks for the info on the lag logic, I'll play with the TS pre/post >> calls for the time-accurate problems and only use LagJacobian. >> > > >> > > On Mon, Oct 30, 2017 at 11:29 AM, Smith, Barry F. >> wrote: >> > > >> > > > On Oct 29, 2017, at 11:50 AM, Mark Lohry wrote: >> > > > >> > > > Thanks again Barry, I've got the preconditioners hooked up with >> -snes_mf_operator and at least AMG looks to be working great on a high >> order unstructured DG problem. >> > > > >> > > > Couple questions on the SNESSetLagJacobian + >> SNESSetLagPreconditioner code flow: >> > > > >> > > > 1) With -snes_mf_operator, and given SNESSetLagJacobian(snes, 1) >> (default) and SNESSetLagPreconditioner(snes, 2), after the first KSP solve >> in a newton iteration, will it do the finite different jacobian >> calculation? Or will the Jacobian only be computed when the preconditioner >> lag setting demands it on the 3rd newton step? I suspect it's the latter >> based on where I see the code pause. >> > > >> > > SNES with -snes_mf_operator will ALWAYS use the matrix-free finite >> difference f(x+h) - f(x) to apply the matrix vector product. >> > > >> > > The LagJacobian and LagPreconditioner are not coordinated. The >> first determines how often the Jacobian used for preconditioning is >> recomputed and the second determines how often the preconditioner is >> recomputed. >> > > >> > > If you are using -snes_mf_operator then it never makes sense to >> have lagJacobian < lagPreconditioner since it would recompute the Jacobian >> but not actually use it. It also makes no sense for lagPreconditioner < >> lagJacobian because you'd be recomputing the preconditioner on the same >> Jacobian. >> > > >> > > But actually if you don't change the Jacobian used in building the >> preconditioner then when it tries to recompute the preconditioner it >> determines the matrix has not changed so skips rebuilding the >> preconditioner. So when using -snes_mf_operator there is really no reason >> generally to set the preconditioner lag. >> > > > >> > > > 2) How do implicit TS and SNESSetLagPreconditioner/Persists >> interact? Does the counter since-last-preconditioner-compute reset with >> time steps, or does that lag counter just increment with every SNES solve >> regardless of how many nonlinear solves might have happened in a given >> timestep? Say lag preconditioner is 2, and a time stepper uses 3, 2, and 3 >> nonlinear solves on 3 steps, is the flow >> > > > >> > > > (time step 1)->(update preconditioner)->(snes solve)->(snes >> solve)->(update preconditioner)->(snes solve) >> > > > (time step 2)->(snes solve)->(update preconditioner)->(snes solve) >> > > > (time step 3)->(snes solve)->(update preconditioner)->(snes >> solve)->(snes solve) >> > > > >> > > > or >> > > > >> > > > (time step 1)->(update preconditioner)->(snes solve)->(snes >> solve)->(update preconditioner)->(snes solve) >> > > > (time step 2)->(update preconditioner)->(snes solve)->(snes solve) >> > > > (time step 3)->(update preconditioner)->(snes solve)->(snes >> solve)->(update preconditioner)->(snes solve) >> > > > >> > > > ? >> > > > >> > > > I think for implicit time stepping I'd probably want the >> preconditioner to be recomputed just once at the beginning of each time >> step, or some multiple of that. Does that sound reasonable? >> > > >> > > Yes, what you want to do is completely reasonable. >> > > >> > > You can use SNESSetLagJacobian() and SNESSetLagJacobianPersists() >> in combination to have the Jacobian recomputed ever fixed number of times; >> if you set the persists flag and set LagJacobian to 10 it will recompute >> the Jacobian used in the preconditioner every 10th time a new Jacobian is >> needed. >> > > >> > > If you want to compute the new Jacobian used to build the >> preconditioner once at the beginning of each new TS stage you can set >> SNESSetLagJacobian() to negative -2 in the TS prestage call. There are >> possibly other tricks you can do by setting the two flags at different >> locations. >> > > >> > > An alternative to hardwiring how often the Jacobian used to build >> the preconditioner is rebuilt is to rebuild based on when the >> preconditioner starts "working less well". Here you could put an additional >> KSPMonitor or SNESMonitor that detects if the number of linear iterations >> is above a certain amount and then sets the recompute Jacobian flag to -2 >> so that for the next solve it recreates the Jacobian used in building the >> preconditioner. >> > > >> > > >> > > > >> > > > 3) Are there any hooks analogous to KSPSetPreSolve/PostSolve for >> the FD computation of the jacobians, or for the computation of the >> preconditioner? I'd like to get a handle on the relative costs of these. >> > > >> > > No, do you just want the time? You can get that from the logging; >> for example -log_view >> > > >> > > > >> > > > >> > > > Best, >> > > > Mark >> > > > >> > > > On Sat, Sep 23, 2017 at 3:28 PM, Mark Lohry >> wrote: >> > > > Great, thanks Barry. >> > > > >> > > > On Sat, Sep 23, 2017 at 3:12 PM, Barry Smith >> wrote: >> > > > >> > > > > On Sep 23, 2017, at 12:48 PM, Mark W. Lohry >> wrote: >> > > > > >> > > > > I'm currently using JFNK in an application where I don't have a >> hand-coded jacobian, and it's working well enough but as expected the >> scaling isn't great. >> > > > > >> > > > > What is the general process for using PC with >> MatMFFDComputeJacobian? Does it make sense to occasionally have petsc >> re-compute the jacobian via finite differences, and then recompute the >> preconditioner? Any that just need the sparsity structure? >> > > > >> > > > Mark >> > > > >> > > > Yes, this is a common approach. SNESSetLagJacobian >> -snes_lag_jacobian >> > > > >> > > > The normal approach in SNES to use matrix-free for the operator >> and use finite differences to compute an approximate Jacobian used to >> construct preconditioners is to to create a sparse matrix with the sparsity >> of the approximate Jacobian (yes you need a way to figure out the sparsity, >> if you use DMDA it will figure out the sparsity for you). Then you use >> > > > >> > > > SNESSetJacobian(snes,J,J, SNESComputeJacobianDefaultColor, >> NULL); >> > > > >> > > > and use the options database option -snes_mf_operator >> > > > >> > > > >> > > > > Are there any PCs that don't work in the matrix-free context? >> > > > >> > > > If you do the above you can use almost all the PC since you are >> providing an explicit matrix from which to build the preconditioner >> > > > >> > > > > Are there any example codes I overlooked? >> > > > > >> > > > > Last but not least... can the Boomer-AMG preconditioner work with >> JFNK? To really show my ignorance of AMG, can it actually be written as a >> matrix P^-1(Ax-b)=0, , or is it just a linear operator? >> > > > >> > > > Again, if you provide an approximate Jacobian like above you can >> use it with BoomerAMG, if you provide NO explicit matrix you cannot use >> BoomerAMG or almost any other preconditioner. >> > > > >> > > > Barry >> > > > >> > > > > >> > > > > Thanks, >> > > > > Mark >> > > > >> > > > >> > > > >> > > >> > > >> > >> > >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Oct 31 10:17:50 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Tue, 31 Oct 2017 15:17:50 +0000 Subject: [petsc-users] petsc4py sparse matrix construction time In-Reply-To: References: Message-ID: You also need to make sure that most matrix entries are generated on the process that they will belong on. Barry > On Oct 30, 2017, at 8:01 PM, Matthew Knepley wrote: > > On Mon, Oct 30, 2017 at 8:06 PM, Cetinbas, Cankur Firat wrote: > Hello, > > > > I am a beginner both in PETSc and mpi4py. I have been working on parallelizing our water transport code (where we solve linear system of equations) and I started with the toy code below. > > > > The toy code reads right hand size (rh), row, column, value vectors to construct sparse coefficient matrix and then scatters them to construct parallel PETSc coefficient matrix and right hand side vector. > > > > The sparse matrix generation time is extremely high in comparison to sps.csr_matrix((val, (row, col)), shape=(n,n)) in python. For instance python generates 181197x181197 sparse matrix in 0.06 seconds and this code with 32 cores:1.19s, 16 cores:6.98s and 8 cores:29.5 s. I was wondering if I am making a mistake in generating sparse matrix? Is there a more efficient way? > > > It looks like you do not preallocate the matrix. There is a chapter on this in the manual. > > Matt > > Thanks for your help in advance. > > > > Regards, > > > > Firat > > > > from petsc4py import PETSc > > from mpi4py import MPI > > import numpy as np > > import time > > > > comm = MPI.COMM_WORLD > > rank = comm.Get_rank() > > size = comm.Get_size() > > > > if rank==0: > > # proc 0 loads tomo image and does fast calculations to append row, col, val, rh lists > > # in the real code this vectors will be available on proc 0 no txt files are read > > row = np.loadtxt('row.out') # indices of non-zero rows > > col = np.loadtxt('col.out') # indices of non-zero columns > > val = np.loadtxt('vs.out') # values in the sparse matrix > > rh = np.loadtxt('RHS.out') # right hand side vector > > n = row.shape[0] #1045699 > > m = rh.shape[0] #181197 square sparse matrix size > > else: > > n = None > > m = None > > row = None > > col = None > > val = None > > rh = None > > rh_ind = None > > > > m_lcl = comm.bcast(m,root=0) > > n_lcl = comm.bcast(n,root=0) > > neq = n_lcl//size > > meq = m_lcl//size > > nx = np.mod(n_lcl,size) > > mx = np.mod(m_lcl,size) > > row_lcl = np.zeros(neq) > > col_lcl = np.zeros(neq) > > val_lcl = np.zeros(neq) > > rh_lcl = np.zeros(meq) > > a = [neq]*size #send counts for Scatterv > > am = [meq]*size #send counts for Scatterv > > > > if nx>0: > > for i in range(0,nx): > > if rank==i: > > row_lcl = np.zeros(neq+1) > > col_lcl = np.zeros(neq+1) > > val_lcl = np.zeros(neq+1) > > a[i] = a[i]+1 > > if mx>0: > > for ii in range(0,mx): > > if rank==ii: > > rh_lcl = np.zeros(meq+1) > > am[ii] = am[ii]+1 > > > > comm.Scatterv([row,a],row_lcl) > > comm.Scatterv([col,a],col_lcl) > > comm.Scatterv([val,a],val_lcl) > > comm.Scatterv([rh,am],rh_lcl) > > comm.Barrier() > > > > A = PETSc.Mat() > > A.create() > > A.setSizes([m_lcl,m_lcl]) > > A.setType('aij') > > A.setUp() > > lr = row_lcl.shape[0] > > for i in range(0,lr): > > A[row_lcl[i],col_lcl[i]] = val_lcl[i] > > A.assemblyBegin() > > A.assemblyEnd() > > > > if size>1: # to get the range for scattered vectors > > ami = [0] > > ami = np.array([0]+am).cumsum() > > for kk in range(0,size): > > if rank==kk: > > Is = ami[kk] > > Ie = ami[kk+1] > > else: > > Is=0; Ie=m_lcl > > > > b= PETSc.Vec() > > b.create() > > b.setSizes(m_lcl) > > b.setFromOptions() > > b.setUp() > > b.setValues(list(range(Is,Ie)),rh_lcl) > > b.assemblyBegin() > > b.assemblyEnd() > > > > # solution vector > > x = b.duplicate() > > x.assemblyBegin() > > x.assemblyEnd() > > > > # create linear solver > > ksp = PETSc.KSP() > > ksp.create() > > ksp.setOperators(A) > > ksp.setType('cg') > > #ksp.getPC().setType('icc') # only sequential > > ksp.getPC().setType('jacobi') > > print('solving with:', ksp.getType()) > > > > #solve > > st=time.time() > > ksp.solve(b,x) > > et=time.time() > > print(et-st) > > > > if size>1: > > #gather > > if rank==0: > > xGthr = np.zeros(m) > > else: > > xGthr = None > > comm.Gatherv(x,[xGthr,am]) > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From bsmith at mcs.anl.gov Tue Oct 31 10:23:49 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Tue, 31 Oct 2017 15:23:49 +0000 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> <87k1zisl6a.fsf@jedbrown.org> Message-ID: <1553F760-492C-4394-BDBF-19B2A69A8517@mcs.anl.gov> > On Oct 30, 2017, at 9:32 PM, zakaryah . wrote: > > You were right, of course. I fixed the problem with the function evaluation and the code seems to be working now, at least on small test problems. > > Is there a way to setup preallocation of the Jacobian matrix, with the entire first row and column non-zero? No great way. What you need to do is copy the specific code that does the preallocation for your problem from src/dm/impls/da/fdda.c stick it in your code and modify it so that it does the full allocation as you need. > I set the preallocation error flag to false, as you suggested several messages ago, and this was great for testing, but now the first assembly of the Jacobian is terribly slow due to allocating on the fly. > > Thanks! > > On Sun, Oct 29, 2017 at 7:07 PM, Matthew Knepley wrote: > On Sun, Oct 29, 2017 at 5:15 PM, zakaryah . wrote: > Good point, Jed - I feel silly for missing this. > > Can I use -snes_type test -snes_test_display with the Jacobian generated from a DMComposite? When I try, it looks like the finite difference Jacobian is missing all the elements in the row corresponding to the redundant variable, except the diagonal, which is wrong. > > Well, this leads me to believe the residual function is wrong. What the FD Jacobian does is just call the residual > twice with different solutions. Thus if the residual is different when you perturb the redundant variable, you should > have Jacobian entries there. > > I'm not sure my code for setting the submatrices is correct. I'm especially uncertain about the submatrix J_bh, where b is the redundant variable and h is the displacements. This submatrix has only one row, and all of its columns are non-zero. Can its values be set with MatSetValuesLocal, on all processors? > > Is there an example of manually coding a Jacobian with a DMRedundant? > > I don't think so. We welcome contributions. > > Matt > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > From zakaryah at gmail.com Tue Oct 31 10:26:59 2017 From: zakaryah at gmail.com (zakaryah .) Date: Tue, 31 Oct 2017 11:26:59 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <1553F760-492C-4394-BDBF-19B2A69A8517@mcs.anl.gov> References: <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> <87k1zisl6a.fsf@jedbrown.org> <1553F760-492C-4394-BDBF-19B2A69A8517@mcs.anl.gov> Message-ID: OK Barry, I'll have a look - thanks. Should I use DMShellSetCreateMatrix to set the routine I write for matrix allocation? -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Oct 31 10:36:09 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Tue, 31 Oct 2017 15:36:09 +0000 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> <87k1zisl6a.fsf@jedbrown.org> <1553F760-492C-4394-BDBF-19B2A69A8517@mcs.anl.gov> Message-ID: <40201D61-0CA9-4BE0-9075-2651CF66CDC3@mcs.anl.gov> > On Oct 31, 2017, at 10:26 AM, zakaryah . wrote: > > OK Barry, I'll have a look - thanks. Should I use DMShellSetCreateMatrix to set the routine I write for matrix allocation? What do you do now and how do you create/set the matrix? Are you creating one big old matrix that has all rows and columns, both the special one and the rest or something else? Barry > > From zakaryah at gmail.com Tue Oct 31 11:59:49 2017 From: zakaryah at gmail.com (zakaryah .) Date: Tue, 31 Oct 2017 12:59:49 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <40201D61-0CA9-4BE0-9075-2651CF66CDC3@mcs.anl.gov> References: <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> <87k1zisl6a.fsf@jedbrown.org> <1553F760-492C-4394-BDBF-19B2A69A8517@mcs.anl.gov> <40201D61-0CA9-4BE0-9075-2651CF66CDC3@mcs.anl.gov> Message-ID: First I set up the composite DM, which contains one redundant field coupled to everything else, and a DMDA, which of course only has local couplings given by the stencil. I then create the matrix with DMCreateMatrix, with the composite, packer, as the first argument. I immediately call MatSetOption with MAT_NEW_NONZERO_ALLOCATION_ERR set to FALSE. The matrix values are set by first calling MatGetLocalSubMatrix, and then MatSetValuesLocal. Apparently this is working, but when I move to larger problems, setting the off-diagonal submatrices is very slow, even when I set the entire submatrix in a single call to MatSetValuesLocal. I call the submatrices Jbb, Jbh, Jhb, and Jhh, where b is the redundant field and h are the displacements in the DMDA. Then Jbh is a 1xN matrix where N is the size of the DMDA, and all of its values are non-zero. When N is around 3e6, setting the values in Jbh takes several hours, even when I only call MatSetValuesLocal on it once. I think the preallocation for the DMDA is working well, because if I forego setting the "off-diagonal" submatrices Jbh and Jhb, then setting Jhh is very fast (seconds). I assume it's the allocation on the fly that is killing performance, because Jhh has 57 times as many non-zero elements as Jbh and Jhb, but setting it is several orders of magnitudes FASTER. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Oct 31 15:15:40 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Tue, 31 Oct 2017 20:15:40 +0000 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> <87k1zisl6a.fsf@jedbrown.org> <1553F760-492C-4394-BDBF-19B2A69A8517@mcs.anl.gov> <40201D61-0CA9-4BE0-9075-2651CF66CDC3@mcs.anl.gov> Message-ID: <45F0E494-37A6-4CA8-8429-2ECFBD2E90DD@mcs.anl.gov> > On Oct 31, 2017, at 11:59 AM, zakaryah . wrote: > > First I set up the composite DM, which contains one redundant field coupled to everything else, and a DMDA, which of course only has local couplings given by the stencil. I then create the matrix with DMCreateMatrix, with the composite, packer, as the first argument. I immediately call MatSetOption with MAT_NEW_NONZERO_ALLOCATION_ERR set to FALSE. The matrix values are set by first calling MatGetLocalSubMatrix, and then MatSetValuesLocal. Apparently this is working, but when I move to larger problems, setting the off-diagonal submatrices is very slow, even when I set the entire submatrix in a single call to MatSetValuesLocal. I call the submatrices Jbb, Jbh, Jhb, and Jhh, where b is the redundant field and h are the displacements in the DMDA. Then Jbh is a 1xN matrix where N is the size of the DMDA, and all of its values are non-zero. When N is around 3e6, setting the values in Jbh takes several hours, even when I only call MatSetValuesLocal on it once. > > I think the preallocation for the DMDA is working well, because if I forego setting the "off-diagonal" submatrices Jbh and Jhb, then setting Jhh is very fast (seconds). I assume it's the allocation on the fly that is killing performance, because Jhh has 57 times as many non-zero elements as Jbh and Jhb, but setting it is several orders of magnitudes FASTER. > From bsmith at mcs.anl.gov Tue Oct 31 15:21:15 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Tue, 31 Oct 2017 20:21:15 +0000 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: References: <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> <87k1zisl6a.fsf@jedbrown.org> <1553F760-492C-4394-BDBF-19B2A69A8517@mcs.anl.gov> <40201D61-0CA9-4BE0-9075-2651CF66CDC3@mcs.anl.gov> Message-ID: <7897599A-B802-4CEB-B188-DC28FB79482C@mcs.anl.gov> Looks like you need to first call DMCompositeSetCoupling() > On Oct 31, 2017, at 11:59 AM, zakaryah . wrote: > > First I set up the composite DM, which contains one redundant field coupled to everything else, and a DMDA, which of course only has local couplings given by the stencil. I then create the matrix with DMCreateMatrix, with the composite, packer, as the first argument. I immediately call MatSetOption with MAT_NEW_NONZERO_ALLOCATION_ERR set to FALSE. The matrix values are set by first calling MatGetLocalSubMatrix, and then MatSetValuesLocal. Apparently this is working, but when I move to larger problems, setting the off-diagonal submatrices is very slow, even when I set the entire submatrix in a single call to MatSetValuesLocal. I call the submatrices Jbb, Jbh, Jhb, and Jhh, where b is the redundant field and h are the displacements in the DMDA. Then Jbh is a 1xN matrix where N is the size of the DMDA, and all of its values are non-zero. When N is around 3e6, setting the values in Jbh takes several hours, even when I only call MatSetValuesLocal on it once. > > I think the preallocation for the DMDA is working well, because if I forego setting the "off-diagonal" submatrices Jbh and Jhb, then setting Jhh is very fast (seconds). I assume it's the allocation on the fly that is killing performance, because Jhh has 57 times as many non-zero elements as Jbh and Jhb, but setting it is several orders of magnitudes FASTER. > From bsmith at mcs.anl.gov Tue Oct 31 16:25:34 2017 From: bsmith at mcs.anl.gov (Smith, Barry F.) Date: Tue, 31 Oct 2017 21:25:34 +0000 Subject: [petsc-users] configuration error In-Reply-To: References: Message-ID: <3D08D163-2E1D-41D9-84EC-0908F9B64A3D@mcs.anl.gov> Manav, Thanks for reporting the problem Fande, Thanks for the pointer. Satish determined the correct long term fix and it is in the branch barry/fix-lto_library-option-maint and will be put in the maint branch and master branch if it passes the testing tonight. Barry > On Oct 30, 2017, at 11:36 AM, Kong, Fande wrote: > > We had exactly the same issue when upgraded compilers. I guess this is somehow related to gfortran. A simple way to work around for us is to change if with_rpath: to if False at line 54 of config/BuildSystem/config/libraries.py. > > Not sure if it works for you. > > Fande, > > > > > On Mon, Oct 30, 2017 at 10:14 AM, Manav Bhatia wrote: > Hi, > > I am trying to install pets 3.8 on a new MacBook machine with OS 10.13. I have installed openmpi from macports and I am getting this error on configuration. Attached is also the configure.log file. > > I am not sure how to proceed with this. Any advice will be greatly appreciated! > > Regards, > Manav > > =============================================================================== > Configuring PETSc to compile on your system > =============================================================================== > =============================================================================== ***** WARNING: Using default optimization C flags -g -O3 You might consider manually setting optimal optimization flags for your system with COPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default C++ optimization flags -g -O3 You might consider manually setting optimal optimization flags for your system with CXXOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default FORTRAN optimization flags -g -O You might consider manually setting optimal optimization flags for your system with FOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== WARNING! Compiling PETSc with no debugging, this should only be done for timing and production runs. All development should be done when configured using --with-debugging=1 =============================================================================== TESTING: checkCLibraries from config.compilers(config/BuildSystem/config/compilers.py:171) ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > ------------------------------------------------------------------------------- > C libraries cannot directly be used from Fortran > ******************************************************************************* > > > > > > From bhatiamanav at gmail.com Tue Oct 31 16:27:30 2017 From: bhatiamanav at gmail.com (Manav Bhatia) Date: Tue, 31 Oct 2017 16:27:30 -0500 Subject: [petsc-users] configuration error In-Reply-To: <3D08D163-2E1D-41D9-84EC-0908F9B64A3D@mcs.anl.gov> References: <3D08D163-2E1D-41D9-84EC-0908F9B64A3D@mcs.anl.gov> Message-ID: Thanks, Barry. Does this also address the mumps compilation issue that I had shared in the same thread: mumps_c.c:307:53: error: no member named 'nnz' in 'CMUMPS_STRUC_C'; did you mean 'nz'? mumps_par->n=0; mumps_par->nz=0; mumps_par->nnz=0; mumps_par->nz_loc=0; mumps_par->nnz_loc=0; mumps_par->nelt=0;mumps_par->instance_number=0;mumps_par->deficiency=0;mumps_par->lwk_user=0;mumps_par->size_schur=0;mumps_par->lrhs=0; mumps_par->lredrhs=0; mumps_par->nrhs=0; mumps_par->nz_rhs=0; mumps_par->lsol_loc=0; ^~~ nz /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:56:20: note: 'nz' declared here MUMPS_INT nz; ^ mumps_c.c:307:92: error: no member named 'nnz_loc' in 'CMUMPS_STRUC_C'; did you mean 'nz_loc'? mumps_par->n=0; mumps_par->nz=0; mumps_par->nnz=0; mumps_par->nz_loc=0; mumps_par->nnz_loc=0; mumps_par->nelt=0;mumps_par->instance_number=0;mumps_par->deficiency=0;mumps_par->lwk_user=0;mumps_par->size_schur=0;mumps_par->lrhs=0; mumps_par->lredrhs=0; mumps_par->nrhs=0; mumps_par->nz_rhs=0; mumps_par->lsol_loc=0; ^~~~~~~ nz_loc /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:62:20: note: 'nz_loc' declared here MUMPS_INT nz_loc; ^ mumps_c.c:419:42: error: no member named 'nnz' in 'CMUMPS_STRUC_C'; did you mean 'nz'? &(mumps_par->nz), &(mumps_par->nnz), irn, &irn_avail, jcn, &jcn_avail, a, &a_avail, ^~~ nz /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:56:20: note: 'nz' declared here MUMPS_INT nz; ^ mumps_c.c:420:46: error: no member named 'nnz_loc' in 'CMUMPS_STRUC_C'; did you mean 'nz_loc'? &(mumps_par->nz_loc), &(mumps_par->nnz_loc), irn_loc, &irn_loc_avail, jcn_loc, &jcn_loc_avail, ^~~~~~~ nz_loc /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:62:20: note: 'nz_loc' declared here MUMPS_INT nz_loc; ^ mumps_c.c:419:29: warning: incompatible pointer types passing 'int *' to parameter of type 'int64_t *' (aka 'long long *') [-Wincompatible-pointer-types] &(mumps_par->nz), &(mumps_par->nnz), irn, &irn_avail, jcn, &jcn_avail, a, &a_avail, ^~~~~~~~~~~~~~~~~ mumps_c.c:99:28: note: passing argument to parameter 'nnz' here MUMPS_INT8 *nnz, ^ mumps_c.c:420:33: warning: incompatible pointer types passing 'int *' to parameter of type 'int64_t *' (aka 'long long *') [-Wincompatible-pointer-types] &(mumps_par->nz_loc), &(mumps_par->nnz_loc), irn_loc, &irn_loc_avail, jcn_loc, &jcn_loc_avail, ^~~~~~~~~~~~~~~~~~~~~ mumps_c.c:107:28: note: passing argument to parameter 'nnz_loc' here MUMPS_INT8 *nnz_loc, ^ 2 warnings and 4 errors generated. > On Oct 31, 2017, at 4:25 PM, Smith, Barry F. wrote: > > > Manav, > > Thanks for reporting the problem > > Fande, > > Thanks for the pointer. > > Satish determined the correct long term fix and it is in the branch barry/fix-lto_library-option-maint and will be put in the maint branch and master branch if it passes the testing tonight. > > Barry > > >> On Oct 30, 2017, at 11:36 AM, Kong, Fande wrote: >> >> We had exactly the same issue when upgraded compilers. I guess this is somehow related to gfortran. A simple way to work around for us is to change if with_rpath: to if False at line 54 of config/BuildSystem/config/libraries.py. >> >> Not sure if it works for you. >> >> Fande, >> >> >> >> >> On Mon, Oct 30, 2017 at 10:14 AM, Manav Bhatia wrote: >> Hi, >> >> I am trying to install pets 3.8 on a new MacBook machine with OS 10.13. I have installed openmpi from macports and I am getting this error on configuration. Attached is also the configure.log file. >> >> I am not sure how to proceed with this. Any advice will be greatly appreciated! >> >> Regards, >> Manav >> >> =============================================================================== >> Configuring PETSc to compile on your system >> =============================================================================== >> =============================================================================== ***** WARNING: Using default optimization C flags -g -O3 You might consider manually setting optimal optimization flags for your system with COPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default C++ optimization flags -g -O3 You might consider manually setting optimal optimization flags for your system with CXXOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default FORTRAN optimization flags -g -O You might consider manually setting optimal optimization flags for your system with FOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== WARNING! Compiling PETSc with no debugging, this should only be done for timing and production runs. All development should be done when configured using --with-debugging=1 =============================================================================== TESTING: checkCLibraries from config.compilers(config/BuildSystem/config/compilers.py:171) ******************************************************************************* >> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): >> ------------------------------------------------------------------------------- >> C libraries cannot directly be used from Fortran >> ******************************************************************************* >> >> >> >> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Oct 31 16:32:46 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 31 Oct 2017 16:32:46 -0500 Subject: [petsc-users] configuration error In-Reply-To: References: <3D08D163-2E1D-41D9-84EC-0908F9B64A3D@mcs.anl.gov> Message-ID: I've already replied to this issue Try a fresh build - and do not reuse --prefix for different version builds. Have you tried doing this? Satish On Tue, 31 Oct 2017, Manav Bhatia wrote: > Thanks, Barry. > > Does this also address the mumps compilation issue that I had shared in the same thread: > > mumps_c.c:307:53: error: no member named 'nnz' in 'CMUMPS_STRUC_C'; did you mean 'nz'? > mumps_par->n=0; mumps_par->nz=0; mumps_par->nnz=0; mumps_par->nz_loc=0; mumps_par->nnz_loc=0; mumps_par->nelt=0;mumps_par->instance_number=0;mumps_par->deficiency=0;mumps_par->lwk_user=0;mumps_par->size_schur=0;mumps_par->lrhs=0; mumps_par->lredrhs=0; mumps_par->nrhs=0; mumps_par->nz_rhs=0; mumps_par->lsol_loc=0; > ^~~ > nz > /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:56:20: note: 'nz' declared here > MUMPS_INT nz; > ^ > mumps_c.c:307:92: error: no member named 'nnz_loc' in 'CMUMPS_STRUC_C'; did you mean 'nz_loc'? > mumps_par->n=0; mumps_par->nz=0; mumps_par->nnz=0; mumps_par->nz_loc=0; mumps_par->nnz_loc=0; mumps_par->nelt=0;mumps_par->instance_number=0;mumps_par->deficiency=0;mumps_par->lwk_user=0;mumps_par->size_schur=0;mumps_par->lrhs=0; mumps_par->lredrhs=0; mumps_par->nrhs=0; mumps_par->nz_rhs=0; mumps_par->lsol_loc=0; > ^~~~~~~ > nz_loc > /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:62:20: note: 'nz_loc' declared here > MUMPS_INT nz_loc; > ^ > mumps_c.c:419:42: error: no member named 'nnz' in 'CMUMPS_STRUC_C'; did you mean 'nz'? > &(mumps_par->nz), &(mumps_par->nnz), irn, &irn_avail, jcn, &jcn_avail, a, &a_avail, > ^~~ > nz > /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:56:20: note: 'nz' declared here > MUMPS_INT nz; > ^ > mumps_c.c:420:46: error: no member named 'nnz_loc' in 'CMUMPS_STRUC_C'; did you mean 'nz_loc'? > &(mumps_par->nz_loc), &(mumps_par->nnz_loc), irn_loc, &irn_loc_avail, jcn_loc, &jcn_loc_avail, > ^~~~~~~ > nz_loc > /Users/manav/Documents/codes/numerical_lib/petsc/include/cmumps_c.h:62:20: note: 'nz_loc' declared here > MUMPS_INT nz_loc; > ^ > mumps_c.c:419:29: warning: incompatible pointer types passing 'int *' to parameter of type 'int64_t *' (aka 'long long *') [-Wincompatible-pointer-types] > &(mumps_par->nz), &(mumps_par->nnz), irn, &irn_avail, jcn, &jcn_avail, a, &a_avail, > ^~~~~~~~~~~~~~~~~ > mumps_c.c:99:28: note: passing argument to parameter 'nnz' here > MUMPS_INT8 *nnz, > ^ > mumps_c.c:420:33: warning: incompatible pointer types passing 'int *' to parameter of type 'int64_t *' (aka 'long long *') [-Wincompatible-pointer-types] > &(mumps_par->nz_loc), &(mumps_par->nnz_loc), irn_loc, &irn_loc_avail, jcn_loc, &jcn_loc_avail, > ^~~~~~~~~~~~~~~~~~~~~ > mumps_c.c:107:28: note: passing argument to parameter 'nnz_loc' here > MUMPS_INT8 *nnz_loc, > ^ > 2 warnings and 4 errors generated. > > > > On Oct 31, 2017, at 4:25 PM, Smith, Barry F. wrote: > > > > > > Manav, > > > > Thanks for reporting the problem > > > > Fande, > > > > Thanks for the pointer. > > > > Satish determined the correct long term fix and it is in the branch barry/fix-lto_library-option-maint and will be put in the maint branch and master branch if it passes the testing tonight. > > > > Barry > > > > > >> On Oct 30, 2017, at 11:36 AM, Kong, Fande wrote: > >> > >> We had exactly the same issue when upgraded compilers. I guess this is somehow related to gfortran. A simple way to work around for us is to change if with_rpath: to if False at line 54 of config/BuildSystem/config/libraries.py. > >> > >> Not sure if it works for you. > >> > >> Fande, > >> > >> > >> > >> > >> On Mon, Oct 30, 2017 at 10:14 AM, Manav Bhatia wrote: > >> Hi, > >> > >> I am trying to install pets 3.8 on a new MacBook machine with OS 10.13. I have installed openmpi from macports and I am getting this error on configuration. Attached is also the configure.log file. > >> > >> I am not sure how to proceed with this. Any advice will be greatly appreciated! > >> > >> Regards, > >> Manav > >> > >> =============================================================================== > >> Configuring PETSc to compile on your system > >> =============================================================================== > >> =============================================================================== ***** WARNING: Using default optimization C flags -g -O3 You might consider manually setting optimal optimization flags for your system with COPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default C++ optimization flags -g -O3 You might consider manually setting optimal optimization flags for your system with CXXOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== ***** WARNING: Using default FORTRAN optimization flags -g -O You might consider manually setting optimal optimization flags for your system with FOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for examples =============================================================================== =============================================================================== WARNING! Compiling PETSc with no debugging, this should only be done for timing and production runs. All development should be done when configured using --with-debugging=1 =============================================================================== TESTING: checkCLibraries from config.compilers(config/BuildSystem/config/compilers.py:171) ******************************************************************************* > >> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > >> ------------------------------------------------------------------------------- > >> C libraries cannot directly be used from Fortran > >> ******************************************************************************* > >> > >> > >> > >> > >> > >> > > > > From ccetinbas at anl.gov Tue Oct 31 21:00:16 2017 From: ccetinbas at anl.gov (Cetinbas, Cankur Firat) Date: Wed, 1 Nov 2017 02:00:16 +0000 Subject: [petsc-users] petsc4py sparse matrix construction time In-Reply-To: References: Message-ID: Hi, Thanks a lot. Based on both of your suggestions I modified the code using Mat.createAIJ() and csr option. The computation time decreased significantly after using this method. Still if there is a better option please let me know after seeing the modified code below. At first trial with 1000x1000 matrix with 96019 non-zeros in the matrix, the computation time did not scale with the number of cores : Single core python @ 0.0035s, single core petsc @ 0.0024s, 2 cores petsc @ 0.0036s, 4 cores petsc @ 0.0032, 8 cores petsc @ 0.0030s. Then I tried with larger matrix 181797x181797 with more non-zeros and I got the following results: Single core python @ 0.021, single core petsc @ 0.031, 2 cores petsc @ 0.024s, 4 cores petsc @ 0.014, 8 cores petsc @ 0.009s, 16 cores petsc @ 0.0087s. I think the optimum number of nodes is highly dependent on matrix size and the number of non-zeros. In the real code matrix size (and so the number of non-zero elements) will grow at every iteration starting with very small matrices growing to very big ones. Is it possible to set the number process from the code dynamically? Another question is about the data types; mpi4py only let me transfer float type data, and petsc4py only lets me use int32 type indices. Besides keep converting the data, is there any solution for this? The modified code for matrix creation part: comm = MPI.COMM_WORLD rank = comm.Get_rank() size = comm.Get_size() if rank==0: row = np.loadtxt('row1000.out').astype(dtype='int32') col = np.loadtxt('col1000.out').astype(dtype='int32') val = np.loadtxt('val1000.out').astype(dtype='int32') m = 1000 # 1000 x 1000 matrix if size>1: rbc = np.bincount(row)*1.0 ieq = int(np.floor(m/size)) a = [ieq]*size ix = int(np.mod(m,size)) if ix>0: for i in range(0,ix): a[i]= a[i]+1 a = np.array([0]+a).cumsum() b = np.zeros(a.shape[0]-1) for i in range(0,a.shape[0]-1): b[i]=rbc[a[i]:a[i+1]].sum() # b is the send counts for Scatterv row = row.astype(dtype=float) col = col.astype(dtype=float) val = val.astype(dtype=float) else: row=None col=None val=None indpt=None b=None m=None if size>1: ml = comm.bcast(m,root=0) bl = comm.bcast(b,root=0) row_lcl = np.zeros(bl[rank]) col_lcl = row_lcl.copy() val_lcl = row_lcl.copy() comm.Scatterv([row,b],row_lcl) comm.Scatterv([col,b],col_lcl) comm.Scatterv([val,b],val_lcl) comm.Barrier() row_lcl = row_lcl.astype(dtype='int32') col_lcl = col_lcl.astype(dtype='int32') val_lcl = val_lcl.astype(dtype='int32') indptr = np.bincount(row_lcl) indptr = indptr[indptr>0] indptr = np.insert(indptr,0,0).cumsum() indptr = indptr.astype(dtype='int32') comm.Barrier() pA = PETSc.Mat().createAIJ(size=(ml,ml),csr=(indptr, col_lcl, val_lcl)) # Matrix generation else: indptr = np.bincount(row) indptr = np.insert(indptr,0,0).cumsum() indptr = indptr.astype(dtype='int32') st=time.time() pA = PETSc.Mat().createAIJ(size=(m,m),csr=(indptr, col, val)) print('dt:',time.time()-st) Regards, Firat -----Original Message----- From: Smith, Barry F. Sent: Tuesday, October 31, 2017 10:18 AM To: Matthew Knepley Cc: Cetinbas, Cankur Firat; petsc-users at mcs.anl.gov; Ahluwalia, Rajesh K. Subject: Re: [petsc-users] petsc4py sparse matrix construction time You also need to make sure that most matrix entries are generated on the process that they will belong on. Barry > On Oct 30, 2017, at 8:01 PM, Matthew Knepley wrote: > > On Mon, Oct 30, 2017 at 8:06 PM, Cetinbas, Cankur Firat wrote: > Hello, > > > > I am a beginner both in PETSc and mpi4py. I have been working on parallelizing our water transport code (where we solve linear system of equations) and I started with the toy code below. > > > > The toy code reads right hand size (rh), row, column, value vectors to construct sparse coefficient matrix and then scatters them to construct parallel PETSc coefficient matrix and right hand side vector. > > > > The sparse matrix generation time is extremely high in comparison to sps.csr_matrix((val, (row, col)), shape=(n,n)) in python. For instance python generates 181197x181197 sparse matrix in 0.06 seconds and this code with 32 cores:1.19s, 16 cores:6.98s and 8 cores:29.5 s. I was wondering if I am making a mistake in generating sparse matrix? Is there a more efficient way? > > > It looks like you do not preallocate the matrix. There is a chapter on this in the manual. > > Matt > > Thanks for your help in advance. > > > > Regards, > > > > Firat > > > > from petsc4py import PETSc > > from mpi4py import MPI > > import numpy as np > > import time > > > > comm = MPI.COMM_WORLD > > rank = comm.Get_rank() > > size = comm.Get_size() > > > > if rank==0: > > # proc 0 loads tomo image and does fast calculations to append row, col, val, rh lists > > # in the real code this vectors will be available on proc 0 no txt files are read > > row = np.loadtxt('row.out') # indices of non-zero rows > > col = np.loadtxt('col.out') # indices of non-zero columns > > val = np.loadtxt('vs.out') # values in the sparse matrix > > rh = np.loadtxt('RHS.out') # right hand side vector > > n = row.shape[0] #1045699 > > m = rh.shape[0] #181197 square sparse matrix size > > else: > > n = None > > m = None > > row = None > > col = None > > val = None > > rh = None > > rh_ind = None > > > > m_lcl = comm.bcast(m,root=0) > > n_lcl = comm.bcast(n,root=0) > > neq = n_lcl//size > > meq = m_lcl//size > > nx = np.mod(n_lcl,size) > > mx = np.mod(m_lcl,size) > > row_lcl = np.zeros(neq) > > col_lcl = np.zeros(neq) > > val_lcl = np.zeros(neq) > > rh_lcl = np.zeros(meq) > > a = [neq]*size #send counts for Scatterv > > am = [meq]*size #send counts for Scatterv > > > > if nx>0: > > for i in range(0,nx): > > if rank==i: > > row_lcl = np.zeros(neq+1) > > col_lcl = np.zeros(neq+1) > > val_lcl = np.zeros(neq+1) > > a[i] = a[i]+1 > > if mx>0: > > for ii in range(0,mx): > > if rank==ii: > > rh_lcl = np.zeros(meq+1) > > am[ii] = am[ii]+1 > > > > comm.Scatterv([row,a],row_lcl) > > comm.Scatterv([col,a],col_lcl) > > comm.Scatterv([val,a],val_lcl) > > comm.Scatterv([rh,am],rh_lcl) > > comm.Barrier() > > > > A = PETSc.Mat() > > A.create() > > A.setSizes([m_lcl,m_lcl]) > > A.setType('aij') > > A.setUp() > > lr = row_lcl.shape[0] > > for i in range(0,lr): > > A[row_lcl[i],col_lcl[i]] = val_lcl[i] > > A.assemblyBegin() > > A.assemblyEnd() > > > > if size>1: # to get the range for scattered vectors > > ami = [0] > > ami = np.array([0]+am).cumsum() > > for kk in range(0,size): > > if rank==kk: > > Is = ami[kk] > > Ie = ami[kk+1] > > else: > > Is=0; Ie=m_lcl > > > > b= PETSc.Vec() > > b.create() > > b.setSizes(m_lcl) > > b.setFromOptions() > > b.setUp() > > b.setValues(list(range(Is,Ie)),rh_lcl) > > b.assemblyBegin() > > b.assemblyEnd() > > > > # solution vector > > x = b.duplicate() > > x.assemblyBegin() > > x.assemblyEnd() > > > > # create linear solver > > ksp = PETSc.KSP() > > ksp.create() > > ksp.setOperators(A) > > ksp.setType('cg') > > #ksp.getPC().setType('icc') # only sequential > > ksp.getPC().setType('jacobi') > > print('solving with:', ksp.getType()) > > > > #solve > > st=time.time() > > ksp.solve(b,x) > > et=time.time() > > print(et-st) > > > > if size>1: > > #gather > > if rank==0: > > xGthr = np.zeros(m) > > else: > > xGthr = None > > comm.Gatherv(x,[xGthr,am]) > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From zakaryah at gmail.com Tue Oct 31 21:40:48 2017 From: zakaryah at gmail.com (zakaryah .) Date: Tue, 31 Oct 2017 22:40:48 -0400 Subject: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods In-Reply-To: <7897599A-B802-4CEB-B188-DC28FB79482C@mcs.anl.gov> References: <87zi979alu.fsf@jedbrown.org> <2E811513-A851-4F84-A93F-BE83D56584BB@mcs.anl.gov> <6FA17D2F-1EB3-4EC0-B13F-B19922011797@glasgow.ac.uk> <877evnyyty.fsf@jedbrown.org> <29550785-0F7A-488B-A159-DD42DC29A228@mcs.anl.gov> <87inf4vsld.fsf@jedbrown.org> <87k1zisl6a.fsf@jedbrown.org> <1553F760-492C-4394-BDBF-19B2A69A8517@mcs.anl.gov> <40201D61-0CA9-4BE0-9075-2651CF66CDC3@mcs.anl.gov> <7897599A-B802-4CEB-B188-DC28FB79482C@mcs.anl.gov> Message-ID: Thanks Barry, that looks like exactly what I need. I'm looking at pack.c and packm.c and I want to check my understanding of what my coupling function should do. The relevant line in *DMCreateMatrix_Composite_AIJ *seems to be: (*com->FormCoupleLocations)(dm,NULL,dnz,onz,__rstart,__nrows,__start,__end); and I infer that dnz and onz are the number of nonzero elements in the diagonal and off-diagonal submatrices, for each row of the DMComposite matrix. I suppose I can just set each of these in a for loop, but can I use the arguments to FormCoupleLocations as the range for the loop? Which ones - __rstart to __rstart+__nrows? How can I determine the number of rows on each processor from within the function that I pass? From the preallocation macros it looks like __start to __end describe the range of the columns of the diagonal submatrix - is that right? It looks like the ranges will be specific to each processor. Do I just set the values in dnz and onz, or do I need to reduce them? Thanks for all the help! Maybe if I get things working I can carve out the core of the code to make an example program for DMRedundant/Composite. -------------- next part -------------- An HTML attachment was scrubbed... URL: