[petsc-dev] Algebraic Multigrid
Ari Rappaport
arir at vosssci.com
Wed Aug 10 14:52:00 CDT 2016
Hi Mark,
There was indeed a bug on our end. Now that we fixed it everything is working correctly even in parallel. I have one question still though. Is there a way to get the mg_levels_pc_type jacobi option
set in the code without passing it as an argument to PetscInitialize? I'm using PCGAMGSetType(pc, PCJACOBI) but it doesn't seem to be working.
-Ari
----- Original Message -----
From: "Mark Adams" <mfadams at lbl.gov>
To: "Ari Rappaport" <arir at vosssci.com>
Cc: "For users of the development version of PETSc" <petsc-dev at mcs.anl.gov>
Sent: Wednesday, July 27, 2016 7:24:24 PM
Subject: Re: [petsc-dev] Algebraic Multigrid
There is clearly something very wrong with your code. I would suggest starting with an example code for instance KSP/examples/tutorial/ex56.c and incrementally add your operator to get something that is working.
Also if you have multiple degrees of freedom per vertex, for instance elasticity, then you want to set the block size of the matrix accordingly.
On Wednesday, July 27, 2016, Ari Rappaport < arir at vosssci.com > wrote:
Hi Mark,
We added the Jacobi line and it appears to accept that flag now. However, we are getting all zeros for the solution vector. And PETSc is claiming to have converged in 7 iterations to the relative tolerance.
-Ari
----- Original Message -----
From: "Mark Adams" < mfadams at lbl.gov >
To: "Ari Rappaport" < arir at vosssci.com >, "For users of the development version of PETSc" < petsc-dev at mcs.anl.gov >
Sent: Wednesday, July 27, 2016 3:26:03 PM
Subject: Re: [petsc-dev] Algebraic Multigrid
Please keep this on the Petsc list.
We seem to have lost Jacobi smoother again. I'm suspecting that you're some funny character in your line with Jacobi that is Confusing the parser. Get back to the old file with the two two Jacobi entries and delete the other line and get Jacobi in the KSP view output. There should be no SOR in the output.
On Wednesday, July 27, 2016, Ari Rappaport < arir at vosssci.com > wrote:
Hi Mark,
I added all these new things. The PCAMG is now ending very quickly but the residual is unreasonably large by about 10 orders of magnitude. I noticed the line "Linear solve did not converge due to DIVERGED_INDEFINITE_PC iterations 2" in the output, could this be causing the problem? It only appears to be going for 2 iterations now.
-Ari
----- Original Message -----
From: "Mark Adams" < mfadams at lbl.gov >
To: "Ari Rappaport" < arir at vosssci.com >
Cc: "For users of the development version of PETSc" < petsc-dev at mcs.anl.gov >
Sent: Tuesday, July 26, 2016 5:07:34 PM
Subject: Re: [petsc-dev] Algebraic Multigrid
Ari, I would also check that your operator is not messed up in parallel. The solver is looking pretty solid.
Also, you can configure PETSc with hypre and use '-pc_type hypre'. If hypre is also good in serial but hosed on multi-proc then it is most probably your operator.
On Tue, Jul 26, 2016 at 6:58 PM, Mark Adams < mfadams at lbl.gov > wrote:
So remove one of the -mg_levels_pc_type jacobi and add -mg_coarse_ksp_type preonly, then verify that this works on one proc and then try two procs.
On Tue, Jul 26, 2016 at 6:56 PM, Mark Adams < mfadams at lbl.gov > wrote:
Oh, actually this worked. You have this ...pc_type jacobi in there twice, so one of them was "unused".
Try this with 2 processors now.
On Tue, Jul 26, 2016 at 6:42 PM, Mark Adams < mfadams at lbl.gov > wrote:
On Tue, Jul 26, 2016 at 6:24 PM, Ari Rappaport < arir at vosssci.com > wrote:
So I commented out the line PCSetType(pc, PCGAMG). The line KSPSetFromOptions(ksp) was already in the code at the end of our initialization routine. I also added .petscrc to the working dir. Here is the current output. It seems as if Option left: name:-mg_levels_pc_type jacobi (no value) is still present in the output..I dunno.
Yea, I dunno either. If you use -help you will get printout of the available options. If you do this you will see stuff like -mg_levels_1_ ... you can also see this in the ksp_view. There is a shortcut that lets you _not_ put "_1" in. Try putting this in for each level like so:
-mg_levels_1_pc_type jacobi
-mg_levels_2_pc_type jacobi
-mg_levels_3_pc_type jacobi
I also notice that the coarse grid ksp is GMRES. This is our fault. It should be preonly. Add:
-mg_coarse_ksp_type preonly
-Ari
----- Original Message -----
From: "Mark Adams" < mfadams at lbl.gov >
To: "Ari Rappaport" < arir at vosssci.com >, "For users of the development version of PETSc" < petsc-dev at mcs.anl.gov >
Sent: Tuesday, July 26, 2016 4:03:03 PM
Subject: Re: [petsc-dev] Algebraic Multigrid
At the end of this you have:
#PETSc Option Table entries:
-ksp_view
-mg_levels_pc_type jacobi
-options_left
#End of PETSc Option Table entries
There is one unused database option. It is:
Option left: name:-mg_levels_pc_type jacobi (no value)
So this jacobi parameter is not being used.
Do you call KPSSetFromOptions? Do you set solver parameters in the code? Like PCGAMG?
You should not set anything in the code, it just confuses things at this point. Use KSPSetFromOptions(). You can hardwire stuff before this call, this just lets you set the defaults, but you should always call this last to let command line parameters override the defaults.
You can put this in a .petscrc file in the working directory and try again.
-ksp_type cg
-ksp_max_it 50
-ksp_rtol 1.e-6
-ksp_converged_reason
-pc_type gamg
-pc_gamg_type agg
-pc_gamg_agg_nsmooths 1
-pc_gamg_coarse_eq_limit 10
-pc_gamg_reuse_interpolation true
-pc_gamg_square_graph 1
-pc_gamg_threshold -0.05
-mg_levels_ksp_max_it 2
-mg_levels_ksp_type chebyshev
-mg_levels_esteig_ksp_type cg
-mg_levels_esteig_ksp_max_it 10
-mg_levels_ksp_chebyshev_esteig 0,.05,0,1.05
-mg_levels_pc_type jacobi
-pc_hypre_type boomeramg
-pc_hypre_boomeramg_no_CF
-pc_hypre_boomeramg_agg_nl 1
-pc_hypre_boomeramg_coarsen_type HMIS
-pc_hypre_boomeramg_interp_type ext+i
More information about the petsc-dev
mailing list