[petsc-users] GAMG issue
Mark F. Adams
mark.adams at columbia.edu
Thu Mar 15 16:28:51 CDT 2012
On Mar 15, 2012, at 4:54 PM, Matthew Knepley wrote:
> On Thu, Mar 15, 2012 at 3:24 PM, John Mousel <john.mousel at gmail.com> wrote:
> Mark,
>
> Running without the options you mentioned before leads to slightly worse performance (175 iterations).
> I have not been able to get run coarse grid solve to work with LU while running ML. It keeps experiencing a zero pivot, and all the combinations of shifting i've tried haven't lead me anywhere, hence the SOR on the course grid. Also, the ML manual suggests limiting the number of levels to 3 or 4 and performing a few sweeps of an iterative method as opposed to a direct solve.
>
This system is singular; it is a Neumann problem. That is fine. LU survived in GAMG because GAMG added just enough floating point error to keep it alive but it is a terrible solver, so SOR is fine. But I think that you do want to use richardson or GMRES (default) and not preonly with say jacobi preconditioning.
Its not clear to me why GAMG is so bad. I suspect that the coarse grid is larger in the GAMG solve than ML, and so the simple SOR solver is not adequate. GAMG stops coarsening with 139 equations on your test.
You can run with
-pc_gamg_coarse_eq_limit 10
This will coarsen to a tiny problem (<10) and 8 iterations of SOR should be fine.
Mark
> That is suspicious behavior for LU. I now suspect something is wrong with this system, namely it is rank deficient
> or the aggregates are and you are not seeing real convergence. Can you add -ksp_monitor_true_residual to make
> sure the solver is actually converging.
>
> Matt
>
> John
>
> On Thu, Mar 15, 2012 at 12:04 PM, Mark F. Adams <mark.adams at columbia.edu> wrote:
> You also want: -pc_gamg_agg_nsmooths 1
>
> You are running plain aggregation. If it is Poisson then smoothing is good.
>
> Is this problem singular? Can you try running ML with these parameters and see if its performance degrades? The ML implementation uses the PETSC infrastructure and uses a very similar algorithm to GAMG-SA. We should be able to get these two to match pretty well.
>
> Mark
>
>
> On Mar 15, 2012, at 12:21 PM, John Mousel wrote:
>
>> Mark,
>>
>> I ran with those options removed (see the run options listed below). Things actually got slightly worse. Now it's up to 142 iterations. I have attached the ksp_view output.
>>
>> -ksp_type bcgsl -pc_type gamg -pc_gamg_sym_graph -ksp_diagonal_scale -ksp_diagonal_scale_fix -mg_levels_ksp_type richardson -mg_levels_pc_type sor -pc_gamg_verbose 1
>>
>>
>> John
>>
>>
>> On Thu, Mar 15, 2012 at 10:55 AM, Mark F. Adams <mark.adams at columbia.edu> wrote:
>> John, can you run again with: -pc_gamg_verbose 1
>>
>> And I would not use: -pc_mg_levels 4 -mg_coarse_ksp_type preonly -mg_coarse_pc_type sor -mg_coarse_pc_sor_its 8
>>
>> 1) I think -mg_coarse_ksp_type preonly and -mg_coarse_pc_sor_its 8 do not do what you think. I think this is the same as 1 iteration. I think you want 'richardson' not 'preonly'.
>>
>> 2) Why are you using sor as the coarse solver? If your problem is singular then you want to use as many levels as possible to get the coarse grid to be tiny. I'm pretty sure HYPRE ignores the coarse solver parameters. But ML uses them and it is converging well.
>>
>> 3) I would not specify the number of levels. GAMG, and I think the rest, have internal logic for stopping a the right level. If the coarse level is large and you use just 8 iterations of sor then convergence will suffer.
>>
>> Mark
>>
>> On Mar 15, 2012, at 11:13 AM, John Mousel wrote:
>>
>>> Mark,
>>>
>>> The changes pulled through this morning. I've run it with the options
>>>
>>> -ksp_type bcgsl -pc_type gamg -pc_gamg_sym_graph -ksp_diagonal_scale -ksp_diagonal_scale_fix -pc_mg_levels 4 -mg_levels_ksp_type richardson -mg_levels_pc_type sor -mg_coarse_ksp_type preonly -mg_coarse_pc_type sor -mg_coarse_pc_sor_its 8
>>>
>>> and it converges in the true residual, but it's not converging as fast as anticpated. The matrix arises from a non-symmetric discretization of the Poisson equation. The solve takes GAMG 114 iterations, whereas ML takes 24 iterations, BoomerAMG takes 22 iterations, and -ksp_type bcgsl -pc_type bjacobi -sub_pc_type ilu -sub_pc_factor_levels 4 takes around 170. I've attached the -ksp_view results for ML,GAMG, and HYPRE. I've attempted to make all the options the same on all levels for ML and GAMG.
>>>
>>> Any thoughts?
>>>
>>> John
>>>
>>>
>>> On Wed, Mar 14, 2012 at 6:04 PM, Mark F. Adams <mark.adams at columbia.edu> wrote:
>>> Humm, I see it with hg view (appended).
>>>
>>> Satish, my main repo looks hosed. I see this:
>>>
>>> ~/Codes/petsc-dev>hg update
>>> abort: crosses branches (merge branches or use --clean to discard changes)
>>> ~/Codes/petsc-dev>hg merge
>>> abort: branch 'default' has 3 heads - please merge with an explicit rev
>>> (run 'hg heads .' to see heads)
>>> ~/Codes/petsc-dev>hg heads
>>> changeset: 22496:8e2a98268179
>>> tag: tip
>>> user: Barry Smith <bsmith at mcs.anl.gov>
>>> date: Wed Mar 14 16:42:25 2012 -0500
>>> files: src/vec/is/interface/f90-custom/zindexf90.c src/vec/vec/interface/f90-custom/zvectorf90.c
>>> description:
>>> undoing manually changes I put in because Satish had a better fix
>>>
>>>
>>> changeset: 22492:bda4df63072d
>>> user: Mark F. Adams <mark.adams at columbia.edu>
>>> date: Wed Mar 14 17:39:52 2012 -0400
>>> files: src/ksp/pc/impls/gamg/tools.c
>>> description:
>>> fix for unsymmetric matrices.
>>>
>>>
>>> changeset: 22469:b063baf366e4
>>> user: Mark F. Adams <mark.adams at columbia.edu>
>>> date: Wed Mar 14 14:22:28 2012 -0400
>>> files: src/ksp/pc/impls/gamg/tools.c
>>> description:
>>> added fix for preallocation for unsymetric matrices.
>>>
>>> Mark
>>>
>>> my 'hg view' on my merge repo:
>>>
>>> Revision: 22492
>>> Branch: default
>>> Author: Mark F. Adams <mark.adams at columbia.edu> 2012-03-14 17:39:52
>>> Committer: Mark F. Adams <mark.adams at columbia.edu> 2012-03-14 17:39:52
>>> Tags: tip
>>> Parent: 22491:451bbbd291c2 (Small fixes to the BT linesearch)
>>>
>>> fix for unsymmetric matrices.
>>>
>>>
>>> ------------------------ src/ksp/pc/impls/gamg/tools.c ------------------------
>>> @@ -103,7 +103,7 @@
>>> PetscErrorCode ierr;
>>> PetscInt Istart,Iend,Ii,jj,ncols,nnz0,nnz1, NN, MM, nloc;
>>> PetscMPIInt mype, npe;
>>> - Mat Gmat = *a_Gmat, tGmat;
>>> + Mat Gmat = *a_Gmat, tGmat, matTrans;
>>> MPI_Comm wcomm = ((PetscObject)Gmat)->comm;
>>> const PetscScalar *vals;
>>> const PetscInt *idx;
>>> @@ -127,6 +127,10 @@
>>> ierr = MatDiagonalScale( Gmat, diag, diag ); CHKERRQ(ierr);
>>> ierr = VecDestroy( &diag ); CHKERRQ(ierr);
>>>
>>> + if( symm ) {
>>> + ierr = MatTranspose( Gmat, MAT_INITIAL_MATRIX, &matTrans ); CHKERRQ(ierr);
>>> + }
>>> +
>>> /* filter - dup zeros out matrix */
>>> ierr = PetscMalloc( nloc*sizeof(PetscInt), &d_nnz ); CHKERRQ(ierr);
>>> ierr = PetscMalloc( nloc*sizeof(PetscInt), &o_nnz ); CHKERRQ(ierr);
>>> @@ -135,6 +139,12 @@
>>> d_nnz[jj] = ncols;
>>> o_nnz[jj] = ncols;
>>> ierr = MatRestoreRow(Gmat,Ii,&ncols,PETSC_NULL,PETSC_NULL); CHKERRQ(ierr);
>>> + if( symm ) {
>>> + ierr = MatGetRow(matTrans,Ii,&ncols,PETSC_NULL,PETSC_NULL); CHKERRQ(ierr);
>>> + d_nnz[jj] += ncols;
>>> + o_nnz[jj] += ncols;
>>> + ierr = MatRestoreRow(matTrans,Ii,&ncols,PETSC_NULL,PETSC_NULL); CHKERRQ(ierr);
>>> + }
>>> if( d_nnz[jj] > nloc ) d_nnz[jj] = nloc;
>>> if( o_nnz[jj] > (MM-nloc) ) o_nnz[jj] = MM - nloc;
>>> }
>>> @@ -142,6 +152,9 @@
>>> CHKERRQ(ierr);
>>> ierr = PetscFree( d_nnz ); CHKERRQ(ierr);
>>> ierr = PetscFree( o_nnz ); CHKERRQ(ierr);
>>> + if( symm ) {
>>> + ierr = MatDestroy( &matTrans ); CHKERRQ(ierr);
>>> + }
>>>
>>>
>>>
>>>
>>> On Mar 14, 2012, at 5:53 PM, John Mousel wrote:
>>>
>>>> Mark,
>>>>
>>>> No change. Can you give me the location that you patched so I can check to make sure it pulled?
>>>> I don't see it on the petsc-dev change log.
>>>>
>>>> John
>>>>
>>>> On Wed, Mar 14, 2012 at 4:40 PM, Mark F. Adams <mark.adams at columbia.edu> wrote:
>>>> John, I've committed these changes, give a try.
>>>>
>>>> Mark
>>>>
>>>> On Mar 14, 2012, at 3:46 PM, Satish Balay wrote:
>>>>
>>>> > This is the usual merge [with uncommited changes] issue.
>>>> >
>>>> > You could use 'hg shelf' extension to shelve your local changes and
>>>> > then do a merge [as Sean would suggest] - or do the merge in a
>>>> > separate/clean clone [I normally do this..]
>>>> >
>>>> > i.e
>>>> > cd ~/Codes
>>>> > hg clone petsc-dev petsc-dev-merge
>>>> > cd petsc-dev-merge
>>>> > hg pull ssh://petsc@petsc.cs.iit.edu//hg/petsc/petsc-dev #just to be sure, look for latest chagnes before merge..
>>>> > hg merge
>>>> > hg commit
>>>> > hg push ssh://petsc@petsc.cs.iit.edu//hg/petsc/petsc-dev
>>>> >
>>>> > [now update your petsc-dev to latest]
>>>> > cd ~/Codes/petsc-dev
>>>> > hg pull
>>>> > hg update
>>>> >
>>>> > Satish
>>>> >
>>>> > On Wed, 14 Mar 2012, Mark F. Adams wrote:
>>>> >
>>>> >> Great, that seems to work.
>>>> >>
>>>> >> I did a 'hg commit tools.c'
>>>> >>
>>>> >> and I want to push this file only. I guess its the only thing in the change set so 'hg push' should be fine. But I see this:
>>>> >>
>>>> >> ~/Codes/petsc-dev/src/ksp/pc/impls/gamg>hg update
>>>> >> abort: crosses branches (merge branches or use --clean to discard changes)
>>>> >> ~/Codes/petsc-dev/src/ksp/pc/impls/gamg>hg merge
>>>> >> abort: outstanding uncommitted changes (use 'hg status' to list changes)
>>>> >> ~/Codes/petsc-dev/src/ksp/pc/impls/gamg>hg status
>>>> >> M include/petscmat.h
>>>> >> M include/private/matimpl.h
>>>> >> M src/ksp/pc/impls/gamg/agg.c
>>>> >> M src/ksp/pc/impls/gamg/gamg.c
>>>> >> M src/ksp/pc/impls/gamg/gamg.h
>>>> >> M src/ksp/pc/impls/gamg/geo.c
>>>> >> M src/mat/coarsen/coarsen.c
>>>> >> M src/mat/coarsen/impls/hem/hem.c
>>>> >> M src/mat/coarsen/impls/mis/mis.c
>>>> >>
>>>> >> Am I ready to do a push?
>>>> >>
>>>> >> Thanks,
>>>> >> Mark
>>>> >>
>>>> >> On Mar 14, 2012, at 2:44 PM, Satish Balay wrote:
>>>> >>
>>>> >>> If commit is the last hg operation that you've done - then 'hg rollback' would undo this commit.
>>>> >>>
>>>> >>> Satish
>>>> >>>
>>>> >>> On Wed, 14 Mar 2012, Mark F. Adams wrote:
>>>> >>>
>>>> >>>> Damn, I'm not preallocating the graph perfectly for unsymmetric matrices and PETSc now dies on this.
>>>> >>>>
>>>> >>>> I have a fix but I committed it with other changes that I do not want to commit. The changes are all in one file so I should be able to just commit this file.
>>>> >>>>
>>>> >>>> Anyone know how to delete a commit?
>>>> >>>>
>>>> >>>> I've tried:
>>>> >>>>
>>>> >>>> ~/Codes/petsc-dev/src/ksp/pc/impls/gamg>hg strip 22487:26ffb9eef17f
>>>> >>>> hg: unknown command 'strip'
>>>> >>>> 'strip' is provided by the following extension:
>>>> >>>>
>>>> >>>> mq manage a stack of patches
>>>> >>>>
>>>> >>>> use "hg help extensions" for information on enabling extensions
>>>> >>>>
>>>> >>>> But have not figured out how to load extensions.
>>>> >>>>
>>>> >>>> Mark
>>>> >>>>
>>>> >>>> On Mar 14, 2012, at 12:54 PM, John Mousel wrote:
>>>> >>>>
>>>> >>>>> Mark,
>>>> >>>>>
>>>> >>>>> I have a non-symmetric matrix. I am running with the following options.
>>>> >>>>>
>>>> >>>>> -pc_type gamg -pc_gamg_sym_graph -ksp_monitor_true_residual
>>>> >>>>>
>>>> >>>>> and with the inclusion of -pc_gamg_sym_graph, I get a new malloc error:
>>>> >>>>>
>>>> >>>>>
>>>> >>>>> 0]PETSC ERROR: --------------------- Error Message ------------------------------------
>>>> >>>>> [0]PETSC ERROR: Argument out of range!
>>>> >>>>> [0]PETSC ERROR: New nonzero at (5150,9319) caused a malloc!
>>>> >>>>> [0]PETSC ERROR: ------------------------------------------------------------------------
>>>> >>>>> [0]PETSC ERROR: Petsc Development HG revision: 587b25035091aaa309c87c90ac64c13408ecf34e HG Date: Wed Mar 14 09:22:54 2012 -0500
>>>> >>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>>> >>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>>> >>>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>>> >>>>> [0]PETSC ERROR: ------------------------------------------------------------------------
>>>> >>>>> [0]PETSC ERROR: ../JohnRepo/VFOLD_exe on a linux-deb named wv.iihr.uiowa.edu by jmousel Wed Mar 14 11:51:35 2012
>>>> >>>>> [0]PETSC ERROR: Libraries linked from /home/jmousel/NumericalLibraries/petsc-hg/petsc-dev/linux-debug/lib
>>>> >>>>> [0]PETSC ERROR: Configure run at Wed Mar 14 09:46:39 2012
>>>> >>>>> [0]PETSC ERROR: Configure options --download-blacs=1 --download-hypre=1 --download-metis=1 --download-ml=1 --download-mpich=1 --download-parmetis=1 --download-scalapack=1 --with-blas-lapack-dir=/opt/intel11/mkl/lib/em64t --with-cc=gcc --with-cmake=/usr/local/bin/cmake --with-cxx=g++ --with-fc=ifort PETSC_ARCH=linux-debug
>>>> >>>>> [0]PETSC ERROR: ------------------------------------------------------------------------
>>>> >>>>> [0]PETSC ERROR: MatSetValues_MPIAIJ() line 506 in /home/jmousel/NumericalLibraries/petsc-hg/petsc-dev/src/mat/impls/aij/mpi/mpiaij.c
>>>> >>>>> [0]PETSC ERROR: MatSetValues() line 1141 in /home/jmousel/NumericalLibraries/petsc-hg/petsc-dev/src/mat/interface/matrix.c
>>>> >>>>> [0]PETSC ERROR: scaleFilterGraph() line 155 in /home/jmousel/NumericalLibraries/petsc-hg/petsc-dev/src/ksp/pc/impls/gamg/tools.c
>>>> >>>>> [0]PETSC ERROR: PCGAMGgraph_AGG() line 865 in /home/jmousel/NumericalLibraries/petsc-hg/petsc-dev/src/ksp/pc/impls/gamg/agg.c
>>>> >>>>> [0]PETSC ERROR: PCSetUp_GAMG() line 516 in /home/jmousel/NumericalLibraries/petsc-hg/petsc-dev/src/ksp/pc/impls/gamg/gamg.c
>>>> >>>>> [0]PETSC ERROR: PCSetUp() line 832 in /home/jmousel/NumericalLibraries/petsc-hg/petsc-dev/src/ksp/pc/interface/precon.c
>>>> >>>>> [0]PETSC ERROR: KSPSetUp() line 261 in /home/jmousel/NumericalLibraries/petsc-hg/petsc-dev/src/ksp/ksp/interface/itfunc.c
>>>> >>>>> [0]PETSC ERROR: KSPSolve() line 385 in /home/jmousel/NumericalLibraries/petsc-hg/petsc-dev/src/ksp/ksp/interface/itfunc.c
>>>> >>>>>
>>>> >>>>>
>>>> >>>>> John
>>>> >>>>>
>>>> >>>>>
>>>> >>>>> On Wed, Mar 14, 2012 at 11:27 AM, Mark F. Adams <mark.adams at columbia.edu> wrote:
>>>> >>>>>
>>>> >>>>> On Mar 14, 2012, at 11:56 AM, John Mousel wrote:
>>>> >>>>>
>>>> >>>>>> Mark,
>>>> >>>>>>
>>>> >>>>>> The matrix is asymmetric. Does this require the setting of an option?
>>>> >>>>>
>>>> >>>>> Yes: -pc_gamg_sym_graph
>>>> >>>>>
>>>> >>>>> Mark
>>>> >>>>>
>>>> >>>>>> I pulled petsc-dev this morning, so I should have (at least close to) the latest code.
>>>> >>>>>>
>>>> >>>>>> John
>>>> >>>>>>
>>>> >>>>>> On Wed, Mar 14, 2012 at 10:54 AM, Mark F. Adams <mark.adams at columbia.edu> wrote:
>>>> >>>>>>
>>>> >>>>>> On Mar 14, 2012, at 11:08 AM, John Mousel wrote:
>>>> >>>>>>
>>>> >>>>>>> I'm getting the following error when using GAMG.
>>>> >>>>>>>
>>>> >>>>>>> petsc-dev/src/ksp/pc/impls/gamg/agg.c:508: smoothAggs: Assertion `sgid==-1' failed.
>>>> >>>>>>
>>>> >>>>>> Is it possible that your matrix is structurally asymmetric?
>>>> >>>>>>
>>>> >>>>>> This code is evolving fast and so you will need to move to the dev version if you are not already using it. (I think I fixed a bug that hit this assert).
>>>> >>>>>>
>>>> >>>>>>>
>>>> >>>>>>> When I try to alter the type of aggregation at the command line using -pc_gamg_type pa, I'm getting
>>>> >>>>>>>
>>>> >>>>>>> [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------
>>>> >>>>>>> [1]PETSC ERROR: Unknown type. Check for miss-spelling or missing external package needed for type:
>>>> >>>>>>> see http://www.mcs.anl.gov/petsc/documentation/installation.html#external!
>>>> >>>>>>> [1]PETSC ERROR: Unknown GAMG type pa given!
>>>> >>>>>>>
>>>> >>>>>>> Has there been a change in the aggregation options? I just pulled petsc-dev this morning.
>>>> >>>>>>>
>>>> >>>>>>
>>>> >>>>>> Yes, this option is gone now. You can use -pc_gamg_type agg for now.
>>>> >>>>>>
>>>> >>>>>> Mark
>>>> >>>>>>
>>>> >>>>>>> John
>>>> >>>>>>
>>>> >>>>>>
>>>> >>>>>
>>>> >>>>>
>>>> >>>>
>>>> >>>>
>>>> >>>
>>>> >>>
>>>> >>
>>>> >>
>>>> >
>>>> >
>>>>
>>>>
>>>
>>>
>>> <GAMG_kspview.txt><ML_kspview.txt><HYPRE_kspview.txt>
>>
>>
>> <GAMG_kspview.txt>
>
>
>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120315/0e22fb7b/attachment-0001.htm>
More information about the petsc-users
mailing list