[petsc-dev] Seeking Advice on Petsc Preconditioners to Try

Mark F. Adams mark.adams at columbia.edu
Tue Dec 20 12:59:19 CST 2011


On Dec 20, 2011, at 10:33 AM, Dave Nystrom wrote:

> Hi Mark,
> 
> I would like to try GAMG on some of my linear solves.  Could you suggest how
> to get started?  Is it more complicated than something like:
> 
> -ksp_type cg -pc_type gamg

This is a good start.  for scalar SPD problems '-pc_gamg_type sa' is good (this might be the default, use -help to see.

Mark

> 
> I'm guessing I should first try it on one of my easier linear solves.  I have
> 5 of them that would have a block size of 1.  Are the other GAMG option
> defaults good to start with or should I be trying to configure them as well?
> If so, I'm not familiar enough with multigrid to know off hand how to do
> that.
> 
> Thanks,
> 
> Dave
> 
> Mark F. Adams writes:
>> 
>> On Dec 2, 2011, at 6:06 PM, Dave Nystrom wrote:
>> 
>>> Mark F. Adams writes:
>>>> It sounds like you have a symmetric positive definite systems like du/dt -
>>>> div(alpha(x) grad)u.  The du/dt term makes the systems easier to solve.
>>>> I'm guessing your hard system does not have this mass term and so is
>>>> purely elliptic.  Multigrid is well suited for this type of problem, but
>>>> the vector nature requires some thought.  You could use PETSc AMG -pc_type
>>>> gamg but you need to tell it that you have a system of two dof/vertex.
>>>> You can do that with something like:
>>>> 
>>>> ierr = MatSetBlockSize( mat, 2 );      CHKERRQ(ierr);
>>>> 
>>>> For the best results from GAMG you need to give it null space information
>>>> but we can worry about that later.
>>> 
>>> Hi Mark,
>>> 
>>> I have been interested in trying some of the multigrid capabilities in
>>> petsc.  I'm not sure I remember seeing GAMG so I guess I should go look for
>>> that.  
>> 
>> GAMG is pretty new.
>> 
>>> I have tried sacusp and sacusppoly but did not get good results on
>>> this particular linear system.  
>>> In particular, sacusppoly seems broken.  I
>>> can't get it to work even with the petsc src/ksp/ksp/examples/tutorials/ex2.c
>>> example.  Thrust complains about an invalid device pointer I believe.
>>> Anyway, I can get the other preconditioners to work just fine on this petsc
>>> example problem.  When I try sacusp on this matrix for the case of generating
>>> a rhs from a known solution vector, the computed solution seems to diverge
>>> from the exact solution.  We also have an interface to an external agmg
>>> package which is not able to solve this problem
>>> but works well on the other 5
>>> linear solves.  So I'd like to try more from the multigrid toolbox but do not
>>> know much about how to supply the extra stuff that these packages often need.
>>> 
>>> So, it sounds like you are suggesting that I try gamg and that I could at
>>> least try it out without having to initially supply lots of additional info.
>>> So I will take a look at gamg.
>>> 
>> 
>> There are many things that can break a solver but most probably want to know that its a system so if you can set the block size and try gamg then that would be a good start.
>> 
>> Mark
>> 
>>> Thanks,
>>> 
>>> Dave
>>> 
>>>> Mark
>>>> 
>>>> On Nov 30, 2011, at 8:15 AM, Matthew Knepley wrote:
>>>> 
>>>>> On Wed, Nov 30, 2011 at 12:41 AM, Dave Nystrom <dnystrom1 at comcast.net> wrote:
>>>>> I have a linear system in a code that I have interfaced to petsc that is
>>>>> taking about 80 percent of the run time per timestep.  This linear system is
>>>>> a symmetric block banded matrix where the blocks are 2x2.  The matrix looks
>>>>> as follows:
>>>>> 
>>>>> 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0
>>>>> 1X X                     Y Y Y
>>>>> 2X X X                     Y Y Y
>>>>> 3  X X X                     Y Y Y
>>>>> 4    X X X                     Y Y Y
>>>>> 5      X X X                     Y Y Y
>>>>> 6        X X X                     Y Y Y
>>>>> 7          X X X                     Y Y Y
>>>>> 8            X X X                     Y Y Y
>>>>> 9              X X X                     Y Y Y
>>>>> 0                X X X                     Y Y Y
>>>>> 1                  X X X                     Y Y Y
>>>>> 2                    X X X                     Y Y Y
>>>>> 3Z                     X X X                     Y Y Y
>>>>> 4Z Z                     X X X                     Y Y Y
>>>>> 5Z Z Z                     X X X                     Y Y Y
>>>>> 6  Z Z Z                     X X X                     Y Y Y
>>>>> 7    Z Z Z                     X X X                     Y Y Y
>>>>> 8      Z Z Z                     X X X                     Y Y Y
>>>>> 9        Z Z Z                     X X X                     Y Y Y
>>>>> 0          Z Z Z                     X X X                     Y Y Y
>>>>> 
>>>>> So in my diagram above, X, Y and Z are 2x2 blocks.  The symmetry of the
>>>>> matrix requires that X_ij = transpose(X_ji) and Y_ij = transpose(Z_ji).  So
>>>>> far, I have just input this matrix to petsc without indicating that it was
>>>>> block banded with 2x2 blocks.  I have also not told petsc that the matrix is
>>>>> symmetric.  And I have allowed petsc to decide the best way to store the
>>>>> matrix.
>>>>> 
>>>>> I can solve this linear system over the course of a run using -ksp_type
>>>>> preonly -pc_type lu.  But that will not scale very well to larger problems
>>>>> that I want to solve.  I can also solve this system over the course of a run
>>>>> using -ksp_type cg -pc_type jacobi -vec_type cusp -mat_type aijcusp.
>>>>> However, over the course of a run, the iteration count ranges from 771 to
>>>>> 47300.  I have also tried sacusp, ainvcusp, sacusppoly, ilu(k) and icc(k)
>>>>> with k=0.  The sacusppoly preconditioner fails because of a thrust error
>>>>> related to an invalid device pointer, if I am remembering correctly.  I
>>>>> reported this problem to petsc-maint a while back and have also reported it
>>>>> for the cusp bugtracker.  But it does not appear that anyone has really
>>>>> looked into the bug.  For the other preconditioners of sacusp, ilu(k) and
>>>>> icc(k), they do not result in convergence to a solution and the runs fail.
>>>>> 
>>>>> All preconditioners are custom. Have you done a literature search for PCs
>>>>> known to work for this problem? Can yu say anything about the spectrum of the
>>>>> operator? conditioning? what is the principal symbol (if its a PDE)? The pattern
>>>>> is not enough to recommend a PC.
>>>>> 
>>>>>  Matt
>>>>> 
>>>>> I'm wondering if there are suggestions of other preconditioners in petsc that
>>>>> I should try.  The only third party package that I have tried is the
>>>>> txpetscgpu package.  I have not tried hypre or any of the multigrid
>>>>> preconditioners yet.  I'm not sure how difficult it is to try those
>>>>> packages.  Anyway, so far I have not found a preconditioner available in
>>>>> petsc that provides a robust solution to this problem and would be interested
>>>>> in any suggestions that anyone might have of things to try.
>>>>> 
>>>>> I'd be happy to provide additional info and am planning on packaging up a
>>>>> couple of examples of the matrix and rhs for people I am interacting with at
>>>>> Tech-X and EMPhotonics.  So I'd be happy to provide the matrix examples for
>>>>> this forum as well if anyone wants a copy.
>>>>> 
>>>>> Thanks,
>>>>> 
>>>>> Dave
>>>>> 
>>>>> --
>>>>> Dave Nystrom
>>>>> 
>>>>> phone: 505-661-9943 (home office)
>>>>>     505-662-6893 (home)
>>>>> skype: dave.nystrom76
>>>>> email: dnystrom1 at comcast.net
>>>>> smail: 219 Loma del Escolar
>>>>>     Los Alamos, NM 87544
>>>>> 
>>>>> 
>>>>> 
>>>>> -- 
>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>>>> -- Norbert Wiener
>>>> 
>>> 
>> 
> 




More information about the petsc-dev mailing list