[petsc-dev] Seeking Advice on Petsc Preconditioners to Try

Dave Nystrom Dave.Nystrom at tachyonlogic.com
Fri Dec 2 18:18:31 CST 2011


Hi Barry,

Barry Smith writes:
 > Dave,
 > 
 > Does this come from a structured 2d grid?  If so, then in addition to
 > algebraic multigrid you could consider geometric multigrid which could
 > work very well.

Yes, this problem comes from spatial discretization on a uniform structured
2d grid.  Some of my other comments on this thread today may be of interest
to you.

 > Do you want to run in parallel, how are you generating the matrix and
 > managing the mesh, if you use the DMDA object it may be easy to use
 > geometric multigrid see the examples src/ksp/ksp/examples/tutorials/
 > ex34.c for linear problems and src/snes/examples/tutorials/ex5.c for
 > examples that use DMDA.

In principle, I believe we want to run in parallel but could get a lot of
mileage out of improving our single node performance.  I'm not sure if we
will need to extend this code to 3d and if we stick with 2d, we might be able
to solve 2d problems with sufficient resolution just on a single node.  The
immediate focus is to see what we can do on a single node without mpi.
However, one of the motivations for interfacing to petsc was to be able to
leverage the extensive mpi capabilities that petsc has.

I have not yet learned about DMDA.  Petsc is such a large and extensive
toolbox that I have not gotten that far yet.  Also, we have been focused on
what we could do on a single multi-core and/or multi-core, multi-socked node
with and without gpu acceleration.  I have been encouraged about some of the
results I have gotten with gpu acceleration on the
src/ksp/ksp/examples/tutorials/ex2.c problem.  Anyway, it looks like I should
be investigating geometric multigrid according to you and Mark.

Thanks,

Dave

 > Barry
 > 
 > On Nov 30, 2011, at 7:15 AM, Matthew Knepley wrote:
 > 
 > > On Wed, Nov 30, 2011 at 12:41 AM, Dave Nystrom <dnystrom1 at comcast.net> wrote:
 > > I have a linear system in a code that I have interfaced to petsc that is
 > > taking about 80 percent of the run time per timestep.  This linear system is
 > > a symmetric block banded matrix where the blocks are 2x2.  The matrix looks
 > > as follows:
 > > 
 > >  1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0
 > > 1X X                     Y Y Y
 > > 2X X X                     Y Y Y
 > > 3  X X X                     Y Y Y
 > > 4    X X X                     Y Y Y
 > > 5      X X X                     Y Y Y
 > > 6        X X X                     Y Y Y
 > > 7          X X X                     Y Y Y
 > > 8            X X X                     Y Y Y
 > > 9              X X X                     Y Y Y
 > > 0                X X X                     Y Y Y
 > > 1                  X X X                     Y Y Y
 > > 2                    X X X                     Y Y Y
 > > 3Z                     X X X                     Y Y Y
 > > 4Z Z                     X X X                     Y Y Y
 > > 5Z Z Z                     X X X                     Y Y Y
 > > 6  Z Z Z                     X X X                     Y Y Y
 > > 7    Z Z Z                     X X X                     Y Y Y
 > > 8      Z Z Z                     X X X                     Y Y Y
 > > 9        Z Z Z                     X X X                     Y Y Y
 > > 0          Z Z Z                     X X X                     Y Y Y
 > > 
 > > So in my diagram above, X, Y and Z are 2x2 blocks.  The symmetry of the
 > > matrix requires that X_ij = transpose(X_ji) and Y_ij = transpose(Z_ji).  So
 > > far, I have just input this matrix to petsc without indicating that it was
 > > block banded with 2x2 blocks.  I have also not told petsc that the matrix is
 > > symmetric.  And I have allowed petsc to decide the best way to store the
 > > matrix.
 > > 
 > > I can solve this linear system over the course of a run using -ksp_type
 > > preonly -pc_type lu.  But that will not scale very well to larger problems
 > > that I want to solve.  I can also solve this system over the course of a run
 > > using -ksp_type cg -pc_type jacobi -vec_type cusp -mat_type aijcusp.
 > > However, over the course of a run, the iteration count ranges from 771 to
 > > 47300.  I have also tried sacusp, ainvcusp, sacusppoly, ilu(k) and icc(k)
 > > with k=0.  The sacusppoly preconditioner fails because of a thrust error
 > > related to an invalid device pointer, if I am remembering correctly.  I
 > > reported this problem to petsc-maint a while back and have also reported it
 > > for the cusp bugtracker.  But it does not appear that anyone has really
 > > looked into the bug.  For the other preconditioners of sacusp, ilu(k) and
 > > icc(k), they do not result in convergence to a solution and the runs fail.
 > > 
 > > All preconditioners are custom. Have you done a literature search for PCs
 > > known to work for this problem? Can yu say anything about the spectrum of the
 > > operator? conditioning? what is the principal symbol (if its a PDE)? The pattern
 > > is not enough to recommend a PC.
 > > 
 > >    Matt
 > >  
 > > I'm wondering if there are suggestions of other preconditioners in petsc that
 > > I should try.  The only third party package that I have tried is the
 > > txpetscgpu package.  I have not tried hypre or any of the multigrid
 > > preconditioners yet.  I'm not sure how difficult it is to try those
 > > packages.  Anyway, so far I have not found a preconditioner available in
 > > petsc that provides a robust solution to this problem and would be interested
 > > in any suggestions that anyone might have of things to try.
 > > 
 > > I'd be happy to provide additional info and am planning on packaging up a
 > > couple of examples of the matrix and rhs for people I am interacting with at
 > > Tech-X and EMPhotonics.  So I'd be happy to provide the matrix examples for
 > > this forum as well if anyone wants a copy.
 > > 
 > > Thanks,
 > > 
 > > Dave
 > > 
 > > --
 > > Dave Nystrom
 > > 
 > > phone: 505-661-9943 (home office)
 > >       505-662-6893 (home)
 > > skype: dave.nystrom76
 > > email: dnystrom1 at comcast.net
 > > smail: 219 Loma del Escolar
 > >       Los Alamos, NM 87544
 > > 
 > > 
 > > 
 > > -- 
 > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
 > > -- Norbert Wiener
 > 



More information about the petsc-dev mailing list