[petsc-dev] Seeking Advice on Petsc Preconditioners to Try
Dave Nystrom
dnystrom1 at comcast.net
Wed Nov 30 00:41:41 CST 2011
I have a linear system in a code that I have interfaced to petsc that is
taking about 80 percent of the run time per timestep. This linear system is
a symmetric block banded matrix where the blocks are 2x2. The matrix looks
as follows:
1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0
1X X Y Y Y
2X X X Y Y Y
3 X X X Y Y Y
4 X X X Y Y Y
5 X X X Y Y Y
6 X X X Y Y Y
7 X X X Y Y Y
8 X X X Y Y Y
9 X X X Y Y Y
0 X X X Y Y Y
1 X X X Y Y Y
2 X X X Y Y Y
3Z X X X Y Y Y
4Z Z X X X Y Y Y
5Z Z Z X X X Y Y Y
6 Z Z Z X X X Y Y Y
7 Z Z Z X X X Y Y Y
8 Z Z Z X X X Y Y Y
9 Z Z Z X X X Y Y Y
0 Z Z Z X X X Y Y Y
So in my diagram above, X, Y and Z are 2x2 blocks. The symmetry of the
matrix requires that X_ij = transpose(X_ji) and Y_ij = transpose(Z_ji). So
far, I have just input this matrix to petsc without indicating that it was
block banded with 2x2 blocks. I have also not told petsc that the matrix is
symmetric. And I have allowed petsc to decide the best way to store the
matrix.
I can solve this linear system over the course of a run using -ksp_type
preonly -pc_type lu. But that will not scale very well to larger problems
that I want to solve. I can also solve this system over the course of a run
using -ksp_type cg -pc_type jacobi -vec_type cusp -mat_type aijcusp.
However, over the course of a run, the iteration count ranges from 771 to
47300. I have also tried sacusp, ainvcusp, sacusppoly, ilu(k) and icc(k)
with k=0. The sacusppoly preconditioner fails because of a thrust error
related to an invalid device pointer, if I am remembering correctly. I
reported this problem to petsc-maint a while back and have also reported it
for the cusp bugtracker. But it does not appear that anyone has really
looked into the bug. For the other preconditioners of sacusp, ilu(k) and
icc(k), they do not result in convergence to a solution and the runs fail.
I'm wondering if there are suggestions of other preconditioners in petsc that
I should try. The only third party package that I have tried is the
txpetscgpu package. I have not tried hypre or any of the multigrid
preconditioners yet. I'm not sure how difficult it is to try those
packages. Anyway, so far I have not found a preconditioner available in
petsc that provides a robust solution to this problem and would be interested
in any suggestions that anyone might have of things to try.
I'd be happy to provide additional info and am planning on packaging up a
couple of examples of the matrix and rhs for people I am interacting with at
Tech-X and EMPhotonics. So I'd be happy to provide the matrix examples for
this forum as well if anyone wants a copy.
Thanks,
Dave
--
Dave Nystrom
phone: 505-661-9943 (home office)
505-662-6893 (home)
skype: dave.nystrom76
email: dnystrom1 at comcast.net
smail: 219 Loma del Escolar
Los Alamos, NM 87544
More information about the petsc-dev
mailing list