[petsc-dev] Should -pc_type mg be a linear preconditioner by default?

Jed Brown jedbrown at mcs.anl.gov
Fri Oct 28 15:14:10 CDT 2011


On Thu, Oct 27, 2011 at 18:14, Barry Smith <bsmith at mcs.anl.gov> wrote:

> Why not use Eisentat trick SOR with Chebychev? Much cheaper than SOR plus
> separate Matrix vector productions.


Hmm, the MatShell tricks that Eisenstat is doing are interfering with the
eigenvalue estimation in Chebychev. I'm not sure how the
PC{Pre,Post}Solve_Eisenstat is supposed to interact with KSPChebychev's
reuse of the same preconditioning context for eignvalue estimation. Could
you have a look at that?

$ ./ex2 -ksp_type chebychev -ksp_chebychev_estimate_eigenvalues 1,0,0,1.1
-pc_type eisenstat
[0]PETSC ERROR: --------------------- Error Message
------------------------------------
[0]PETSC ERROR: No support for this operation for this object type!
[0]PETSC ERROR: Cannot have different mat and pmat!
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Development HG revision:
711caad7ce0e7d4332772ac626a1541284bd2edb  HG Date: Thu Oct 20 09:59:19 2011
-0700
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: ./ex2 on a ompi named batura by jed Fri Oct 28 14:06:44 2011
[0]PETSC ERROR: Libraries linked from /home/jed/petsc/ompi/lib
[0]PETSC ERROR: Configure run at Thu Oct 20 14:11:15 2011
[0]PETSC ERROR: Configure options --download-blacs --download-hypre
--download-ml --download-mumps --download-parmetis --download-parms
--download-scalapack --download-spai --download-sundials --download-superlu
--download-superlu_dist --download-umfpack --with-parmetis-dir=/usr
--with-shared-libraries --with-single-library=0 --with-sowing
PETSC_ARCH=ompi
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: PCPreSolve_Eisenstat() line 62 in
/home/jed/petsc/src/ksp/pc/impls/eisens/eisen.c
[0]PETSC ERROR: PCPreSolve() line 1381 in
/home/jed/petsc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSPSolve() line 410 in
/home/jed/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: KSPSolve_Chebychev() line 209 in
/home/jed/petsc/src/ksp/ksp/impls/cheby/cheby.c
[0]PETSC ERROR: KSPSolve() line 429 in
/home/jed/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: main() line 199 in src/ksp/ksp/examples/tutorials/ex2.c
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 56.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------


> In fact why not do that also in the symmetric case?


It's unnecessary, Richardson/SOR or Chebychev/Jacobi are both fine. The
latter is much nicer if we have a GPU. But I suppose we could do both just
to make things more predictable.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111028/c9c2cbf5/attachment.html>


More information about the petsc-dev mailing list