Symmetric matrices.

billy at dem.uminho.pt billy at dem.uminho.pt
Mon Nov 13 20:00:27 CST 2006



Does the direct solver work in parallel? Because when I try it with more than 1
processor, I get:

[0]PETSC ERROR: MatCholeskyFactorSymbolic() line 2319 in src/mat/interface/matrix.c
[0]PETSC ERROR: No support for this operation for this object type!
[0]PETSC ERROR: Mat type mpisbaij!
[0]PETSC ERROR: PCSetUp_Cholesky() line 237 in
src/ksp/pc/impls/factor/cholesky/cholesky.c
[0]PETSC ERROR: PCSetUp() line 798 in src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSPSetUp() line 234 in src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: KSPSolve() line 334 in src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: MatCholeskyFactorSymbolic() line 2319 in src/mat/interface/matrix.c
[1]PETSC ERROR: No support for this operation for this object type!
[1]PETSC ERROR: Mat type mpisbaij!
[1]PETSC ERROR: PCSetUp_Cholesky() line 237 in
src/ksp/pc/impls/factor/cholesky/cholesky.c
[1]PETSC ERROR: PCSetUp() line 798 in src/ksp/pc/interface/precon.c
[1]PETSC ERROR: KSPSetUp() line 234 in src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: KSPSolve() line 334 in src/ksp/ksp/interface/itfunc.c


Billy.

Quoting Hong Zhang <hzhang at mcs.anl.gov>:

> 
> > Which solvers are more appropriate for symmetric matrices?
> 
> Iterative solvers: ksp_type cg, pc_type icc
> Direct solvers:    ksp_type preonly, pc_type cc
> 
> Non-symmetric solvers also work, but the above solvers
> are more efficient in general.
> 
> You can run petsc code with various ksp/pc combinations,
> and use the option '-log_summary' to evaluate the performance.
> Using '-help' to see all solver options.
> 
> Hong
> 
> 





More information about the petsc-users mailing list