[petsc-users] Parallel Incomplete Choleski Factorization

Michele Rosso mrosso at uci.edu
Tue Jul 17 12:43:52 CDT 2012


Hi Hong,

I have some problems with the block jacobi preconditioner.
I am solving a 3D  Poisson equation with periodic BCs, discretized by 
using finite differences (7-points stencil).
Thus the problem is singular and the nullspace has to be removed.
If I solve with the PCG method + JACOBI preconditioner the results are fine.
If I use PCG + Block Jacobi preconditioner + ICC on each block the 
results are fine on the majority of the processors,
but on few of them the error is very large.
Do you have any idea/suggestions on how to fix this problem?
This is the fragment of code I am using ( petsc 3.1 and Fortran 90):

     PetscErrorCode                petsc_err
     Mat                                         A
     PC                                           pc, subpc
     KSP                                         ksp
     KSP                                         subksp(1)
     :
     :
     :
     call KSPCreate(PETSC_COMM_WORLD,ksp,petsc_err)
     call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,petsc_err)
     call KSPGetPC(ksp,pc,petsc_err)
     call PCSetType(pc,PCBJACOBI,petsc_err)
     call KSPSetUp(ksp,petsc_err)

     ! KSP context for each single block
     call 
PCBJacobiGetSubKSP(pc,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,subksp(1),petsc_err) 

     call KSPGetPC(subksp(1),subpc,petsc_err)
     call PCSetType(subpc,PCICC,petsc_err)
     call KSPSetType(subksp(1),KSPCG, petsc_err)
     call KSPSetTolerances(subksp(1),tol ,PETSC_DEFAULT_DOUBLE_PRECISION,&
          & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,petsc_err)

    ! Remove nullspace from the singular system (Check PETSC_NULL)
     call 
MatNullSpaceCreate(MPI_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,nullspace,petsc_err)
     call KSPSetNullSpace(ksp, nullspace, petsc_err)
     call MatNullSpaceRemove(nullspace, b, PETSC_NULL,petsc_err)

     call KSPSolve(ksp,b,x,petsc_err)



Thank you,

Michele







On 07/13/2012 12:14 PM, Hong Zhang wrote:
> Michele :
>
>
>     I need to use the ICC factorization as preconditioner, but I
>     noticed that no parallel version is supported.
>     Is that correct?
>
> Correct.
>
>     If so, is there a work around, like building  the preconditioner
>     "by hand" by using PETSc functions?
>
> You may try block jacobi with icc in the blocks  '-ksp_type cg 
> -pc_type bjacobi -sub_pc_type icc'
>
> Hong
>
>


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120717/2d7cd0fd/attachment.html>


More information about the petsc-users mailing list