[petsc-users] Parallel Incomplete Choleski Factorization

Hong Zhang hzhang at mcs.anl.gov
Tue Jul 17 13:03:32 CDT 2012


Michele :

>
> I have some problems with the block jacobi preconditioner.
> I am solving a 3D  Poisson equation with periodic BCs, discretized by
> using finite differences (7-points stencil).
> Thus the problem is singular and the nullspace has to be removed.
>

For  Poisson equations, multigrid precondition should be the method of
choice.

If I solve with the PCG method + JACOBI preconditioner the results are fine.
> If I use PCG + Block Jacobi preconditioner + ICC on each block the results
> are fine on the majority of the processors,
> but on few of them the error is very large.
>

How do you know " few of them"?

Do you have any idea/suggestions on how to fix this problem?
> This is the fragment of code I am using ( petsc 3.1 and Fortran 90):
>

Please update to petsc-3.3. petsc-3.1 is too old.

>
>     PetscErrorCode                petsc_err
>     Mat                                         A
>     PC                                           pc, subpc
>     KSP                                         ksp
>     KSP                                         subksp(1)
>     :
>     :
>     :
>     call KSPCreate(PETSC_COMM_WORLD,ksp,petsc_err)
>     call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,petsc_err)
>

call KSPSetType(ksp,KSPCG, ) !the default type is gmres. I guess you want CG

    call KSPGetPC(ksp,pc,petsc_err)
>     call PCSetType(pc,PCBJACOBI,petsc_err)
> !    call KSPSetUp(ksp,petsc_err)    call this at the end
>
>     ! KSP context for each single block
>     call
> PCBJacobiGetSubKSP(pc,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,subksp(1),petsc_err)
>
>     call KSPGetPC(subksp(1),subpc,petsc_err)
>     call PCSetType(subpc,PCICC,petsc_err)
>


>     call KSPSetType(subksp(1),KSPPREONLY petsc_err)
>


>     call KSPSetTolerances(subksp(1),tol ,PETSC_DEFAULT_DOUBLE_PRECISION,&
>          & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,petsc_err)
>
>    ! Remove nullspace from the singular system (Check PETSC_NULL)
>     call
> MatNullSpaceCreate(MPI_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,nullspace,petsc_err)
>     call KSPSetNullSpace(ksp, nullspace, petsc_err)
>     call MatNullSpaceRemove(nullspace, b, PETSC_NULL,petsc_err)
>
>     call KSPSolve(ksp,b,x,petsc_err)
>

I modified your code slightly. All these options can be provided at runtime:
'-ksp_type cg -pc_type bjacobi -sub_pc_type icc'

Hong

>
>
>
>
>
>
>
>
>
>  On 07/13/2012 12:14 PM, Hong Zhang wrote:
>
> Michele :
>
>>
>> I need to use the ICC factorization as preconditioner, but I noticed that
>> no parallel version is supported.
>> Is that correct?
>>
> Correct.
>
>
>>  If so, is there a work around, like building  the preconditioner "by
>> hand" by using PETSc functions?
>>
> You may try block jacobi with icc in the blocks  '-ksp_type cg -pc_type
> bjacobi -sub_pc_type icc'
>
>  Hong
>
>>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120717/afe5d8c4/attachment-0001.html>


More information about the petsc-users mailing list