[petsc-users] Parallel Incomplete Choleski Factorization

Michele Rosso mrosso at uci.edu
Wed Jul 18 12:35:07 CDT 2012


Thank you.

Michele

On 07/18/2012 09:09 AM, Satish Balay wrote:
> Its now useable on Intrepid with PETSC_DIR=/soft/apps/libraries/petsc/3.3-p2/xl-opt
>
> Check /soft/apps/libraries/petsc/README
>
> Satish
>
> On Tue, 17 Jul 2012, Michele Rosso wrote:
>
>> Thank a lot.
>>
>> Please let me know when version 3.3 is available.
>>
>> Michele
>>
>> On 07/17/2012 12:13 PM, Barry Smith wrote:
>>>>> Please update to petsc-3.3. petsc-3.1 is too old.
>>>>       I would do that but the version installed on the platform (Intrepid
>>>> at ALCF) I am working on is 3.1-p2.
>>>      Satish,
>>>
>>>         Please fix this.
>>>
>>>        Thanks
>>>
>>>        Barry
>>>
>>> On Jul 17, 2012, at 1:36 PM, Michele Rosso wrote:
>>>
>>>> On 07/17/2012 11:03 AM, Hong Zhang wrote:
>>>>> Michele :
>>>>>
>>>>> I have some problems with the block jacobi preconditioner.
>>>>> I am solving a 3D  Poisson equation with periodic BCs, discretized by
>>>>> using finite differences (7-points stencil).
>>>>> Thus the problem is singular and the nullspace has to be removed.
>>>>>
>>>>> For  Poisson equations, multigrid precondition should be the method of
>>>>> choice.
>>>> Thank you for the suggestion. I do not have any experience with multigrid,
>>>> but I will try.
>>>>> If I solve with the PCG method + JACOBI preconditioner the results are
>>>>> fine.
>>>>> If I use PCG + Block Jacobi preconditioner + ICC on each block the
>>>>> results are fine on the majority of the processors,
>>>>> but on few of them the error is very large.
>>>>>    How do you know " few of them"?
>>>>    Basically the solution is not correct on some grid points, say 6 grid
>>>> nodes out of 64^3. The 6 grid nodes with problems belongs to 2 of the 128
>>>> processors
>>>> I am using.
>>>>> Do you have any idea/suggestions on how to fix this problem?
>>>>> This is the fragment of code I am using ( petsc 3.1 and Fortran 90):
>>>>>    Please update to petsc-3.3. petsc-3.1 is too old.
>>>>       I would do that but the version installed on the platform (Intrepid
>>>> at ALCF) I am working on is 3.1-p2.
>>>>
>>>>>       PetscErrorCode                petsc_err
>>>>>       Mat                                         A
>>>>>       PC                                           pc, subpc
>>>>>       KSP                                         ksp
>>>>>       KSP                                         subksp(1)
>>>>>       :
>>>>>       :
>>>>>       :
>>>>>       call KSPCreate(PETSC_COMM_WORLD,ksp,petsc_err)
>>>>>       call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,petsc_err)
>>>>>    call KSPSetType(ksp,KSPCG, ) !the default type is gmres. I guess you
>>>>> want CG
>>>>>
>>>>>       call KSPGetPC(ksp,pc,petsc_err)
>>>>>       call PCSetType(pc,PCBJACOBI,petsc_err)
>>>>> !    call KSPSetUp(ksp,petsc_err)    call this at the end
>>>>>            ! KSP context for each single block
>>>>>       call
>>>>> PCBJacobiGetSubKSP(pc,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,subksp(1),petsc_err)
>>>>>       call KSPGetPC(subksp(1),subpc,petsc_err)
>>>>>       call PCSetType(subpc,PCICC,petsc_err)
>>>>>         call KSPSetType(subksp(1),KSPPREONLY petsc_err)
>>>>>         call KSPSetTolerances(subksp(1),tol
>>>>> ,PETSC_DEFAULT_DOUBLE_PRECISION,&
>>>>>            &
>>>>> PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,petsc_err)
>>>>>
>>>>>      ! Remove nullspace from the singular system (Check PETSC_NULL)
>>>>>       call
>>>>> MatNullSpaceCreate(MPI_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,nullspace,petsc_err)
>>>>>       call KSPSetNullSpace(ksp, nullspace, petsc_err)
>>>>>       call MatNullSpaceRemove(nullspace, b, PETSC_NULL,petsc_err)
>>>>>
>>>>>       call KSPSolve(ksp,b,x,petsc_err)
>>>>>
>>>>> I modified your code slightly. All these options can be provided at
>>>>> runtime:
>>>>> '-ksp_type cg -pc_type bjacobi -sub_pc_type icc'
>>>>>
>>>>> Hong
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On 07/13/2012 12:14 PM, Hong Zhang wrote:
>>>>>> Michele :
>>>>>>
>>>>>> I need to use the ICC factorization as preconditioner, but I noticed
>>>>>> that no parallel version is supported.
>>>>>> Is that correct?
>>>>>> Correct.
>>>>>>    If so, is there a work around, like building  the preconditioner "by
>>>>>> hand" by using PETSc functions?
>>>>>> You may try block jacobi with icc in the blocks  '-ksp_type cg
>>>>>> -pc_type bjacobi -sub_pc_type icc'
>>>>>>
>>>>>> Hong
>>>>>>
>>>>>
>>
>>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120718/0c2c82ed/attachment.html>


More information about the petsc-users mailing list