[petsc-users] Use block Jacobi preconditioner with SNES

Jed Brown jed at jedbrown.org
Sun Aug 26 17:08:06 CDT 2018


Okay, interesting.  I take it you either are not running in parallel or
need to have several subdomains (of varying size) per process.

One approach would be to use PCASM (with zero overlap, it is equivalent
to Block Jacobi) and then use -mat_partitioning_type to select a
partitioner (could be a general graph partitioner or could by a custom
implementation that you provide).  I don't know if this would feel
overly clumsy or ultimately be a cleaner and more generic approach.

Ali Reza Khaz'ali <arkhazali at cc.iut.ac.ir> writes:

> Dear Barry and Jed,
>
> Thanks for your great suggestions. I set the options as Jed proposed and 
> it worked. However, Block Jacobi preconditioner treats the entire 
> Jacobian matrix as only one giant block. It is good in small systems, 
> but in larger ones, it requires a lot of memory and runtime, as direct 
> methods do.  In addition, I cannot use -pc_bjacobi_blocks option, 
> because the block sizes of the Jacobain matrix are variable (In my 
> sample code, SadeqSize array contain block sizes), and apparently 
> -pc_bjacobi_blocks option assumes the block sizes are equal.
> I have to add an explanation about unusual high tolerances in my code 
> sample, in fact, I am facing a serious ill-conditioned problem. The 
> direct methods (like LU) solve the system without problem, but I could 
> not find any preconditioner+iterative solver that can solve it with the 
> required precision.  Since direct solvers are useless in large systems, 
> I had to design an specific preconditioner for it (I wrote a paper about 
> it: https://link.springer.com/article/10.1007/s10596-014-9421-3), but 
> the designed preconditioner was rather slow (even on GPU), and I hope 
> that I can find a faster solution in PETSc. So for now I am searching 
> for a combination that just works for me, and then, I'll refine it.
>
> Best wishes,
> Ali
>
> On 8/25/2018 4:31 AM, Smith, Barry F. wrote:
>>     What you would like to do is reasonable but unfortunately the order of the operations in SNES/KSP means that the KSPSetUp() cannot be called before the SNESSolve() because the Jacobian matrix has yet to be provided to KSP. Thus you have to, as Jed suggested, use the options database to set the options. Note that if you want the options "hardwired" into the code you can use PetscOptionsSetValue() from the code.
>>
>>    Barry
>>
>>
>>> On Aug 24, 2018, at 1:48 PM, Ali Reza Khaz'ali <arkhazali at cc.iut.ac.ir> wrote:
>>>
>>> Hi,
>>>
>>> I am trying to use block Jacobi preconditioner in SNES (SNESNEWTONLS). However, PCBJacobiGetSubKSP function returns an error stating "Object is in wrong state, Must call KSPSetUp() or PCSetUp() first". When I add KSPSetUp, I got and error from them as: "Must call DMShellSetGlobalVector() or DMShellSetCreateGlobalVector()", and if PCSetUp is added, "Object is in wrong state, Matrix must be set first" error is printed.
>>>
>>> Below is a part of my code. It is run serially. Any help is much appreciated.
>>>
>>>      ierr = SNESGetKSP(snes, &Petsc_ksp);
>>>      CHKERRQ(ierr);
>>>      ierr = KSPGetPC(Petsc_ksp, &Petsc_pc);
>>>      CHKERRQ(ierr);
>>>      ierr = KSPSetTolerances(Petsc_ksp, 1.e-3, 1e-3, PETSC_DEFAULT, 20000);
>>>      CHKERRQ(ierr);
>>>      ierr = SNESSetTolerances(snes, 1e-1, 1e-1, 1e-1, 2000, 2000);
>>>      CHKERRQ(ierr);
>>>      ierr = SNESSetType(snes, SNESNEWTONLS);
>>>      CHKERRQ(ierr);
>>>      ierr = KSPSetType(Petsc_ksp, KSPGMRES);
>>>      CHKERRQ(ierr);
>>>      ierr = PCSetType(Petsc_pc, PCBJACOBI);
>>>      CHKERRQ(ierr);
>>>      ierr = PCSetType(Petsc_pc, PCBJACOBI);
>>>      CHKERRQ(ierr);
>>>      ierr = PCBJacobiSetTotalBlocks(Petsc_pc, 2*Nx*Ny*Nz, SadeqSize);
>>>      CHKERRQ(ierr);
>>>
>>>      SNESSetUp(snes);
>>>      CHKERRQ(ierr);
>>>      ierr = PCBJacobiGetSubKSP(Petsc_pc, &nLocal, &Firstly, &subKSP);
>>>      CHKERRQ(ierr);
>>>
>>>      for (i = 0; i < nLocal; i++) {
>>>          ierr = KSPGetPC(subKSP[i], &SubPc);
>>>          CHKERRQ(ierr);
>>>          ierr = PCSetType(SubPc, PCLU);
>>>          CHKERRQ(ierr);
>>>          ierr = PCFactorSetMatSolverPackage(SubPc, "mkl_pardiso");
>>>          CHKERRQ(ierr);
>>>          ierr = KSPSetType(subKSP[i], KSPPREONLY);
>>>          CHKERRQ(ierr);
>>>          ierr = KSPSetTolerances(subKSP[i], 1.e-6, PETSC_DEFAULT, PETSC_DEFAULT, PETSC_DEFAULT);
>>>          CHKERRQ(ierr);
>>>      }
>>>      ierr = SNESSolve(snes, NULL, Petsc_X);
>>>      CHKERRQ(ierr);
>>>
>>>
>>>
>>> -- 
>>> Ali Reza Khaz’ali
>>> Assistant Professor of Petroleum Engineering,
>>> Department of Chemical Engineering
>>> Isfahan University of Technology
>>> Isfahan, Iran
>>>


More information about the petsc-users mailing list