[petsc-users] Use block Jacobi preconditioner with SNES

Dave May dave.mayhem23 at gmail.com
Mon Aug 27 01:20:52 CDT 2018


If none of the suggestions provided is to your taste, why not just build
the preconditioner matrix yourself? Seems your have precise requirements
and the relevant info of the individual blocks, so you should be able to
construct the preconditioner, either using A (original operator) or
directly from the discrete problem.



On Mon, 27 Aug 2018 at 05:33, Matthew Knepley <knepley at gmail.com> wrote:

> On Sat, Aug 25, 2018 at 2:30 PM Ali Reza Khaz'ali <arkhazali at cc.iut.ac.ir>
> wrote:
>
>> Dear Barry and Jed,
>>
>> Thanks for your great suggestions. I set the options as Jed proposed and
>> it worked. However, Block Jacobi preconditioner treats the entire
>> Jacobian matrix as only one giant block. It is good in small systems,
>> but in larger ones, it requires a lot of memory and runtime, as direct
>> methods do.  In addition, I cannot use -pc_bjacobi_blocks option,
>> because the block sizes of the Jacobain matrix are variable (In my
>> sample code, SadeqSize array contain block sizes), and apparently
>> -pc_bjacobi_blocks option assumes the block sizes are equal.
>> I have to add an explanation about unusual high tolerances in my code
>> sample, in fact, I am facing a serious ill-conditioned problem. The
>> direct methods (like LU) solve the system without problem, but I could
>> not find any preconditioner+iterative solver that can solve it with the
>> required precision.  Since direct solvers are useless in large systems,
>>
>
> Lets discuss this point a bit further. I assume your system is sparse.
> Sparse
> direct solvers can solve systems fairly efficiently for hundreds of
> thousands of
> unknowns. How big do you want? Also, do you plan on having more than 500K
> unknowns per process? If not, why not just use sparse direct solvers on
> each process?
>
>   Thanks,
>
>      Matt
>
> I had to design an specific preconditioner for it (I wrote a paper about
>> it: https://link.springer.com/article/10.1007/s10596-014-9421-3), but
>> the designed preconditioner was rather slow (even on GPU), and I hope
>> that I can find a faster solution in PETSc. So for now I am searching
>> for a combination that just works for me, and then, I'll refine it.
>>
>> Best wishes,
>> Ali
>>
>> On 8/25/2018 4:31 AM, Smith, Barry F. wrote:
>> >     What you would like to do is reasonable but unfortunately the order
>> of the operations in SNES/KSP means that the KSPSetUp() cannot be called
>> before the SNESSolve() because the Jacobian matrix has yet to be provided
>> to KSP. Thus you have to, as Jed suggested, use the options database to set
>> the options. Note that if you want the options "hardwired" into the code
>> you can use PetscOptionsSetValue() from the code.
>> >
>> >    Barry
>> >
>> >
>> >> On Aug 24, 2018, at 1:48 PM, Ali Reza Khaz'ali <arkhazali at cc.iut.ac.ir>
>> wrote:
>> >>
>> >> Hi,
>> >>
>> >> I am trying to use block Jacobi preconditioner in SNES (SNESNEWTONLS).
>> However, PCBJacobiGetSubKSP function returns an error stating "Object is in
>> wrong state, Must call KSPSetUp() or PCSetUp() first". When I add KSPSetUp,
>> I got and error from them as: "Must call DMShellSetGlobalVector() or
>> DMShellSetCreateGlobalVector()", and if PCSetUp is added, "Object is in
>> wrong state, Matrix must be set first" error is printed.
>> >>
>> >> Below is a part of my code. It is run serially. Any help is much
>> appreciated.
>> >>
>> >>      ierr = SNESGetKSP(snes, &Petsc_ksp);
>> >>      CHKERRQ(ierr);
>> >>      ierr = KSPGetPC(Petsc_ksp, &Petsc_pc);
>> >>      CHKERRQ(ierr);
>> >>      ierr = KSPSetTolerances(Petsc_ksp, 1.e-3, 1e-3, PETSC_DEFAULT,
>> 20000);
>> >>      CHKERRQ(ierr);
>> >>      ierr = SNESSetTolerances(snes, 1e-1, 1e-1, 1e-1, 2000, 2000);
>> >>      CHKERRQ(ierr);
>> >>      ierr = SNESSetType(snes, SNESNEWTONLS);
>> >>      CHKERRQ(ierr);
>> >>      ierr = KSPSetType(Petsc_ksp, KSPGMRES);
>> >>      CHKERRQ(ierr);
>> >>      ierr = PCSetType(Petsc_pc, PCBJACOBI);
>> >>      CHKERRQ(ierr);
>> >>      ierr = PCSetType(Petsc_pc, PCBJACOBI);
>> >>      CHKERRQ(ierr);
>> >>      ierr = PCBJacobiSetTotalBlocks(Petsc_pc, 2*Nx*Ny*Nz, SadeqSize);
>> >>      CHKERRQ(ierr);
>> >>
>> >>      SNESSetUp(snes);
>> >>      CHKERRQ(ierr);
>> >>      ierr = PCBJacobiGetSubKSP(Petsc_pc, &nLocal, &Firstly, &subKSP);
>> >>      CHKERRQ(ierr);
>> >>
>> >>      for (i = 0; i < nLocal; i++) {
>> >>          ierr = KSPGetPC(subKSP[i], &SubPc);
>> >>          CHKERRQ(ierr);
>> >>          ierr = PCSetType(SubPc, PCLU);
>> >>          CHKERRQ(ierr);
>> >>          ierr = PCFactorSetMatSolverPackage(SubPc, "mkl_pardiso");
>> >>          CHKERRQ(ierr);
>> >>          ierr = KSPSetType(subKSP[i], KSPPREONLY);
>> >>          CHKERRQ(ierr);
>> >>          ierr = KSPSetTolerances(subKSP[i], 1.e-6, PETSC_DEFAULT,
>> PETSC_DEFAULT, PETSC_DEFAULT);
>> >>          CHKERRQ(ierr);
>> >>      }
>> >>      ierr = SNESSolve(snes, NULL, Petsc_X);
>> >>      CHKERRQ(ierr);
>> >>
>> >>
>> >>
>> >> --
>> >> Ali Reza Khaz’ali
>> >> Assistant Professor of Petroleum Engineering,
>> >> Department of Chemical Engineering
>> >> Isfahan University of Technology
>> >> Isfahan, Iran
>> >>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180827/fe7ee79e/attachment-0001.html>


More information about the petsc-users mailing list