[petsc-users] Red-black block Jacobi in PETSc?
Barry Smith
bsmith at mcs.anl.gov
Tue Aug 29 22:28:15 CDT 2017
Dang, you are right.
It could be there, some of the code to write it could be stolen from the PCSOR that does handle a generic size N block.
Barry
> On Aug 29, 2017, at 9:54 PM, Ben Yee <bcyee at umich.edu> wrote:
>
> Oh, I could be mistaken, but I don't think the PCApply_PBJacobi_N() function is in the master branch of petsc. (At least, it's not in the pbjacobi.c file that I downloaded for petsc-3.7.6) I did a quick search and found it in this old mat-taij branch: https://bitbucket.org/petsc/petsc/branch/debo/mat-taij . Is this the function you were referring to?
>
> Regarding the block density, I think the matrix is relatively diagonally dominant, but the density is generally a bit higher than 30%. The blocks are of the same size. I think you're right that the factorization is faster, I will try the factorization first. However, I am concerned about the memory cost of the factorizing all of the blocks for problems with a large number of equations, so I think I may have to resort to an iterative solver or doing Gaussian elimination in those cases.
>
> On Tue, Aug 29, 2017 at 10:32 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> > On Aug 29, 2017, at 9:10 PM, Ben Yee <bcyee at umich.edu> wrote:
> >
> > Thank you for your detailed response!
> >
> > In my case, the number of equations (the block size) depends on the user input (there is one equation per energy group, and the number of groups depends on how the user wants to discretize the energy variable). It typically is around 50, but can be as high as a few hundred. In pbjacobi.c, it seems that it is hard-coded to work for a fixed block size N, but I would like it to work for general N. Moreover, I would like to solve the blocks iteratively (using SOR for example) since the block sizes can get rather large.
> >
> > I haven't tried this yet, but it seems that it wouldn't be too hard to modify what you have suggested to work as I have described above. I could add an extra PetscBool to pbjacobi that indicates that I want to use my own PCApply_PBJacobi which would work for general block size N. And, in that function, instead of applying a stored inverse block, I could implement SOR iterations.
>
> There is a general PCApply_PBJacobi_N() you can start with no need for a bool.
>
> Almost for sure if the blocks are more than 30% full you will do better to to do the factorization and inversion and NOT do SOR. How dense are your blocks? I am assuming each block is of the same size?
>
>
> Barry
>
> >
> > Does that sound like a good plan, or do you suggest an alternative approach?
> >
> > Thanks again for your help on this!
> >
> > On Tue, Aug 29, 2017 at 9:11 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> > Ok, you should be using the BAIJ matrix it supports point-block Jacobi and point-block Gauss-Seidel.
> >
> > We do not have a red-black Jacobi/Gauss-Seidel but you could easily add it. You will add it, not by using a any shell objects but by adding functionality to the PCPBJACOBI code in PETSc which is in src/ksp/pc/impls/pbjacobi/pbjacobi.c
> >
> > First you will need to add a routine to supply the colors (make it general for any number of colors since the code is basically the same as for only two colors) call it say
> >
> > PCPBJacobiSetColors(PC pc,PetscInt number of colors, PetscInt *sizeofeachcolor, PetscInt **idsforeachcolor);
> >
> > you will have to add additional fields in PC_PBJacobi to contain this information.
> >
> > Then copy the static PetscErrorCode PCApply_PBJacobi_N(PC pc,Vec x,Vec y) for your block size N (for example 3) and modify it to do what you want, so for example
> >
> > static PetscErrorCode PCApply_PBJacobi_2_Color(PC pc,Vec x,Vec y)
> > {
> > PC_PBJacobi *jac = (PC_PBJacobi*)pc->data;
> > PetscErrorCode ierr;
> > PetscInt i,m = jac->mbs;
> > const MatScalar *diag = jac->diag;
> > PetscScalar x0,x1,*yy;
> > const PetscScalar *xx;
> >
> > PetscFunctionBegin;
> > if (!jac->b) {
> > ierr = VecDuplicate(x,&jac->b);
> > ierr = VecDuplicate(x,&jac->work);
> > }
> >
> > ierr = VecCopy(x,b);CHKERRQ(ierr);
> > for (j=0; j<jac->numberofcolors; j++) {
> >
> > ierr = VecGetArrayRead(b,&xx);CHKERRQ(ierr);
> > ierr = VecGetArray(y,&yy);CHKERRQ(ierr);
> >
> > for (i=0; i<jac->sizeofeachcolor[j]; i++) {
> > ii = jac->idsforeachcolor[j][i];
> > diag = jac->diag + 4*ii;
> > x0 = xx[2*ii]; x1 = xx[2*ii+1];
> > yy[2*ii] = diag[0]*x0 + diag[2]*x1;
> > yy[2*ii+1] = diag[1]*x0 + diag[3]*x1;
> > }
> > ierr = VecRestoreArrayRead(b,&xx);CHKERRQ(ierr);
> > ierr = VecRestoreArray(y,&yy);CHKERRQ(ierr);
> >
> > /* update residual */
> > if (i < jac->sizeofeachcolor[j]-1) {
> > ierr = MatMult(pc->matrix,y,work2);
> > ierr = VecAYPX(b,-1,work1);
> > }
> > }
> >
> > PetscFunctionReturn(0);
> > }
> >
> > Finally in PCSetUp_PBJacobi() you would set the apply function pointer to the "color" version if the user has provided the coloring information.
> >
> > Pretty simple.
> >
> > Barry
> >
> >
> > > On Aug 29, 2017, at 6:47 PM, Ben Yee <bcyee at umich.edu> wrote:
> > >
> > > I'm solving a coupled set of equations, so each "block" corresponds to a set of unknowns for a particular spatial cell. The matrix is structured such that all of the unknowns for a given spatial cell have adjacent global matrix indices (i.e., they're next to each other in the global solution vector). Effectively, I want to do red-black Gauss Seidel, but with blocks. Alternatively, it's the same as applying block Jacobi for all the red cells and then applying block Jacobi for all the black cells.
> > >
> > > The color of the block is determined from the geometry of the problem which is stored in various structures in the code I'm working on, independent of petsc. (Physically, I generally have a nice 3d cartesian spatial grid and the coloring is just a checkerboard in that case.)
> > >
> > > The reason I want to do this is for research purposes. I've implemented my own interpolation matrices for PCMG, and, in my simpler 1d codes and convergence analyses, I've found that doing a red-black smoothing significantly improved convergence for my particular problem (though I'm aware that this generally leads to poor cache efficiency).
> > >
> > > On Aug 29, 2017 7:33 PM, "Barry Smith" <bsmith at mcs.anl.gov> wrote:
> > >
> > > Ben,
> > >
> > > Please explain more what you mean by "a red-black block Jacobi smoother". What is your matrix structure? What are the blocks? How do you decide which ones are which color? Why do you wish to use some a smoother.
> > >
> > > Barry
> > >
> > > > On Aug 29, 2017, at 6:19 PM, Ben Yee <bcyee at umich.edu> wrote:
> > > >
> > > > Hi,
> > > >
> > > > For the smoother in PCMG, I want to use a red-black block Jacobi smoother. Is this available with the existing PETSc options? If so, how do I designate which blocks are red and which are black?
> > > >
> > > > If it is not possible, what would be the best way to implement this? Would I use KSPRICHARDSON+PCSHELL?
> > > >
> > > > Thanks!
> > > >
> > > > --
> > > > Ben Yee
> > > >
> > > > NERS PhD Candidate, University of Michigan
> > > > B.S. Mech. & Nuclear Eng., U.C. Berkeley
> > >
> > >
> >
> >
> >
> >
> > --
> > Ben Yee
> >
> > NERS PhD Candidate, University of Michigan
> > B.S. Mech. & Nuclear Eng., U.C. Berkeley
>
>
>
>
> --
> Ben Yee
>
> NERS PhD Candidate, University of Michigan
> B.S. Mech. & Nuclear Eng., U.C. Berkeley
More information about the petsc-users
mailing list