[petsc-users] V-cycle multigrid with matrix shells

Barry Smith bsmith at mcs.anl.gov
Mon May 16 23:33:00 CDT 2011


On May 16, 2011, at 9:52 PM, Sylvain Barbot wrote:

> Hi Barry,
> 
> Thanks for your comments.
> 
>>>       CALL MatShellSetOperation(c%sA(l+1),MATOP_MULT,matrixsmoothfv,ierr);
>> 
>>    You say this is a smoother but you register it as a MATOP_MULT? What is the calling sequence of matrixsmoothfv() and what does it do? Smooth or multiply? MATOP_MULT is for registering a multiple MATOP_SOR (or MATOP_RELAX in older versions of PETSc) is for providing a smoother.
> 
> My function matrixsmoothfv indeed performs the smoothing operation so
> it not a multiplication. I can see no documentation about MATOP_SOR,
> is there an example somewhere online?

   For each possible Matrix function there is the name, for matrix vector multiply it is MATOP_MULT and for smoothing it is MATOP_SOR and the calling sequence you use in your function that does the operation is the same as the matrix operation. For example for MatMult() it is Mat,Vec,Vec,PetscErrorCode (see below for the smoother).
> Assigning MATOP_SOR to my
> smoothing matrix now generates the following runtime error message:
> 
> [0]PETSC ERROR: MatShellSetOperation_Fortran() line 107 in
> src/mat/impls/shell/ftn-custom/zshellf.c
> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
> rank 0 in job 37  catalina.gps.caltech.edu_50093   caused collective
> abort of all ranks
>  exit status of rank 0: return code 1

    This is because we hadn't added support for providing the MATOP_SOR for the Fortran interface. Edit src/mat/imples/shell/ftn-custom/zshellf.c and locate the function matshellsetoperation_() and put instead of it the following code.


static PetscErrorCode oursor(Mat mat,Vec b,PetscReal omega,MatSORType flg,PetscReal shift,PetscInt its,PetscInt lits,Vec x)
{
  PetscErrorCode ierr = 0;
  (*(PetscErrorCode (PETSC_STDCALL *)(Mat*,Vec*,PetscReal*,MatSORType*,PetscReal*,PetscInt*,PetscInt*,Vec*,PetscErrorCode*))(((PetscObject)mat)->fortran_func_pointers[9]))(&mat,&b,&omega,&flg,&shift,&its,&lits,&x,&ierr);
  return ierr;
}

void PETSC_STDCALL matshellsetoperation_(Mat *mat,MatOperation *op,PetscErrorCode (PETSC_STDCALL *f)(Mat*,Vec*,Vec*,PetscErrorCode*),PetscErrorCode *ierr)
{
  MPI_Comm comm;

  *ierr = PetscObjectGetComm((PetscObject)*mat,&comm);if (*ierr) return;
  PetscObjectAllocateFortranPointers(*mat,10);
  if (*op == MATOP_MULT) {
    *ierr = MatShellSetOperation(*mat,*op,(PetscVoidFunction)ourmult);
    ((PetscObject)*mat)->fortran_func_pointers[0] = (PetscVoidFunction)f;
  } else if (*op == MATOP_MULT_TRANSPOSE) {
    *ierr = MatShellSetOperation(*mat,*op,(PetscVoidFunction)ourmulttranspose);
    ((PetscObject)*mat)->fortran_func_pointers[2] = (PetscVoidFunction)f;
  } else if (*op == MATOP_MULT_ADD) {
    *ierr = MatShellSetOperation(*mat,*op,(PetscVoidFunction)ourmultadd);
    ((PetscObject)*mat)->fortran_func_pointers[1] = (PetscVoidFunction)f;
  } else if (*op == MATOP_MULT_TRANSPOSE_ADD) {
    *ierr = MatShellSetOperation(*mat,*op,(PetscVoidFunction)ourmulttransposeadd);
    ((PetscObject)*mat)->fortran_func_pointers[3] = (PetscVoidFunction)f;
  } else if (*op == MATOP_GET_DIAGONAL) {
    *ierr = MatShellSetOperation(*mat,*op,(PetscVoidFunction)ourgetdiagonal);
    ((PetscObject)*mat)->fortran_func_pointers[4] = (PetscVoidFunction)f;
  } else if (*op == MATOP_DIAGONAL_SCALE) {
    *ierr = MatShellSetOperation(*mat,*op,(PetscVoidFunction)ourdiagonalscale);
    ((PetscObject)*mat)->fortran_func_pointers[5] = (PetscVoidFunction)f;
  } else if (*op == MATOP_DIAGONAL_SET) {
    *ierr = MatShellSetOperation(*mat,*op,(PetscVoidFunction)ourdiagonalset);
    ((PetscObject)*mat)->fortran_func_pointers[6] = (PetscVoidFunction)f;
  } else if (*op == MATOP_GET_VECS) {
    *ierr = MatShellSetOperation(*mat,*op,(PetscVoidFunction)ourgetvecs);
    ((PetscObject)*mat)->fortran_func_pointers[7] = (PetscVoidFunction)f;
  } else if (*op == MATOP_VIEW) {
    *ierr = MatShellSetOperation(*mat,*op,(PetscVoidFunction)ourview);
    ((PetscObject)*mat)->fortran_func_pointers[8] = (PetscVoidFunction)f;
  } else if (*op == MATOP_SOR) {
    *ierr = MatShellSetOperation(*mat,*op,(PetscVoidFunction)oursor);
    ((PetscObject)*mat)->fortran_func_pointers[9] = (PetscVoidFunction)f;
  } else {
    PetscError(comm,__LINE__,"MatShellSetOperation_Fortran",__FILE__,__SDIR__,PETSC_ERR_ARG_WRONG,PETSC_ERROR_INITIAL,
               "Cannot set that matrix operation");
    *ierr = 1;
  }
}


Then in that directory run make to update the PETSc libraries. 

Now note the calling sequence of the relaxation function you need to provide. It is PetscErrorCode  MatSOR(Mat mat,Vec b,PetscReal omega,MatSORType flag,PetscReal shift,PetscInt its,PetscInt lits,Vec x,PetscErrorCode ierr) 
in your function you can ignore the omega, the flag, and the shift argument. You function should do its*lits smoothing steps. 

Once you have this going please let us know what happens.

  Barry




> 
>>> I also provide work space...
>>> Despite that, I have a crash upon running KSPSetup() with "[0]PETSC
>>> ERROR: PCSetUp_BJacobi()".
>> 
>> It is trying to use bjacobi because that is the default solver, you need to tell it to use some other solver explicitly if you do not want it to use block Jacobi. So if you registered your matrixsmoothfv() as a MATOP_SOR then you can use -mg_levels_pc_type sor to have it use that smoother.  What solver options are you running with?
> 
> I setup the solver with:
> 
> CALL PCMGSetType(c%pc,PC_MG_MULTIPLICATIVE,ierr);
> 
> but this error might not be representative of the problem. upstream,
> my smoothing matrices were not set up properly.
> 
>>> MatGetVecs() line 7265 in src/mat/interface/matrix.c
>>> [0]PETSC ERROR: KSPGetVecs() line 806 in src/ksp/ksp/interface/iterativ.c
>>> [0]PETSC ERROR: KSPSetUp_GMRES() line 94 in src/ksp/ksp/impls/gmres/gmres.c
>>> [0]PETSC ERROR: KSPSetUp() line
>>> 
>>     This is not an indication that it is trying to use a direct solver, this is an indication then you never provided ANY kind of matrix at this level. You need to provide a matrix operator on each level for it to get work vectors from etc. It would be help to have the COMPLETE error message here so I can see exactly what is happening instead of a fragment making me guess what the problem is.



More information about the petsc-users mailing list