[petsc-users] Problems about Assemble DMComposite Precondition Matrix

Mark Adams mfadams at lbl.gov
Mon Nov 5 09:59:18 CST 2018


On Mon, Nov 5, 2018 at 10:37 AM Yingjie Wu <yjwu16 at gmail.com> wrote:

> Thank you very much for your reply.
> My equation is a neutron diffusion equation with eigenvalues, which is why
> I use DMConposite because there is a single non-physical field variable,
> eigenvalue.
>

OK, DMComposite might be your best choice.


> I am not very familiar with FieldSplit. I can understand it first.
> It seems not the problem of DmConposite because the reason of error
> reporting is that the diagonal elements of the precondition matrix have
> zero terms.
> My requirement is to divide precondition matrix to sub-matrices, because I
> have neutrons of multiple energy groups (each is a physical field). I want
> to assign only diagonal block matrices, preferably using
> MatSetValuesStencil (which can simplify the assignment of five diagonal
> matrices).
>

So you are trying to create the identity matrix with MatSetValuesStencil
but you are getting zeros on the diagonal.

I would debug this by looking at the matrix in matlab (or Octave). You can
use code like this:

  if (PETSC_TRUE) {
    PetscViewer viewer;
    ierr = PetscViewerASCIIOpen(comm, "Amat.m", &viewer);CHKERRQ(ierr);
    ierr = PetscViewerPushFormat(viewer,
PETSC_VIEWER_ASCII_MATLAB);CHKERRQ(ierr);
    ierr = MatView(Amat,viewer);CHKERRQ(ierr);
    ierr = PetscViewerPopFormat(viewer);CHKERRQ(ierr);
    ierr = PetscViewerDestroy(&viewer);
  }

Test on a small, serial problem and see where your data is actually going,
if not on the diagonal.


> Thanks,
> Yingjie
>
> Mark Adams <mfadams at lbl.gov> 于2018年11月5日周一 下午11:22写道:
>
>> DMComposite is not very mature, the last time I checked and I don't of
>> anyone having worked on it recently, and it is probably not what you want
>> anyway. FieldSplit is most likely what you want.
>>
>> What are your equations and discretization? eg, Stokes with cell centered
>> pressure? There are probably examples that are close to what you want and
>> it would not be hard to move your code over.
>>
>> Mark
>>
>> On Mon, Nov 5, 2018 at 10:00 AM Yingjie Wu via petsc-users <
>> petsc-users at mcs.anl.gov> wrote:
>>
>>> Dear Petsc developer:
>>> Hi,
>>> I have recently studied the preconditioner of the program, and some
>>> problems have arisen. Please help me to solve them.
>>> At present, I have written a program to solve the system of non-linear
>>> equations. The Matrix Free method has been used to calculate results. But I
>>> want to add a preprocessing matrix to it.
>>> I used the DMComposite object, which stores two sub-DM objects and a
>>> single value (two physical field variables and one variable). I want to use
>>> MatGetLocalSubMatrix to assign each physical field sub precondition matrix.
>>> At the same time, because my DM object is two-dimensional, I use
>>> MatSetValuesStencil() to fill the sub matrix.
>>> At present, I just want to fill in a unit matrix (for global vectors) as
>>> the precondition matrix of Matrix Free Method (just to test whether it can
>>> be used, the unit matrix has no preprocessing effect). But the procedure
>>> was wrong.
>>>
>>>            yjwu at yjwu-XPS-8910:~/petsc-3.10.1/src/snes/examples/tutorials$
>>> mpiexec -n 1 ./ex216 -f wu-readtwogroups -snes_mf_operator -snes_view
>>> -snes_converged_reason -snes_monitor -ksp_converged_reason
>>> -ksp_monitor_true_residual
>>>
>>>   0 SNES Function norm 8.235090086536e-02
>>> iter = 0, SNES Function norm 0.0823509
>>> iter = 0, Keff ======= 1.
>>>   Linear solve did not converge due to DIVERGED_PCSETUP_FAILED
>>> iterations 0
>>>                  PCSETUP_FAILED due to FACTOR_NUMERIC_ZEROPIVOT
>>> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations
>>> 0
>>> SNES Object: 1 MPI processes
>>>   type: newtonls
>>>   maximum iterations=50, maximum function evaluations=10000
>>>   tolerances: relative=1e-08, absolute=1e-50, solution=1e-08
>>>   total number of linear solver iterations=0
>>>   total number of function evaluations=1
>>>   norm schedule ALWAYS
>>>   SNESLineSearch Object: 1 MPI processes
>>>     type: bt
>>>       interpolation: cubic
>>>       alpha=1.000000e-04
>>>     maxstep=1.000000e+08, minlambda=1.000000e-12
>>>     tolerances: relative=1.000000e-08, absolute=1.000000e-15,
>>> lambda=1.000000e-08
>>>     maximum iterations=40
>>>   KSP Object: 1 MPI processes
>>>     type: gmres
>>>       restart=30, using Classical (unmodified) Gram-Schmidt
>>> Orthogonalization with no iterative refinement
>>>       happy breakdown tolerance 1e-30
>>>     maximum iterations=10000, initial guess is zero
>>>     tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
>>>     left preconditioning
>>>     using PRECONDITIONED norm type for convergence test
>>>   PC Object: 1 MPI processes
>>>     type: ilu
>>>       out-of-place factorization
>>>       0 levels of fill
>>>       tolerance for zero pivot 2.22045e-14
>>>       matrix ordering: natural
>>>       factor fill ratio given 1., needed 1.
>>>         Factored matrix follows:
>>>           Mat Object: 1 MPI processes
>>>             type: seqaij
>>>             rows=961, cols=961
>>>             package used to perform factorization: petsc
>>>             total: nonzeros=4625, allocated nonzeros=4625
>>>             total number of mallocs used during MatSetValues calls =0
>>>               not using I-node routines
>>>     linear system matrix followed by preconditioner matrix:
>>>     Mat Object: 1 MPI processes
>>>       type: mffd
>>>       rows=961, cols=961
>>>         Matrix-free approximation:
>>>           err=1.49012e-08 (relative error in function evaluation)
>>>           The compute h routine has not yet been set
>>>     Mat Object: 1 MPI processes
>>>       type: seqaij
>>>       rows=961, cols=961
>>>       total: nonzeros=4625, allocated nonzeros=4625
>>>       total number of mallocs used during MatSetValues calls =0
>>>         not using I-node routines
>>>
>>> It seems that there are elements on the diagonal line that are not
>>> filled, but I don't understand what went wrong. Part of the code of my
>>> program is as follows. I added all the code to the appendix. I have tested
>>> that for the entire Jacobian matrix assignment unit matrix (no submatrix,
>>> direct operation of the global preprocessing matrix), the program can run
>>> normally. However, since the problems developed later may be more complex,
>>> involving multiple physical fields, it would be easier to implement
>>> submatrix.
>>>
>>> ......
>>>
>>> /* set DMComposite */
>>>
>>> ierr =
>>> DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da1);CHKERRQ(ierr);
>>>   ierr = DMSetFromOptions(user.da1);CHKERRQ(ierr);
>>>   ierr = DMSetUp(user.da1);CHKERRQ(ierr);
>>>   ierr = DMCompositeAddDM(user.packer,user.da1);CHKERRQ(ierr);
>>>   ierr =
>>> DMDACreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,20,24,PETSC_DECIDE,PETSC_DECIDE,1,1,0,0,&user.da2);CHKERRQ(ierr);
>>>   ierr = DMSetFromOptions(user.da2);CHKERRQ(ierr);
>>>   ierr = DMSetUp(user.da2);CHKERRQ(ierr);
>>>   ierr = DMCompositeAddDM(user.packer,user.da2);CHKERRQ(ierr);
>>>   ierr =
>>> DMRedundantCreate(PETSC_COMM_WORLD,0,1,&user.red1);CHKERRQ(ierr);
>>>   ierr = DMCompositeAddDM(user.packer,user.red1);CHKERRQ(ierr);
>>> ......
>>> /* in FormJacobian(SNES snes,Vec U,Mat J,Mat B,void *ctx)  */
>>> ierr =
>>> DMCompositeGetLocalVectors(user->packer,&vphi1,&vphi2,&vlambda);CHKERRQ(ierr);
>>>   ierr =
>>> DMCompositeScatter(user->packer,U,vphi1,vphi2,vlambda);CHKERRQ(ierr);
>>>
>>>   ierr = VecGetArray(vlambda,&lambda);CHKERRQ(ierr);
>>>   ierr = DMDAVecGetArray(user->da1,vphi1,&phi1);CHKERRQ(ierr);
>>>   ierr = DMDAVecGetArray(user->da2,vphi2,&phi2);CHKERRQ(ierr);
>>>
>>>   ierr = DMCompositeGetLocalISs(user->packer,&is);CHKERRQ(ierr);
>>>   ierr = MatGetLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr);
>>>   ierr = MatGetLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr);
>>>
>>>   for (j=ys; j<ys+ym; j++) {
>>>     for (i=xs; i<xs+xm; i++) {
>>>       row.j = j; row.i = i;
>>>       unit = 1.0;
>>>       ierr =
>>> MatSetValuesStencil(B11,1,&row,1,&row,&unit,INSERT_VALUES);CHKERRQ(ierr);
>>>     }
>>>   }
>>>
>>>   for (j=ys; j<ys+ym; j++) {
>>>     for (i=xs; i<xs+xm; i++) {
>>>       row.j = j; row.i = i;
>>>       unit = 1.0;
>>>       ierr =
>>> MatSetValuesStencil(B22,1,&row,1,&row,&unit,INSERT_VALUES);CHKERRQ(ierr);
>>>     }
>>>   }
>>>
>>>   ierr = MatRestoreLocalSubMatrix(B,is[0],is[0],&B11);CHKERRQ(ierr);
>>>   ierr = MatRestoreLocalSubMatrix(B,is[1],is[1],&B22);CHKERRQ(ierr);
>>>
>>>
>>>   unit = 1.0;
>>>   row1  = 960;//last row global index
>>>
>>>   ierr =
>>> MatSetValues(B,1,&row1,1,&row1,&unit,INSERT_VALUES);CHKERRQ(ierr);
>>>
>>>   ierr = VecRestoreArray(vlambda,&lambda);CHKERRQ(ierr);
>>>   ierr = DMDAVecRestoreArray(user->da1,vphi1,&phi1);CHKERRQ(ierr);
>>>   ierr = DMDAVecRestoreArray(user->da2,vphi2,&phi2);CHKERRQ(ierr);
>>> ......
>>>
>>>
>>> Thanks,
>>> Yingjie
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181105/9d249ac7/attachment.html>


More information about the petsc-users mailing list