[petsc-users] [EXT]Re: Zero diagonal Error Code with all non-zero diagonal entries
Alexander B Prescott
alexprescott at email.arizona.edu
Mon Apr 6 10:27:47 CDT 2020
Thank you Matt, you were right about the residual function, the error was
subtle. I've gotten the analytical Jacobian running and have some new
questions, but I'll start a new thread for that.
Best,
Alexander
On Fri, Apr 3, 2020 at 10:39 AM Matthew Knepley <knepley at gmail.com> wrote:
> *External Email*
> On Fri, Apr 3, 2020 at 12:20 PM Alexander B Prescott <
> alexprescott at email.arizona.edu> wrote:
>
>> Hi Matt,
>>
>> Thanks for your help. I've done as you suggested and it still doesn't
>> work -- I get the same error message as before if I add the -snes_fd flag
>> to the command line. I also added -info and included that output below.
>> Also, a similar error is returned with other PC types, e.g LU.
>>
>
> Great. I now suspect your residual is flawed. I have several suggestions:
>
> a) It looks like you might have an indexing bug, since your residual
> does not depend on dof 0. PETSc uses 0-based indexing.
>
> b) I think the best initial check is to put in the exact solution and
> check that the residual is 0. I do this in SNES ex5 using MMS.
>
> c) -snes_fd forms the Jacobian without coloring. Giving no options uses
> coloring.
>
> Thanks,
>
> Matt
>
>
>> [0] PetscInitialize(): PETSc successfully started: number of processors =
>> 1
>> [0] PetscGetHostName(): Rejecting domainname, likely is NIS login2.(none)
>> [0] PetscInitialize(): Running on machine: login2
>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688
>> -2080374784 max tags = 268435455
>> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688
>> -2080374784
>> [0] DMGetDMSNES(): Creating new DMSNES
>> [0] PetscGetHostName(): Rejecting domainname, likely is NIS login2.(none)
>> [0] SNESSetFromOptions(): Setting default finite difference Jacobian
>> matrix
>> [0] DMCreateMatrix_Shell(): Naively creating matrix using global vector
>> distribution without preallocation
>> [0] MatSetUp(): Warning not preallocating matrix storage
>> [0] DMGetDMKSP(): Creating new DMKSP
>> 0 SNES Function norm 1.000000000000e+01
>> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 25 X 25; storage space: 70
>> unneeded,55 used
>> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
>> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3
>> [0] MatCheckCompressedRow(): Found the ratio (num_zerorows
>> 0)/(num_localrows 25) < 0.6. Do not use CompressedRow routines.
>> [0] MatSeqAIJCheckInode(): Found 25 nodes out of 25 rows. Not using Inode
>> routines
>> [0] SNESComputeJacobian(): Rebuilding preconditioner
>> [0] PCSetUp(): Setting up PC for first time
>> [0] PCSetUp_MG(): Using outer operators to define finest grid operator
>> because PCMGGetSmoother(pc,nlevels-1,&ksp);KSPSetOperators(ksp,...);
>> was not called.
>> [0] PCSetUp(): Setting up PC for first time
>> [0] PCSetUp(): Leaving PC with identical preconditioner since operator is
>> unchanged
>> [0] PCSetUp(): Leaving PC with identical preconditioner since operator is
>> unchanged
>> [0] PCSetUp(): Leaving PC with identical preconditioner since operator is
>> unchanged
>> [0]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>> [0]PETSC ERROR: Arguments are incompatible
>> [0]PETSC ERROR: Zero diagonal on row 0
>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.
>> ....
>>
>>
>> Best,
>> Alexander
>>
>>
>>
>> On Thu, Apr 2, 2020 at 9:07 PM Matthew Knepley <knepley at gmail.com> wrote:
>>
>>> *External Email*
>>> On Thu, Apr 2, 2020 at 11:33 PM Alexander B Prescott <
>>> alexprescott at email.arizona.edu> wrote:
>>>
>>>> Hello,
>>>>
>>>> I am teaching myself how to use Petsc for nonlinear equations and I've
>>>> run into a problem that I can't quite figure out. I am trying to use the
>>>> matrix coloring routines for the finite difference Jacobian approximation,
>>>> and I've followed the steps in the manual to do this.
>>>> When I run the program with a MG preconditioner, I get back the error:
>>>>
>>>> [0]PETSC ERROR: --------------------- Error Message
>>>> ----------------------------
>>>> [0]PETSC ERROR: Arguments are incompatible
>>>> [0]PETSC ERROR: Zero diagonal on row 0
>>>> .....
>>>>
>>>>
>>> The easiest thing to try is just comment out all your matrix stuff
>>> including SNESSetJacobian(). PETSc will do the coloring automatically.
>>> If this works, you know its your coloring. If not, then its something
>>> with the MG setup.
>>>
>>> Thanks,
>>>
>>> Matt
>>>
>>>
>>>> What's interesting is that after I've added non-zero entries to the
>>>> matrix with MatrixSetValues() and assembled the matrix
>>>> with MatAssemblyBegin() + MatAssemblyEnd(), I can verify that every
>>>> diagonal entry is non-zero with a call to MatGetValues. I've included a
>>>> relevant code snippet below and I'm happy to send more. Any guidance is
>>>> greatly appreciated.
>>>>
>>>> Command line:
>>>>
>>>> mpirun -n 1 ./program -snes_view -snes_converged_reason -snes_monitor
>>>> -ksp_monitor -ksp_converged_reason -pc_type mg
>>>>
>>>> Code snippet:
>>>>
>>>> ierr = FormJacobianColoring(snes,J);CHKERRQ(ierr); // this function
>>>> set's matrix values and assembles the matrix
>>>>
>>>> // removed the code, but this is where I've used MatGetValues() to
>>>> ensure that the diagonal of J (as well as other entries) has been set to 1.0
>>>>
>>>> ierr = MatColoringCreate(J,&coloring);CHKERRQ(ierr);
>>>> ierr = MatColoringSetType(coloring,MATCOLORINGSL);CHKERRQ(ierr);
>>>> ierr = MatColoringSetFromOptions(coloring);CHKERRQ(ierr);
>>>> ierr = MatColoringApply(coloring,&iscoloring);CHKERRQ(ierr);
>>>> ierr = MatColoringDestroy(&coloring);CHKERRQ(ierr);
>>>> /*
>>>> Create the data structure that SNESComputeJacobianDefaultColor() uses
>>>> to compute the actual Jacobians via finite differences.
>>>> */
>>>> ierr = MatFDColoringCreate(J,iscoloring,&fdcoloring);CHKERRQ(ierr);
>>>> ierr = MatFDColoringSetFunction(fdcoloring,(PetscErrorCode
>>>> (*)(void))FormFunction,NULL);CHKERRQ(ierr);
>>>> ierr = MatFDColoringSetFromOptions(fdcoloring);CHKERRQ(ierr);
>>>> ierr = MatFDColoringSetUp(J,iscoloring,fdcoloring);CHKERRQ(ierr);
>>>> ierr =
>>>> SNESSetJacobian(snes,J,J,SNESComputeJacobianDefaultColor,fdcoloring);CHKERRQ(ierr);
>>>>
>>>> ierr = SNESSetFromOptions(snes);CHKERRQ(ierr);
>>>>
>>>> ierr = VecSet(x,0.001);CHKERRQ(ierr);
>>>> ierr = SNESSolve(snes,NULL,x);CHKERRQ(ierr);
>>>>
>>>>
>>>>
>>>> And here's the full error code:
>>>>
>>>> [0]PETSC ERROR: --------------------- Error Message
>>>> --------------------------------------------------------------
>>>> [0]PETSC ERROR: Arguments are incompatible
>>>> [0]PETSC ERROR: Zero diagonal on row 0
>>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>>> for trouble shooting.
>>>> [0]PETSC ERROR: Petsc Release Version 3.10.3, Dec, 18, 2018
>>>> [0]PETSC ERROR: ./petsc_flowroute on a named i0n22 by alexprescott Thu
>>>> Apr 2 20:14:01 2020
>>>> [0]PETSC ERROR: Configure options
>>>> --prefix=/cm/shared/uaapps/petsc/3.10.3 --download-fblaslapack
>>>> --download-metis --download-parmetis --download-hypre PETSC_ARCH=linux-gnu
>>>> --with-debugging=no COPTFLAGS=-O3 CXXOPTFLAGS=-O3
>>>> [0]PETSC ERROR: #1 MatInvertDiagonal_SeqAIJ() line 1662 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/mat/impls/aij/seq/aij.c
>>>> [0]PETSC ERROR: #2 MatSOR_SeqAIJ() line 1693 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/mat/impls/aij/seq/aij.c
>>>> [0]PETSC ERROR: #3 MatSOR() line 3932 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/mat/interface/matrix.c
>>>> [0]PETSC ERROR: #4 PCApply_SOR() line 31 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/pc/impls/sor/sor.c
>>>> [0]PETSC ERROR: #5 PCApply() line 462 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/pc/interface/precon.c
>>>> [0]PETSC ERROR: #6 KSP_PCApply() line 281 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/include/petsc/private/kspimpl.h
>>>> [0]PETSC ERROR: #7 KSPInitialResidual() line 67 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/ksp/interface/itres.c
>>>> [0]PETSC ERROR: #8 KSPSolve_GMRES() line 233 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/ksp/impls/gmres/gmres.c
>>>> [0]PETSC ERROR: #9 KSPSolve() line 780 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/ksp/interface/itfunc.c
>>>> [0]PETSC ERROR: #10 KSPSolve_Chebyshev() line 367 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/ksp/impls/cheby/cheby.c
>>>> [0]PETSC ERROR: #11 KSPSolve() line 780 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/ksp/interface/itfunc.c
>>>> [0]PETSC ERROR: #12 PCMGMCycle_Private() line 20 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/pc/impls/mg/mg.c
>>>> [0]PETSC ERROR: #13 PCApply_MG() line 377 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/pc/impls/mg/mg.c
>>>> [0]PETSC ERROR: #14 PCApply() line 462 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/pc/interface/precon.c
>>>> [0]PETSC ERROR: #15 KSP_PCApply() line 281 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/include/petsc/private/kspimpl.h
>>>> [0]PETSC ERROR: #16 KSPInitialResidual() line 67 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/ksp/interface/itres.c
>>>> [0]PETSC ERROR: #17 KSPSolve_GMRES() line 233 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/ksp/impls/gmres/gmres.c
>>>> [0]PETSC ERROR: #18 KSPSolve() line 780 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/ksp/ksp/interface/itfunc.c
>>>> [0]PETSC ERROR: #19 SNESSolve_NEWTONLS() line 224 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/snes/impls/ls/ls.c
>>>> [0]PETSC ERROR: #20 SNESSolve() line 4397 in
>>>> /cm/local/uabuild/petsc/petsc-3.10.3/src/snes/interface/snes.c
>>>> [0]PETSC ERROR: #21 main() line 705 in
>>>> /home/u16/alexprescott/petsc_example/flowroute/petsc_flowroute.c
>>>> [0]PETSC ERROR: PETSc Option Table entries:
>>>> [0]PETSC ERROR: -ksp_converged_reason
>>>> [0]PETSC ERROR: -ksp_monitor
>>>> [0]PETSC ERROR: -pc_type mg
>>>> [0]PETSC ERROR: -snes_converged_reason
>>>> [0]PETSC ERROR: -snes_monitor
>>>> [0]PETSC ERROR: -snes_view
>>>> [0]PETSC ERROR: ----------------End of Error Message -------send entire
>>>> error message to petsc-maint at mcs.anl.gov----------
>>>> application called MPI_Abort(MPI_COMM_WORLD, 75) - process 0
>>>>
>>>>
>>>> Printed matrix for a 25 x 25 example
>>>>
>>>> 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 0 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 0 0 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 0 1 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 0 0 1 0 0 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0
>>>> 0 0 0 1 0 0 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0
>>>> 0 0 0 0 1 0 0 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0
>>>> 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0
>>>> 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0
>>>> 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0
>>>> 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 1 1 0 0 0 0 0 0
>>>> 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 1 1 0 0 0 0 0
>>>> 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0
>>>> 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 1 1 0 0 0
>>>> 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 1 1 0 0
>>>> 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 1 1 0
>>>> 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 1 1
>>>>
>>>>
>>>>
>>>> Best,
>>>> Alexander
>>>>
>>>>
>>>>
>>>> --
>>>> Alexander Prescott
>>>> alexprescott at email.arizona.edu
>>>> PhD Candidate, The University of Arizona
>>>> Department of Geosciences
>>>> 1040 E. 4th Street
>>>> Tucson, AZ, 85721
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>>
>> --
>> Alexander Prescott
>> alexprescott at email.arizona.edu
>> PhD Candidate, The University of Arizona
>> Department of Geosciences
>> 1040 E. 4th Street
>> Tucson, AZ, 85721
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
--
Alexander Prescott
alexprescott at email.arizona.edu
PhD Candidate, The University of Arizona
Department of Geosciences
1040 E. 4th Street
Tucson, AZ, 85721
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200406/6f33a14e/attachment-0001.html>
More information about the petsc-users
mailing list