[petsc-users] Jacobian test by finite difference

zakaryah . zakaryah at gmail.com
Tue May 19 14:51:53 CDT 2020


I'd like to have one more pass at debugging it myself first. I think Matt
is probably right about the ordering. I need to make sure I understand the
ordering used in VecView. If I use DMCreateGlobalVector to create the
vector for the residual, calculate the residual, then call VecView on the
residual vector, how do I ensure that the output is in natural ordering?

On Tue, May 19, 2020 at 1:17 PM Zhang, Hong <hongzhang at anl.gov> wrote:

> Can you post your code (at least some essential snippets) for the residual
> evaluation and the Jacobian evaluation?
>
> Hong (Mr.)
>
> On May 19, 2020, at 11:08 AM, zakaryah . <zakaryah at gmail.com> wrote:
>
> Thanks Matt. I should have said that the boundary is set to BOUNDARY_NONE.
>> I am not sure what you mean about the ordering. I understand that the
>> actual indexing, internal to PETSc, does not match the natural ordering, as
>> explained in the manual. But, doesn't MatView always use the natural
>> ordering? Also, my hand-coded Jacobian routine uses MatSetValuesStencil,
>> and the corresponding output from snes_test_jacobian_display exactly
>> matches what I expect to see - correct layout of nonzero terms according to
>> natural ordering. It is only the finite difference Jacobian that contains
>> the unexpected, off-stencil terms.
>>
>
> This is weird. I run the Jacobian test manually, by adding a small
> perturbation to the configuration vector at each index and calculating the
> function for the perturbed configuration, then the result looks like it
> should. This makes me think there is not a problem with my routine for
> calculating the function, but I can't explain why the Jacobian test by
> finite difference that PETSc calculates would be different, and have
> non-zero values outside the stencil.
>
> On Tue, May 19, 2020 at 10:13 AM zakaryah . <zakaryah at gmail.com> wrote:
>
>> Thanks Matt. I should have said that the boundary is set to
>> BOUNDARY_NONE. I am not sure what you mean about the ordering. I understand
>> that the actual indexing, internal to PETSc, does not match the natural
>> ordering, as explained in the manual. But, doesn't MatView always use the
>> natural ordering? Also, my hand-coded Jacobian routine uses
>> MatSetValuesStencil, and the corresponding output from
>> snes_test_jacobian_display exactly matches what I expect to see - correct
>> layout of nonzero terms according to natural ordering. It is only the
>> finite difference Jacobian that contains the unexpected, off-stencil terms.
>>
>> On Tue, May 19, 2020, 5:56 AM Matthew Knepley <knepley at gmail.com> wrote:
>>
>>> On Tue, May 19, 2020 at 1:57 AM zakaryah . <zakaryah at gmail.com> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I'm debugging some convergence issues and I came across something
>>>> strange. I am using a nonlinear solver on a structured grid. The DMDA is 3
>>>> dimensional, with 3 dof, and a box stencil of width 1. There are some small
>>>> errors in the hand-coded Jacobian, which I am trying to sort out, but at
>>>> least the fill pattern of the matrix is correct.
>>>>
>>>> However, when I run with -snes_test_jacobian
>>>> -snes_test_jacobian_display -snes_compare_explicit, I see something very
>>>> strange. The finite difference Jacobian has large terms outside the
>>>> stencil. For example, for x,y,z,c = 0,0,0,0 (row 0), the columns 6, 7, 8,
>>>> 12, 13, and 14 (column 6 => x=2,y=0,z=0,c=0, etc.) have large values, while
>>>> columns 9 through 20 are calculated but equal to zero. The "correct"
>>>> values, i.e., in the stencil, are calculated as well, and nearly agree with
>>>> my hand-coded Jacobian. This issue does NOT occur in serial, but occurs for
>>>> any number of processors greater than 1.
>>>>
>>>> I have checked the indexing by hand, carefully, and the memory access
>>>> with valgrind, and the results were clean. Does anyone have an idea why the
>>>> finite difference calculation of the Jacobian would produce large values
>>>> outside the stencil? I am using PETSc 3.12.2 and openMPI 3.1.0. Thanks for
>>>> your help.
>>>>
>>>
>>> Since it only appears in parallel, I am guessing that your calculation
>>> of global ordering does not take into account that we locally
>>> reorder, rather than using lexicographic ordering, and you might have
>>> periodic boundary conditions.
>>>
>>>   Thanks,
>>>
>>>      Matt
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200519/a799cc5b/attachment.html>


More information about the petsc-users mailing list