[petsc-users] MatZeroRows changes my sparsity pattern

Alexander Lindsay alexlindsay239 at gmail.com
Thu Jul 15 12:33:43 CDT 2021


After talking with Fande, I don't think my proposal is a good one. Whereas
MatSetValues makes it clear that you must call through to
MatAssemblyBegin/MatAssemblyEnd after use, there is no such indication for
MatZeroRows. Probably most users expect to be able to go onto
preconditioning after MatZeroRows, no matter the value of
MAT_KEEP_NONZERO_PATTERN.

For MOOSE's use, Fande is proposing an option to not shrink memory that we
would toggle on at the beginning of Jacobian evaluation and then toggle off
before our final call to MatAssemblyBegin/End.

With that in mind, I consider this thread to be resolved...

On Thu, Jul 15, 2021 at 9:33 AM Alexander Lindsay <alexlindsay239 at gmail.com>
wrote:

> Especially if the user has requested to keep their nonzero pattern, is
> there any harm in calling MatAssembly with FLUSH instead of FINAL? Are
> there users relying on MatZeroValues being their final assembly?
>
> On Thu, Jul 15, 2021 at 8:51 AM Alexander Lindsay <
> alexlindsay239 at gmail.com> wrote:
>
>> On Thu, Jul 15, 2021 at 8:46 AM Fande Kong <fdkong.jd at gmail.com> wrote:
>>
>>> "if (a->keepnonzeropattern)" branch does not change ilen so that
>>> A->ops->assemblyend will be fine. It would help if you made sure that
>>> elements have been inserted for these rows before you call MatZeroRows.
>>>
>>
>> So this is the crux of the problem. In ex11 let's say that I had not
>> insert a value at (0,5) but I know I'm going to later and I've preallocated
>> the space for it. MatZeroValues will erase that preallocation with its call
>> to MatAssemblyEnd with MAT_FINAL_ASSEMBLY regardless of the value of
>> keepnonzeropattern.
>>
>>
>>> However, I am not sure it is necessary to call  A->ops->assemblyend if
>>> we already require a->keepnonzeropattern. That being said, we might have
>>> something like this
>>>
>>>
>>> *diff --git a/src/mat/impls/aij/seq/aij.c b/src/mat/impls/aij/seq/aij.c*
>>>
>>> *index 42c93a82b1..3f20a599d6 100644*
>>>
>>> *--- a/src/mat/impls/aij/seq/aij.c*
>>>
>>> *+++ b/src/mat/impls/aij/seq/aij.c*
>>>
>>> @@ -2203,7 +2203,9 @@ PetscErrorCode MatZeroRows_SeqAIJ(Mat A,PetscInt
>>> N,const PetscInt rows[],PetscSc
>>>
>>>  #if defined(PETSC_HAVE_DEVICE)
>>>
>>>    if (A->offloadmask != PETSC_OFFLOAD_UNALLOCATED) A->offloadmask =
>>> PETSC_OFFLOAD_CPU;
>>>
>>>  #endif
>>>
>>> -  ierr = (*A->ops->assemblyend)(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr);
>>>
>>> +  if (!a->keepnonzeropattern) {
>>>
>>> +    ierr = (*A->ops->assemblyend)(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr);
>>>
>>> +  }
>>>
>>>    PetscFunctionReturn(0);
>>>
>>>  }
>>>
>>>
>>> Fande
>>>
>>> On Thu, Jul 15, 2021 at 9:30 AM Stefano Zampini <
>>> stefano.zampini at gmail.com> wrote:
>>>
>>>> Alexander
>>>>
>>>> Do you have a small code to reproduce the issue?
>>>>
>>>> Below is the output using a PETSc example (src/mat/tests/ex11). The
>>>> pattern is kept.
>>>>
>>>> kl-18448:tests szampini$ ./ex11
>>>> Mat Object: 1 MPI processes
>>>>   type: seqaij
>>>> row 0: (0, 5.)
>>>> row 1: (0, -1.)  (1, 4.)  (2, -1.)  (6, -1.)
>>>> row 2: (2, 5.)
>>>> row 3: (2, -1.)  (3, 4.)  (4, -1.)  (8, -1.)
>>>> row 4: (4, 5.)
>>>> row 5: (0, -1.)  (5, 4.)  (6, -1.)  (10, -1.)
>>>> row 6: (6, 5.)
>>>> row 7: (2, -1.)  (6, -1.)  (7, 4.)  (8, -1.)  (12, -1.)
>>>> row 8: (8, 5.)
>>>> row 9: (4, -1.)  (8, -1.)  (9, 4.)  (14, -1.)
>>>> row 10: (10, 5.)
>>>> row 11: (6, -1.)  (10, -1.)  (11, 4.)  (12, -1.)  (16, -1.)
>>>> row 12: (12, 5.)
>>>> row 13: (8, -1.)  (12, -1.)  (13, 4.)  (14, -1.)  (18, -1.)
>>>> row 14: (14, 5.)
>>>> row 15: (10, -1.)  (15, 4.)  (16, -1.)  (20, -1.)
>>>> row 16: (16, 5.)
>>>> row 17: (12, -1.)  (16, -1.)  (17, 4.)  (18, -1.)  (22, -1.)
>>>> row 18: (18, 5.)
>>>> row 19: (14, -1.)  (18, -1.)  (19, 4.)  (24, -1.)
>>>> row 20: (20, 5.)
>>>> row 21: (16, -1.)  (20, -1.)  (21, 4.)  (22, -1.)
>>>> row 22: (22, 5.)
>>>> row 23: (18, -1.)  (22, -1.)  (23, 4.)  (24, -1.)
>>>> row 24: (19, -1.)  (23, -1.)  (24, 4.)
>>>> kl-18448:tests szampini$ ./ex11 -keep_nonzero_pattern
>>>> Mat Object: 1 MPI processes
>>>>   type: seqaij
>>>> row 0: (0, 5.)  (1, 0.)  (5, 0.)
>>>> row 1: (0, -1.)  (1, 4.)  (2, -1.)  (6, -1.)
>>>> row 2: (1, 0.)  (2, 5.)  (3, 0.)  (7, 0.)
>>>> row 3: (2, -1.)  (3, 4.)  (4, -1.)  (8, -1.)
>>>> row 4: (3, 0.)  (4, 5.)  (9, 0.)
>>>> row 5: (0, -1.)  (5, 4.)  (6, -1.)  (10, -1.)
>>>> row 6: (1, 0.)  (5, 0.)  (6, 5.)  (7, 0.)  (11, 0.)
>>>> row 7: (2, -1.)  (6, -1.)  (7, 4.)  (8, -1.)  (12, -1.)
>>>> row 8: (3, 0.)  (7, 0.)  (8, 5.)  (9, 0.)  (13, 0.)
>>>> row 9: (4, -1.)  (8, -1.)  (9, 4.)  (14, -1.)
>>>> row 10: (5, 0.)  (10, 5.)  (11, 0.)  (15, 0.)
>>>> row 11: (6, -1.)  (10, -1.)  (11, 4.)  (12, -1.)  (16, -1.)
>>>> row 12: (7, 0.)  (11, 0.)  (12, 5.)  (13, 0.)  (17, 0.)
>>>> row 13: (8, -1.)  (12, -1.)  (13, 4.)  (14, -1.)  (18, -1.)
>>>> row 14: (9, 0.)  (13, 0.)  (14, 5.)  (19, 0.)
>>>> row 15: (10, -1.)  (15, 4.)  (16, -1.)  (20, -1.)
>>>> row 16: (11, 0.)  (15, 0.)  (16, 5.)  (17, 0.)  (21, 0.)
>>>> row 17: (12, -1.)  (16, -1.)  (17, 4.)  (18, -1.)  (22, -1.)
>>>> row 18: (13, 0.)  (17, 0.)  (18, 5.)  (19, 0.)  (23, 0.)
>>>> row 19: (14, -1.)  (18, -1.)  (19, 4.)  (24, -1.)
>>>> row 20: (15, 0.)  (20, 5.)  (21, 0.)
>>>> row 21: (16, -1.)  (20, -1.)  (21, 4.)  (22, -1.)
>>>> row 22: (17, 0.)  (21, 0.)  (22, 5.)  (23, 0.)
>>>> row 23: (18, -1.)  (22, -1.)  (23, 4.)  (24, -1.)
>>>> row 24: (19, -1.)  (23, -1.)  (24, 4.)
>>>>
>>>> On Jul 15, 2021, at 4:41 PM, Alexander Lindsay <
>>>> alexlindsay239 at gmail.com> wrote:
>>>>
>>>> My interpretation of the documentation page of MatZeroRows is that if
>>>> I've set MAT_KEEP_NONZERO_PATTERN to true, then my sparsity pattern
>>>> shouldn't be changed by a call to it, e.g. a->imax should not change.
>>>> However, at least for sequential matrices, MatAssemblyEnd is called with
>>>> MAT_FINAL_ASSEMBLY at the end of MatZeroRows_SeqAIJ and that does indeed
>>>> change my sparsity pattern. Is my interpretation of the documentation page
>>>> wrong?
>>>>
>>>> Alex
>>>>
>>>>
>>>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210715/d8fb4de4/attachment.html>


More information about the petsc-users mailing list