[petsc-users] MatScale returns different results depending on matrix size

Matthew Knepley knepley at gmail.com
Wed Jan 6 10:36:32 CST 2021


On Wed, Jan 6, 2021 at 11:05 AM Roland Richter <roland.richter at ntnu.no>
wrote:

> Hei,
>
> I ran the program in both versions using "valgrind --tool=memcheck
> --leak-check=full --show-leak-kinds=all <binary> -malloc_debug". I got
>
> ==3059== LEAK SUMMARY:
> ==3059==    definitely lost: 12,916 bytes in 32 blocks
> ==3059==    indirectly lost: 2,415 bytes in 2 blocks
> ==3059==      possibly lost: 0 bytes in 0 blocks
> ==3059==    still reachable: 103,511 bytes in 123 blocks
> ==3059==         suppressed: 0 bytes in 0 blocks
>
> but none of the leaks is related to the scaling-function itself.
>
> Did I miss something here?
>
> Here is my analysis. It is certainly the case that MatScale() does not
mysteriously scale by other numbers.
It is used all over the place in tests, and in the code. Your test requires
another package. Thus, it seems
reasonable to guess that a bad interaction with that package (memory
overwrite, conflicting layout or format, etc.)
is responsible for the behavior you see.

  Thanks,

     Matt

> Thanks!
> Am 06.01.21 um 15:26 schrieb Matthew Knepley:
>
> On Wed, Jan 6, 2021 at 2:41 AM Roland Richter <roland.richter at ntnu.no>
> wrote:
>
>> Hei,
>>
>> I added one additional function to the code:
>>
>> *void test_scaling_petsc_pointer(const Mat &in_mat,*
>> *                                Mat &out_mat,*
>> *                                const PetscScalar &scaling_factor) {*
>> *    MatCopy (in_mat, out_mat, SAME_NONZERO_PATTERN);*
>> *    PetscScalar *mat_ptr;*
>> *    MatDenseGetArray (out_mat, &mat_ptr);*
>> *    PetscInt r_0, r_1;*
>> *    MatGetLocalSize (out_mat, &r_0, &r_1);*
>> *    for(int i = 0; i < r_0 * r_1; ++i)*
>> *        *(mat_ptr + i) = (*(mat_ptr + i) * scaling_factor);*
>>
>> *    MatAssemblyBegin (out_mat, MAT_FINAL_ASSEMBLY);*
>> *    MatAssemblyEnd (out_mat, MAT_FINAL_ASSEMBLY);*
>> *}*
>>
>> When replacing test function *test_scaling_petsc()* with
>> *test_scaling_petsc_pointer()* everything works as it should, but I do
>> not understand why.
>>
>> Do you have any suggestions?
>>
> The easiest explanation is that you have a memory overwrite in the code
> somewhere. Barry's suggestion to use
> valgrind is good.
>
>    Matt
>
>> Thanks!
>>
>>
>> Am 05.01.21 um 15:24 schrieb Roland Richter:
>>
>> Hei,
>>
>> the code I attached to the original mail should work out of the box, but
>> requires armadillo and PETSc to compile/run. Armadillo stores the data in
>> column-major order, and therefore I am transposing the matrices before and
>> after transferring using .st().
>>
>> Thank you for your help!
>>
>> Regards,
>>
>> Roland
>> Am 05.01.21 um 15:21 schrieb Matthew Knepley:
>>
>> On Tue, Jan 5, 2021 at 7:57 AM Roland Richter <roland.richter at ntnu.no>
>> wrote:
>>
>>> Hei,
>>>
>>> I would like to scale a given matrix with a fixed scalar value, and
>>> therefore would like to use MatScale(). Nevertheless, I observed an
>>> interesting behavior depending on the size of the matrix, and currently
>>> I am not sure why.
>>>
>>> When running the attached code, I intend to divide all elements in the
>>> matrix by a constant factor of 10. If I have three or fewer rows and
>>> 1024 columns, I get the expected result. If I have four or more rows
>>> (with the same number of columns), suddenly my scaling factor seems to
>>> be 0.01 instead of 0.1 for the PETSc-matrix. The armadillo-based matrix
>>> still behaves as expected.
>>>
>>
>> 1) It looks like you assume the storage in your armadillo matrix is row
>> major. I would be surprised if this was true.
>>
>> 2) I think it is unlikely that there is a problem with MatScale, so I
>> would guess either you have a memory overwrite
>> or are misinterpreting your output. If you send something I can run, I
>> will figure out which it is.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> I currently do not understand that behavior, but do not see any problems
>>> with the code either. Are there any possible explanations for that
>>> behavior?
>>>
>>> Thank you very much,
>>>
>>> regards,
>>>
>>> Roland Richter
>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210106/7a3db916/attachment.html>


More information about the petsc-users mailing list