[petsc-users] PetscSFBcastAndOpBegin() fails when called from MatDiagonalScale() for n_threads > 1 with "Wrong type of object"

Roland Richter roland.richter at ntnu.no
Tue Jan 19 05:58:46 CST 2021


Hei,

thanks for the idea, that solved my problem! I noticed that I was
writing out of bounds in an earlier loop, which then resulted in the
behavior observed below.

Regards,

Roland

Am 17.01.21 um 19:40 schrieb Stefano Zampini:
> Valgrind https://valgrind.org/ <https://valgrind.org/>
>
> Il Dom 17 Gen 2021, 21:28 Roland Richter <roland.richter at ntnu.no
> <mailto:roland.richter at ntnu.no>> ha scritto:
>
>     Dear all,
>
>     I am currently encountering issues in my program when calling
>     MatDiagonalScale(matrix, NULL, vector) in an MPI-context for at
>     least two threads. The backtrace from gdb (and the error output)
>     tells me that the problem lies apparently in
>     PetscSFBcastAndOpBegin(), with the exact error listed as
>     /petsc/src/vec/is/sf/interface/sf.c Wrong type of object:
>     Parameter # 1
>     /
>
>     When printing the involved matrix and vector, everything looks as
>     expected (the matrix is split over the processes equally with
>     continuous rows, the vector is distributed over both processes).
>
>     Is the issue that my vector is distributed over several processes?
>     Or is there another issue I am not seeing here?
>
>     I tried to narrow the problem down by writing a MWE, but was not
>     able to reproduce it in a smaller scale. Therefore, are there
>     other approaches I could try, too?
>
>     Thanks!
>
>     Regards,
>
>     Roland Richter
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210119/ec28ce08/attachment.html>


More information about the petsc-users mailing list