[petsc-users] PetscSFBcastAndOpBegin() fails when called from MatDiagonalScale() for n_threads > 1 with "Wrong type of object"

Matthew Knepley knepley at gmail.com
Sun Jan 17 14:05:23 CST 2021


On Sun, Jan 17, 2021 at 1:28 PM Roland Richter <roland.richter at ntnu.no>
wrote:

> Dear all,
>
> I am currently encountering issues in my program when calling
> MatDiagonalScale(matrix, NULL, vector) in an MPI-context for at least two
> threads. The backtrace from gdb (and the error output) tells me that the
> problem lies apparently in PetscSFBcastAndOpBegin(), with the exact error
> listed as
> *petsc/src/vec/is/sf/interface/sf.c Wrong type of object: Parameter # 1 *
>
> 1) Always send the entire error trace

2) You are using right scaling, so it would use the matvec scatter
internally. If MatSetUp() has not been called, this scatter will not have
     been created. I think we are missing a check here that the matrix is
setup.

  Thanks,

     Matt

> When printing the involved matrix and vector, everything looks as expected
> (the matrix is split over the processes equally with continuous rows, the
> vector is distributed over both processes).
>
> Is the issue that my vector is distributed over several processes? Or is
> there another issue I am not seeing here?
>
> I tried to narrow the problem down by writing a MWE, but was not able to
> reproduce it in a smaller scale. Therefore, are there other approaches I
> could try, too?
>
> Thanks!
>
> Regards,
>
> Roland Richter
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210117/8328d664/attachment.html>


More information about the petsc-users mailing list