[petsc-users] error: Petsc has generated inconsistent data, MPI_Allreduce() called in different locations (code lines) on different processors
Matthew Knepley
knepley at gmail.com
Fri Apr 5 07:20:33 CDT 2019
On Fri, Apr 5, 2019 at 3:20 AM Eda Oktay via petsc-users <
petsc-users at mcs.anl.gov> wrote:
> Hello,
>
> I am trying to calculate unweighted Laplacian of a matrix by using 2
> cores. If the size of matrix is in even number then my program works.
> However, when I try to use a matrix having odd number for size, I guess
> since size of the matrix cannot be divided into processors correctly, I get
> the following error:
>
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Petsc has generated inconsistent data
> [1]PETSC ERROR: MPI_Allreduce() called in different locations (code lines)
> on different processors
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.10.3, Dec, 18, 2018
> [0]PETSC ERROR: Petsc has generated inconsistent data
> [0]PETSC ERROR: MPI_Allreduce() called in different locations (code lines)
> on different processors
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.10.3, Dec, 18, 2018
> [0]PETSC ERROR: ./SON_YENI_DENEME_TEMIZ_ENYENI_FINAL on a
> arch-linux2-c-debug named dfa.wls.metu.edu.tr by edaoktay Fri Apr 5
> 09:50:54 2019
> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --with-cxx-dialect=C++11 --download-openblas
> --download-metis --download-parmetis --download-superlu_dist
> --download-slepc --download-mpich
> [1]PETSC ERROR: ./SON_YENI_DENEME_TEMIZ_ENYENI_FINAL on a
> arch-linux2-c-debug named dfa.wls.metu.edu.tr by edaoktay Fri Apr 5
> 09:50:54 2019
> [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --with-cxx-dialect=C++11 --download-openblas
> --download-metis --download-parmetis --download-superlu_dist
> --download-slepc --download-mpich
> [1]PETSC ERROR: [0]PETSC ERROR: #1 MatSetOption() line 5505 in
> /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c
> #1 MatStashScatterBegin_BTS() line 843 in
> /home/edaoktay/petsc-3.10.3/src/mat/utils/matstash.c
> [1]PETSC ERROR: [0]PETSC ERROR: #2 MatSetOption() line 5505 in
> /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c
> #2 MatStashScatterBegin_BTS() line 843 in
> /home/edaoktay/petsc-3.10.3/src/mat/utils/matstash.c
> [1]PETSC ERROR: #3 MatStashScatterBegin_Private() line 462 in
> /home/edaoktay/petsc-3.10.3/src/mat/utils/matstash.c
> [0]PETSC ERROR: #3 main() line 164 in
> /home/edaoktay/petsc-3.10.3/arch-linux2-c-debug/share/slepc/examples/src/eda/SON_YENI_DENEME_TEMIZ_ENYENI_FINAL.c
> [1]PETSC ERROR: #4 MatAssemblyBegin_MPIAIJ() line 774 in
> /home/edaoktay/petsc-3.10.3/src/mat/impls/aij/mpi/mpiaij.c
> [1]PETSC ERROR: [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -f
> /home/edaoktay/petsc-3.10.3/share/petsc/datafiles/matrices/binary_files/airfoil1_binary
> #5 MatAssemblyBegin() line 5251 in
> /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c
> [1]PETSC ERROR: [0]PETSC ERROR: -mat_partitioning_type parmetis
> [0]PETSC ERROR: -unweighted
> #6 main() line 169 in
> /home/edaoktay/petsc-3.10.3/arch-linux2-c-debug/share/slepc/examples/src/eda/SON_YENI_DENEME_TEMIZ_ENYENI_FINAL.c
> [1]PETSC ERROR: PETSc Option Table entries:
> [1]PETSC ERROR: [0]PETSC ERROR: ----------------End of Error Message
> -------send entire error message to petsc-maint at mcs.anl.gov----------
> -f
> /home/edaoktay/petsc-3.10.3/share/petsc/datafiles/matrices/binary_files/airfoil1_binary
> [1]PETSC ERROR: -mat_partitioning_type parmetis
> [1]PETSC ERROR: -unweighted
> [1]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
> ----------------End of Error Message -------send entire error message to
> petsc-maint at mcs.anl.gov----------
> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1
>
>
> where line 164 in my main program is MatSetOption and line 169
> is MatAssemblyBegin. I am new to MPI usage so I did not understand why
> MPI_Allreduce() causes a problem and how can I fix the problem.
>
You have to call collective methods on all processes in the same order.
This is not happening in your code. Beyond that,
there is no way for us to tell how this happened.
Thanks,
Matt
> Thank you,
>
> Eda
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190405/3e3fd595/attachment.html>
More information about the petsc-users
mailing list