[petsc-users] error: Petsc has generated inconsistent data, MPI_Allreduce() called in different locations (code lines) on different processors

Eda Oktay eda.oktay at metu.edu.tr
Sun Apr 7 04:43:27 CDT 2019


Dear Barry,

Thank you for answering. I am sending my code, makefile and 2 PETSc Binary
File where airfoil1_binary has odd numbered size (the one I got error) and
gr_30_30_binary has even numbered size (I got correct result). I got the
error for lines 164 and 169.

By the way, I think I have done something wrong at line 320 (I tried to
allocate an array twice) but I just need to reduce the error in 164 and 169
then I will get to reduce the error in line 320.

Thanks,

Eda


Smith, Barry F. <bsmith at mcs.anl.gov>, 6 Nis 2019 Cmt, 09:29 tarihinde şunu
yazdı:

>
>   Eda,
>
>    Can you send us your code (and any needed data files)? We certainly
> expect PETSc to perform correctly if the size of the matrix cannot be
> divide by the number of processors. It is possible the problem is due to
> bugs either in MatStashScatterBegin_BTS() or in your code.
>
>    Thanks
>
>     Barry
>
>
> > On Apr 5, 2019, at 7:20 AM, Matthew Knepley via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
> >
> > On Fri, Apr 5, 2019 at 3:20 AM Eda Oktay via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
> > Hello,
> >
> > I am trying to calculate unweighted Laplacian of a matrix by using 2
> cores. If the size of matrix is in even number then my program works.
> However, when I try to use a matrix having odd number for size, I guess
> since size of the matrix cannot be divided into processors correctly, I get
> the following error:
> >
> > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [1]PETSC ERROR: Petsc has generated inconsistent data
> > [1]PETSC ERROR: MPI_Allreduce() called in different locations (code
> lines) on different processors
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [1]PETSC ERROR: Petsc Release Version 3.10.3, Dec, 18, 2018
> > [0]PETSC ERROR: Petsc has generated inconsistent data
> > [0]PETSC ERROR: MPI_Allreduce() called in different locations (code
> lines) on different processors
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.10.3, Dec, 18, 2018
> > [0]PETSC ERROR: ./SON_YENI_DENEME_TEMIZ_ENYENI_FINAL on a
> arch-linux2-c-debug named dfa.wls.metu.edu.tr by edaoktay Fri Apr  5
> 09:50:54 2019
> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --with-cxx-dialect=C++11 --download-openblas
> --download-metis --download-parmetis --download-superlu_dist
> --download-slepc --download-mpich
> > [1]PETSC ERROR: ./SON_YENI_DENEME_TEMIZ_ENYENI_FINAL on a
> arch-linux2-c-debug named dfa.wls.metu.edu.tr by edaoktay Fri Apr  5
> 09:50:54 2019
> > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --with-cxx-dialect=C++11 --download-openblas
> --download-metis --download-parmetis --download-superlu_dist
> --download-slepc --download-mpich
> > [1]PETSC ERROR: [0]PETSC ERROR: #1 MatSetOption() line 5505 in
> /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c
> > #1 MatStashScatterBegin_BTS() line 843 in
> /home/edaoktay/petsc-3.10.3/src/mat/utils/matstash.c
> > [1]PETSC ERROR: [0]PETSC ERROR: #2 MatSetOption() line 5505 in
> /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c
> > #2 MatStashScatterBegin_BTS() line 843 in
> /home/edaoktay/petsc-3.10.3/src/mat/utils/matstash.c
> > [1]PETSC ERROR: #3 MatStashScatterBegin_Private() line 462 in
> /home/edaoktay/petsc-3.10.3/src/mat/utils/matstash.c
> > [0]PETSC ERROR: #3 main() line 164 in
> /home/edaoktay/petsc-3.10.3/arch-linux2-c-debug/share/slepc/examples/src/eda/SON_YENI_DENEME_TEMIZ_ENYENI_FINAL.c
> > [1]PETSC ERROR: #4 MatAssemblyBegin_MPIAIJ() line 774 in
> /home/edaoktay/petsc-3.10.3/src/mat/impls/aij/mpi/mpiaij.c
> > [1]PETSC ERROR: [0]PETSC ERROR: PETSc Option Table entries:
> > [0]PETSC ERROR: -f
> /home/edaoktay/petsc-3.10.3/share/petsc/datafiles/matrices/binary_files/airfoil1_binary
> > #5 MatAssemblyBegin() line 5251 in
> /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c
> > [1]PETSC ERROR: [0]PETSC ERROR: -mat_partitioning_type parmetis
> > [0]PETSC ERROR: -unweighted
> > #6 main() line 169 in
> /home/edaoktay/petsc-3.10.3/arch-linux2-c-debug/share/slepc/examples/src/eda/SON_YENI_DENEME_TEMIZ_ENYENI_FINAL.c
> > [1]PETSC ERROR: PETSc Option Table entries:
> > [1]PETSC ERROR: [0]PETSC ERROR: ----------------End of Error Message
> -------send entire error message to petsc-maint at mcs.anl.gov----------
> > -f
> /home/edaoktay/petsc-3.10.3/share/petsc/datafiles/matrices/binary_files/airfoil1_binary
> > [1]PETSC ERROR: -mat_partitioning_type parmetis
> > [1]PETSC ERROR: -unweighted
> > [1]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 1) -
> process 0
> > ----------------End of Error Message -------send entire error message to
> petsc-maint at mcs.anl.gov----------
> > application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1
> >
> >
> >  where line 164 in my main program is MatSetOption and line 169 is
> MatAssemblyBegin. I am new to MPI usage so I did not understand why
> MPI_Allreduce() causes a problem and how can I fix the problem.
> >
> > You have to call collective methods on all processes in the same order.
> This is not happening in your code. Beyond that,
> > there is no way for us to tell how this happened.
> >
> >   Thanks,
> >
> >      Matt
> >
> > Thank you,
> >
> > Eda
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
> >
> > https://www.cse.buffalo.edu/~knepley/
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190407/ca9f9f52/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: example.zip
Type: application/zip
Size: 47687 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190407/ca9f9f52/attachment-0001.zip>


More information about the petsc-users mailing list