<div dir="ltr"><div dir="ltr">On Fri, Apr 5, 2019 at 3:20 AM Eda Oktay via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr">Hello,<div><br></div><div>I am trying to calculate unweighted Laplacian of a matrix by using 2 cores. If the size of matrix is in even number then my program works. However, when I try to use a matrix having odd number for size, I guess since size of the matrix cannot be divided into processors correctly, I get the following error:</div><div><br></div><div><div>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------</div><div>[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------</div><div>[1]PETSC ERROR: Petsc has generated inconsistent data</div><div>[1]PETSC ERROR: MPI_Allreduce() called in different locations (code lines) on different processors</div><div>[1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.</div><div>[1]PETSC ERROR: Petsc Release Version 3.10.3, Dec, 18, 2018 </div><div>[0]PETSC ERROR: Petsc has generated inconsistent data</div><div>[0]PETSC ERROR: MPI_Allreduce() called in different locations (code lines) on different processors</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Release Version 3.10.3, Dec, 18, 2018 </div><div>[0]PETSC ERROR: ./SON_YENI_DENEME_TEMIZ_ENYENI_FINAL on a arch-linux2-c-debug named <a href="http://dfa.wls.metu.edu.tr" target="_blank">dfa.wls.metu.edu.tr</a> by edaoktay Fri Apr 5 09:50:54 2019</div><div>[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --with-cxx-dialect=C++11 --download-openblas --download-metis --download-parmetis --download-superlu_dist --download-slepc --download-mpich</div><div>[1]PETSC ERROR: ./SON_YENI_DENEME_TEMIZ_ENYENI_FINAL on a arch-linux2-c-debug named <a href="http://dfa.wls.metu.edu.tr" target="_blank">dfa.wls.metu.edu.tr</a> by edaoktay Fri Apr 5 09:50:54 2019</div><div>[1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --with-cxx-dialect=C++11 --download-openblas --download-metis --download-parmetis --download-superlu_dist --download-slepc --download-mpich</div><div>[1]PETSC ERROR: [0]PETSC ERROR: #1 MatSetOption() line 5505 in /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c</div><div>#1 MatStashScatterBegin_BTS() line 843 in /home/edaoktay/petsc-3.10.3/src/mat/utils/matstash.c</div><div>[1]PETSC ERROR: [0]PETSC ERROR: #2 MatSetOption() line 5505 in /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c</div><div>#2 MatStashScatterBegin_BTS() line 843 in /home/edaoktay/petsc-3.10.3/src/mat/utils/matstash.c</div><div>[1]PETSC ERROR: #3 MatStashScatterBegin_Private() line 462 in /home/edaoktay/petsc-3.10.3/src/mat/utils/matstash.c</div><div>[0]PETSC ERROR: #3 main() line 164 in /home/edaoktay/petsc-3.10.3/arch-linux2-c-debug/share/slepc/examples/src/eda/SON_YENI_DENEME_TEMIZ_ENYENI_FINAL.c</div><div>[1]PETSC ERROR: #4 MatAssemblyBegin_MPIAIJ() line 774 in /home/edaoktay/petsc-3.10.3/src/mat/impls/aij/mpi/mpiaij.c</div><div>[1]PETSC ERROR: [0]PETSC ERROR: PETSc Option Table entries:</div><div>[0]PETSC ERROR: -f /home/edaoktay/petsc-3.10.3/share/petsc/datafiles/matrices/binary_files/airfoil1_binary</div><div>#5 MatAssemblyBegin() line 5251 in /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c</div><div>[1]PETSC ERROR: [0]PETSC ERROR: -mat_partitioning_type parmetis</div><div>[0]PETSC ERROR: -unweighted</div><div>#6 main() line 169 in /home/edaoktay/petsc-3.10.3/arch-linux2-c-debug/share/slepc/examples/src/eda/SON_YENI_DENEME_TEMIZ_ENYENI_FINAL.c</div><div>[1]PETSC ERROR: PETSc Option Table entries:</div><div>[1]PETSC ERROR: [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------</div><div>-f /home/edaoktay/petsc-3.10.3/share/petsc/datafiles/matrices/binary_files/airfoil1_binary</div><div>[1]PETSC ERROR: -mat_partitioning_type parmetis</div><div>[1]PETSC ERROR: -unweighted</div><div>[1]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0</div><div>----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------</div><div>application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1</div></div><div><br></div><div><br></div><div> where line 164 in my main program is MatSetOption and line 169 is MatAssemblyBegin. I am new to MPI usage so I did not understand why MPI_Allreduce() causes a problem and how can I fix the problem.</div></div></div></div></div></blockquote><div><br></div><div>You have to call collective methods on all processes in the same order. This is not happening in your code. Beyond that,</div><div>there is no way for us to tell how this happened.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div>Thank you,</div><div><br></div><div>Eda</div></div></div></div></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>