[petsc-dev] MatMatMult crashes with MATMPIAIJ using one processor

Barry Smith bsmith at mcs.anl.gov
Sat Mar 11 13:06:26 CST 2017


  I can reproduce this in src/mat/examples/tests

./ex94 -f0 ${DATAFILESPATH}/matrices/arco1 -f1 ${DATAFILESPATH}/matrices/arco1 -viewer_binary_skip_info  -mat_type mpiaij

we'll fix it.

  Thanks

    Barry

> On Mar 11, 2017, at 1:04 AM, Ce Qin <qince168 at gmail.com> wrote:
> 
> Dear all,
> 
> I encountered a problem that MatMatMult crashes using only one processor. The matrix is MATMPIAIJ. I am using the master branch. The same program can run in parallel without problem. 
> 
> Valgrind gives the following stack trace:
> 
> ==5543== Invalid read of size 8
> ==5543==    at 0xDC094CE: MatGetBrowsOfAoCols_MPIAIJ (mpiaij.c:4509)
> ==5543==    by 0xDC3FB95: MatMatMultSymbolic_MPIAIJ_MPIAIJ_nonscalable (mpimatmatmult.c:218)
> ==5543==    by 0xDC45193: MatMatMult_MPIAIJ_MPIAIJ (mpimatmatmult.c:61)
> ==5543==    by 0xD9C657D: MatMatMult (matrix.c:9450)
> 
> ==5543== 
> ==5543== Invalid read of size 4
> ==5543==    at 0xDC094D7: MatGetBrowsOfAoCols_MPIAIJ (mpiaij.c:4510)
> ==5543==    by 0xDC3FB95: MatMatMultSymbolic_MPIAIJ_MPIAIJ_nonscalable (mpimatmatmult.c:218)
> ==5543==    by 0xDC45193: MatMatMult_MPIAIJ_MPIAIJ (mpimatmatmult.c:61)
> ==5543==    by 0xD9C657D: MatMatMult (matrix.c:9450)
> 
> ==5543== 
> ==5543== Invalid read of size 4
> ==5543==    at 0xDC0793B: MatGetBrowsOfAoCols_MPIAIJ (mpiaij.c:4513)
> ==5543==    by 0xDC3FB95: MatMatMultSymbolic_MPIAIJ_MPIAIJ_nonscalable (mpimatmatmult.c:218)
> ==5543==    by 0xDC45193: MatMatMult_MPIAIJ_MPIAIJ (mpimatmatmult.c:61)
> ==5543==    by 0xD9C657D: MatMatMult (matrix.c:9450)
> 
> ==5543== Invalid read of size 4
> ==5543==    at 0xDC0794E: MatGetBrowsOfAoCols_MPIAIJ (mpiaij.c:4520)
> ==5543==    by 0xDC3FB95: MatMatMultSymbolic_MPIAIJ_MPIAIJ_nonscalable (mpimatmatmult.c:218)
> ==5543==    by 0xDC45193: MatMatMult_MPIAIJ_MPIAIJ (mpimatmatmult.c:61)
> ==5543==    by 0xD9C657D: MatMatMult (matrix.c:9450)
> 
> This problem may be related to this thread http://lists.mcs.anl.gov/pipermail/petsc-users/2014-September/022864.html. If you need an example that reproduces this problem, please let me know.
> 
> Best,
> Ce Qin




More information about the petsc-dev mailing list