[petsc-users] VecAssemblyEnd_MPI_BTS

Fande Kong fdkong.jd at gmail.com
Tue Jul 14 10:16:11 CDT 2020


Hi All,


I was doing a large-scale simulation using 12288 cores and had the
following error. The code ran fine using less than 12288 cores.

Any quick suggestions to track down this issue?

Thanks,

Fande,


[3342]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[3342]PETSC ERROR: Petsc has generated inconsistent data
[3342]PETSC ERROR: Received vector entry 0 out of local range
[344829312,344964096)]
[3342]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.
[3342]PETSC ERROR: Petsc Release Version 3.13.3, unknown
[3342]PETSC ERROR: /home/kongf/workhome/sawtooth/griffin/griffin-opt on a
arch-moose named r1i4n34 by kongf Tue Jul 14 08:44:02 2020
[3342]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=no
--with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1
--download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1
--download-mumps=1 --download-scalapack=1 --download-slepc=1 --with-mpi=1
--with-cxx-dialect=C++11 --with-fortran-bindings=0 --with-sowing=0
--with-64-bit-indices --download-mumps=0
[3342]PETSC ERROR: #1 VecAssemblyEnd_MPI_BTS() line 324 in
/home/kongf/workhome/sawtooth/moosers/petsc/src/vec/vec/impls/mpi/pbvec.c
[3342]PETSC ERROR: #2 VecAssemblyEnd() line 171 in
/home/kongf/workhome/sawtooth/moosers/petsc/src/vec/vec/interface/vector.c
[cli_3342]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 3342
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200714/ef382627/attachment.html>


More information about the petsc-users mailing list