[petsc-users] VecAssemblyEnd_MPI_BTS

Junchao Zhang junchao.zhang at gmail.com
Tue Jul 14 14:14:48 CDT 2020


I have no idea. You can try -build_twosided allreduce. If not working, try
to write an example.
--Junchao Zhang


On Tue, Jul 14, 2020 at 11:30 AM Fande Kong <fdkong.jd at gmail.com> wrote:

> The petsc configuration log was attached.
>
> Thanks,
>
> Fande,
>
>
>
> On Tue, Jul 14, 2020 at 9:16 AM Fande Kong <fdkong.jd at gmail.com> wrote:
>
>> Hi All,
>>
>>
>> I was doing a large-scale simulation using 12288 cores and had the
>> following error. The code ran fine using less than 12288 cores.
>>
>> Any quick suggestions to track down this issue?
>>
>> Thanks,
>>
>> Fande,
>>
>>
>> [3342]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>> [3342]PETSC ERROR: Petsc has generated inconsistent data
>> [3342]PETSC ERROR: Received vector entry 0 out of local range
>> [344829312,344964096)]
>> [3342]PETSC ERROR: See
>> https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>> shooting.
>> [3342]PETSC ERROR: Petsc Release Version 3.13.3, unknown
>> [3342]PETSC ERROR: /home/kongf/workhome/sawtooth/griffin/griffin-opt on a
>> arch-moose named r1i4n34 by kongf Tue Jul 14 08:44:02 2020
>> [3342]PETSC ERROR: Configure options --download-hypre=1
>> --with-debugging=no --with-shared-libraries=1 --download-fblaslapack=1
>> --download-metis=1 --download-ptscotch=1 --download-parmetis=1
>> --download-superlu_dist=1 --download-mumps=1 --download-scalapack=1
>> --download-slepc=1 --with-mpi=1 --with-cxx-dialect=C++11
>> --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices
>> --download-mumps=0
>> [3342]PETSC ERROR: #1 VecAssemblyEnd_MPI_BTS() line 324 in
>> /home/kongf/workhome/sawtooth/moosers/petsc/src/vec/vec/impls/mpi/pbvec.c
>> [3342]PETSC ERROR: #2 VecAssemblyEnd() line 171 in
>> /home/kongf/workhome/sawtooth/moosers/petsc/src/vec/vec/interface/vector.c
>> [cli_3342]: aborting job:
>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 3342
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200714/82e62611/attachment.html>


More information about the petsc-users mailing list