[petsc-users] Question about the error message
Michael Povolotskyi
mpovolot at purdue.edu
Mon Jun 29 10:12:23 CDT 2015
Thank you, I can try with a newer release.
Have you re-implemented the function there?
The problem is we have not move to the new API completely.
Is there any workaround with Petsc 3.4?
On 06/29/2015 11:08 AM, Matthew Knepley wrote:
> On Mon, Jun 29, 2015 at 9:59 AM, Michael Povolotskyi
> <mpovolot at purdue.edu <mailto:mpovolot at purdue.edu>> wrote:
>
> Dear PETSc developers and users,
> what does this error message mean?
>
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [0]PETSC ERROR: Argument out of range!
> [0]PETSC ERROR: nnz cannot be greater than row length: local row 0
> value 108 rowlength 72!
>
>
> It looks like a bug in our AXPY implementation. We try to do
> preallocation of the result, but it seems
> here that we are overallocating. Can you run with the latest release?
>
> Thanks,
>
> Matt
>
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.4.3, Oct, 15, 2013
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: /home/mpovolot/NEMO5/prototype/nemo on a
> linux-static named conte-fe02.rcac.purdue.edu
> <http://conte-fe02.rcac.purdue.edu> by mpovolot Mon Jun 29
> 10:49:32 2015
> [0]PETSC ERROR: Libraries linked from
> /apps/rhel6/petsc/3.4.3_impi-4.1.1.036_intel-13.1.1.163/linux-static/lib
> [0]PETSC ERROR: Configure run at Sun Jan 19 12:47:22 2014
> [0]PETSC ERROR: Configure options --with-cc=mpiicc
> --with-cxx=mpiicpc --with-fc=mpiifort --download-sowing
> --with-scalar-type=complex --with-shared-libraries=0 --with-pic=1
> --with-clanguage=C++ --with-fortran=1 --with-fortran-kernels=0
> --with-debugging=0
> --with-blas-lapack-dir=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64
> --with-blacs-lib=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.so
> --with-blacs-include=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/include
> --with-scalapack-lib="-L/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64
> -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64"
> --with-scalapack-include=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/include
> --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3
> --download-hdf5=yes --download-metis=yes --download-parmetis=yes
> --download-superlu_dist=yes --download-mumps=yes
> --download-hypre=no --download-spooles=yes
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: MatSeqAIJSetPreallocation_SeqAIJ() line 3524 in
> /apps/rhel6/petsc/3.4.3_impi-4.1.1.036_intel-13.1.1.163/src/mat/impls/aij/seq/aij.c
> [0]PETSC ERROR: MatSeqAIJSetPreallocation() line 3496 in
> /apps/rhel6/petsc/3.4.3_impi-4.1.1.036_intel-13.1.1.163/src/mat/impls/aij/seq/aij.c
> [0]PETSC ERROR: MatAXPY_SeqAIJ() line 2710 in
> /apps/rhel6/petsc/3.4.3_impi-4.1.1.036_intel-13.1.1.163/src/mat/impls/aij/seq/aij.c
> [0]PETSC ERROR: MatAXPY() line 39 in
> /apps/rhel6/petsc/3.4.3_impi-4.1.1.036_intel-13.1.1.163/src/mat/utils/axpy.c
> terminate called after throwing an instance of 'n5_runtime_error'
> what(): [PetscMatrixNemo<cplx>] PETSc gave error with code 63:
> Argument out of range
> .
>
> Program received signal SIGABRT, Aborted.
>
> Thank you,
> Michael.
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
--
Michael Povolotskyi, PhD
Research Assistant Professor
Network for Computational Nanotechnology
Hall for Discover and Learning Research, Room 441
West Lafayette, IN 47907
Phone (765) 4949396
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150629/60dce3d7/attachment-0001.html>
More information about the petsc-users
mailing list