[petsc-users] Doubt in MPIPreallocation

Barry Smith bsmith at mcs.anl.gov
Fri Feb 19 15:23:07 CST 2016


  Are you sure you have divided up the matrix so each process gets some of the rows?

  You are just going to have to break down and run with -start_in_debugger to see exactly why it is crashing.

  Barry

> On Feb 19, 2016, at 3:16 PM, John Albequerque <johncfdcfd at gmail.com> wrote:
> 
> Barry,
> I forgot to mention that the code is working fine with a single processor, but when I increase the number of processors, as in this case 3 processors have been used when the error message was generated. And yes, it has something to do with the off-diagonal terms as I am increasing the number of processes, the same error triggers for each row for which o_nnz[i] is not zero.
> 
> 
> Many thanks.
> 
> ----
> John Albequerque.
> 
> On Sat, Feb 20, 2016 at 2:38 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> 
>   You are setting nonzero preallocations for the "off diagonal" portion of the matrix, but one one process there is no off diagonal portion so you should be setting those values to all zero.
> 
>   Barry
> 
> > On Feb 19, 2016, at 2:29 PM, John Albequerque <johncfdcfd at gmail.com> wrote:
> >
> >
> > Dear Barry,
> > I am sorry I could not get you. I have also posted the entire error message so that you could get a deeper insight into it.
> >
> >
> > Many thanks.
> >
> > ----
> > John Albequerque.
> > ========================================================================================
> > Argument out of range
> > [0]PETSC ERROR: nnz cannot be greater than row length: local row 0 value 1 rowlength 0
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.6.3, unknown
> > [0]PETSC ERROR: ./MatSparse on a linux-gnu-c-debug named John by johncfd Sat Feb 20 01:36:01 2016
> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich
> > [0]PETSC ERROR: #1 MatSeqAIJSetPreallocation_SeqAIJ() line 3567 in /home//MyStuff/ClonedRepos/petsc/src/mat/impls/aij/seq/aij.c
> > [0]PETSC ERROR: #2 MatSeqAIJSetPreallocation() line 3539 in /home/MyStuff/ClonedRepos/petsc/src/mat/impls/aij/seq/aij.c
> > [0]PETSC ERROR: #3 MatMPIAIJSetPreallocation_MPIAIJ() line 2835 in /home/MyStuff/ClonedRepos/petsc/src/mat/impls/aij/mpi/mpiaij.c
> > [0]PETSC ERROR: #4 MatMPIAIJSetPreallocation() line 3532 in /home/MyStuff/ClonedRepos/petsc/src/mat/impls/aij/mpi/mpiaij.c
> > =========================================================================================
> >
> >
> > On Sat, Feb 20, 2016 at 12:35 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> > > On Feb 19, 2016, at 12:54 PM, John Albequerque <johncfdcfd at gmail.com> wrote:
> > >
> > > Jed, one more
> > >
> > > nnz cannot be greater than row length: local row 0 value 1 rowlength 0
> >
> >   Always send the entire error message, this provides the context to know what the hey is going on.
> >
> >   It looks like you set a matrix block (if you are using an MPI matrix on one process this could the the "off-diagonal" block)  which has no columns (hence row length is zero) but you claim you need to preallocate an entry (the 1).
> >
> >   Barry
> >
> > >
> > > How do I deal with this error?
> > >
> > > Thanks.
> > >
> > > ----
> > > John Albequerque.
> > >
> > >
> > >
> > > On Fri, Feb 19, 2016 at 8:17 PM, John Albequerque <johncfdcfd at gmail.com> wrote:
> > > Thank you very much, I will try it.
> > >
> > >
> > > Thanks
> > >
> > > ----
> > > John Albequerque.
> > >
> > >
> > >
> > > On Fri, Feb 19, 2016 at 8:16 PM, Jed Brown <jed at jedbrown.org> wrote:
> > > John Albequerque <johncfdcfd at gmail.com> writes:
> > >
> > > > So Jed, what you are suggesting is that I should set only the non-zero
> > > > elements while using
> > > > *  MatSetValues(A,(high-low),**idxm,nc,idxn,values,INSERT_*
> > > >
> > > > *VALUES);*
> > > > And not mention the zero elements and for that I should loop over all local
> > > > rows and then set each of the value.
> > >
> > > Yes.  The whole point of a sparse matrix is to spend no time or storage
> > > on 0.0 entries.  If you allocate a dense matrix and store all the zeros,
> > > you're just being inefficient relative to a dense format.
> > >
> > >
> >
> >
> 
> 



More information about the petsc-users mailing list