[petsc-users] MatMPIAIJSetPreallocation: "nnz cannot be greater than row length"

David Knezevic david.knezevic at akselos.com
Sun Feb 22 16:58:47 CST 2015


Thanks, that helps! After fixing that, now I get this error:

[1]PETSC ERROR: Petsc has generated inconsistent data
[1]PETSC ERROR: MPIAIJ Matrix was assembled but is missing garray

Any suggestions about what may be wrong now? I'll try the debugger tomorrow.

Thanks,
David



On Sun, Feb 22, 2015 at 5:45 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>  Do not call for SeqAIJ matrix. Do not call before the first time you have
> preallocated and put entries in the matrix and done the
> MatAssemblyBegin/End()
>
>   If it still crashes you'll need to try the debugger
>
>   Barry
>
> > On Feb 22, 2015, at 4:09 PM, David Knezevic <david.knezevic at akselos.com>
> wrote:
> >
> > Hi Barry,
> >
> > Thanks for your help, much appreciated.
> >
> > I added a prototype for MatDisAssemble_MPIAIJ:
> > PETSC_INTERN PetscErrorCode MatDisAssemble_MPIAIJ(Mat);
> >
> > and I added a call to MatDisAssemble_MPIAIJ before
> MatMPIAIJSetPreallocation. However, I get a segfault on the call to
> MatDisAssemble_MPIAIJ. The segfault occurs in both serial and parallel.
> >
> > FYI, I'm using Petsc 3.5.2, and I'm not using a non-debug build (though
> I could rebuild PETSc in debug mode if you think that would help figure out
> what's happening here).
> >
> > Thanks,
> > David
> >
> >
> >
> > On Sun, Feb 22, 2015 at 1:13 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> >   David,
> >
> >    This is an obscure little feature of MatMPIAIJ,   each time you
> change the sparsity pattern before you call the MatMPIAIJSetPreallocation
> you need to call  MatDisAssemble_MPIAIJ(Mat mat).    This is a private
> PETSc function so you need to provide your own prototype for it above the
> function you use it in.
> >
> >   Let us know if this resolves the problem.
> >
> >    Barry
> >
> > We never really intended that people would call
> MatMPIAIJSetPreallocation() AFTER they had already used the matrix.
> >
> >
> > > On Feb 22, 2015, at 6:50 AM, David Knezevic <
> david.knezevic at akselos.com> wrote:
> > >
> > > Hi all,
> > >
> > > I've implemented a solver for a contact problem using SNES. The
> sparsity pattern of the jacobian matrix needs to change at each nonlinear
> iteration (because the elements which are in contact can change), so I
> tried to deal with this by calling MatSeqAIJSetPreallocation and
> MatMPIAIJSetPreallocation during each iteration in order to update the
> preallocation.
> > >
> > > This seems to work fine in serial, but with two or more MPI processes
> I run into the error "nnz cannot be greater than row length", e.g.:
> > > nnz cannot be greater than row length: local row 528 value 12
> rowlength 0
> > >
> > > This error is from the call to
> > > MatSeqAIJSetPreallocation(b->B,o_nz,o_nnz); in
> MatMPIAIJSetPreallocation_MPIAIJ.
> > >
> > > Any guidance on what the problem might be would be most appreciated.
> For example, I was wondering if there is a problem with calling
> SetPreallocation on a matrix that has already been preallocated?
> > >
> > > Some notes:
> > > - I'm using PETSc via libMesh
> > > - The code that triggers this issue is available as a PR on the
> libMesh github repo, in case anyone is interested:
> https://github.com/libMesh/libmesh/pull/460/
> > > - I can try to make a minimal pure-PETSc example that reproduces this
> error, if that would be helpful.
> > >
> > > Many thanks,
> > > David
> > >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150222/ff1a9cad/attachment-0001.html>


More information about the petsc-users mailing list