[petsc-users] MatMPIAIJSetPreallocation: "nnz cannot be greater than row length"
Barry Smith
bsmith at mcs.anl.gov
Sun Feb 22 12:13:05 CST 2015
David,
This is an obscure little feature of MatMPIAIJ, each time you change the sparsity pattern before you call the MatMPIAIJSetPreallocation you need to call MatDisAssemble_MPIAIJ(Mat mat). This is a private PETSc function so you need to provide your own prototype for it above the function you use it in.
Let us know if this resolves the problem.
Barry
We never really intended that people would call MatMPIAIJSetPreallocation() AFTER they had already used the matrix.
> On Feb 22, 2015, at 6:50 AM, David Knezevic <david.knezevic at akselos.com> wrote:
>
> Hi all,
>
> I've implemented a solver for a contact problem using SNES. The sparsity pattern of the jacobian matrix needs to change at each nonlinear iteration (because the elements which are in contact can change), so I tried to deal with this by calling MatSeqAIJSetPreallocation and MatMPIAIJSetPreallocation during each iteration in order to update the preallocation.
>
> This seems to work fine in serial, but with two or more MPI processes I run into the error "nnz cannot be greater than row length", e.g.:
> nnz cannot be greater than row length: local row 528 value 12 rowlength 0
>
> This error is from the call to
> MatSeqAIJSetPreallocation(b->B,o_nz,o_nnz); in MatMPIAIJSetPreallocation_MPIAIJ.
>
> Any guidance on what the problem might be would be most appreciated. For example, I was wondering if there is a problem with calling SetPreallocation on a matrix that has already been preallocated?
>
> Some notes:
> - I'm using PETSc via libMesh
> - The code that triggers this issue is available as a PR on the libMesh github repo, in case anyone is interested: https://github.com/libMesh/libmesh/pull/460/
> - I can try to make a minimal pure-PETSc example that reproduces this error, if that would be helpful.
>
> Many thanks,
> David
>
More information about the petsc-users
mailing list