[petsc-users] MatMPIAIJSetPreallocation: "nnz cannot be greater than row length"

David Knezevic david.knezevic at akselos.com
Sun Feb 22 21:09:04 CST 2015


Hi Dmitry,

Thanks for the suggestion. I tried MatSetType(mat,MATMPIAIJ) followed
by MatXAIJSetPreallocation(...),
but unfortunately this still gives me the same error as before: "nnz cannot
be greater than row length: local row 168 value 24 rowlength 0".

I gather that the idea here is that MatSetType builds a new matrix object,
and then I should be able to pre-allocate for that new matrix however I
like, right? Was I supposed to clear the matrix object somehow before
calling MatSetType? (I didn't do any sort of clear operation.)

As I said earlier, I'll make a dbg PETSc build, so hopefully that will help
shed some light on what's going wrong for me.

Thanks,
David




On Sun, Feb 22, 2015 at 6:02 PM, Dmitry Karpeyev <dkarpeev at gmail.com> wrote:

> David,
> It might be easier to just rebuild the whole matrix from scratch: you
> would in effect be doing all that with disassembling and resetting the
> preallocation.
> MatSetType(mat,MATMPIAIJ)
> or
> PetscObjectGetType((PetscObject)mat,&type);
> MatSetType(mat,type);
> followed by
> MatXAIJSetPreallocation(...);
> should do.
> Dmitry.
>
>
> On Sun Feb 22 2015 at 4:45:46 PM Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>>
>>  Do not call for SeqAIJ matrix. Do not call before the first time you
>> have preallocated and put entries in the matrix and done the
>> MatAssemblyBegin/End()
>>
>>   If it still crashes you'll need to try the debugger
>>
>>   Barry
>>
>> > On Feb 22, 2015, at 4:09 PM, David Knezevic <david.knezevic at akselos.com>
>> wrote:
>> >
>> > Hi Barry,
>> >
>> > Thanks for your help, much appreciated.
>> >
>> > I added a prototype for MatDisAssemble_MPIAIJ:
>> > PETSC_INTERN PetscErrorCode MatDisAssemble_MPIAIJ(Mat);
>> >
>> > and I added a call to MatDisAssemble_MPIAIJ before
>> MatMPIAIJSetPreallocation. However, I get a segfault on the call to
>> MatDisAssemble_MPIAIJ. The segfault occurs in both serial and parallel.
>> >
>> > FYI, I'm using Petsc 3.5.2, and I'm not using a non-debug build (though
>> I could rebuild PETSc in debug mode if you think that would help figure out
>> what's happening here).
>> >
>> > Thanks,
>> > David
>> >
>> >
>> >
>> > On Sun, Feb 22, 2015 at 1:13 PM, Barry Smith <bsmith at mcs.anl.gov>
>> wrote:
>> >
>> >   David,
>> >
>> >    This is an obscure little feature of MatMPIAIJ,   each time you
>> change the sparsity pattern before you call the MatMPIAIJSetPreallocation
>> you need to call  MatDisAssemble_MPIAIJ(Mat mat).    This is a private
>> PETSc function so you need to provide your own prototype for it above the
>> function you use it in.
>> >
>> >   Let us know if this resolves the problem.
>> >
>> >    Barry
>> >
>> > We never really intended that people would call
>> MatMPIAIJSetPreallocation() AFTER they had already used the matrix.
>> >
>> >
>> > > On Feb 22, 2015, at 6:50 AM, David Knezevic <
>> david.knezevic at akselos.com> wrote:
>> > >
>> > > Hi all,
>> > >
>> > > I've implemented a solver for a contact problem using SNES. The
>> sparsity pattern of the jacobian matrix needs to change at each nonlinear
>> iteration (because the elements which are in contact can change), so I
>> tried to deal with this by calling MatSeqAIJSetPreallocation and
>> MatMPIAIJSetPreallocation during each iteration in order to update the
>> preallocation.
>> > >
>> > > This seems to work fine in serial, but with two or more MPI processes
>> I run into the error "nnz cannot be greater than row length", e.g.:
>> > > nnz cannot be greater than row length: local row 528 value 12
>> rowlength 0
>> > >
>> > > This error is from the call to
>> > > MatSeqAIJSetPreallocation(b->B,o_nz,o_nnz); in
>> MatMPIAIJSetPreallocation_MPIAIJ.
>> > >
>> > > Any guidance on what the problem might be would be most appreciated.
>> For example, I was wondering if there is a problem with calling
>> SetPreallocation on a matrix that has already been preallocated?
>> > >
>> > > Some notes:
>> > > - I'm using PETSc via libMesh
>> > > - The code that triggers this issue is available as a PR on the
>> libMesh github repo, in case anyone is interested:
>> https://github.com/libMesh/libmesh/pull/460/
>> > > - I can try to make a minimal pure-PETSc example that reproduces this
>> error, if that would be helpful.
>> > >
>> > > Many thanks,
>> > > David
>> > >
>> >
>> >
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150222/8ad9aad1/attachment-0001.html>


More information about the petsc-users mailing list