<div dir="ltr">Hi Barry,<div><br></div><div>Thanks for your help, much appreciated.</div><div><br></div><div>I added a prototype for MatDisAssemble_MPIAIJ:</div><div>PETSC_INTERN PetscErrorCode MatDisAssemble_MPIAIJ(Mat);<br></div><div><br></div><div>and I added a call to MatDisAssemble_MPIAIJ before MatMPIAIJSetPreallocation. However, I get a segfault on the call to MatDisAssemble_MPIAIJ. The segfault occurs in both serial and parallel.</div><div><br></div><div>FYI, I'm using Petsc 3.5.2, and I'm not using a non-debug build (though I could rebuild PETSc in debug mode if you think that would help figure out what's happening here).</div><div><br></div><div>Thanks,</div><div>David</div><div><br></div><div><br></div><div class="gmail_extra"><br clear="all"><div><div class="gmail_signature"><div dir="ltr"><span style="color:rgb(0,0,0);font-family:Calibri,sans-serif;font-size:12px"><pre cols="72"><span style="font-family:arial,sans-serif;font-size:small;color:rgb(34,34,34)">On Sun, Feb 22, 2015 at 1:13 PM, Barry Smith </span><span dir="ltr" style="font-family:arial,sans-serif;font-size:small;color:rgb(34,34,34)"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span><span style="font-family:arial,sans-serif;font-size:small;color:rgb(34,34,34)"> wrote:</span><br></pre></span></div></div></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><br>
David,<br>
<br>
This is an obscure little feature of MatMPIAIJ, each time you change the sparsity pattern before you call the MatMPIAIJSetPreallocation you need to call MatDisAssemble_MPIAIJ(Mat mat). This is a private PETSc function so you need to provide your own prototype for it above the function you use it in.<br>
<br>
Let us know if this resolves the problem.<br>
<br>
Barry<br>
<br>
We never really intended that people would call MatMPIAIJSetPreallocation() AFTER they had already used the matrix.<br>
<div class=""><div class="h5"><br>
<br>
> On Feb 22, 2015, at 6:50 AM, David Knezevic <<a href="mailto:david.knezevic@akselos.com">david.knezevic@akselos.com</a>> wrote:<br>
><br>
> Hi all,<br>
><br>
> I've implemented a solver for a contact problem using SNES. The sparsity pattern of the jacobian matrix needs to change at each nonlinear iteration (because the elements which are in contact can change), so I tried to deal with this by calling MatSeqAIJSetPreallocation and MatMPIAIJSetPreallocation during each iteration in order to update the preallocation.<br>
><br>
> This seems to work fine in serial, but with two or more MPI processes I run into the error "nnz cannot be greater than row length", e.g.:<br>
> nnz cannot be greater than row length: local row 528 value 12 rowlength 0<br>
><br>
> This error is from the call to<br>
> MatSeqAIJSetPreallocation(b->B,o_nz,o_nnz); in MatMPIAIJSetPreallocation_MPIAIJ.<br>
><br>
> Any guidance on what the problem might be would be most appreciated. For example, I was wondering if there is a problem with calling SetPreallocation on a matrix that has already been preallocated?<br>
><br>
> Some notes:<br>
> - I'm using PETSc via libMesh<br>
> - The code that triggers this issue is available as a PR on the libMesh github repo, in case anyone is interested: <a href="https://github.com/libMesh/libmesh/pull/460/" target="_blank">https://github.com/libMesh/libmesh/pull/460/</a><br>
> - I can try to make a minimal pure-PETSc example that reproduces this error, if that would be helpful.<br>
><br>
> Many thanks,<br>
> David<br>
><br>
<br>
</div></div></blockquote></div><br></div></div>