<div dir="ltr"><div><div>Hello Barry,<br><br></div>The full error message is<br><br><span style="font-family:monospace,monospace">[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Argument out of range<br>[0]PETSC ERROR: Inserting a new nonzero at (25,27) in the matrix<br>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.6.2, Oct, 02, 2015 <br>[0]PETSC ERROR: /gpfs/work/(user name)/progs/PICC_Code/piccode on a linux-debug named (host) by (user) Thu Oct 15 11:22:41 2015<br>[0]PETSC ERROR: Configure options --with-fc=mpif90 PETSC_ARCH=linux-debug --with-debugging=1 --with-clanguage=cxx --with-cxx=mpicxx --with-cc=mpicc --with-blas-lib="-L/usr/global/intel/mkl/<a href="http://11.0.0.079/lib/intel64">11.0.0.079/lib/intel64</a> -lmkl_rt" --with-lapack-lib="-L/usr/global/intel/mkl/<a href="http://11.0.0.079/lib/intel64">11.0.0.079/lib/intel64</a> -lmkl_rt" --with-shared-libraries --with-x=0 --download-hypre=1<br>[0]PETSC ERROR: #1 MatSetValues_SeqAIJ() line 484 in /gpfs/scratch/(user name)/petsc-3.6.2/src/mat/impls/aij/seq/aij.c<br>[0]PETSC ERROR: #2 MatSetValues() line 1173 in /gpfs/scratch/(user name)/petsc-3.6.2/src/mat/interface/matrix.c<br>[0]PETSC ERROR: #3 MatCopy_Basic() line 3722 in /gpfs/scratch/(user name)/petsc-3.6.2/src/mat/interface/matrix.c<br>[0]PETSC ERROR: #4 MatCopy_SeqAIJ() line 2630 in /gpfs/scratch/(user name)/petsc-3.6.2/src/mat/impls/aij/seq/aij.c<br>[0]PETSC ERROR: #5 MatCopy() line 3779 in /gpfs/scratch/(user name)/petsc-3.6.2/src/mat/interface/matrix.c<br>48 MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);<br>(gdb) n<br>49 MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);</span><br><br><br></div>I tried a DMDACreate1D where I specify the dimension as n^3, where n is the number of nodes on one side of the cube, but I still get the same error. The error is generated at the MatCopy line as mentioned earlier.<br><div><div class="gmail_extra"><br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
In parallel you will not be able to reuse some old code to "compute the Finite difference discretization of a 3D regular grid" so you might consider just discarding that old code and using what PETSc provides for computing the "Finite difference discretization of a 3D regular grid."<br></blockquote><div><br></div><div>Ah ok, I had meant to say Finite difference discretization of the Poisson equation.<br><br>If my parallel code could generate the sparse matrices formats for the portion owned by each ranks, would there be a convenient way to generate the PETSc matrices to be part of the DM? I had noticed that CreateMatSeqAIJWithArrays changes the location of the Mat pointer passed in for ComputeMatrix, so the actual matrix in DM remains as all zeros, so my final solution comes out as zeros too.<br></div><br></div><br></div><div class="gmail_extra">Thanking You,<br></div><div class="gmail_extra"><div class="gmail_signature"><div dir="ltr">K.N.Ramachandran<br><div>Ph: 814-441-4279</div></div></div>
</div></div></div>