[petsc-users] Diagnosing Poisson Solver Behavior

Matthew Knepley knepley at gmail.com
Thu Oct 15 10:46:39 CDT 2015


On Thu, Oct 15, 2015 at 10:43 AM, K. N. Ramachandran <knram06 at gmail.com>
wrote:

> Hello Barry,
>
> The full error message is
>
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Argument out of range
> [0]PETSC ERROR: Inserting a new nonzero at (25,27) in the matrix
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.6.2, Oct, 02, 2015
> [0]PETSC ERROR: /gpfs/work/(user name)/progs/PICC_Code/piccode on a
> linux-debug named (host) by (user) Thu Oct 15 11:22:41 2015
> [0]PETSC ERROR: Configure options --with-fc=mpif90 PETSC_ARCH=linux-debug
> --with-debugging=1 --with-clanguage=cxx --with-cxx=mpicxx --with-cc=mpicc
> --with-blas-lib="-L/usr/global/intel/mkl/11.0.0.079/lib/intel64 -lmkl_rt"
> --with-lapack-lib="-L/usr/global/intel/mkl/11.0.0.079/lib/intel64
> -lmkl_rt" --with-shared-libraries --with-x=0 --download-hypre=1
> [0]PETSC ERROR: #1 MatSetValues_SeqAIJ() line 484 in /gpfs/scratch/(user
> name)/petsc-3.6.2/src/mat/impls/aij/seq/aij.c
> [0]PETSC ERROR: #2 MatSetValues() line 1173 in /gpfs/scratch/(user
> name)/petsc-3.6.2/src/mat/interface/matrix.c
> [0]PETSC ERROR: #3 MatCopy_Basic() line 3722 in /gpfs/scratch/(user
> name)/petsc-3.6.2/src/mat/interface/matrix.c
> [0]PETSC ERROR: #4 MatCopy_SeqAIJ() line 2630 in /gpfs/scratch/(user
> name)/petsc-3.6.2/src/mat/impls/aij/seq/aij.c
> [0]PETSC ERROR: #5 MatCopy() line 3779 in /gpfs/scratch/(user
> name)/petsc-3.6.2/src/mat/interface/matrix.c
> 48        MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);
> (gdb) n
> 49        MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);
>
>
> I tried a DMDACreate1D where I specify the dimension as n^3, where n is
> the number of nodes on one side of the cube, but I still get the same
> error. The error is generated at the MatCopy line as mentioned earlier.
>

This will not give you what you want since its got 1D connectivity.


> In parallel you will not be able to reuse some old code to "compute the
>> Finite difference discretization of a 3D regular grid" so you might
>> consider just discarding that old code and using what PETSc provides for
>> computing the "Finite difference discretization of a 3D regular grid."
>>
>
> Ah ok, I had meant to say Finite difference discretization of the Poisson
> equation.
>
> If my parallel code could generate the sparse matrices formats for the
> portion owned by each ranks, would there be a convenient way to generate
> the PETSc matrices to be part of the DM? I had noticed that
> CreateMatSeqAIJWithArrays changes the location of the Mat pointer passed in
> for ComputeMatrix, so the actual matrix in DM remains as all zeros, so my
> final solution comes out as zeros too.
>

This seems like the hardest way to do this. We have running examples, that
scale well, and produce exactly the matrix
you are using. In addition, they create the matrix in parallel, so the
whole thing is scalable.

In order to do it the way you want, you would need to match the
partitioning, which is hard.

  Matt

Thanking You,
> K.N.Ramachandran
> Ph: 814-441-4279
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151015/0d0b07ed/attachment-0001.html>


More information about the petsc-users mailing list