[petsc-users] Parallel assembly of a matrix problem

Barry Smith bsmith at petsc.dev
Tue Aug 16 13:03:50 CDT 2022


MatMPIAIJSetPreallocation(A, 3, NULL, 0, NULL);

The 0 indicates you expect that there will be no matrix entries in the columns of the matrix that represent degrees of freedom that are not stored on the given rank. In your case this means you expect the matrix to be block diagonal with one block per MPI rank. You need to put an appropriate value instead of 1. If you matrix is Tri-diagonal then using 1 will be enough. But if your matrix has a more complicated structure (but with nonzeros on the diagonal) then you need to use 2.

  Barry


> On Aug 16, 2022, at 1:45 PM, Patrick Alken <patrick.alken at geomag.info> wrote:
> 
> I have managed to simplify my example program, and removed the dependencies on slepc and gsl (attached). 
> 
> The program defines a 800x800 matrix with 3 non-zero entries per row, which are all set to 1.0.
> 
> When I run the program on 1 processor, everything works correctly. When I run it on 2+ processors, I get the following error:
> 
> -----
> 
> first = 400 last = 800 
> [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- 
> [1]PETSC ERROR: Argument out of range 
> [1]PETSC ERROR: New nonzero at (0,399) caused a malloc 
> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off this check 
> [1]PETSC ERROR: See https://petsc.org/release/faq/ <https://petsc.org/release/faq/> for trouble shooting. 
> [1]PETSC ERROR: Petsc Release Version 3.17.4, unknown  
> [1]PETSC ERROR: Configure options --download-make --download-mpich --download-scalapack --download-cmake --download-mumps 
> [1]PETSC ERROR: #1 MatSetValues_MPIAIJ() at /data/palken/petsc/src/mat/impls/aij/mpi/mpiaij.c:506 
> [1]PETSC ERROR: #2 MatSetValues() at /data/palken/petsc/src/mat/interface/matrix.c:1343 
> first = 0 last = 400 
> [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- 
> [0]PETSC ERROR: Argument out of range 
> [0]PETSC ERROR: New nonzero at (399,400) caused a malloc 
> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off this check 
> [0]PETSC ERROR: See https://petsc.org/release/faq/ <https://petsc.org/release/faq/> for trouble shooting. 
> [0]PETSC ERROR: Petsc Release Version 3.17.4, unknown  
> [0]PETSC ERROR: Configure options --download-make --download-mpich --download-scalapack --download-cmake --download-mumps 
> [0]PETSC ERROR: #1 MatSetValues_MPIAIJ() at /data/palken/petsc/src/mat/impls/aij/mpi/mpiaij.c:506 
> [0]PETSC ERROR: #2 MatSetValues() at /data/palken/petsc/src/mat/interface/matrix.c:1343
> 
> -----
> 
> As far as I understand, processor 1 will have rows 1-400, while processor 2 will have rows 401-800. From my reading of the documentation, each processor will have all columns (1-800) for their block of rows. So I am at a loss to understand why the program is generating these errors. Any help is appreciated.
> 
> Patrick
> 
> On 8/15/22 04:29, Mark Adams wrote:
>> You might need to fix this:
>> 
>> if (i - j < m)
>> to 
>> if (i - j < m && i - j >= 0)
>> Mark
>> 
>> On Mon, Aug 15, 2022 at 12:12 AM Patrick Alken <patrick.alken at geomag.info <mailto:patrick.alken at geomag.info>> wrote:
>> I am trying to assemble a matrix in parallel in Petsc, and it is working 
>> for 1 processor, but when I try running it with 2 processors I get 
>> errors like the following:
>> 
>> ----
>> 
>> [1]PETSC ERROR: --------------------- Error Message 
>> --------------------------------------------------------------
>> [1]PETSC ERROR: Argument out of range
>> [1]PETSC ERROR: New nonzero at (0,207) caused a malloc
>> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn 
>> off this check
>> [1]PETSC ERROR: See https://petsc.org/release/faq/ <https://petsc.org/release/faq/> for trouble shooting.
>> [1]PETSC ERROR: Petsc Release Version 3.17.4, unknown
>> [1]PETSC ERROR: #1 MatSetValues_MPIAIJ() at 
>> /data/palken/petsc/src/mat/impls/aij/mpi/mpiaij.c:506
>> 
>> ----
>> 
>> and then many MatSetValues errors follow.
>> 
>> I have attached a minimal working example and a Makefile. My code does 
>> depend on the GSL library also. I think there must be a problem in how I 
>> set up the d_nnz and o_nnz arrays for the MatMPIAIJSetPreallocation() 
>> function (lines 211-235 of mwe.c).
>> 
>> I am building a matrix using radial basis functions with compact 
>> support, so the matrix entries are zero when ||Xi - Xj|| >= c, where c 
>> is the RBF shape parameter. This is accounted for on line 224 and 254 of 
>> the code. As far as I can tell I am consistent in the way I define the 
>> nnz arrays and the way I call MatSetValues, so I don't know why Petsc 
>> would need to malloc additional memory during the assembly process.
>> 
>> Any pointers would be greatly appreciated.
>> 
>> Patrick
> <mwe.c><Makefile.txt>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220816/8778558a/attachment.html>


More information about the petsc-users mailing list