[petsc-users] Parallel assembly of a matrix problem

Barry Smith bsmith at petsc.dev
Mon Aug 15 08:28:55 CDT 2022


    There are two sets of tools for helping compute the preallocation in a "pre-compute" step that "count" what you will need later efficiently so that you don't need to "figure out" the preallocation information yourself. They are both slightly clumsy, unfortunately, but should work fine.

 https://petsc.org/main/docs/manualpages/Mat/MATPREALLOCATOR/?highlight=matpreallocator <https://petsc.org/main/docs/manualpages/Mat/MATPREALLOCATOR/?highlight=matpreallocator>

https://petsc.org/main/docs/manualpages/Mat/MatPreallocateBegin/?highlight=matpreallocatebegin <https://petsc.org/main/docs/manualpages/Mat/MatPreallocateBegin/?highlight=matpreallocatebegin> 

> On Aug 15, 2022, at 12:12 AM, Patrick Alken <patrick.alken at geomag.info> wrote:
> 
> I am trying to assemble a matrix in parallel in Petsc, and it is working for 1 processor, but when I try running it with 2 processors I get errors like the following:
> 
> ----
> 
> [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [1]PETSC ERROR: Argument out of range
> [1]PETSC ERROR: New nonzero at (0,207) caused a malloc
> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off this check
> [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.17.4, unknown
> [1]PETSC ERROR: #1 MatSetValues_MPIAIJ() at /data/palken/petsc/src/mat/impls/aij/mpi/mpiaij.c:506
> 
> ----
> 
> and then many MatSetValues errors follow.
> 
> I have attached a minimal working example and a Makefile. My code does depend on the GSL library also. I think there must be a problem in how I set up the d_nnz and o_nnz arrays for the MatMPIAIJSetPreallocation() function (lines 211-235 of mwe.c).
> 
> I am building a matrix using radial basis functions with compact support, so the matrix entries are zero when ||Xi - Xj|| >= c, where c is the RBF shape parameter. This is accounted for on line 224 and 254 of the code. As far as I can tell I am consistent in the way I define the nnz arrays and the way I call MatSetValues, so I don't know why Petsc would need to malloc additional memory during the assembly process.
> 
> Any pointers would be greatly appreciated.
> 
> Patrick
> <mwe.c><Makefile.txt>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220815/0fe474e9/attachment.html>


More information about the petsc-users mailing list