[petsc-users] Parallel assembly of a matrix problem
Patrick Alken
patrick.alken at geomag.info
Tue Aug 16 12:45:52 CDT 2022
I have managed to simplify my example program, and removed the
dependencies on slepc and gsl (attached).
The program defines a 800x800 matrix with 3 non-zero entries per row,
which are all set to 1.0.
When I run the program on 1 processor, everything works correctly. When
I run it on 2+ processors, I get the following error:
-----
first = 400 last = 800
[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: Argument out of range
[1]PETSC ERROR: New nonzero at (0,399) caused a malloc
Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn
off this check
[1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.17.4, unknown
[1]PETSC ERROR: Configure options --download-make --download-mpich
--download-scalapack --download-cmake --download-mumps
[1]PETSC ERROR: #1 MatSetValues_MPIAIJ() at
/data/palken/petsc/src/mat/impls/aij/mpi/mpiaij.c:506
[1]PETSC ERROR: #2 MatSetValues() at
/data/palken/petsc/src/mat/interface/matrix.c:1343
first = 0 last = 400
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Argument out of range
[0]PETSC ERROR: New nonzero at (399,400) caused a malloc
Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn
off this check
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.17.4, unknown
[0]PETSC ERROR: Configure options --download-make --download-mpich
--download-scalapack --download-cmake --download-mumps
[0]PETSC ERROR: #1 MatSetValues_MPIAIJ() at
/data/palken/petsc/src/mat/impls/aij/mpi/mpiaij.c:506
[0]PETSC ERROR: #2 MatSetValues() at
/data/palken/petsc/src/mat/interface/matrix.c:1343
-----
As far as I understand, processor 1 will have rows 1-400, while
processor 2 will have rows 401-800. From my reading of the
documentation, each processor will have all columns (1-800) for their
block of rows. So I am at a loss to understand why the program is
generating these errors. Any help is appreciated.
Patrick
On 8/15/22 04:29, Mark Adams wrote:
> You might need to fix this:
>
> if (i - j < m)
> to
> if (i - j < m && i - j >= 0)
> Mark
>
> On Mon, Aug 15, 2022 at 12:12 AM Patrick Alken
> <patrick.alken at geomag.info> wrote:
>
> I am trying to assemble a matrix in parallel in Petsc, and it is
> working
> for 1 processor, but when I try running it with 2 processors I get
> errors like the following:
>
> ----
>
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Argument out of range
> [1]PETSC ERROR: New nonzero at (0,207) caused a malloc
> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE)
> to turn
> off this check
> [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble
> shooting.
> [1]PETSC ERROR: Petsc Release Version 3.17.4, unknown
> [1]PETSC ERROR: #1 MatSetValues_MPIAIJ() at
> /data/palken/petsc/src/mat/impls/aij/mpi/mpiaij.c:506
>
> ----
>
> and then many MatSetValues errors follow.
>
> I have attached a minimal working example and a Makefile. My code
> does
> depend on the GSL library also. I think there must be a problem in
> how I
> set up the d_nnz and o_nnz arrays for the MatMPIAIJSetPreallocation()
> function (lines 211-235 of mwe.c).
>
> I am building a matrix using radial basis functions with compact
> support, so the matrix entries are zero when ||Xi - Xj|| >= c,
> where c
> is the RBF shape parameter. This is accounted for on line 224 and
> 254 of
> the code. As far as I can tell I am consistent in the way I define
> the
> nnz arrays and the way I call MatSetValues, so I don't know why Petsc
> would need to malloc additional memory during the assembly process.
>
> Any pointers would be greatly appreciated.
>
> Patrick
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220816/6b7e3a18/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: mwe.c
Type: text/x-csrc
Size: 1738 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220816/6b7e3a18/attachment-0001.bin>
-------------- next part --------------
default: mwe
include ${PETSC_DIR}/lib/petsc/conf/variables
OBJECTS = mwe.o
mwe.o: mwe.c
${CLINKER} -I${PETSC_DIR}/include -c $< -o $@
mwe: ${OBJECTS}
${CLINKER} -o mwe ${OBJECTS} ${PETSC_KSP_LIB}
clean:
rm -f mwe mwe.o
More information about the petsc-users
mailing list