[petsc-users] How to correctly call MatXAIJSetPreallocation for MATIS?

Eric Chamberland Eric.Chamberland at giref.ulaval.ca
Tue Nov 10 23:10:35 CST 2020


Hi,

I am trying to add support for MATIS in our code to be able to use 
PCBDDC and others.

So far, I have modify our matrix creation to add a call to 
MatSetLocalToGlobalMapping and also modified our non-zeros count to be 
able to give "local" non-zeros to MatXAIJSetPreallocation.

When I run with, don't laugh, 1 process, everything is fine! So far so 
good. ;)

When I run with 2 processes, fine also...

When I run with 3 processes, I have process rank 1 and 2 giving errors:

[1]PETSC ERROR: New nonzero at (7,5) caused a malloc

and

[2]PETSC ERROR: New nonzero at (10,2) caused a malloc

I understand that these new nonzero indices are *local* indices. The 
global indices I am trying to do assembly on are:

proc 1:

global Line 3 (local 7)
global Columns: 3 7 10 15 *16* 17 (local 7 9 10 4 *5* 6)  // *x* is 
faulty column, meaning only 4 nnz have been allocated!?

proc2 :

(global Line 16 (local 10)
(global Columns: 3 8 16 17 *20* 22 23 24 (local 8 11 10 12 *2* 4 5 6) // 
*x* is faulty column, , meaning only 4 nnz have been allocated!?

The error is returned at the moment we do the first assembly.

After verifying my number of nnz, I just can't find why PETSc complains 
about a malloc, since as I can verify, I counted them well...

Global matrix "size": 25 x 25 (given to MatSetSizes)

Here are the non-zeros given to MatXAIJSetPreallocation followed by 
mapping used when creating ISLocalToGlobalMapping:

=============

Process 0:

nnz_d[Local:0]=4
nnz_d[Local:1]=6
nnz_d[Local:2]=4
nnz_d[Local:3]=4
nnz_d[Local:4]=6
nnz_d[Local:5]=4
nnz_d[Local:6]=6
nnz_d[Local:7]=8
nnz_d[Local:8]=6
nnz_d[Local:9]=9
nnz_d[Local:10]=4

Local,Global:

0,0
1,1
2,2
3,3
4,4
5,5
6,6
7,7
8,8
9,9
10,10

=============

Process 1:


nnz_d[Local:0]=4
nnz_d[Local:1]=6
nnz_d[Local:2]=6
nnz_d[Local:3]=4
nnz_d[Local:4]=9
nnz_d[Local:5]=4
nnz_d[Local:6]=6
nnz_d[Local:7]=6
nnz_d[Local:8]=4
nnz_d[Local:9]=4
nnz_d[Local:10]=8

Local,Global:

0,11
1,12
2,13
3,14
4,15
5,16
6,17
7,3
8,5
9,7
10,10

=============

Process 2:

nnz_d[Local:0]=4
nnz_d[Local:1]=4
nnz_d[Local:2]=6
nnz_d[Local:3]=6
nnz_d[Local:4]=6
nnz_d[Local:5]=6
nnz_d[Local:6]=9
nnz_d[Local:7]=4
nnz_d[Local:8]=4
nnz_d[Local:9]=4
nnz_d[Local:10]=8
nnz_d[Local:11]=6
nnz_d[Local:12]=6

  Local,Global:
0,18
1,19
2,20
3,21
4,22
5,23
6,24
7,2
8,3
9,14
10,16
11,8
12,17

=============

I have ran with valgrind, everything is ok.

So, why don't I have enough values reserved on local line 7 of rank 1? 
and 10 of rank 2?

Thanks for your insights,

Eric

ps: Here is the backtrace:

[1]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------
[1]PETSC ERROR: Argument out of range
[1]PETSC ERROR: New nonzero at (7,5) caused a malloc
Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn 
off this check
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.11.2, May, 18, 2019
[1]PETSC ERROR: Test.MEF++.dev on a  named rohan by ericc Tue Nov 10 
23:39:47 2020
[1]PETSC ERROR: Configure options 
--prefix=/opt/petsc-3.11.2_debug_openmpi-4.0.1 --with-mpi-compilers=1 
--with-mpi-dir=/opt/openmpi-4.0.1 --with-cxx-dialect=C++11 
--with-make-np=12 --with-shared-libraries=1 --with-debugging=yes 
--with-memalign=64 --with-visibility=0 --with-64-bit-indices=0 
--download-ml=yes --download-mumps=yes --download-superlu=yes 
--download-superlu_dist=yes --download-parmetis=yes 
--download-ptscotch=yes --download-metis=yes --download-suitesparse=yes 
--download-hypre=yes 
--with-blaslapack-dir=/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64 
--with-mkl_pardiso-dir=/opt/intel/composer_xe_2015.2.164/mkl 
--with-mkl_cpardiso-dir=/opt/intel/composer_xe_2015.2.164/mkl 
--with-scalapack=1 
--with-scalapack-include=/opt/intel/composer_xe_2015.2.164/mkl/include 
--with-scalapack-lib="-L/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64 
-lmkl_scalapack_lp64 -lmkl_blacs_openmpi_lp64"
[1]PETSC ERROR: #1 MatSetValues_SeqAIJ() line 481 in 
/home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/impls/aij/seq/aij.c
[1]PETSC ERROR: #2 MatSetValues() line 1407 in 
/home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/interface/matrix.c
[1]PETSC ERROR: #3 MatSetValuesBlocked() line 1919 in 
/home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/interface/matrix.c
[1]PETSC ERROR: #4 MatSetValuesBlocked_IS() line 2609 in 
/home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/impls/is/matis.c
[1]PETSC ERROR: #5 MatSetValuesBlocked() line 1898 in 
/home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/interface/matrix.c
_GIREF_ASSERTION(false)
voir fichier 
/home/mefpp_ericc/depots_prepush/GIREF/src/commun/Petsc/MatricePETSc.cc:858
    ---->  ERREUR FATALE: Erreur PETSc


-- 
Eric Chamberland, ing., M. Ing
Professionnel de recherche
GIREF/Université Laval
(418) 656-2131 poste 41 22 42



More information about the petsc-users mailing list