[petsc-users] How to correctly call MatXAIJSetPreallocation for MATIS?

Eric Chamberland Eric.Chamberland at giref.ulaval.ca
Wed Nov 11 20:57:04 CST 2020


Hi,

I will test what Barry suggested, but I have done 2 tests after 
Stefano's remarks (and also upgraded to petsc 3.14.1 -- which changed 
nothing):

=================================

test #1- Extract the local matrix with MatISGetLocalMat, call 
MatXAIJSetPreallocation with the local non-zeros I sent you in my first 
email: it worked!

=================================

*but*:

=================================

test #2- Pass the "same" non-zeros vectors as if I had a "normal" mpi 
matrix:

This gives me a very different error:

[0]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------
[0]PETSC ERROR: No support for this operation for this object type
[0]PETSC ERROR: Some of the column indices can not be mapped! Maybe you 
should not use MATIS
[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.14.1, Nov 03, 2020
[0]PETSC ERROR: Test.MEF++.dev on a  named rohan by ericc Wed Nov 11 
21:35:55 2020
[0]PETSC ERROR: Configure options 
--prefix=/opt/petsc-3.14.1_debug_openmpi-4.0.5 --with-mpi-compilers=1 
--with-mpi-dir=/opt/openmpi-4.0.5 --with-cxx-dialect=C++11 
--with-make-np=12 --with-shared-libraries=1 --with-debugging=yes 
--with-memalign=64 --with-visibility=0 --with-64-bit-indices=0 
--download-ml=yes --download-mumps=yes --download-superlu=yes 
--download-superlu_dist=yes --download-parmetis=yes 
--download-ptscotch=yes --download-metis=yes --download-suitesparse=yes 
--download-hypre=yes 
--with-blaslapack-dir=/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64 
--with-mkl_pardiso-dir=/opt/intel/composer_xe_2015.2.164/mkl 
--with-mkl_cpardiso-dir=/opt/intel/composer_xe_2015.2.164/mkl 
--with-scalapack=1 
--with-scalapack-include=/opt/intel/composer_xe_2015.2.164/mkl/include 
--with-scalapack-lib="-L/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64 
-lmkl_scalapack_lp64 -lmkl_blacs_openmpi_lp64"
[0]PETSC ERROR: #1 MatSetValuesBlocked_IS() line 2555 in 
/home/mefpp_ericc/ompi-opt/petsc-3.14.1-debug/src/mat/impls/is/matis.c
[0]PETSC ERROR: #2 MatSetValuesBlocked() line 1848 in 
/home/mefpp_ericc/ompi-opt/petsc-3.14.1-debug/src/mat/interface/matrix.c

==================================

So I still mystified on how to call MatXAIJSetPreallocation for MATIS 
type.  Here are more precises questions/remarks:

a) What is the correct length for the dnnz vector for a MATIS when 
calling MatXAIJSetPreallocation?  I put it to the size of Mat from 
MatISGetLocalMat.

b) Since all non-zeros are supposed to be "local", what do you put in 
onnz for MatXAIJSetPreallocation?  In my case, I put nothing: all 
entries are 0 and as for dnnz, the length is the one from the local Mat...

Since my test #1 is now all working right, I can go on with my 
modifications, but for the convenience of other users, maybe some 
further tests should be done?

I will tests Barry's suggestions asap,

Thanks a lot!

Eric

On 2020-11-11 6:47 a.m., Stefano Zampini wrote:
> Eric
>
> just use the same arrays you provide for the other matrix types. The 
> purpose of having support for MATIS in MatXAIJSetPreallocation is 
> exactly to not preallocate the local matrices, but treat the matrix as 
> if it was in "assembled" (AIJ) form. The MATIS code does the local 
> preallocation for you (a little bit overestimated), see here 
> https://gitlab.com/petsc/petsc/-/blob/master/src/mat/impls/is/matis.c#L1686
> You need to provide the local2global map object before calling the 
> preallocation routine
>
> Let me know if something is still unclear
>
> Stefano
>
> Il giorno mer 11 nov 2020 alle ore 12:06 Barry Smith <bsmith at petsc.dev 
> <mailto:bsmith at petsc.dev>> ha scritto:
>
>
>
>     > On Nov 10, 2020, at 11:10 PM, Eric Chamberland
>     <Eric.Chamberland at giref.ulaval.ca
>     <mailto:Eric.Chamberland at giref.ulaval.ca>> wrote:
>     >
>     > Hi,
>     >
>     > I am trying to add support for MATIS in our code to be able to
>     use PCBDDC and others.
>     >
>     > So far, I have modify our matrix creation to add a call to
>     MatSetLocalToGlobalMapping and also modified our non-zeros count
>     to be able to give "local" non-zeros to MatXAIJSetPreallocation.
>     >
>     > When I run with, don't laugh, 1 process, everything is fine! So
>     far so good. ;)
>     >
>     > When I run with 2 processes, fine also...
>     >
>     > When I run with 3 processes, I have process rank 1 and 2 giving
>     errors:
>     >
>     > [1]PETSC ERROR: New nonzero at (7,5) caused a malloc
>     >
>     > and
>     >
>     > [2]PETSC ERROR: New nonzero at (10,2) caused a malloc
>     >
>     > I understand that these new nonzero indices are *local* indices.
>     The global indices I am trying to do assembly on are:
>     >
>     > proc 1:
>     >
>     > global Line 3 (local 7)
>     > global Columns: 3 7 10 15 *16* 17 (local 7 9 10 4 *5* 6) // *x*
>     is faulty column, meaning only 4 nnz have been allocated!?
>
>       Because the local column indices may not be set "in order" it
>     doesn't mean for sure that it thinks there are only 4 column slots
>     available, it could
>     be it filled up column slots with local columns larger than 5 and
>     hence used up all the available space.
>
>       Unfortunately we don't have any code in place to display what is
>     happening when the error occurs. So I suggest the following.
>
>       Stop putting in values just before the first problematic value.
>
>       At that point in your code call MatAssemblyBegin/End()
>
>       Then call MatView()
>
>       This will show you what columns have been filled up in each
>     local matrix and can help you determine if either
>
>       1) some other column entries are getting in there that you
>     didn't expect.  or
>
>       2) somehow the preallocation is not properly being determined
>     and is smaller than you expect.
>
>     Good luck,
>
>       Barry
>
>
>     >
>     > proc2 :
>     >
>     > (global Line 16 (local 10)
>     > (global Columns: 3 8 16 17 *20* 22 23 24 (local 8 11 10 12 *2* 4
>     5 6) // *x* is faulty column, , meaning only 4 nnz have been
>     allocated!?
>     >
>     > The error is returned at the moment we do the first assembly.
>     >
>     > After verifying my number of nnz, I just can't find why PETSc
>     complains about a malloc, since as I can verify, I counted them
>     well...
>     >
>     > Global matrix "size": 25 x 25 (given to MatSetSizes)
>     >
>     > Here are the non-zeros given to MatXAIJSetPreallocation followed
>     by mapping used when creating ISLocalToGlobalMapping:
>     >
>     > =============
>     >
>     > Process 0:
>     >
>     > nnz_d[Local:0]=4
>     > nnz_d[Local:1]=6
>     > nnz_d[Local:2]=4
>     > nnz_d[Local:3]=4
>     > nnz_d[Local:4]=6
>     > nnz_d[Local:5]=4
>     > nnz_d[Local:6]=6
>     > nnz_d[Local:7]=8
>     > nnz_d[Local:8]=6
>     > nnz_d[Local:9]=9
>     > nnz_d[Local:10]=4
>     >
>     > Local,Global:
>     >
>     > 0,0
>     > 1,1
>     > 2,2
>     > 3,3
>     > 4,4
>     > 5,5
>     > 6,6
>     > 7,7
>     > 8,8
>     > 9,9
>     > 10,10
>     >
>     > =============
>     >
>     > Process 1:
>     >
>     >
>     > nnz_d[Local:0]=4
>     > nnz_d[Local:1]=6
>     > nnz_d[Local:2]=6
>     > nnz_d[Local:3]=4
>     > nnz_d[Local:4]=9
>     > nnz_d[Local:5]=4
>     > nnz_d[Local:6]=6
>     > nnz_d[Local:7]=6
>     > nnz_d[Local:8]=4
>     > nnz_d[Local:9]=4
>     > nnz_d[Local:10]=8
>     >
>     > Local,Global:
>     >
>     > 0,11
>     > 1,12
>     > 2,13
>     > 3,14
>     > 4,15
>     > 5,16
>     > 6,17
>     > 7,3
>     > 8,5
>     > 9,7
>     > 10,10
>     >
>     > =============
>     >
>     > Process 2:
>     >
>     > nnz_d[Local:0]=4
>     > nnz_d[Local:1]=4
>     > nnz_d[Local:2]=6
>     > nnz_d[Local:3]=6
>     > nnz_d[Local:4]=6
>     > nnz_d[Local:5]=6
>     > nnz_d[Local:6]=9
>     > nnz_d[Local:7]=4
>     > nnz_d[Local:8]=4
>     > nnz_d[Local:9]=4
>     > nnz_d[Local:10]=8
>     > nnz_d[Local:11]=6
>     > nnz_d[Local:12]=6
>     >
>     >  Local,Global:
>     > 0,18
>     > 1,19
>     > 2,20
>     > 3,21
>     > 4,22
>     > 5,23
>     > 6,24
>     > 7,2
>     > 8,3
>     > 9,14
>     > 10,16
>     > 11,8
>     > 12,17
>     >
>     > =============
>     >
>     > I have ran with valgrind, everything is ok.
>     >
>     > So, why don't I have enough values reserved on local line 7 of
>     rank 1? and 10 of rank 2?
>     >
>     > Thanks for your insights,
>     >
>     > Eric
>     >
>     > ps: Here is the backtrace:
>     >
>     > [1]PETSC ERROR: --------------------- Error Message
>     --------------------------------------------------------------
>     > [1]PETSC ERROR: Argument out of range
>     > [1]PETSC ERROR: New nonzero at (7,5) caused a malloc
>     > Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE)
>     to turn off this check
>     > [1]PETSC ERROR: See
>     http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>     shooting.
>     > [1]PETSC ERROR: Petsc Release Version 3.11.2, May, 18, 2019
>     > [1]PETSC ERROR: Test.MEF++.dev on a  named rohan by ericc Tue
>     Nov 10 23:39:47 2020
>     > [1]PETSC ERROR: Configure options
>     --prefix=/opt/petsc-3.11.2_debug_openmpi-4.0.1
>     --with-mpi-compilers=1 --with-mpi-dir=/opt/openmpi-4.0.1
>     --with-cxx-dialect=C++11 --with-make-np=12
>     --with-shared-libraries=1 --with-debugging=yes --with-memalign=64
>     --with-visibility=0 --with-64-bit-indices=0 --download-ml=yes
>     --download-mumps=yes --download-superlu=yes
>     --download-superlu_dist=yes --download-parmetis=yes
>     --download-ptscotch=yes --download-metis=yes
>     --download-suitesparse=yes --download-hypre=yes
>     --with-blaslapack-dir=/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64
>     --with-mkl_pardiso-dir=/opt/intel/composer_xe_2015.2.164/mkl
>     --with-mkl_cpardiso-dir=/opt/intel/composer_xe_2015.2.164/mkl
>     --with-scalapack=1
>     --with-scalapack-include=/opt/intel/composer_xe_2015.2.164/mkl/include
>     --with-scalapack-lib="-L/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64
>     -lmkl_scalapack_lp64 -lmkl_blacs_openmpi_lp64"
>     > [1]PETSC ERROR: #1 MatSetValues_SeqAIJ() line 481 in
>     /home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/impls/aij/seq/aij.c
>     > [1]PETSC ERROR: #2 MatSetValues() line 1407 in
>     /home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/interface/matrix.c
>     > [1]PETSC ERROR: #3 MatSetValuesBlocked() line 1919 in
>     /home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/interface/matrix.c
>     > [1]PETSC ERROR: #4 MatSetValuesBlocked_IS() line 2609 in
>     /home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/impls/is/matis.c
>     > [1]PETSC ERROR: #5 MatSetValuesBlocked() line 1898 in
>     /home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/interface/matrix.c
>     > _GIREF_ASSERTION(false)
>     > voir fichier
>     /home/mefpp_ericc/depots_prepush/GIREF/src/commun/Petsc/MatricePETSc.cc:858
>     >    ---->  ERREUR FATALE: Erreur PETSc
>     >
>     >
>     > --
>     > Eric Chamberland, ing., M. Ing
>     > Professionnel de recherche
>     > GIREF/Université Laval
>     > (418) 656-2131 poste 41 22 42
>     >
>
>
>
> -- 
> Stefano

-- 
Eric Chamberland, ing., M. Ing
Professionnel de recherche
GIREF/Université Laval
(418) 656-2131 poste 41 22 42

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20201111/0bcbf2ce/attachment.html>


More information about the petsc-users mailing list