<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Nov 12, 2020, at 5:57 AM, Eric Chamberland <<a href="mailto:Eric.Chamberland@giref.ulaval.ca" class="">Eric.Chamberland@giref.ulaval.ca</a>> wrote:</div><br class="Apple-interchange-newline"><div class="">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" class="">
<div class=""><p class="">Hi,</p><p class="">I will test what Barry suggested, but I have done 2 tests after
Stefano's remarks (and also upgraded to petsc 3.14.1 -- which
changed nothing):</p><p class="">=================================<br class="">
</p><p class="">test #1- Extract the local matrix with MatISGetLocalMat, call
MatXAIJSetPreallocation with the local non-zeros I sent you in my
first email: it worked!</p><p class="">=================================</p><p class="">*but*:</p><p class="">=================================</p><p class="">test #2- Pass the "same" non-zeros vectors as if I had a "normal"
mpi matrix:</p><p class="">This gives me a very different error:</p><p class="">[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------<br class="">
[0]PETSC ERROR: No support for this operation for this object type<br class="">
[0]PETSC ERROR: Some of the column indices can not be mapped!
Maybe you should not use MATIS<br class="">
[0]PETSC ERROR: See
<a class="moz-txt-link-freetext" href="https://www.mcs.anl.gov/petsc/documentation/faq.html">https://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble
shooting.<br class="">
[0]PETSC ERROR: Petsc Release Version 3.14.1, Nov 03, 2020 <br class="">
[0]PETSC ERROR: Test.MEF++.dev on a named rohan by ericc Wed Nov
11 21:35:55 2020<br class="">
[0]PETSC ERROR: Configure options
--prefix=/opt/petsc-3.14.1_debug_openmpi-4.0.5
--with-mpi-compilers=1 --with-mpi-dir=/opt/openmpi-4.0.5
--with-cxx-dialect=C++11 --with-make-np=12
--with-shared-libraries=1 --with-debugging=yes --with-memalign=64
--with-visibility=0 --with-64-bit-indices=0 --download-ml=yes
--download-mumps=yes --download-superlu=yes
--download-superlu_dist=yes --download-parmetis=yes
--download-ptscotch=yes --download-metis=yes
--download-suitesparse=yes --download-hypre=yes
--with-blaslapack-dir=/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64
--with-mkl_pardiso-dir=/opt/intel/composer_xe_2015.2.164/mkl
--with-mkl_cpardiso-dir=/opt/intel/composer_xe_2015.2.164/mkl
--with-scalapack=1
--with-scalapack-include=/opt/intel/composer_xe_2015.2.164/mkl/include
--with-scalapack-lib="-L/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64
-lmkl_scalapack_lp64 -lmkl_blacs_openmpi_lp64"<br class="">
[0]PETSC ERROR: #1 MatSetValuesBlocked_IS() line 2555 in
/home/mefpp_ericc/ompi-opt/petsc-3.14.1-debug/src/mat/impls/is/matis.c<br class="">
[0]PETSC ERROR: #2 MatSetValuesBlocked() line 1848 in
/home/mefpp_ericc/ompi-opt/petsc-3.14.1-debug/src/mat/interface/matrix.c</p><p class="">==================================<br class="">
</p><p class="">So I still mystified on how to call MatXAIJSetPreallocation for
MATIS type. </p></div></div></blockquote><div>This error tells you that you are trying to insert values (in global ordering) that do not belong to the process. Each process can insert only at the global entries listed in their part of the local-to-global map </div><div><br class=""></div><br class=""><blockquote type="cite" class=""><div class=""><p class="">Here are more precises questions/remarks:</p><p class="">a) What is the correct length for the dnnz vector for a MATIS
when calling MatXAIJSetPreallocation? I put it to the size of Mat
from MatISGetLocalMat</p><div class=""><br class=""></div></div></blockquote><div><br class=""></div><div>This is wrong,</div><div>What I meant for “the same arrays” is that you call MatXAIJSetPreallocation(A,…..) with the same data you use for other matrix types, which means the size of dnnz and onnz is the usual number of locally owned rows in parallel (i.e. given by MatGetLocalSize(A,&sizednzandonz,…))</div><div>No need to have special code path for MATIS, just use the same data you use for the MPIAIJ, the man page is quite clear I guess <a href="https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatXAIJSetPreallocation.html" class="">https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatXAIJSetPreallocation.html</a></div><br class=""><blockquote type="cite" class=""><div class=""><p class="">b) Since all non-zeros are supposed to be "local", what do you
put in onnz for MatXAIJSetPreallocation? In my case, I put
nothing: all entries are 0 and as for dnnz, the length is the one
from the local Mat…</p></div></blockquote><div><br class=""></div>Again, You populate dnnz and onnz as if they were for a MPIAIJ matrix, no special code path, no MatISGetLocalMat. Here you have an example creating a MATIS for a DMDA <a href="https://gitlab.com/petsc/petsc/-/blob/master/src/dm/impls/da/fdda.c#L888" class="">https://gitlab.com/petsc/petsc/-/blob/master/src/dm/impls/da/fdda.c#L888</a></div><div>As you can see, we use the same code to preallocate for a MATAIJ or a MATIS, no special code path.<br class=""><blockquote type="cite" class=""><div class=""><p class="">
</p><p class="">Since my test #1 is now all working right, I can go on with my
modifications, but for the convenience of other users, maybe some
further tests should be done?</p><div class=""><br class=""></div></div></blockquote><div><br class=""></div><div>Bugs are always possible, but this is extensively tested in CI, with 2D and 3D unstructured and non-conforming meshes, as well as with DMDA, or with user defined local-to-global maps.</div><div>It is also tested by other FEM packages (MFEM, FENICS, PetIGA) that use PETSc.</div><div>Can you provide a MWE to show your issues? We can add it to the test suite and it can be educational for other users.</div><br class=""><blockquote type="cite" class=""><div class=""><p class="">I will tests Barry's suggestions asap,</p><p class="">Thanks a lot!</p><p class="">Eric<br class="">
</p>
<div class="moz-cite-prefix">On 2020-11-11 6:47 a.m., Stefano
Zampini wrote:<br class="">
</div>
<blockquote type="cite" cite="mid:CAGPUisinaZRtUqCdRDKwmkV2Ee6iHvw5_6xHSFztpG_Nf1BmjQ@mail.gmail.com" class="">
<meta http-equiv="content-type" content="text/html; charset=UTF-8" class="">
<div dir="ltr" class="">Eric
<div class=""><br class="">
</div>
<div class="">just use the same arrays you provide for the other matrix
types. The purpose of having support for MATIS in
MatXAIJSetPreallocation is exactly to not preallocate the
local matrices, but treat the matrix as if it was in
"assembled" (AIJ) form. The MATIS code does the local
preallocation for you (a little bit overestimated), see here <a href="https://gitlab.com/petsc/petsc/-/blob/master/src/mat/impls/is/matis.c#L1686" moz-do-not-send="true" class="">https://gitlab.com/petsc/petsc/-/blob/master/src/mat/impls/is/matis.c#L1686</a></div>
<div class="">You need to provide the local2global map object before
calling the preallocation routine</div>
<div class=""><br class="">
</div>
<div class="">Let me know if something is still unclear</div>
<div class=""><br class="">
</div>
<div class="">Stefano</div>
</div>
<br class="">
<div class="gmail_quote">
<div dir="ltr" class="gmail_attr">Il giorno mer 11 nov 2020 alle
ore 12:06 Barry Smith <<a href="mailto:bsmith@petsc.dev" moz-do-not-send="true" class="">bsmith@petsc.dev</a>> ha scritto:<br class="">
</div>
<blockquote class="gmail_quote" style="margin:0px 0px 0px
0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br class="">
<br class="">
> On Nov 10, 2020, at 11:10 PM, Eric Chamberland <<a href="mailto:Eric.Chamberland@giref.ulaval.ca" target="_blank" moz-do-not-send="true" class="">Eric.Chamberland@giref.ulaval.ca</a>>
wrote:<br class="">
> <br class="">
> Hi,<br class="">
> <br class="">
> I am trying to add support for MATIS in our code to be
able to use PCBDDC and others.<br class="">
> <br class="">
> So far, I have modify our matrix creation to add a call
to MatSetLocalToGlobalMapping and also modified our non-zeros
count to be able to give "local" non-zeros to
MatXAIJSetPreallocation.<br class="">
> <br class="">
> When I run with, don't laugh, 1 process, everything is
fine! So far so good. ;)<br class="">
> <br class="">
> When I run with 2 processes, fine also...<br class="">
> <br class="">
> When I run with 3 processes, I have process rank 1 and 2
giving errors:<br class="">
> <br class="">
> [1]PETSC ERROR: New nonzero at (7,5) caused a malloc<br class="">
> <br class="">
> and<br class="">
> <br class="">
> [2]PETSC ERROR: New nonzero at (10,2) caused a malloc<br class="">
> <br class="">
> I understand that these new nonzero indices are *local*
indices. The global indices I am trying to do assembly on are:<br class="">
> <br class="">
> proc 1:<br class="">
> <br class="">
> global Line 3 (local 7)<br class="">
> global Columns: 3 7 10 15 *16* 17 (local 7 9 10 4 *5* 6)
// *x* is faulty column, meaning only 4 nnz have been
allocated!?<br class="">
<br class="">
Because the local column indices may not be set "in order"
it doesn't mean for sure that it thinks there are only 4
column slots available, it could<br class="">
be it filled up column slots with local columns larger than 5
and hence used up all the available space.<br class="">
<br class="">
Unfortunately we don't have any code in place to display
what is happening when the error occurs. So I suggest the
following. <br class="">
<br class="">
Stop putting in values just before the first problematic
value. <br class="">
<br class="">
At that point in your code call MatAssemblyBegin/End() <br class="">
<br class="">
Then call MatView() <br class="">
<br class="">
This will show you what columns have been filled up in each
local matrix and can help you determine if either <br class="">
<br class="">
1) some other column entries are getting in there that you
didn't expect. or<br class="">
<br class="">
2) somehow the preallocation is not properly being
determined and is smaller than you expect. <br class="">
<br class="">
Good luck,<br class="">
<br class="">
Barry<br class="">
<br class="">
<br class="">
> <br class="">
> proc2 :<br class="">
> <br class="">
> (global Line 16 (local 10)<br class="">
> (global Columns: 3 8 16 17 *20* 22 23 24 (local 8 11 10
12 *2* 4 5 6) // *x* is faulty column, , meaning only 4 nnz
have been allocated!?<br class="">
> <br class="">
> The error is returned at the moment we do the first
assembly.<br class="">
> <br class="">
> After verifying my number of nnz, I just can't find why
PETSc complains about a malloc, since as I can verify, I
counted them well...<br class="">
> <br class="">
> Global matrix "size": 25 x 25 (given to MatSetSizes)<br class="">
> <br class="">
> Here are the non-zeros given to MatXAIJSetPreallocation
followed by mapping used when creating ISLocalToGlobalMapping:<br class="">
> <br class="">
> =============<br class="">
> <br class="">
> Process 0:<br class="">
> <br class="">
> nnz_d[Local:0]=4<br class="">
> nnz_d[Local:1]=6<br class="">
> nnz_d[Local:2]=4<br class="">
> nnz_d[Local:3]=4<br class="">
> nnz_d[Local:4]=6<br class="">
> nnz_d[Local:5]=4<br class="">
> nnz_d[Local:6]=6<br class="">
> nnz_d[Local:7]=8<br class="">
> nnz_d[Local:8]=6<br class="">
> nnz_d[Local:9]=9<br class="">
> nnz_d[Local:10]=4<br class="">
> <br class="">
> Local,Global:<br class="">
> <br class="">
> 0,0<br class="">
> 1,1<br class="">
> 2,2<br class="">
> 3,3<br class="">
> 4,4<br class="">
> 5,5<br class="">
> 6,6<br class="">
> 7,7<br class="">
> 8,8<br class="">
> 9,9<br class="">
> 10,10<br class="">
> <br class="">
> =============<br class="">
> <br class="">
> Process 1:<br class="">
> <br class="">
> <br class="">
> nnz_d[Local:0]=4<br class="">
> nnz_d[Local:1]=6<br class="">
> nnz_d[Local:2]=6<br class="">
> nnz_d[Local:3]=4<br class="">
> nnz_d[Local:4]=9<br class="">
> nnz_d[Local:5]=4<br class="">
> nnz_d[Local:6]=6<br class="">
> nnz_d[Local:7]=6<br class="">
> nnz_d[Local:8]=4<br class="">
> nnz_d[Local:9]=4<br class="">
> nnz_d[Local:10]=8<br class="">
> <br class="">
> Local,Global:<br class="">
> <br class="">
> 0,11<br class="">
> 1,12<br class="">
> 2,13<br class="">
> 3,14<br class="">
> 4,15<br class="">
> 5,16<br class="">
> 6,17<br class="">
> 7,3<br class="">
> 8,5<br class="">
> 9,7<br class="">
> 10,10<br class="">
> <br class="">
> =============<br class="">
> <br class="">
> Process 2:<br class="">
> <br class="">
> nnz_d[Local:0]=4<br class="">
> nnz_d[Local:1]=4<br class="">
> nnz_d[Local:2]=6<br class="">
> nnz_d[Local:3]=6<br class="">
> nnz_d[Local:4]=6<br class="">
> nnz_d[Local:5]=6<br class="">
> nnz_d[Local:6]=9<br class="">
> nnz_d[Local:7]=4<br class="">
> nnz_d[Local:8]=4<br class="">
> nnz_d[Local:9]=4<br class="">
> nnz_d[Local:10]=8<br class="">
> nnz_d[Local:11]=6<br class="">
> nnz_d[Local:12]=6<br class="">
> <br class="">
> Local,Global:<br class="">
> 0,18<br class="">
> 1,19<br class="">
> 2,20<br class="">
> 3,21<br class="">
> 4,22<br class="">
> 5,23<br class="">
> 6,24<br class="">
> 7,2<br class="">
> 8,3<br class="">
> 9,14<br class="">
> 10,16<br class="">
> 11,8<br class="">
> 12,17<br class="">
> <br class="">
> =============<br class="">
> <br class="">
> I have ran with valgrind, everything is ok.<br class="">
> <br class="">
> So, why don't I have enough values reserved on local line
7 of rank 1? and 10 of rank 2?<br class="">
> <br class="">
> Thanks for your insights,<br class="">
> <br class="">
> Eric<br class="">
> <br class="">
> ps: Here is the backtrace:<br class="">
> <br class="">
> [1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------<br class="">
> [1]PETSC ERROR: Argument out of range<br class="">
> [1]PETSC ERROR: New nonzero at (7,5) caused a malloc<br class="">
> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR,
PETSC_FALSE) to turn off this check<br class="">
> [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank" moz-do-not-send="true" class="">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.<br class="">
> [1]PETSC ERROR: Petsc Release Version 3.11.2, May, 18,
2019<br class="">
> [1]PETSC ERROR: Test.MEF++.dev on a named rohan by ericc
Tue Nov 10 23:39:47 2020<br class="">
> [1]PETSC ERROR: Configure options
--prefix=/opt/petsc-3.11.2_debug_openmpi-4.0.1
--with-mpi-compilers=1 --with-mpi-dir=/opt/openmpi-4.0.1
--with-cxx-dialect=C++11 --with-make-np=12
--with-shared-libraries=1 --with-debugging=yes
--with-memalign=64 --with-visibility=0 --with-64-bit-indices=0
--download-ml=yes --download-mumps=yes --download-superlu=yes
--download-superlu_dist=yes --download-parmetis=yes
--download-ptscotch=yes --download-metis=yes
--download-suitesparse=yes --download-hypre=yes
--with-blaslapack-dir=/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64
--with-mkl_pardiso-dir=/opt/intel/composer_xe_2015.2.164/mkl
--with-mkl_cpardiso-dir=/opt/intel/composer_xe_2015.2.164/mkl
--with-scalapack=1
--with-scalapack-include=/opt/intel/composer_xe_2015.2.164/mkl/include
--with-scalapack-lib="-L/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64
-lmkl_scalapack_lp64 -lmkl_blacs_openmpi_lp64"<br class="">
> [1]PETSC ERROR: #1 MatSetValues_SeqAIJ() line 481 in
/home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/impls/aij/seq/aij.c<br class="">
> [1]PETSC ERROR: #2 MatSetValues() line 1407 in
/home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/interface/matrix.c<br class="">
> [1]PETSC ERROR: #3 MatSetValuesBlocked() line 1919 in
/home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/interface/matrix.c<br class="">
> [1]PETSC ERROR: #4 MatSetValuesBlocked_IS() line 2609 in
/home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/impls/is/matis.c<br class="">
> [1]PETSC ERROR: #5 MatSetValuesBlocked() line 1898 in
/home/mefpp_ericc/depots_prepush/cpetssc/ompi-opt/petsc-3.11.2-debug/src/mat/interface/matrix.c<br class="">
> _GIREF_ASSERTION(false)<br class="">
> voir fichier
/home/mefpp_ericc/depots_prepush/GIREF/src/commun/Petsc/<a href="http://matricepetsc.cc:858" class="">MatricePETSc.cc:858</a><br class="">
> ----> ERREUR FATALE: Erreur PETSc<br class="">
> <br class="">
> <br class="">
> -- <br class="">
> Eric Chamberland, ing., M. Ing<br class="">
> Professionnel de recherche<br class="">
> GIREF/Université Laval<br class="">
> (418) 656-2131 poste 41 22 42<br class="">
> <br class="">
<br class="">
</blockquote>
</div>
<br clear="all" class="">
<div class=""><br class="">
</div>
-- <br class="">
<div dir="ltr" class="gmail_signature">Stefano</div>
</blockquote>
<pre class="moz-signature" cols="72">--
Eric Chamberland, ing., M. Ing
Professionnel de recherche
GIREF/Université Laval
(418) 656-2131 poste 41 22 42</pre>
</div>
</blockquote></div><br class=""></body></html>