<div dir="ltr">Franck,<div><br></div><div>is it your "real" case working properly now?</div><div><br></div><div>Stefano</div></div><div class="gmail_extra"><br><div class="gmail_quote">2017-06-20 17:31 GMT+02:00 Franck Houssen <span dir="ltr"><<a href="mailto:franck.houssen@inria.fr" target="_blank">franck.houssen@inria.fr</a>></span>:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div style="font-family:times new roman,new york,times,serif;font-size:12pt;color:#000000"><div>OK. <br></div><div><br></div><div>Franck<br></div><div><br></div><hr id="m_4679290877670475079zwchr"><blockquote style="border-left:2px solid #1010ff;margin-left:5px;padding-left:5px;color:#000;font-weight:normal;font-style:normal;text-decoration:none;font-family:Helvetica,Arial,sans-serif;font-size:12pt"><span class=""><b>De: </b>"Stefano Zampini" <<a href="mailto:stefano.zampini@gmail.com" target="_blank">stefano.zampini@gmail.com</a>><br><b>À: </b>"Franck Houssen" <<a href="mailto:franck.houssen@inria.fr" target="_blank">franck.houssen@inria.fr</a>><br><b>Cc: </b>"PETSc users list" <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br></span><b>Envoyé: </b>Mardi 20 Juin 2017 17:08:29<div><div class="h5"><br><b>Objet: </b>Re: [petsc-users] Building MatIS with dense local matrix ?<br><div><br></div><div dir="ltr">It should be fixed right now, both in master and in maint.<div>Again, sorry for this ping-pong of fixes, my brain it's not fully functional these days....<div><br></div><div><a href="https://bitbucket.org/petsc/petsc/commits/c6f20c4fa7817632f09219574920bd3bd922f6f1" target="_blank">https://bitbucket.org/petsc/<wbr>petsc/commits/<wbr>c6f20c4fa7817632f09219574920bd<wbr>3bd922f6f1</a><br></div></div></div><div class="gmail_extra"><br><div class="gmail_quote">2017-06-20 16:30 GMT+02:00 Franck Houssen <span dir="ltr"><<a href="mailto:franck.houssen@inria.fr" target="_blank">franck.houssen@inria.fr</a>></span>:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div style="font-family:times new roman,new york,times,serif;font-size:12pt;color:#000000"><div>OK. I moved from petsc-3.7.6 to development version (git clone bitbucket).<br></div><div>The very first version of the dummy example (= matISCSRDenseSquare/Rect.cpp) works with this fix: <a href="https://bitbucket.org/petsc/petsc/commits/4c8dd594d1988a0cbe282f8a37d9916f61e0c445" target="_blank">https://bitbucket.org/petsc/<wbr>petsc/commits/<wbr>4c8dd594d1988a0cbe282f8a37d991<wbr>6f61e0c445</a>.<br></div><div>The second version of the dummy example works too with the fix if one moves to petsc bitbucket (master).<br></div><div><br></div><div>But, the code still breaks in "my" initial "real" case (using now master from petsc bitbucket)... With another error "SEGV under MatISSetMPIXAIJPreallocation_<wbr>Private" (note: this is not a "new non zero" message, this seems to be another problem).<br></div><div><br></div><div>Here is a third version of the dummy example that breaks with "SEGV under MatISSetMPIXAIJPreallocation_<wbr>Private" (using master from petsc bitbucket) : the idea is the same but with N procs (not only 2) and a rectangular matrix of size N*(N+1).</div><div>With 2 procs, it works (all cases).</div><div>With 4 procs, new problems occur:<br></div><div>>> mpirun -n 4 ./matISCSRDenseSquare.exe csr; mpirun -n 4 ./matISCSRDenseSquare.exe dense => OK<br></div><div>>> mpirun -n 4 ./matISCSRDenseRect.exe csr => OK<br></div><div>but<br></div><div>>> mpirun -n 4 ./matISCSRDenseRect.exe dense; <br>dense<br>dense<br>dense<br>dense<br>[3]PETSC ERROR: ------------------------------<wbr>------------------------------<wbr>------------<br>[3]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>[3]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>[3]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html#<wbr>valgrind</a><br>[3]PETSC ERROR: or try <a href="http://valgrind.org" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>[3]PETSC ERROR: likely location of problem given in stack below<br>[3]PETSC ERROR: --------------------- Stack Frames ------------------------------<wbr>------<br>[3]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>[3]PETSC ERROR: INSTEAD the line number of the start of the function<br>[3]PETSC ERROR: is given.<br>[3]PETSC ERROR: [3] MatISSetMPIXAIJPreallocation_<wbr>Private line 1055 /home/fghoussen/Documents/<wbr>INRIA/petsc/src/mat/impls/is/<wbr>matis.c<br>[3]PETSC ERROR: [3] MatISGetMPIXAIJ_IS line 1230 /home/fghoussen/Documents/<wbr>INRIA/petsc/src/mat/impls/is/<wbr>matis.c<br>[3]PETSC ERROR: [3] MatISGetMPIXAIJ line 1384 /home/fghoussen/Documents/<wbr>INRIA/petsc/src/mat/impls/is/<wbr>matis.c<br>[3]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div><br></div><div>I tried to go through this with caution... But I have to say I feel messy. Can you reproduce this problem at your side ?<br></div><div><br></div><div>Franck<br></div><div><br></div><div>>> git diff .<br>--- a/matISCSRDenseRect.cpp<br>+++ b/matISCSRDenseRect.cpp<br>@@ -14,19 +14,17 @@ int main(int argc,char **argv) {<br> if (matType != "csr" && matType != "dense") {cout << "error: need arg = csr or dense" << endl; return 1;}<br> <br> PetscInitialize(&argc, &argv, NULL, NULL);<br>- int size = 0; MPI_Comm_size(MPI_COMM_WORLD, &size); if (size != 2) {cout << "error: mpi != 2" << endl; return 1;}<br>+ int size = 0; MPI_Comm_size(MPI_COMM_WORLD, &size);<span><br> int rank = 0; MPI_Comm_rank(MPI_COMM_WORLD, &rank);<br> <br></span>- PetscInt localIdx[2] = {0, 0};<span><br>- if (rank == 0) {localIdx[0] = 0; localIdx[1] = 1;}<br></span>- else {localIdx[0] = 1; localIdx[1] = 2;}<br>+ PetscInt localIdx[2] = {rank, rank+1};<span><br> ISLocalToGlobalMapping rmap;<br> ISLocalToGlobalMappingCreate(<wbr>PETSC_COMM_WORLD, 1, 1, &rank, PETSC_COPY_VALUES, &rmap);<br></span> ISLocalToGlobalMapping cmap;<br> ISLocalToGlobalMappingCreate(<wbr>PETSC_COMM_WORLD, 1, 2, localIdx, PETSC_COPY_VALUES, &cmap);<br> <br> Mat A;<br>- MatCreateIS(PETSC_COMM_WORLD, 1, PETSC_DECIDE, PETSC_DECIDE, 2, 3, rmap, cmap, &A);<br>+ MatCreateIS(PETSC_COMM_WORLD, 1, PETSC_DECIDE, PETSC_DECIDE, size, size+1, rmap, cmap, &A);<br> <br> Mat Aloc;<span><br> if (matType == "csr") {cout << matType << endl; MatCreateSeqAIJ(PETSC_COMM_<wbr>SELF, 1, 2, 2, NULL, &Aloc);}<br></span></div><div><br></div><div>--- a/matISCSRDenseSquare.cpp<br>+++ b/matISCSRDenseSquare.cpp<br>@@ -14,19 +14,17 @@ int main(int argc,char **argv) {<br> if (matType != "csr" && matType != "dense") {cout << "error: need arg = csr or dense" << endl; return 1;}<br> <br> PetscInitialize(&argc, &argv, NULL, NULL);<br>- int size = 0; MPI_Comm_size(MPI_COMM_WORLD, &size); if (size != 2) {cout << "error: mpi != 2" << endl; return 1;}<br>+ int size = 0; MPI_Comm_size(MPI_COMM_WORLD, &size);<span><br> int rank = 0; MPI_Comm_rank(MPI_COMM_WORLD, &rank);<br> <br></span>- PetscInt localIdx[2] = {0, 0};<span><br>- if (rank == 0) {localIdx[0] = 0; localIdx[1] = 1;}<br></span>- else {localIdx[0] = 1; localIdx[1] = 2;}<br>+ PetscInt localIdx[2] = {rank, rank+1};<span><br> ISLocalToGlobalMapping rmap;<br> ISLocalToGlobalMappingCreate(<wbr>PETSC_COMM_WORLD, 1, 2, localIdx, PETSC_COPY_VALUES, &rmap);<br></span> ISLocalToGlobalMapping cmap;<br> ISLocalToGlobalMappingCreate(<wbr>PETSC_COMM_WORLD, 1, 2, localIdx, PETSC_COPY_VALUES, &cmap);<br> <br> Mat A;<br>- MatCreateIS(PETSC_COMM_WORLD, 1, PETSC_DECIDE, PETSC_DECIDE, 3, 3, rmap, cmap, &A);<br>+ MatCreateIS(PETSC_COMM_WORLD, 1, PETSC_DECIDE, PETSC_DECIDE, size+1, size+1, rmap, cmap, &A);<br> <br> Mat Aloc;<span><span><br> if (matType == "csr") {cout << matType << endl; MatCreateSeqAIJ(PETSC_COMM_<wbr>SELF, 2, 2, 2, NULL, &Aloc);}<br></span></span><div><br></div></div><hr id="m_4679290877670475079m_7376831623023054887zwchr"><blockquote style="border-left:2px solid #1010ff;margin-left:5px;padding-left:5px;color:#000;font-weight:normal;font-style:normal;text-decoration:none;font-family:Helvetica,Arial,sans-serif;font-size:12pt"><span><b>De: </b>"Stefano Zampini" <<a href="mailto:stefano.zampini@gmail.com" target="_blank">stefano.zampini@gmail.com</a>><br><b>À: </b>"Franck Houssen" <<a href="mailto:franck.houssen@inria.fr" target="_blank">franck.houssen@inria.fr</a>><br><b>Cc: </b>"PETSc users list" <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br></span><b>Envoyé: </b>Mardi 20 Juin 2017 13:23:27<div><div class="m_4679290877670475079h5"><br><b>Objet: </b>Re: [petsc-users] Building MatIS with dense local matrix ?<br><div><br></div><div dir="ltr">Franck<div><br></div><div>I tested your new example with master and it works. However, It doesn't work with maint. I fixed the rectangular case a while ago in master and forgot to add the change to maint too. Sorry for that. </div><div><br></div><div>This should fix the problem with maint: <a href="https://bitbucket.org/petsc/petsc/commits/0ea065fb06d751599c4157d36bfe1a1b41348e0b" target="_blank">https://bitbucket.org/petsc/<wbr>petsc/commits/<wbr>0ea065fb06d751599c4157d36bfe1a<wbr>1b41348e0b</a><br></div><div><br></div><div>Test your real case and let me know.</div><div>If you could, it would be good to test against master too.</div><div> </div><div>Thanks,</div><div>Stefano</div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">2017-06-20 12:58 GMT+02:00 Franck Houssen <span dir="ltr"><<a href="mailto:franck.houssen@inria.fr" target="_blank">franck.houssen@inria.fr</a>></span>:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div style="font-family:times new roman,new york,times,serif;font-size:12pt;color:#000000"><div>As I said, it is often difficult to reduce the "real" problem: it turns out that your fix solves the "matISCSRDenseSquare/Rect.cpp" dummy example I sent, but, it's still not working in "my real" situation.<br></div><div><br></div><div>I changed a bit the "matISCSRDenseSquare/Rect.cpp" dummy example (see git diff below - I just changed the point that overlaps) : the dummy example is still failing<br></div><div><br></div><div>"mpirun -n 2 ./matISCSRDenseSquare.exe csr" and "mpirun -n 2 ./matISCSRDenseSquare.exe dense" : OK<br></div><div>but<br></div><div>"mpirun -n 2 ./matISCSRDenseRect.exe csr" and "mpirun -n 2 ./matISCSRDenseRect.exe dense": KO with error "Argument out of range - New nonzero at (0,2) caused a malloc"<br></div><div><br></div><div>I would say, the problem (I am concerned with the "real" case) is around lines 360-380 of /src/mat/impls/is/matis.c (not around 181 : this fixes a valid problem, but, this problem is another one)<br></div><div><br></div><div>Franck<br></div><div><br></div><div>--- a/matISCSRDenseRect.cpp<br>+++ b/matISCSRDenseRect.cpp<br>@@ -18,7 +18,7 @@ int main(int argc,char **argv) {<br> int rank = 0; MPI_Comm_rank(MPI_COMM_WORLD, &rank);<br> <br> PetscInt localIdx[2] = {0, 0};<br>- if (rank == 0) {localIdx[0] = 0; localIdx[1] = 1;}<br>+ if (rank == 0) {localIdx[0] = 0; localIdx[1] = 2;}<br> else {localIdx[0] = 1; localIdx[1] = 2;}<br> ISLocalToGlobalMapping rmap;<span><br> ISLocalToGlobalMappingCreate(<wbr>PETSC_COMM_WORLD, 1, 1, &rank, PETSC_COPY_VALUES, &rmap);<br></span>diff --git a/Graphs/Franck/03.petscDDM/<wbr>02.petscMailList/<wbr>matISCSRDenseSquare.cpp b/Graphs/Franck/03.petscDDM/<wbr>02.petscMailList/<wbr>matISCSRDenseSquare.cpp<br>index 4bc6190..4a6ea41 100644<br>--- a/matISCSRDenseSquare.cpp<br>+++ b/matISCSRDenseSquare.cpp<br>@@ -18,7 +18,7 @@ int main(int argc,char **argv) {<br> int rank = 0; MPI_Comm_rank(MPI_COMM_WORLD, &rank);<br> <br> PetscInt localIdx[2] = {0, 0};<br>- if (rank == 0) {localIdx[0] = 0; localIdx[1] = 1;}<br>+ if (rank == 0) {localIdx[0] = 0; localIdx[1] = 2;}<br> else {localIdx[0] = 1; localIdx[1] = 2;}<br> ISLocalToGlobalMapping rmap;<span><span><br> ISLocalToGlobalMappingCreate(<wbr>PETSC_COMM_WORLD, 1, 2, localIdx, PETSC_COPY_VALUES, &rmap);<br></span></span><div><br></div></div><hr id="m_4679290877670475079m_7376831623023054887m_7732172224123963102zwchr"><blockquote style="border-left:2px solid #1010ff;margin-left:5px;padding-left:5px;color:#000;font-weight:normal;font-style:normal;text-decoration:none;font-family:Helvetica,Arial,sans-serif;font-size:12pt"><span><b>De: </b>"Stefano Zampini" <<a href="mailto:stefano.zampini@gmail.com" target="_blank">stefano.zampini@gmail.com</a>><br><b>À: </b>"Franck Houssen" <<a href="mailto:franck.houssen@inria.fr" target="_blank">franck.houssen@inria.fr</a>><br><b>Cc: </b>"PETSc users list" <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br></span><b>Envoyé: </b>Mardi 20 Juin 2017 00:23:24<div><div class="m_4679290877670475079m_7376831623023054887h5"><br><b>Objet: </b>Re: [petsc-users] Building MatIS with dense local matrix ?<br><div><br></div>It should be fixed now in maint and master<div><br></div><div><a href="https://bitbucket.org/petsc/petsc/commits/4c8dd594d1988a0cbe282f8a37d9916f61e0c445" target="_blank">https://bitbucket.org/petsc/<wbr>petsc/commits/<wbr>4c8dd594d1988a0cbe282f8a37d991<wbr>6f61e0c445</a><br></div><div><br></div><div>Thanks for reporting the problem,</div><div>Stefano</div><div><br><div><blockquote><div>On Jun 19, 2017, at 10:46 PM, Stefano Zampini <<a href="mailto:stefano.zampini@gmail.com" target="_blank">stefano.zampini@gmail.com</a>> wrote:</div><br class="m_4679290877670475079m_7376831623023054887m_7732172224123963102Apple-interchange-newline"><div><div dir="auto">Franck,<div dir="auto"><br></div><div dir="auto">Thanks. I'll get back soon with a fix.</div><div dir="auto"><br></div><div dir="auto">Stefano</div><div dir="auto"><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">Il 19 Giu 2017 18:17, "Franck Houssen" <<a href="mailto:franck.houssen@inria.fr" target="_blank">franck.houssen@inria.fr</a>> ha scritto:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div style="font-family:'times new roman','new york',times,serif;font-size:12pt"><div>The problem was difficult to reduce as reducing make things disappear... Luckily, I believe I got it (or at least, it looks "like" the one I "really" have...).<br></div><div><br></div><div>Seems that for square matrix, it works fine for csr and dense matrix. But, If I am not mistaken, it does not for dense rectangular matrix (still OK for csr).<br></div><div><br></div><div>matISCSRDenseSquare.cpp: 2 procs, global 3x3 matrix, each proc adds a 2x2 local matrix in the global matrix.<br></div><div>matISCSRDenseRect.cpp: 2 procs, global <strong>2</strong>x3 matrix, each proc adds a <strong>1</strong>x2 local <strong>vector</strong> in the global matrix.<br></div><div><br></div><div>reminder: running debian/testing with gcc-6.3 + petsc-3.7.6</div><div><br></div><div>Franck<br></div><div><br></div><div>>> mpirun -n 2 ./matISCSRDenseSquare.exe csr; mpirun -n 2 ./matISCSRDenseSquare.exe dense<br>csr<br>csr<br>Mat Object: 2 MPI processes<br> type: is<br> Mat Object: 1 MPI processes<br> type: seqaij<br>row 0: (0, 1.) (1, 0.) <br>row 1: (0, 0.) (1, 1.) <br> Mat Object: 1 MPI processes<br> type: seqaij<br>row 0: (0, 1.) (1, 0.) <br>row 1: (0, 0.) (1, 1.) <br>dense<br>dense<br>Mat Object: 2 MPI processes<br> type: is<br> Mat Object: 1 MPI processes<br> type: seqdense<br>1.0000000000000000e+00 0.0000000000000000e+00 <br>0.0000000000000000e+00 1.0000000000000000e+00 <br> Mat Object: 1 MPI processes<br> type: seqdense<br>1.0000000000000000e+00 0.0000000000000000e+00 <br>0.0000000000000000e+00 1.0000000000000000e+00 <br><div><br></div></div><div>>> mpirun -n 2 ./matISCSRDenseRect.exe csr; mpirun -n 2 ./matISCSRDenseRect.exe dense<br>csr<br>csr<br>Mat Object: 2 MPI processes<br> type: is<br> Mat Object: 1 MPI processes<br> type: seqaij<br>row 0: (0, 1.) (1, 0.) <br> Mat Object: 1 MPI processes<br> type: seqaij<br>row 0: (0, 1.) (1, 0.) <br>dense<br>dense<br>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>[0]PETSC ERROR: Argument out of range<br>[0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>[1]PETSC ERROR: Argument out of range<br>[1]PETSC ERROR: New nonzero at (0,1) caused a malloc<br>Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_<wbr>ERR, PETSC_FALSE) to turn off this check<br>[1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>[1]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 <br>[1]PETSC ERROR: ./matISCSRDenseRect.exe on a arch-linux2-c-debug named yoda by fghoussen Mon Jun 19 18:08:58 2017<br>New nonzero at (0,1) caused a malloc<br>Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_<wbr>ERR, PETSC_FALSE) to turn off this check<br>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 <br>[1]PETSC ERROR: Configure options --prefix=/home/fghoussen/<wbr>Documents/INRIA/petsc-3.7.6/<wbr>local --with-mpi=1 --with-pthread=1 --download-f2cblaslapack=yes --download-mumps=yes --download-scalapack=yes --download-superlu=yes --download-suitesparse=yes<br>[1]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 616 in /home/fghoussen/Documents/<wbr>INRIA/petsc-3.7.6/src/mat/<wbr>impls/aij/mpi/mpiaij.c<br>[1]PETSC ERROR: #2 MatSetValues() line 1190 in /home/fghoussen/Documents/<wbr>INRIA/petsc-3.7.6/src/mat/<wbr>interface/matrix.c<br>[1]PETSC ERROR: #3 MatSetValuesLocal() line 2053 in /home/fghoussen/Documents/<wbr>INRIA/petsc-3.7.6/src/mat/<wbr>interface/matrix.c<br>[1]PETSC ERROR: #4 MatISGetMPIXAIJ_IS() line 365 in /home/fghoussen/Documents/<wbr>INRIA/petsc-3.7.6/src/mat/<wbr>impls/is/matis.c<br>[1]PETSC ERROR: #5 MatISGetMPIXAIJ() line 437 in /home/fghoussen/Documents/<wbr>INRIA/petsc-3.7.6/src/mat/<wbr>impls/is/matis.c<br>[0]PETSC ERROR: ./matISCSRDenseRect.exe on a arch-linux2-c-debug named yoda by fghoussen Mon Jun 19 18:08:58 2017<br>[0]PETSC ERROR: Configure options --prefix=/home/fghoussen/<wbr>Documents/INRIA/petsc-3.7.6/<wbr>local --with-mpi=1 --with-pthread=1 --download-f2cblaslapack=yes --download-mumps=yes --download-scalapack=yes --download-superlu=yes --download-suitesparse=yes<br>[0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 582 in /home/fghoussen/Documents/<wbr>INRIA/petsc-3.7.6/src/mat/<wbr>impls/aij/mpi/mpiaij.c<br>[0]PETSC ERROR: #2 MatSetValues() line 1190 in /home/fghoussen/Documents/<wbr>INRIA/petsc-3.7.6/src/mat/<wbr>interface/matrix.c<br>[0]PETSC ERROR: #3 MatSetValuesLocal() line 2053 in /home/fghoussen/Documents/<wbr>INRIA/petsc-3.7.6/src/mat/<wbr>interface/matrix.c<br>[0]PETSC ERROR: #4 MatISGetMPIXAIJ_IS() line 365 in /home/fghoussen/Documents/<wbr>INRIA/petsc-3.7.6/src/mat/<wbr>impls/is/matis.c<br>[0]PETSC ERROR: #5 MatISGetMPIXAIJ() line 437 in /home/fghoussen/Documents/<wbr>INRIA/petsc-3.7.6/src/mat/<wbr>impls/is/matis.c<br>Mat Object: 2 MPI processes<br> type: is<br> Mat Object: 1 MPI processes<br> type: seqdense<br>1.0000000000000000e+00 0.0000000000000000e+00 <br> Mat Object: 1 MPI processes<br> type: seqdense<br>1.0000000000000000e+00 0.0000000000000000e+00 <br><div><br></div></div><div>>> diff matISCSRDenseSquare.cpp matISCSRDenseRect.cpp<br>3c3<br>< // ~> g++ -o matISCSRDenseSquare.exe matISCSRDenseSquare.cpp -lpetsc -lm; mpirun -n 2 matISCSRDenseSquare.exe<br>---<br>> // ~> g++ -o matISCSRDenseRect.exe matISCSRDenseRect.cpp -lpetsc -lm; mpirun -n 2 matISCSRDenseRect.exe<br>24c24<br>< ISLocalToGlobalMappingCreate(<wbr>PETSC_COMM_WORLD, 1, 2, localIdx, PETSC_COPY_VALUES, &rmap);<br>---<br>> ISLocalToGlobalMappingCreate(<wbr>PETSC_COMM_WORLD, 1, 1, &rank, PETSC_COPY_VALUES, &rmap);<br>29c29<br>< MatCreateIS(PETSC_COMM_WORLD, 1, PETSC_DECIDE, PETSC_DECIDE, 3, 3, rmap, cmap, &A);<br>---<br>> MatCreateIS(PETSC_COMM_WORLD, 1, PETSC_DECIDE, PETSC_DECIDE, 2, 3, rmap, cmap, &A);<br>32,33c32,33<br>< if (matType == "csr") {cout << matType << endl; MatCreateSeqAIJ(PETSC_COMM_<wbr>SELF, 2, 2, 2, NULL, &Aloc);}<br>< else {cout << matType << endl; MatCreateSeqDense(PETSC_COMM_<wbr>SELF, 2, 2, NULL, &Aloc);}<br>---<br>> if (matType == "csr") {cout << matType << endl; MatCreateSeqAIJ(PETSC_COMM_<wbr>SELF, 1, 2, 2, NULL, &Aloc);}<br>> else {cout << matType << endl; MatCreateSeqDense(PETSC_COMM_<wbr>SELF, 1, 2, NULL, &Aloc);}<br>35,36c35,36<br>< PetscScalar localVal[4] = {1., 0., 0., 1.};<br>< MatSetValues(Aloc, 2, localIdx, 2, localIdx, localVal, ADD_VALUES); // Add local 2x2 matrix<br>---<br>> PetscScalar localVal[2] = {1., 0.}; PetscInt oneLocalRow = 0;<br>> MatSetValues(Aloc, 1, &oneLocalRow, 2, localIdx, localVal, ADD_VALUES); // Add local row<br><div><br></div></div><hr id="m_4679290877670475079m_7376831623023054887m_7732172224123963102m_5842485459567131580zwchr"><blockquote style="border-left-width:2px;border-left-style:solid;border-left-color:rgb(16,16,255);margin-left:5px;padding-left:5px;font-weight:normal;font-style:normal;text-decoration:none;font-family:Helvetica,Arial,sans-serif;font-size:12pt"><b>De: </b>"Stefano Zampini" <<a href="mailto:stefano.zampini@gmail.com" target="_blank">stefano.zampini@gmail.com</a>><br><b>À: </b>"Franck Houssen" <<a href="mailto:franck.houssen@inria.fr" target="_blank">franck.houssen@inria.fr</a>><br><b>Cc: </b>"PETSc users list" <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br><b>Envoyé: </b>Lundi 19 Juin 2017 15:25:35<br><b>Objet: </b>Re: [petsc-users] Building MatIS with dense local matrix ?<br><div><br></div><div dir="auto">Can you send a minimal working example so that I can fix the code?<div dir="auto"><br></div><div dir="auto">Thanks</div><div dir="auto">Stefano</div></div><div class="gmail_extra"><br><div class="gmail_quote">Il 19 Giu 2017 15:20, "Franck Houssen" <<a href="mailto:franck.houssen@inria.fr" target="_blank">franck.houssen@inria.fr</a>> ha scritto:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div style="font-family:'times new roman','new york',times,serif;font-size:12pt"><div>Hi,<br></div><div><br></div><div>I try to call MatISGetMPIXAIJ on a MatIS (A) that has been feed locally by sequential (Aloc) dense matrix.</div><div>Seems this ends up with this error: [0]PETSC ERROR: New nonzero at (0,1) caused a malloc. Is this a known error / limitation ? (not supposed to work with dense matrix ?)<br></div><div><br></div><div>This (pseudo code) works fine:<br></div><div>MatCreateIS(..., A)</div><div>MatCreateSeqAIJ(..., Aloc)<br></div><div>MatISSetLocalMat(pcA, pcALoc) <br></div><div>MatISGetMPIXAIJ(A, ...) // OK !<br></div><div><br></div><div>When I try to replace MatCreateSeqAIJ(..., Aloc) with MatCreateSeqDense(..., Aloc), it does no more work. <br></div><div><br></div><div>Franck<br></div><div><br></div><div>PS: running debian/testing with gcc-6.3 + petsc-3.7.6<br></div></div></div></blockquote></div></div></blockquote><div><br></div></div></div></blockquote></div></div></div></blockquote></div><br></div></div></div></blockquote><div><br></div></div></div></blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="m_4679290877670475079m_7376831623023054887gmail_signature">Stefano</div></div></div></div></blockquote><div><br></div></div></div></blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="m_4679290877670475079gmail_signature">Stefano</div></div></div></div></blockquote><div><br></div></div></div></blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature" data-smartmail="gmail_signature">Stefano</div>
</div>