[petsc-users] Building MatIS with dense local matrix ?

Stefano Zampini stefano.zampini at gmail.com
Mon Jun 19 17:23:24 CDT 2017


It should be fixed now in maint and master

https://bitbucket.org/petsc/petsc/commits/4c8dd594d1988a0cbe282f8a37d9916f61e0c445 <https://bitbucket.org/petsc/petsc/commits/4c8dd594d1988a0cbe282f8a37d9916f61e0c445>

Thanks for reporting the problem,
Stefano

> On Jun 19, 2017, at 10:46 PM, Stefano Zampini <stefano.zampini at gmail.com> wrote:
> 
> Franck,
> 
> Thanks. I'll​ get back soon with a fix.
> 
> Stefano
> 
> 
> Il 19 Giu 2017 18:17, "Franck Houssen" <franck.houssen at inria.fr <mailto:franck.houssen at inria.fr>> ha scritto:
> The problem was difficult to reduce as reducing make things disappear... Luckily, I believe I got it (or at least, it looks "like" the one I "really" have...).
> 
> Seems that for square matrix, it works fine for csr and dense matrix. But, If I am not mistaken, it does not for dense rectangular matrix (still OK for csr).
> 
> matISCSRDenseSquare.cpp: 2 procs, global 3x3 matrix, each proc adds a 2x2 local matrix in the global matrix.
> matISCSRDenseRect.cpp: 2 procs, global 2x3 matrix, each proc adds a 1x2 local vector in the global matrix.
> 
> reminder: running debian/testing with gcc-6.3 + petsc-3.7.6
> 
> Franck
> 
> >> mpirun -n 2 ./matISCSRDenseSquare.exe csr; mpirun -n 2 ./matISCSRDenseSquare.exe dense
> csr
> csr
> Mat Object: 2 MPI processes
>   type: is
>   Mat Object:   1 MPI processes
>     type: seqaij
> row 0: (0, 1.)  (1, 0.) 
> row 1: (0, 0.)  (1, 1.) 
>   Mat Object:   1 MPI processes
>     type: seqaij
> row 0: (0, 1.)  (1, 0.) 
> row 1: (0, 0.)  (1, 1.) 
> dense
> dense
> Mat Object: 2 MPI processes
>   type: is
>   Mat Object:   1 MPI processes
>     type: seqdense
> 1.0000000000000000e+00 0.0000000000000000e+00 
> 0.0000000000000000e+00 1.0000000000000000e+00 
>   Mat Object:   1 MPI processes
>     type: seqdense
> 1.0000000000000000e+00 0.0000000000000000e+00 
> 0.0000000000000000e+00 1.0000000000000000e+00 
> 
> >> mpirun -n 2 ./matISCSRDenseRect.exe csr; mpirun -n 2 ./matISCSRDenseRect.exe dense
> csr
> csr
> Mat Object: 2 MPI processes
>   type: is
>   Mat Object:   1 MPI processes
>     type: seqaij
> row 0: (0, 1.)  (1, 0.) 
>   Mat Object:   1 MPI processes
>     type: seqaij
> row 0: (0, 1.)  (1, 0.) 
> dense
> dense
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Argument out of range
> [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [1]PETSC ERROR: Argument out of range
> [1]PETSC ERROR: New nonzero at (0,1) caused a malloc
> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off this check
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html <http://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 
> [1]PETSC ERROR: ./matISCSRDenseRect.exe on a arch-linux2-c-debug named yoda by fghoussen Mon Jun 19 18:08:58 2017
> New nonzero at (0,1) caused a malloc
> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off this check
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html <http://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017 
> [1]PETSC ERROR: Configure options --prefix=/home/fghoussen/Documents/INRIA/petsc-3.7.6/local --with-mpi=1 --with-pthread=1 --download-f2cblaslapack=yes --download-mumps=yes --download-scalapack=yes --download-superlu=yes --download-suitesparse=yes
> [1]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 616 in /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/aij/mpi/mpiaij.c
> [1]PETSC ERROR: #2 MatSetValues() line 1190 in /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/interface/matrix.c
> [1]PETSC ERROR: #3 MatSetValuesLocal() line 2053 in /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/interface/matrix.c
> [1]PETSC ERROR: #4 MatISGetMPIXAIJ_IS() line 365 in /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> [1]PETSC ERROR: #5 MatISGetMPIXAIJ() line 437 in /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> [0]PETSC ERROR: ./matISCSRDenseRect.exe on a arch-linux2-c-debug named yoda by fghoussen Mon Jun 19 18:08:58 2017
> [0]PETSC ERROR: Configure options --prefix=/home/fghoussen/Documents/INRIA/petsc-3.7.6/local --with-mpi=1 --with-pthread=1 --download-f2cblaslapack=yes --download-mumps=yes --download-scalapack=yes --download-superlu=yes --download-suitesparse=yes
> [0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 582 in /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/aij/mpi/mpiaij.c
> [0]PETSC ERROR: #2 MatSetValues() line 1190 in /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/interface/matrix.c
> [0]PETSC ERROR: #3 MatSetValuesLocal() line 2053 in /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/interface/matrix.c
> [0]PETSC ERROR: #4 MatISGetMPIXAIJ_IS() line 365 in /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> [0]PETSC ERROR: #5 MatISGetMPIXAIJ() line 437 in /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> Mat Object: 2 MPI processes
>   type: is
>   Mat Object:   1 MPI processes
>     type: seqdense
> 1.0000000000000000e+00 0.0000000000000000e+00 
>   Mat Object:   1 MPI processes
>     type: seqdense
> 1.0000000000000000e+00 0.0000000000000000e+00 
> 
> >> diff matISCSRDenseSquare.cpp matISCSRDenseRect.cpp
> 3c3
> < // ~> g++ -o matISCSRDenseSquare.exe matISCSRDenseSquare.cpp -lpetsc -lm; mpirun -n 2 matISCSRDenseSquare.exe
> ---
> > // ~> g++ -o matISCSRDenseRect.exe matISCSRDenseRect.cpp -lpetsc -lm; mpirun -n 2 matISCSRDenseRect.exe
> 24c24
> <   ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, 2, localIdx, PETSC_COPY_VALUES, &rmap);
> ---
> >   ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, 1, &rank, PETSC_COPY_VALUES, &rmap);
> 29c29
> <   MatCreateIS(PETSC_COMM_WORLD, 1, PETSC_DECIDE, PETSC_DECIDE, 3, 3, rmap, cmap, &A);
> ---
> >   MatCreateIS(PETSC_COMM_WORLD, 1, PETSC_DECIDE, PETSC_DECIDE, 2, 3, rmap, cmap, &A);
> 32,33c32,33
> <   if (matType == "csr") {cout << matType << endl; MatCreateSeqAIJ(PETSC_COMM_SELF, 2, 2, 2, NULL, &Aloc);}
> <   else {cout << matType << endl; MatCreateSeqDense(PETSC_COMM_SELF, 2, 2, NULL, &Aloc);}
> ---
> >   if (matType == "csr") {cout << matType << endl; MatCreateSeqAIJ(PETSC_COMM_SELF, 1, 2, 2, NULL, &Aloc);}
> >   else {cout << matType << endl; MatCreateSeqDense(PETSC_COMM_SELF, 1, 2, NULL, &Aloc);}
> 35,36c35,36
> <   PetscScalar localVal[4] = {1., 0., 0., 1.};
> <   MatSetValues(Aloc, 2, localIdx, 2, localIdx, localVal, ADD_VALUES); // Add local 2x2 matrix
> ---
> >   PetscScalar localVal[2] = {1., 0.}; PetscInt oneLocalRow = 0;
> >   MatSetValues(Aloc, 1, &oneLocalRow, 2, localIdx, localVal, ADD_VALUES); // Add local row
> 
> De: "Stefano Zampini" <stefano.zampini at gmail.com <mailto:stefano.zampini at gmail.com>>
> À: "Franck Houssen" <franck.houssen at inria.fr <mailto:franck.houssen at inria.fr>>
> Cc: "PETSc users list" <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
> Envoyé: Lundi 19 Juin 2017 15:25:35
> Objet: Re: [petsc-users] Building MatIS with dense local matrix ?
> 
> Can you send a minimal working example so that I can fix the code?
> 
> Thanks
> Stefano
> 
> Il 19 Giu 2017 15:20, "Franck Houssen" <franck.houssen at inria.fr <mailto:franck.houssen at inria.fr>> ha scritto:
> Hi,
> 
> I try to call MatISGetMPIXAIJ on a MatIS (A) that has been feed locally by sequential (Aloc) dense matrix.
> Seems this ends up with this error: [0]PETSC ERROR: New nonzero at (0,1) caused a malloc. Is this a known error / limitation ? (not supposed to work with dense matrix ?)
> 
> This (pseudo code) works fine:
> MatCreateIS(..., A)
> MatCreateSeqAIJ(..., Aloc)
> MatISSetLocalMat(pcA, pcALoc) 
> MatISGetMPIXAIJ(A, ...) // OK !
> 
> When I try to replace MatCreateSeqAIJ(..., Aloc) with MatCreateSeqDense(..., Aloc), it does no more work. 
> 
> Franck
> 
> PS: running debian/testing with gcc-6.3 + petsc-3.7.6
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170620/61c411c3/attachment.html>


More information about the petsc-users mailing list