[petsc-users] Building MatIS with dense local matrix ?
Stefano Zampini
stefano.zampini at gmail.com
Tue Jun 20 06:23:27 CDT 2017
Franck
I tested your new example with master and it works. However, It doesn't
work with maint. I fixed the rectangular case a while ago in master and
forgot to add the change to maint too. Sorry for that.
This should fix the problem with maint:
https://bitbucket.org/petsc/petsc/commits/0ea065fb06d751599c4157d36bfe1a1b41348e0b
Test your real case and let me know.
If you could, it would be good to test against master too.
Thanks,
Stefano
2017-06-20 12:58 GMT+02:00 Franck Houssen <franck.houssen at inria.fr>:
> As I said, it is often difficult to reduce the "real" problem: it turns
> out that your fix solves the "matISCSRDenseSquare/Rect.cpp" dummy example I
> sent, but, it's still not working in "my real" situation.
>
> I changed a bit the "matISCSRDenseSquare/Rect.cpp" dummy example (see git
> diff below - I just changed the point that overlaps) : the dummy example is
> still failing
>
> "mpirun -n 2 ./matISCSRDenseSquare.exe csr" and "mpirun -n 2
> ./matISCSRDenseSquare.exe dense" : OK
> but
> "mpirun -n 2 ./matISCSRDenseRect.exe csr" and "mpirun -n 2
> ./matISCSRDenseRect.exe dense": KO with error "Argument out of range - New
> nonzero at (0,2) caused a malloc"
>
> I would say, the problem (I am concerned with the "real" case) is around
> lines 360-380 of /src/mat/impls/is/matis.c (not around 181 : this fixes a
> valid problem, but, this problem is another one)
>
> Franck
>
> --- a/matISCSRDenseRect.cpp
> +++ b/matISCSRDenseRect.cpp
> @@ -18,7 +18,7 @@ int main(int argc,char **argv) {
> int rank = 0; MPI_Comm_rank(MPI_COMM_WORLD, &rank);
>
> PetscInt localIdx[2] = {0, 0};
> - if (rank == 0) {localIdx[0] = 0; localIdx[1] = 1;}
> + if (rank == 0) {localIdx[0] = 0; localIdx[1] = 2;}
> else {localIdx[0] = 1; localIdx[1] = 2;}
> ISLocalToGlobalMapping rmap;
> ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, 1, &rank,
> PETSC_COPY_VALUES, &rmap);
> diff --git a/Graphs/Franck/03.petscDDM/02.petscMailList/matISCSRDenseSquare.cpp
> b/Graphs/Franck/03.petscDDM/02.petscMailList/matISCSRDenseSquare.cpp
> index 4bc6190..4a6ea41 100644
> --- a/matISCSRDenseSquare.cpp
> +++ b/matISCSRDenseSquare.cpp
> @@ -18,7 +18,7 @@ int main(int argc,char **argv) {
> int rank = 0; MPI_Comm_rank(MPI_COMM_WORLD, &rank);
>
> PetscInt localIdx[2] = {0, 0};
> - if (rank == 0) {localIdx[0] = 0; localIdx[1] = 1;}
> + if (rank == 0) {localIdx[0] = 0; localIdx[1] = 2;}
> else {localIdx[0] = 1; localIdx[1] = 2;}
> ISLocalToGlobalMapping rmap;
> ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, 2, localIdx,
> PETSC_COPY_VALUES, &rmap);
>
> ------------------------------
>
> *De: *"Stefano Zampini" <stefano.zampini at gmail.com>
> *À: *"Franck Houssen" <franck.houssen at inria.fr>
> *Cc: *"PETSc users list" <petsc-users at mcs.anl.gov>
> *Envoyé: *Mardi 20 Juin 2017 00:23:24
>
> *Objet: *Re: [petsc-users] Building MatIS with dense local matrix ?
>
> It should be fixed now in maint and master
>
> https://bitbucket.org/petsc/petsc/commits/4c8dd594d1988a0cbe282f8a37d991
> 6f61e0c445
>
> Thanks for reporting the problem,
> Stefano
>
> On Jun 19, 2017, at 10:46 PM, Stefano Zampini <stefano.zampini at gmail.com>
> wrote:
>
> Franck,
>
> Thanks. I'll get back soon with a fix.
>
> Stefano
>
>
> Il 19 Giu 2017 18:17, "Franck Houssen" <franck.houssen at inria.fr> ha
> scritto:
>
>> The problem was difficult to reduce as reducing make things disappear...
>> Luckily, I believe I got it (or at least, it looks "like" the one I
>> "really" have...).
>>
>> Seems that for square matrix, it works fine for csr and dense matrix.
>> But, If I am not mistaken, it does not for dense rectangular matrix (still
>> OK for csr).
>>
>> matISCSRDenseSquare.cpp: 2 procs, global 3x3 matrix, each proc adds a 2x2
>> local matrix in the global matrix.
>> matISCSRDenseRect.cpp: 2 procs, global *2*x3 matrix, each proc adds a *1*x2
>> local *vector* in the global matrix.
>>
>> reminder: running debian/testing with gcc-6.3 + petsc-3.7.6
>>
>> Franck
>>
>> >> mpirun -n 2 ./matISCSRDenseSquare.exe csr; mpirun -n 2
>> ./matISCSRDenseSquare.exe dense
>> csr
>> csr
>> Mat Object: 2 MPI processes
>> type: is
>> Mat Object: 1 MPI processes
>> type: seqaij
>> row 0: (0, 1.) (1, 0.)
>> row 1: (0, 0.) (1, 1.)
>> Mat Object: 1 MPI processes
>> type: seqaij
>> row 0: (0, 1.) (1, 0.)
>> row 1: (0, 0.) (1, 1.)
>> dense
>> dense
>> Mat Object: 2 MPI processes
>> type: is
>> Mat Object: 1 MPI processes
>> type: seqdense
>> 1.0000000000000000e+00 0.0000000000000000e+00
>> 0.0000000000000000e+00 1.0000000000000000e+00
>> Mat Object: 1 MPI processes
>> type: seqdense
>> 1.0000000000000000e+00 0.0000000000000000e+00
>> 0.0000000000000000e+00 1.0000000000000000e+00
>>
>> >> mpirun -n 2 ./matISCSRDenseRect.exe csr; mpirun -n 2
>> ./matISCSRDenseRect.exe dense
>> csr
>> csr
>> Mat Object: 2 MPI processes
>> type: is
>> Mat Object: 1 MPI processes
>> type: seqaij
>> row 0: (0, 1.) (1, 0.)
>> Mat Object: 1 MPI processes
>> type: seqaij
>> row 0: (0, 1.) (1, 0.)
>> dense
>> dense
>> [0]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>> [0]PETSC ERROR: Argument out of range
>> [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>> [1]PETSC ERROR: Argument out of range
>> [1]PETSC ERROR: New nonzero at (0,1) caused a malloc
>> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn
>> off this check
>> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.
>> [1]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017
>> [1]PETSC ERROR: ./matISCSRDenseRect.exe on a arch-linux2-c-debug named
>> yoda by fghoussen Mon Jun 19 18:08:58 2017
>> New nonzero at (0,1) caused a malloc
>> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn
>> off this check
>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.
>> [0]PETSC ERROR: Petsc Release Version 3.7.6, Apr, 24, 2017
>> [1]PETSC ERROR: Configure options --prefix=/home/fghoussen/
>> Documents/INRIA/petsc-3.7.6/local --with-mpi=1 --with-pthread=1
>> --download-f2cblaslapack=yes --download-mumps=yes --download-scalapack=yes
>> --download-superlu=yes --download-suitesparse=yes
>> [1]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 616 in
>> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/
>> impls/aij/mpi/mpiaij.c
>> [1]PETSC ERROR: #2 MatSetValues() line 1190 in /home/fghoussen/Documents/
>> INRIA/petsc-3.7.6/src/mat/interface/matrix.c
>> [1]PETSC ERROR: #3 MatSetValuesLocal() line 2053 in
>> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/interface/matrix.c
>> [1]PETSC ERROR: #4 MatISGetMPIXAIJ_IS() line 365 in
>> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
>> [1]PETSC ERROR: #5 MatISGetMPIXAIJ() line 437 in
>> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
>> [0]PETSC ERROR: ./matISCSRDenseRect.exe on a arch-linux2-c-debug named
>> yoda by fghoussen Mon Jun 19 18:08:58 2017
>> [0]PETSC ERROR: Configure options --prefix=/home/fghoussen/
>> Documents/INRIA/petsc-3.7.6/local --with-mpi=1 --with-pthread=1
>> --download-f2cblaslapack=yes --download-mumps=yes --download-scalapack=yes
>> --download-superlu=yes --download-suitesparse=yes
>> [0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 582 in
>> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/
>> impls/aij/mpi/mpiaij.c
>> [0]PETSC ERROR: #2 MatSetValues() line 1190 in /home/fghoussen/Documents/
>> INRIA/petsc-3.7.6/src/mat/interface/matrix.c
>> [0]PETSC ERROR: #3 MatSetValuesLocal() line 2053 in
>> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/interface/matrix.c
>> [0]PETSC ERROR: #4 MatISGetMPIXAIJ_IS() line 365 in
>> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
>> [0]PETSC ERROR: #5 MatISGetMPIXAIJ() line 437 in
>> /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
>> Mat Object: 2 MPI processes
>> type: is
>> Mat Object: 1 MPI processes
>> type: seqdense
>> 1.0000000000000000e+00 0.0000000000000000e+00
>> Mat Object: 1 MPI processes
>> type: seqdense
>> 1.0000000000000000e+00 0.0000000000000000e+00
>>
>> >> diff matISCSRDenseSquare.cpp matISCSRDenseRect.cpp
>> 3c3
>> < // ~> g++ -o matISCSRDenseSquare.exe matISCSRDenseSquare.cpp -lpetsc
>> -lm; mpirun -n 2 matISCSRDenseSquare.exe
>> ---
>> > // ~> g++ -o matISCSRDenseRect.exe matISCSRDenseRect.cpp -lpetsc -lm;
>> mpirun -n 2 matISCSRDenseRect.exe
>> 24c24
>> < ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, 2, localIdx,
>> PETSC_COPY_VALUES, &rmap);
>> ---
>> > ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, 1, &rank,
>> PETSC_COPY_VALUES, &rmap);
>> 29c29
>> < MatCreateIS(PETSC_COMM_WORLD, 1, PETSC_DECIDE, PETSC_DECIDE, 3, 3,
>> rmap, cmap, &A);
>> ---
>> > MatCreateIS(PETSC_COMM_WORLD, 1, PETSC_DECIDE, PETSC_DECIDE, 2, 3,
>> rmap, cmap, &A);
>> 32,33c32,33
>> < if (matType == "csr") {cout << matType << endl;
>> MatCreateSeqAIJ(PETSC_COMM_SELF, 2, 2, 2, NULL, &Aloc);}
>> < else {cout << matType << endl; MatCreateSeqDense(PETSC_COMM_SELF, 2,
>> 2, NULL, &Aloc);}
>> ---
>> > if (matType == "csr") {cout << matType << endl;
>> MatCreateSeqAIJ(PETSC_COMM_SELF, 1, 2, 2, NULL, &Aloc);}
>> > else {cout << matType << endl; MatCreateSeqDense(PETSC_COMM_SELF, 1,
>> 2, NULL, &Aloc);}
>> 35,36c35,36
>> < PetscScalar localVal[4] = {1., 0., 0., 1.};
>> < MatSetValues(Aloc, 2, localIdx, 2, localIdx, localVal, ADD_VALUES);
>> // Add local 2x2 matrix
>> ---
>> > PetscScalar localVal[2] = {1., 0.}; PetscInt oneLocalRow = 0;
>> > MatSetValues(Aloc, 1, &oneLocalRow, 2, localIdx, localVal,
>> ADD_VALUES); // Add local row
>>
>> ------------------------------
>>
>> *De: *"Stefano Zampini" <stefano.zampini at gmail.com>
>> *À: *"Franck Houssen" <franck.houssen at inria.fr>
>> *Cc: *"PETSc users list" <petsc-users at mcs.anl.gov>
>> *Envoyé: *Lundi 19 Juin 2017 15:25:35
>> *Objet: *Re: [petsc-users] Building MatIS with dense local matrix ?
>>
>> Can you send a minimal working example so that I can fix the code?
>>
>> Thanks
>> Stefano
>>
>> Il 19 Giu 2017 15:20, "Franck Houssen" <franck.houssen at inria.fr> ha
>> scritto:
>>
>>> Hi,
>>>
>>> I try to call MatISGetMPIXAIJ on a MatIS (A) that has been feed locally
>>> by sequential (Aloc) dense matrix.
>>> Seems this ends up with this error: [0]PETSC ERROR: New nonzero at (0,1)
>>> caused a malloc. Is this a known error / limitation ? (not supposed to work
>>> with dense matrix ?)
>>>
>>> This (pseudo code) works fine:
>>> MatCreateIS(..., A)
>>> MatCreateSeqAIJ(..., Aloc)
>>> MatISSetLocalMat(pcA, pcALoc)
>>> MatISGetMPIXAIJ(A, ...) // OK !
>>>
>>> When I try to replace MatCreateSeqAIJ(..., Aloc) with
>>> MatCreateSeqDense(..., Aloc), it does no more work.
>>>
>>> Franck
>>>
>>> PS: running debian/testing with gcc-6.3 + petsc-3.7.6
>>>
>>
>>
>
>
--
Stefano
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170620/1fcb7de1/attachment.html>
More information about the petsc-users
mailing list