[petsc-users] Argument out of range error in MatPermute
Eda Oktay
eda.oktay at metu.edu.tr
Wed Apr 24 05:03:38 CDT 2019
Dear Stefano,
Thank you for answering. When I used MatSetSizes, I got an error since the
matrix is read from outside, but from the error I understood that actually
the local sizes are 2127 and 2126, so I misunderstood the problem. I am
sorry for my mistake.
However, I still cannot understand where is the error. Because both
communicators and local sizes of IS and the matrix are the same. I still
get the same error in MatPermute:
[0]PETSC ERROR: Argument out of range
[0]PETSC ERROR: Index -1081207334 is out of range
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.10.3, Dec, 18, 2018
[0]PETSC ERROR: ./TEKRAR_TEK_SAYI_SON_YENI_DENEME_TEMIZ_ENYENI_FINAL on a
arch-linux2-c-debug named 53d.wls.metu.edu.tr by edaoktay Wed Apr 24
11:22:15 2019
[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --with-cxx-dialect=C++11 --download-openblas
--download-metis --download-parmetis --download-superlu_dist
--download-slepc --download-mpich
[0]PETSC ERROR: #1 PetscLayoutFindOwner() line 248 in
/home/edaoktay/petsc-3.10.3/include/petscis.h
[0]PETSC ERROR: #2 MatPermute_MPIAIJ() line 1685 in
/home/edaoktay/petsc-3.10.3/src/mat/impls/aij/mpi/mpiaij.c
[0]PETSC ERROR: #3 MatPermute() line 4997 in
/home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c
[0]PETSC ERROR: #4 main() line 361 in
/home/edaoktay/petsc-3.10.3/arch-linux2-c-debug/share/slepc/examples/src/eda/TEKRAR_TEK_SAYI_SON_YENI_DENEME_TEMIZ_ENYENI_FINAL.c
This is the part of my program:
ierr = MatCreateVecs(L,&vr,NULL);CHKERRQ(ierr);
ierr = EPSGetEigenpair(eps,0,&kr,NULL,vr,NULL);
ierr = PetscPrintf(PETSC_COMM_WORLD," The second smallest eigenvalue:
%g\n",kr);CHKERRQ(ierr);
/* sort second smallest eigenvector */
ierr = VecGetSize(vr,&siz);CHKERRQ(ierr);
ierr = PetscMalloc1(siz,&idx);CHKERRQ(ierr);
for (i=0; i<siz;i++) idx[i] = i;
VecScatter ctx;
Vec vout;
VecScatterCreateToAll(vr,&ctx,&vout);
VecScatterBegin(ctx,vr,vout,INSERT_VALUES,SCATTER_FORWARD);
VecScatterEnd(ctx,vr,vout,INSERT_VALUES,SCATTER_FORWARD);
VecScatterDestroy(&ctx);
PetscScalar *avr;
ierr = VecGetArray(vout,&avr);CHKERRQ(ierr);
ierr = PetscSortRealWithPermutation(siz,avr,idx);CHKERRQ(ierr);
/*Select out a piece of the resulting indices idx on each process; for
example with two processes I think rank = 0 would get the first half of the
idx and rank = 1 would get the second half.*/
PetscMPIInt rank,size;
MPI_Comm_rank(PETSC_COMM_WORLD, &rank);
MPI_Comm_size(PETSC_COMM_WORLD, &size);
PetscInt mod;
mod = siz % size;
PetscInt *idxx,ss;
ss = (siz-mod)/size;
if (mod != 0){
if (rank<mod){
PetscMalloc1(ss+1,&idxx);
} else{
PetscMalloc1(ss,&idxx);
}
} else{
PetscMalloc1(ss,&idxx);
}
j =0;
for (i=rank*ss; i<(rank+1)*ss; i++) {
idxx[j] = idx[i];
//PetscPrintf(PETSC_COMM_WORLD," idxx: %D\n",idxx[j]);
j++;
if (mod != 0){
if (rank<mod){
idxx[ss+1] = idx[ss*size+rank+1];
}
}
/*Permute matrix L (spy(A(p1,p1))*/
if (mod != 0){
if (rank<mod){
ierr =
ISCreateGeneral(PETSC_COMM_WORLD,ss+1,idxx,PETSC_COPY_VALUES,&is);CHKERRQ(ierr);
} else{
ierr =
ISCreateGeneral(PETSC_COMM_WORLD,ss,idxx,PETSC_COPY_VALUES,&is);CHKERRQ(ierr);
}
}else {
ierr =
ISCreateGeneral(PETSC_COMM_WORLD,ss,idxx,PETSC_COPY_VALUES,&is);CHKERRQ(ierr);
}
ierr = ISSetPermutation(is);CHKERRQ(ierr);
ierr = MatPermute(A,is,is,&PL);CHKERRQ(ierr);
I printed IS and idxx, IS is a permutation and numbers seem to be correct.
The program also works when mod==0.
Thanks,
Eda
Stefano Zampini <stefano.zampini at gmail.com>, 22 Nis 2019 Pzt, 18:13
tarihinde şunu yazdı:
> If you are using PETSC_DECIDE for the local sizes in MatSetSizes, the
> default local sizes should be 2127 and 2126.
>
> Il giorno lun 22 apr 2019 alle ore 11:28 Eda Oktay via petsc-users <
> petsc-users at mcs.anl.gov> ha scritto:
>
>> Thank you for your answers. I figured out that for a matrix of size
>> 4253*4253, local size of Mat is 2127 and 2127 for 2 processors. However, I
>> wrote the program such that the local sizes of IS 2127 and 2126.
>>
>> Is local size of Mat being 2127 on both processors correct? If it is,
>> then I will change the local size of IS but then it will exceed the global
>> size of Mat. Isn't this also a problem?
>>
>> Thanks a lot,
>>
>> Eda
>>
>> Smith, Barry F. <bsmith at mcs.anl.gov>, 9 Nis 2019 Sal, 00:31 tarihinde
>> şunu yazdı:
>>
>>>
>>> Suggest printing out the IS with ISView after it is created and
>>> confirming that 1) it is a permutation and 2) that the size of the IS on
>>> each process matches the number of rows on that process.
>>>
>>> Note from the manual page: The index sets should be on the same
>>> communicator as Mat and have the same local sizes.
>>>
>>> Barry
>>>
>>>
>>> > On Apr 8, 2019, at 3:21 AM, Eda Oktay via petsc-users <
>>> petsc-users at mcs.anl.gov> wrote:
>>> >
>>> > Hello again,
>>> >
>>> > I solved the problem for even numbered sized matrices. However when
>>> the matrix size is odd, then number of elements in each index set at each
>>> processor are different. (For example, for size 4253*4253 and 2 processors,
>>> size of index set at processor 0 is 2127 where at processor 1, it is 2126)
>>> I think this is why, MatPermute again gives the same "Argument out of
>>> range" error. Index sets look like correct but I still did not get why I
>>> get this error.
>>> >
>>> > This is the part of my program:
>>> >
>>> > PetscMPIInt rank,size;
>>> > MPI_Comm_rank(PETSC_COMM_WORLD, &rank);
>>> > MPI_Comm_size(PETSC_COMM_WORLD, &size);
>>> >
>>> > PetscInt mod;
>>> > mod = siz % size;
>>> >
>>> > PetscInt *idxx,ss;
>>> > ss = (siz-mod)/size;
>>>
>>> >
>>> > if (mod != 0){
>>> > if (rank<mod){
>>> > PetscMalloc1(ss+1,&idxx);
>>>
>>> > } else{
>>> > PetscMalloc1(ss,&idxx);
>>> > }
>>> > }
>>> >
>>> > if (rank != size-1) {
>>> > j =0;
>>> > for (i=rank*ss; i<(rank+1)*ss; i++) {
>>> > idxx[j] = idx[i];
>>> > j++;
>>> > }
>>> >
>>> > } else {
>>> >
>>> > j =0;
>>> > for (i=rank*ss; i<siz; i++) {
>>> > idxx[j] = idx[i];
>>> > j++;
>>> > }
>>> >
>>> > }
>>> >
>>> > if (mod != 0){
>>> > if (rank<mod){
>>> > idxx[ss+1] = idx[ss*size+rank+1];
>>> > }
>>> > }
>>> >
>>> > /*Permute matrix L (spy(A(p1,p1))*/
>>> >
>>> > if (mod != 0){
>>> > if (rank<mod){
>>> > ierr =
>>> ISCreateGeneral(PETSC_COMM_WORLD,ss+1,idxx,PETSC_COPY_VALUES,&is);CHKERRQ(ierr);
>>> > } else{
>>> > ierr =
>>> ISCreateGeneral(PETSC_COMM_WORLD,ss,idxx,PETSC_COPY_VALUES,&is);CHKERRQ(ierr);
>>> > }
>>> > }
>>> > ierr = ISSetPermutation(is);CHKERRQ(ierr);
>>> >
>>> > ierr = MatPermute(A,is,is,&PL);CHKERRQ(ierr);
>>> >
>>> > And I get the following error even if I use MatSetOption :
>>> >
>>> > [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> > [0]PETSC ERROR: Argument out of range
>>> > [0]PETSC ERROR: New nonzero at (0,4252) caused a malloc
>>> > Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to
>>> turn off this check
>>> > [0]PETSC ERROR: See
>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>>> shooting.
>>> > [0]PETSC ERROR: Petsc Release Version 3.10.3, Dec, 18, 2018
>>> > [0]PETSC ERROR: ./SON_YENI_DENEME_TEMIZ_ENYENI_FINAL on a
>>> arch-linux2-c-debug named dd2b.wls.metu.edu.tr by edaoktay Mon Apr 8
>>> 11:10:59 2019
>>> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>>> --with-fc=gfortran --with-cxx-dialect=C++11 --download-openblas
>>> --download-metis --download-parmetis --download-superlu_dist
>>> --download-slepc --download-mpich
>>> > [0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 617 in
>>> /home/edaoktay/petsc-3.10.3/src/mat/impls/aij/mpi/mpiaij.c
>>> > [0]PETSC ERROR: #2 MatSetValues() line 1349 in
>>> /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c
>>> > [0]PETSC ERROR: #3 MatPermute_MPIAIJ() line 1714 in
>>> /home/edaoktay/petsc-3.10.3/src/mat/impls/aij/mpi/mpiaij.c
>>> > [0]PETSC ERROR: #4 MatPermute() line 4997 in
>>> /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c
>>> > [0]PETSC ERROR: #5 main() line 352 in
>>> /home/edaoktay/petsc-3.10.3/arch-linux2-c-debug/share/slepc/examples/src/eda/SON_YENI_DENEME_TEMIZ_ENYENI_FINAL.c
>>> > [0]PETSC ERROR: PETSc Option Table entries:
>>> > [0]PETSC ERROR: -f
>>> /home/edaoktay/petsc-3.10.3/share/petsc/datafiles/matrices/binary_files/airfoil1_binary
>>> > [0]PETSC ERROR: -mat_partitioning_type parmetis
>>> > [0]PETSC ERROR: -unweighted
>>> > [0]PETSC ERROR: ----------------End of Error Message -------send
>>> entire error message to petsc-maint at mcs.anl.gov----------
>>> >
>>> > Thanks!
>>> >
>>> > Eda
>>> >
>>> > Eda Oktay <eda.oktay at metu.edu.tr>, 25 Mar 2019 Pzt, 13:53 tarihinde
>>> şunu yazdı:
>>> > I attached whole program I wrote where the problem is in line 285. One
>>> of the matrices I used was airfoil1_binary, included in the folder. Also, I
>>> included makefile. Is that what you want?
>>> >
>>> > Matthew Knepley <knepley at gmail.com>, 25 Mar 2019 Pzt, 13:41 tarihinde
>>> şunu yazdı:
>>> > That should not happen. Can you send in a small example that we can
>>> debug.
>>> >
>>> > Thanks,
>>> >
>>> > Matt
>>> >
>>> > On Mon, Mar 25, 2019 at 12:38 AM Eda Oktay via petsc-users <
>>> petsc-users at mcs.anl.gov> wrote:
>>> > Hello,
>>> >
>>> > I am trying to permute a vector A using following lines:
>>> >
>>> > ierr =
>>> ISCreateGeneral(PETSC_COMM_SELF,siz,idx,PETSC_COPY_VALUES,&is);CHKERRQ(ierr);
>>> > ierr = ISSetPermutation(is);CHKERRQ(ierr);
>>> > ierr = ISDuplicate(is,&newIS);CHKERRQ(ierr);
>>> > ierr =
>>> MatSetOption(A,MAT_NEW_NONZERO_ALLOCATION_ERR,PETSC_FALSE);CHKERRQ(ierr);
>>> > ierr = MatPermute(A,is,newIS,&PL);CHKERRQ(ierr);
>>> >
>>> > However, in MatPermute line, I get the following error even if I used
>>> MatSetOption before this line:
>>> >
>>> > [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> > [0]PETSC ERROR: Argument out of range
>>> > [0]PETSC ERROR: New nonzero at (0,485) caused a malloc
>>> > Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to
>>> turn off this check
>>> > [0]PETSC ERROR: See
>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>>> shooting.
>>> > [0]PETSC ERROR: Petsc Release Version 3.10.3, Dec, 18, 2018
>>> > [0]PETSC ERROR: ./DENEME_TEMIZ_ENYENI_FINAL on a arch-linux2-c-debug
>>> named 1232.wls.metu.edu.tr by edaoktay Mon Mar 25 12:15:14 2019
>>> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>>> --with-fc=gfortran --with-cxx-dialect=C++11 --download-openblas
>>> --download-metis --download-parmetis --download-superlu_dist
>>> --download-slepc --download-mpich
>>> > [0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 579 in
>>> /home/edaoktay/petsc-3.10.3/src/mat/impls/aij/mpi/mpiaij.c
>>> > [0]PETSC ERROR: #2 MatAssemblyEnd_MPIAIJ() line 807 in
>>> /home/edaoktay/petsc-3.10.3/src/mat/impls/aij/mpi/mpiaij.c
>>> > [0]PETSC ERROR: #3 MatAssemblyEnd() line 5340 in
>>> /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c
>>> > [0]PETSC ERROR: #4 MatPermute_MPIAIJ() line 1723 in
>>> /home/edaoktay/petsc-3.10.3/src/mat/impls/aij/mpi/mpiaij.c
>>> > [0]PETSC ERROR: #5 MatPermute() line 4997 in
>>> /home/edaoktay/petsc-3.10.3/src/mat/interface/matrix.c
>>> > [0]PETSC ERROR: #6 main() line 285 in
>>> /home/edaoktay/petsc-3.10.3/arch-linux2-c-debug/share/slepc/examples/src/eda/DENEME_TEMIZ_ENYENI_FINAL.c
>>> > [0]PETSC ERROR: PETSc Option Table entries:
>>> > [0]PETSC ERROR: -f
>>> /home/edaoktay/petsc-3.10.3/share/petsc/datafiles/matrices/binary_files/airfoil1_binary
>>> > [0]PETSC ERROR: -mat_partitioning_type parmetis
>>> > [0]PETSC ERROR: -weighted
>>> > [0]PETSC ERROR: ----------------End of Error Message -------send
>>> entire error message to petsc-maint at mcs.anl.gov----------
>>> >
>>> > I'll be glad if you can help me.
>>> >
>>> > Thanks!
>>> >
>>> > Eda
>>> >
>>> >
>>> > --
>>> > What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> > -- Norbert Wiener
>>> >
>>> > https://www.cse.buffalo.edu/~knepley/
>>>
>>>
>
> --
> Stefano
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190424/57f8f31f/attachment-0001.html>
More information about the petsc-users
mailing list