[petsc-users] Error in creation of MPISBAIJ matrix with DMCreateMatrix

Barry Smith bsmith at mcs.anl.gov
Mon Jan 20 13:37:03 CST 2014


   Thanks for reporting the problem. This is our error. Could you please send us the code that generates the error so we can reproduce the problem, determine the cause and fix it.

   Barry

 We need to know the exact values of imax,jmax, kmax etc to reproduce the problem.

 

On Jan 20, 2014, at 9:33 AM, Xiao, Jianjun (IKET) <jianjun.xiao at kit.edu> wrote:

> Dear developers,
> 
> I am using petsc-dev. I tried to create a MPISBAIJ matrix as shown below, and it seems that the matrix creation is sensitive to the number of processors.
> 
>      CALL DMDACreate3d(PETSC_COMM_WORLD,DMDA_BOUNDARY_NONE,            &
>     &    DMDA_BOUNDARY_NONE,DMDA_BOUNDARY_NONE,                        &
>     &    DMDA_STENCIL_BOX,-imax,-jmax,-kmax,PETSC_DECIDE,PETSC_DECIDE,&
>     &    PETSC_DECIDE,1,1,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,       &
>     &    PETSC_NULL_INTEGER,da,ierr)
> 
>       CALL DMSetMatType(da,MATMPISBAIJ,ierr)
>       CALL DMCreateMatrix(da,mat,ierr)
> 
> A cluster with 64 processors was used for the testing. 
> 
> When the number of procssors are 1,2,3,4,5,6,7,8,16,32 and 64, the code always works quite well. 
> 
> For some other numbers, the code works not so stable. Sometimes, the matrix was created successfully. Sometimes, it failed.
> 
> When the number of procssors are 20, 33, 63 or some relatively bigger numbers , the code always got the error below.   
> 
> 
> mpirun: [25]PETSC ERROR: --------------------- Error Message ------------------------------------
> mpirun: [25]PETSC ERROR: Argument out of range!
> mpirun: [25]PETSC ERROR: New nonzero at (198,7038) caused a malloc!
> mpirun: [25]PETSC ERROR: ------------------------------------------------------------------------
> mpirun: [25]PETSC ERROR: Petsc Development GIT revision: f7404d5510646a3c64be49fff6ce547efef07b3d GIT Date: 2013-11-27 00:12:54 +0100
> mpirun: [25]PETSC ERROR: See docs/changes/index.html for recent updates.
> mpirun: [25]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> mpirun: [25]PETSC ERROR: See docs/index.html for manual pages.
> mpirun: [25]PETSC ERROR: Configure options --with-cc=mpicc --with-fc=mpif90 --with-cxx=mpicxx --with-debugging=1
> mpirun: [25]PETSC ERROR: ------------------------------------------------------------------------
> mpirun: [25]PETSC ERROR: MatSetValuesBlocked_SeqBAIJ() line 1836 in src/mat/impls/baij/seq/baij.c
> mpirun: [25]PETSC ERROR: MatSetValuesBlocked_MPISBAIJ() line 339 in src/mat/impls/sbaij/mpi/mpisbaij.c
> mpirun: [25]PETSC ERROR: MatSetValuesBlocked() line 1658 in src/mat/interface/matrix.c
> mpirun: [25]PETSC ERROR: DMCreateMatrix_DA_3d_MPISBAIJ() line 1780 in src/dm/impls/da/fdda.c
> mpirun: [25]PETSC ERROR: DMCreateMatrix_DA() line 777 in src/dm/impls/da/fdda.c
> mpirun: [25]PETSC ERROR: DMCreateMatrix() line 1007 in src/dm/interface/dm.c
> 
> Then I changed the matrix format to MATMPIBAIJ. DMCreateMatrix worked fine for any number of processors.
> 
>       CALL DMSetMatType(da,MATMPIBAIJ,ierr)
>       CALL DMCreateMatrix(da,gfmat,ierr)
> 
> Could you please let me know how can I fix this problem? If you need more information, please let me know. Thank you.
> 
> JJ



More information about the petsc-users mailing list