[petsc-users] MPIBAIJ MatMult and non conforming object sizes error on some matrices

Steena M stm8086 at yahoo.com
Thu Dec 18 02:36:06 CST 2014


Hello,

I am loading symmetric sparse matrices  (source: Florida sparse matrix database) in binary format, converting it to MPIBAIJ format for executing parallel MatMult. The following piece  of code seems to work for most matrices but aborts with a "non conforming object sizes " error for some matrices. For instance,  MPIBAIJ MatMult() on a sparse matrix of size 19366x19366 with block size 2 using two MPI ranks:

[1]PETSC ERROR: Nonconforming object sizes!
[1]PETSC ERROR: Mat mat,Vec y: local dim 9682 9683!

Another instance, matrix thermomech_TK with dimensions 102158*102158 with block size 2 using two MPI ranks:

[1]PETSC ERROR: Nonconforming object sizes!
[1]PETSC ERROR: Mat mat,Vec y: local dim 51078 51079!

 thermomech_TK completes a clean execution with block size 7 without errors. Does this mean that depending on the sparsity pattern of a matrix, compatible block sizes will not always work?

On a different note: For unsymmetric matrices in MATMULT, in addition to setting "-mat_nonsym", is there a different recommended technique to load matrices?

================

PETSc code for the symmetric case is as follows:

static char help[] = "Parallel SpMV--reads binary matrix file";

#include <petscmat.h>


#undef __FUNCT__
#define __FUNCT__ "main"
int main(int argc,char **args)
{

Vec            x,y; 
Mat            A; 
PetscViewer     fd;
int rank, global_row_size, global_col_size,ierr, fd1;
PetscBool       PetscPreLoad = PETSC_FALSE;
PetscInt       fileheader[4];
char           filein[PETSC_MAX_PATH_LEN] ; /*binary .dat matrix file */
PetscScalar one = 1.0;
PetscScalar zero = 0.0;
PetscInt bs;
int m, n,M,N, total_ranks;

 PetscInitialize(&argc,&args,(char *)0,help);
 MPI_Comm_rank(PETSC_COMM_WORLD,&rank);
 MPI_Comm_size(MPI_COMM_WORLD,&total_ranks);
 PetscPrintf (PETSC_COMM_WORLD,"Total ranks is %d", total_ranks);
 int local_size = m/total_ranks;
 
  ierr =  PetscOptionsGetString(NULL,"-fin",filein,PETSC_MAX_PATH_LEN,NULL); //filename from command prompt
  ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,filein,FILE_MODE_READ,&fd); //Send it to the petscviewer
 
 /*Matrix creating and loading*/
  ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr);
  ierr = MatSetType(A, MATMPIBAIJ);
  ierr = MatLoad(A,fd);CHKERRQ(ierr);
  ierr = PetscViewerDestroy(&fd);CHKERRQ(ierr);



/* Vector setup */
  ierr = MatGetSize(A,&m,&n);
 
 
 
  ierr = VecCreate(PETSC_COMM_WORLD,&x);CHKERRQ(ierr);
  ierr = VecSetType(x,VECMPI);
  ierr = VecSetSizes(x,m/total_ranks,m);CHKERRQ(ierr); //Force local size instead of PETSC_DECIDE
  ierr = VecSetFromOptions(x);CHKERRQ(ierr);
 
  ierr = VecSetType(x,VECMPI);
  ierr = VecCreate(PETSC_COMM_WORLD,&y);CHKERRQ(ierr); 
  ierr = VecSetSizes(y,m/total_ranks,m);CHKERRQ(ierr); //Force local size instead of PETSC_DECIDE
  ierr = VecSetFromOptions(y);CHKERRQ(ierr);

  ierr = VecSet(x,one);CHKERRQ(ierr);
  ierr = VecSet(y,zero); CHKERRQ(ierr);



/* SpMV*/
  ierr = MatMult(A,x,y);CHKERRQ(ierr);

  ierr = VecDestroy(&x);CHKERRQ(ierr);
  ierr = VecDestroy(&y);CHKERRQ(ierr);
  ierr = MatDestroy(&A);CHKERRQ(ierr);
 
  ierr = PetscFinalize();
  return 0;
 
  }



More information about the petsc-users mailing list