[petsc-users] Is MatSetUp required with MatCreateNormal and MatCreateMPIAIJWithSplitArrays?

Mihai Alexe malexe at vt.edu
Thu Nov 29 09:48:48 CST 2012


Hello all,

I am creating a large rectangular MPIAIJ matrix, then a shell NormalMatrix
that eventually gets passed to a KSP object (all part of a constrained
least-squares solver).
Code looks as follows:

 //user.A_mat and user.Hess are PETSc Mat

 info = MatCreateMPIAIJWithSplitArrays( PETSC_COMM_WORLD, *locrow, *loccol,
nrow,
  *ncol, onrowidx, oncolidx,
  (PetscScalar*) onvals, offrowidx, offcolidx,
  (PetscScalar*) values, &user.A_mat ); CHKERRQ(info);

 info = MatCreateNormal( user.A_mat, &user.Hess ); CHKERRQ(info);
 info = MatSetUp( user.Hess );

Is MatSetUp() required for A or Hess to be initialized correctly? Or some
call to MatSetPreallocation?

My code crashes after displaying (with -info -mat_view_info):

[0] PetscCommDuplicate(): Duplicating a communicator 47534399113024
67425648 max tags = 2147483647
[0] PetscCommDuplicate(): Duplicating a communicator 47534399112000
67760592 max tags = 2147483647
[0] MatCreate_SeqAIJ_Inode(): Not using Inode routines due to -mat_no_inode
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 1508490; storage space:
0 unneeded,34572269 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 615
Matrix Object: 1 MPI processes
  type: seqaij
  rows=8920860, cols=1508490
  total: nonzeros=34572269, allocated nonzeros=0
  total number of mallocs used during MatSetValues calls =0
    not using I-node routines
[0] PetscCommDuplicate(): Using internal PETSc communicator 47534399112000
67760592
[0] MatCreate_SeqAIJ_Inode(): Not using Inode routines due to -mat_no_inode
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 18752388; storage
space: 0 unneeded,1762711 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 349
Matrix Object: 1 MPI processes
  type: seqaij
  rows=8920860, cols=18752388
  total: nonzeros=1762711, allocated nonzeros=0
  total number of mallocs used during MatSetValues calls =0
    not using I-node routines
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 1508490; storage space:
0 unneeded,34572269 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 615
Matrix Object: 1 MPI processes
  type: seqaij
  rows=8920860, cols=1508490
  total: nonzeros=34572269, allocated nonzeros=0
  total number of mallocs used during MatSetValues calls =0
    not using I-node routines
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 18752388; storage
space: 0 unneeded,1762711 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 349
Matrix Object: 1 MPI processes
  type: seqaij
  rows=8920860, cols=18752388
  total: nonzeros=1762711, allocated nonzeros=0
  total number of mallocs used during MatSetValues calls =0
    not using I-node routines
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 1508490; storage space:
0 unneeded,34572269 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 615
[0] PetscCommDuplicate(): Using internal PETSc communicator 47534399112000
67760592
[0] PetscCommDuplicate(): Using internal PETSc communicator 47534399112000
67760592
[0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
[0] VecScatterCreate(): General case: MPI to Seq
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 38109; storage space: 0
unneeded,1762711 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 349
Matrix Object: 160 MPI processes
  type: mpiaij
  rows=131858910, cols=18752388

The code ran just fine on a smaller (pruned) input dataset.
I don't get a stacktrace unfortunately... (running in production mode,
trying to switch to debug mode now).


Regards,
Mihai
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121129/6f822239/attachment.html>


More information about the petsc-users mailing list