[petsc-users] Is MatSetUp required with MatCreateNormal and MatCreateMPIAIJWithSplitArrays?

Mihai Alexe malexe at vt.edu
Fri Dec 14 04:59:56 CST 2012


Barry,

I've tracked down the problem.

I ran with -info -mat_view_info, and fpe's enabled and got a SIGFPE after
entering MatCreateMPIAIJWithSplitArrays (Petsc did not produce a stacktrace
unfortunately). This was due to a floating point exception in a typecast
inside mat/interface/matrix.c:

      if (mat->ops->getinfo) {
        MatInfo info;
        ierr = MatGetInfo(mat,MAT_GLOBAL_SUM,&info);CHKERRQ(ierr);
        ierr = PetscViewerASCIIPrintf(viewer,"*total: nonzeros=%D*,
allocated nonzeros=%D\n",*(PetscInt)info.nz_used*
,(PetscInt)info.nz_allocated);CHKERRQ(ierr);
        ierr = PetscViewerASCIIPrintf(viewer,"total number of mallocs used
during MatSetValues calls =%D\n",(PetscInt)info.mallocs);CHKERRQ(ierr);
      }

My sparse matrix has about 6 billion nonzeros. When I disable FPEs, i get a
silent overflow when converting MatInfo.nz_used from PetscLogDouble to
(32-bit) PetscInt:

Matrix Object: 96 MPI processes
  type: mpiaij
  rows=131857963, cols=18752388
  total: *nonzeros=-2147483648*, allocated nonzeros=0

and the code runs just fine. Maybe PETSc should cast nz_used to a long int?


Mihai
On Thu, Nov 29, 2012 at 6:25 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
> On Nov 29, 2012, at 9:48 AM, Mihai Alexe <malexe at vt.edu> wrote:
>
> > Hello all,
> >
> > I am creating a large rectangular MPIAIJ matrix, then a shell
> NormalMatrix that eventually gets passed to a KSP object (all part of a
> constrained least-squares solver).
> > Code looks as follows:
> >
> >  //user.A_mat and user.Hess are PETSc Mat
> >
> >  info = MatCreateMPIAIJWithSplitArrays( PETSC_COMM_WORLD, *locrow,
> *loccol, nrow,
> >                                 *ncol, onrowidx, oncolidx,
> >                                 (PetscScalar*) onvals, offrowidx,
> offcolidx,
> >                                 (PetscScalar*) values, &user.A_mat );
> CHKERRQ(info);
> >
> >  info = MatCreateNormal( user.A_mat, &user.Hess ); CHKERRQ(info);
> >  info = MatSetUp( user.Hess );
> >
> > Is MatSetUp() required for A or Hess to be initialized correctly? Or
> some call to MatSetPreallocation?
> '
>    No you shouldn't need them.   Try with valgrind
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>
>   Barry
>
> >
> > My code crashes after displaying (with -info -mat_view_info):
> >
> > [0] PetscCommDuplicate(): Duplicating a communicator 47534399113024
> 67425648 max tags = 2147483647
> > [0] PetscCommDuplicate(): Duplicating a communicator 47534399112000
> 67760592 max tags = 2147483647
> > [0] MatCreate_SeqAIJ_Inode(): Not using Inode routines due to
> -mat_no_inode
> > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 1508490; storage
> space: 0 unneeded,34572269 used
> > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 615
> > Matrix Object: 1 MPI processes
> >   type: seqaij
> >   rows=8920860, cols=1508490
> >   total: nonzeros=34572269, allocated nonzeros=0
> >   total number of mallocs used during MatSetValues calls =0
> >     not using I-node routines
> > [0] PetscCommDuplicate(): Using internal PETSc communicator
> 47534399112000 67760592
> > [0] MatCreate_SeqAIJ_Inode(): Not using Inode routines due to
> -mat_no_inode
> > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 18752388; storage
> space: 0 unneeded,1762711 used
> > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 349
> > Matrix Object: 1 MPI processes
> >   type: seqaij
> >   rows=8920860, cols=18752388
> >   total: nonzeros=1762711, allocated nonzeros=0
> >   total number of mallocs used during MatSetValues calls =0
> >     not using I-node routines
> > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 1508490; storage
> space: 0 unneeded,34572269 used
> > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 615
> > Matrix Object: 1 MPI processes
> >   type: seqaij
> >   rows=8920860, cols=1508490
> >   total: nonzeros=34572269, allocated nonzeros=0
> >   total number of mallocs used during MatSetValues calls =0
> >     not using I-node routines
> > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 18752388; storage
> space: 0 unneeded,1762711 used
> > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 349
> > Matrix Object: 1 MPI processes
> >   type: seqaij
> >   rows=8920860, cols=18752388
> >   total: nonzeros=1762711, allocated nonzeros=0
> >   total number of mallocs used during MatSetValues calls =0
> >     not using I-node routines
> > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 1508490; storage
> space: 0 unneeded,34572269 used
> > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 615
> > [0] PetscCommDuplicate(): Using internal PETSc communicator
> 47534399112000 67760592
> > [0] PetscCommDuplicate(): Using internal PETSc communicator
> 47534399112000 67760592
> > [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> > [0] VecScatterCreate(): General case: MPI to Seq
> > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 38109; storage
> space: 0 unneeded,1762711 used
> > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 349
> > Matrix Object: 160 MPI processes
> >   type: mpiaij
> >   rows=131858910, cols=18752388
> >
> > The code ran just fine on a smaller (pruned) input dataset.
> > I don't get a stacktrace unfortunately... (running in production mode,
> trying to switch to debug mode now).
> >
> >
> > Regards,
> > Mihai
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121214/195d6b5a/attachment.html>


More information about the petsc-users mailing list