[petsc-users] Is MatSetUp required with MatCreateNormal and MatCreateMPIAIJWithSplitArrays?

Satish Balay balay at mcs.anl.gov
Fri Dec 14 10:26:03 CST 2012


pushed https://bitbucket.org/petsc/petsc-3.3/commits/6dac937a3eace3b81d6dcbb945ee7a85

Satish

On Fri, 14 Dec 2012, Barry Smith wrote:

> 
>    Mihai,
> 
>     Thanks for tracking down the problem. As a side note, you are getting close to using all of the space in int in your matrix row/column sizes, when you matrix sizes are great than 2^{31}-1 you will need to configure PETSc with --with-64-bit-indices to have PETSc use long long int for PetscInt.
> 
>    Satish,
> 
>      Could you please patch 3.3 and replace the use of %D with %lld and replace the (PetscInt) casts with (long long int) casts in the two lines
>       ierr = PetscViewerASCIIPrintf(viewer,"total: nonzeros=%D, allocated nonzeros=%D\n",(PetscInt)info.nz_used,(PetscInt)info.nz_allocated);CHKERRQ(ierr);
>         ierr = PetscViewerASCIIPrintf(viewer,"total number of mallocs used during MatSetValues calls =%D\n",(PetscInt)info.mallocs);CHKERRQ(ierr);
> 
>   Thanks
> 
>    Barry
> 
> On Dec 14, 2012, at 4:59 AM, Mihai Alexe <malexe at vt.edu> wrote:
> 
> > Barry,
> > 
> > I've tracked down the problem.
> > 
> > I ran with -info -mat_view_info, and fpe's enabled and got a SIGFPE after entering MatCreateMPIAIJWithSplitArrays (Petsc did not produce a stacktrace unfortunately). This was due to a floating point exception in a typecast inside mat/interface/matrix.c:
> > 
> >       if (mat->ops->getinfo) {
> >         MatInfo info;
> >         ierr = MatGetInfo(mat,MAT_GLOBAL_SUM,&info);CHKERRQ(ierr);
> >         ierr = PetscViewerASCIIPrintf(viewer,"total: nonzeros=%D, allocated nonzeros=%D\n",(PetscInt)info.nz_used,(PetscInt)info.nz_allocated);CHKERRQ(ierr);
> >         ierr = PetscViewerASCIIPrintf(viewer,"total number of mallocs used during MatSetValues calls =%D\n",(PetscInt)info.mallocs);CHKERRQ(ierr);
> >       }
> > 
> > My sparse matrix has about 6 billion nonzeros. When I disable FPEs, i get a silent overflow when converting MatInfo.nz_used from PetscLogDouble to (32-bit) PetscInt:
> > 
> > Matrix Object: 96 MPI processes
> >   type: mpiaij
> >   rows=131857963, cols=18752388
> >   total: nonzeros=-2147483648, allocated nonzeros=0 
> > 
> > and the code runs just fine. Maybe PETSc should cast nz_used to a long int?
> > 
> > 
> > Mihai
> > On Thu, Nov 29, 2012 at 6:25 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> > 
> > On Nov 29, 2012, at 9:48 AM, Mihai Alexe <malexe at vt.edu> wrote:
> > 
> > > Hello all,
> > >
> > > I am creating a large rectangular MPIAIJ matrix, then a shell NormalMatrix that eventually gets passed to a KSP object (all part of a constrained least-squares solver).
> > > Code looks as follows:
> > >
> > >  //user.A_mat and user.Hess are PETSc Mat
> > >
> > >  info = MatCreateMPIAIJWithSplitArrays( PETSC_COMM_WORLD, *locrow, *loccol, nrow,
> > >                                 *ncol, onrowidx, oncolidx,
> > >                                 (PetscScalar*) onvals, offrowidx, offcolidx,
> > >                                 (PetscScalar*) values, &user.A_mat ); CHKERRQ(info);
> > >
> > >  info = MatCreateNormal( user.A_mat, &user.Hess ); CHKERRQ(info);
> > >  info = MatSetUp( user.Hess );
> > >
> > > Is MatSetUp() required for A or Hess to be initialized correctly? Or some call to MatSetPreallocation?
> > '
> >    No you shouldn't need them.   Try with valgrind   http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> > 
> >   Barry
> > 
> > >
> > > My code crashes after displaying (with -info -mat_view_info):
> > >
> > > [0] PetscCommDuplicate(): Duplicating a communicator 47534399113024 67425648 max tags = 2147483647
> > > [0] PetscCommDuplicate(): Duplicating a communicator 47534399112000 67760592 max tags = 2147483647
> > > [0] MatCreate_SeqAIJ_Inode(): Not using Inode routines due to -mat_no_inode
> > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 1508490; storage space: 0 unneeded,34572269 used
> > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 615
> > > Matrix Object: 1 MPI processes
> > >   type: seqaij
> > >   rows=8920860, cols=1508490
> > >   total: nonzeros=34572269, allocated nonzeros=0
> > >   total number of mallocs used during MatSetValues calls =0
> > >     not using I-node routines
> > > [0] PetscCommDuplicate(): Using internal PETSc communicator 47534399112000 67760592
> > > [0] MatCreate_SeqAIJ_Inode(): Not using Inode routines due to -mat_no_inode
> > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 18752388; storage space: 0 unneeded,1762711 used
> > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 349
> > > Matrix Object: 1 MPI processes
> > >   type: seqaij
> > >   rows=8920860, cols=18752388
> > >   total: nonzeros=1762711, allocated nonzeros=0
> > >   total number of mallocs used during MatSetValues calls =0
> > >     not using I-node routines
> > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 1508490; storage space: 0 unneeded,34572269 used
> > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 615
> > > Matrix Object: 1 MPI processes
> > >   type: seqaij
> > >   rows=8920860, cols=1508490
> > >   total: nonzeros=34572269, allocated nonzeros=0
> > >   total number of mallocs used during MatSetValues calls =0
> > >     not using I-node routines
> > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 18752388; storage space: 0 unneeded,1762711 used
> > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 349
> > > Matrix Object: 1 MPI processes
> > >   type: seqaij
> > >   rows=8920860, cols=18752388
> > >   total: nonzeros=1762711, allocated nonzeros=0
> > >   total number of mallocs used during MatSetValues calls =0
> > >     not using I-node routines
> > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 1508490; storage space: 0 unneeded,34572269 used
> > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 615
> > > [0] PetscCommDuplicate(): Using internal PETSc communicator 47534399112000 67760592
> > > [0] PetscCommDuplicate(): Using internal PETSc communicator 47534399112000 67760592
> > > [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> > > [0] VecScatterCreate(): General case: MPI to Seq
> > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 8920860 X 38109; storage space: 0 unneeded,1762711 used
> > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> > > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 349
> > > Matrix Object: 160 MPI processes
> > >   type: mpiaij
> > >   rows=131858910, cols=18752388
> > >
> > > The code ran just fine on a smaller (pruned) input dataset.
> > > I don't get a stacktrace unfortunately... (running in production mode, trying to switch to debug mode now).
> > >
> > >
> > > Regards,
> > > Mihai
> > >
> > 
> > 
> 
> 



More information about the petsc-users mailing list