[petsc-users] preallocation after DMCreateMatrix?

Matthew Knepley knepley at gmail.com
Wed Nov 29 05:46:44 CST 2017


On Wed, Nov 29, 2017 at 2:26 AM, Matteo Semplice <matteo.semplice at unito.it>
wrote:

> On 25/11/2017 02:05, Matthew Knepley wrote:
>
> On Fri, Nov 24, 2017 at 4:21 PM, Matteo Semplice <matteo.semplice at unito.it
> > wrote:
>
>> Hi.
>>
>> The manual for DMCreateMatrix says "Notes: This properly preallocates the
>> number of nonzeros in the sparse matrix so you do not need to do it
>> yourself", so I got the impression that one does not need to call the
>> preallocation routine for the matrix and indeed in most examples listed in
>> the manual page for DMCreateMatrix this is not done or (KSP tutorial ex4)
>> it is called declaring 0 entries per row.
>>
>> However, if read in a mesh in a DMPlex ore create a DMDA and then call
>> DMCreateMatrix, the resulting matrix errors out when I call MatSetValues. I
>> have currently followed the suggestion of the error message and call
>> MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE), but I'd
>> like to fix this properly.
>>
>
> It sounds like your nonzero pattern does not obey the topology. What
> nonzero pattern are you trying to input?
>
>
> Hi.
>
>     The problem with the DMDA was a bug in our code. Sorry for the noise.
>
> On the other hand, with the DMPLex, I still experience problems. It's a FV
> code and I reduced it to the case of the simple case of a laplacian
> operator: I need non-diagonal entries at (i,j) if cell i and cell j have a
> common face.
> Here below is my code that reads in a grid of 4 cells (unit square divided
> by the diagonals), create a section with 1 dof per cell, creates a matrix
> and assembles it.
>

>   ierr = DMPlexCreateFromFile(PETSC_COMM_WORLD, "square1.msh",
> PETSC_TRUE, &dm);CHKERRQ(ierr);
>

Can you show me the mesh?

         ierr = DMViewFromOptions(dm, NULL, "-dm_view");CHKERRQ(ierr);

and then run with -dm_view ::ascii_info_detail


>   ierr = DMPlexSetAdjacencyUseCone(dm, PETSC_TRUE); CHKERRQ(ierr);
>   ierr = DMPlexSetAdjacencyUseClosure(dm, PETSC_FALSE); CHKERRQ(ierr);
>

This looks like the right FVM adjacency, but your matrix is diagonal it
appears below. TS ex11
has an identical call, but produces the correct matrix, which is why I want
to look at your mesh.

  Thanks,

     Matt


>
>   ierr = DMPlexComputeGeometryFVM(dm, &ctx.cellgeom, &ctx.facegeom);
> CHKERRQ(ierr);
>   PetscInt nFields = 1, numComp[1];
>   ierr = DMGetDimension(dm, &ctx.dim);CHKERRQ(ierr);
>   /* u has 1 dof on each cell */
>   numComp[0] = 1;
>   PetscInt *numDofs;
>   ierr = PetscCalloc1(nFields*(ctx.dim+1), &numDofs); CHKERRQ(ierr);
>   numDofs[0*(ctx.dim+1)+ctx.dim] = 1;
>   /* Create a PetscSection with this data layout */
>   PetscSection   section;
>   DMPlexCreateSection(dm, ctx.dim, nFields, numComp, numDofs, 0, NULL,
> NULL, NULL, NULL, &section);
>   ierr = PetscFree(numDofs); CHKERRQ(ierr);
>   PetscSectionSetFieldName(section, 0, "u");
>   /* Tell the DM to use this data layout */
>   DMSetDefaultSection(dm, section);
>   PetscSectionDestroy(&section);
>   /* Print out the list of cells */
>   printCells(dm,ctx);
>   //Assemble the system
>   Mat A;
>   PetscPrintf(PETSC_COMM_WORLD,"Creating matrix\n");
>   ierr = DMCreateMatrix(dm,&A);CHKERRQ(ierr);
>   ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE);CHKERRQ(ierr);
>   //ierr = MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR,
> PETSC_FALSE);CHKERRQ(ierr);
>   PetscPrintf(PETSC_COMM_WORLD,"Assembling matrix\n");
>   ierr = assembleMatrix(dm,ctx,A);CHKERRQ(ierr);
>
>
> An error is generated when the assembleMatrix function tried to insert an
> element at position (0,1) of the matrix. When running with the -mat_view
> option I get the following output.
>
> matteo:~/software/petscMplex$ ./mplexMatrix -mat_view
> 0] Le celle sono i nodi da 0 a 3 nel DMPlex
> 0]  cell 0 has centroid (0.166667, 0.500000) and volume 0.250000
> 0]       has 3 faces: 9 10 11
> 0]  cell 1 has centroid (0.500000, 0.166667) and volume 0.250000
> 0]       has 3 faces: 12 13 9
> 0]  cell 2 has centroid (0.500000, 0.833333) and volume 0.250000
> 0]       has 3 faces: 10 14 15
> 0]  cell 3 has centroid (0.833333, 0.500000) and volume 0.250000
> 0]       has 3 faces: 14 13 16
> Creating matrix
> Mat Object: 1 MPI processes
>   type: seqaij
> row 0: (0, 0.)
> row 1: (1, 0.)
> row 2: (2, 0.)
> row 3: (3, 0.)
> Assembling matrix
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Argument out of range
> [0]PETSC ERROR: New nonzero at (0,1) caused a malloc
> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn
> off this check
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.7.5, Jan, 01, 2017
> [0]PETSC ERROR: ./mplexMatrix on a x86_64-linux-gnu-real named signalkuppe
> by matteo Wed Nov 29 09:10:12 2017
> [0]PETSC ERROR: Configure options --build=x86_64-linux-gnu --prefix=/usr
> --includedir=${prefix}/include --mandir=${prefix}/share/man
> --infodir=${prefix}/share/info --sysconfdir=/etc --localstatedir=/var
> --with-silent-rules=0 --libdir=${prefix}/lib/x86_64-linux-gnu
> --libexecdir=${prefix}/lib/x86_64-linux-gnu --with-maintainer-mode=0
> --with-dependency-tracking=0 --with-debugging=0 --shared-library-extension=_real
> --with-clanguage=C++ --with-shared-libraries --with-pic=1 --useThreads=0
> --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/x86_64-linux-gnu/openmpi
> --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-blacs=1
> --with-blacs-lib="-lblacsCinit-openmpi -lblacs-openmpi"
> --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi
> --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps
> -lzmumps -lsmumps -lcmumps -lmumps_common -lpord" --with-suitesparse=1
> --with-suitesparse-include=/usr/include/suitesparse
> --with-suitesparse-lib="-lumfpack -lamd -lcholmod -lklu" --with-spooles=1
> --with-spooles-include=/usr/include/spooles --with-spooles-lib=-lspooles
> --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch
> --with-ptscotch-lib="-lptesmumps -lptscotch -lptscotcherr" --with-fftw=1
> --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi"
> --with-superlu=1 --with-superlu-include=/usr/include/superlu
> --with-superlu-lib=-lsuperlu --with-hdf5=1 --with-hdf5-dir=/usr/lib/x86_64-linux-gnu/hdf5/openmpi
> --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-hypre=1
> --with-hypre-include=/usr/include/hypre --with-hypre-lib="-lHYPRE_IJ_mv
> -lHYPRE_parcsr_ls -lHYPRE_sstruct_ls -lHYPRE_sstruct_mv -lHYPRE_struct_ls
> -lHYPRE_struct_mv -lHYPRE_utilities" --prefix=/usr/lib/petscdir/3.7.5/x86_64-linux-gnu-real
> PETSC_DIR=/build/petsc-XG7COe/petsc-3.7.5+dfsg1
> --PETSC_ARCH=x86_64-linux-gnu-real CFLAGS="-g -O2
> -fdebug-prefix-map=/build/petsc-XG7COe/petsc-3.7.5+dfsg1=.
> -fstack-protector-strong -Wformat -Werror=format-security -fPIC"
> CXXFLAGS="-g -O2 -fdebug-prefix-map=/build/petsc-XG7COe/petsc-3.7.5+dfsg1=.
> -fstack-protector-strong -Wformat -Werror=format-security -fPIC"
> FCFLAGS="-g -O2 -fdebug-prefix-map=/build/petsc-XG7COe/petsc-3.7.5+dfsg1=.
> -fstack-protector-strong -fPIC" FFLAGS="-g -O2 -fdebug-prefix-map=/build/
> petsc-XG7COe/petsc-3.7.5+dfsg1=. -fstack-protector-strong -fPIC"
> CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-z,relro -fPIC"
> MAKEFLAGS=w
> [0]PETSC ERROR: #1 MatSetValues_SeqAIJ() line 485 in
> /build/petsc-XG7COe/petsc-3.7.5+dfsg1/src/mat/impls/aij/seq/aij.c
> [0]PETSC ERROR: #2 MatSetValues() line 1190 in
> /build/petsc-XG7COe/petsc-3.7.5+dfsg1/src/mat/interface/matrix.c
> [0]PETSC ERROR: #3 assembleMatrix() line 150 in /home/matteo/software/
> petscMplex/mplexMatrix.cpp
> [0]PETSC ERROR: #4 main() line 202 in /home/matteo/software/
> petscMplex/mplexMatrix.cpp
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -mat_view
> [0]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 63.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
> I guess that the matrix is created with only diagonal entries... ?
>
> (The same happens with 3.8.0 from debian experimental)
>
> Thanks,
>     Matteo
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171129/1cde611f/attachment-0001.html>


More information about the petsc-users mailing list