<html>
  <head>
    <meta http-equiv="Content-Type" content="text/html; charset=utf-8">
  </head>
  <body text="#000000" bgcolor="#FFFFFF">
    On 25/11/2017 02:05, Matthew Knepley wrote:<br>
    <blockquote type="cite"
cite="mid:CAMYG4GkRGmpsq9a7Ovhf8f4mjNWCSg1-rEwZVpWOUjmcT=JnKQ@mail.gmail.com">
      <div dir="ltr">
        <div class="gmail_extra">
          <div class="gmail_quote">On Fri, Nov 24, 2017 at 4:21 PM,
            Matteo Semplice <span dir="ltr"><<a
                href="mailto:matteo.semplice@unito.it" target="_blank"
                moz-do-not-send="true">matteo.semplice@unito.it</a>></span>
            wrote:<br>
            <blockquote class="gmail_quote" style="margin:0 0 0
              .8ex;border-left:1px #ccc solid;padding-left:1ex">Hi.<br>
              <br>
              The manual for DMCreateMatrix says "Notes: This properly
              preallocates the number of nonzeros in the sparse matrix
              so you do not need to do it yourself", so I got the
              impression that one does not need to call the
              preallocation routine for the matrix and indeed in most
              examples listed in the manual page for DMCreateMatrix this
              is not done or (KSP tutorial ex4) it is called declaring 0
              entries per row.<br>
              <br>
              However, if read in a mesh in a DMPlex ore create a DMDA
              and then call DMCreateMatrix, the resulting matrix errors
              out when I call MatSetValues. I have currently followed
              the suggestion of the error message and call
              MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR<wbr>,
              PETSC_FALSE), but I'd like to fix this properly.<br>
            </blockquote>
            <div><br>
            </div>
            <div>It sounds like your nonzero pattern does not obey the
              topology. What nonzero pattern are you trying to input?</div>
          </div>
        </div>
      </div>
    </blockquote>
    <br>
    Hi.<br>
    <br>
        The problem with the DMDA was a bug in our code. Sorry for the
    noise.<br>
    <br>
    On the other hand, with the DMPLex, I still experience problems.
    It's a FV code and I reduced it to the case of the simple case of a
    laplacian operator: I need non-diagonal entries at (i,j) if cell i
    and cell j have a common face. <br>
    Here below is my code that reads in a grid of 4 cells (unit square
    divided by the diagonals), create a section with 1 dof per cell,
    creates a matrix and assembles it.<br>
    <tt><br>
    </tt><tt>  ierr = DMPlexCreateFromFile(PETSC_COMM_WORLD,
      "square1.msh", PETSC_TRUE, &dm);CHKERRQ(ierr);</tt><tt><br>
    </tt><tt>  ierr = DMPlexSetAdjacencyUseCone(dm, PETSC_TRUE);
      CHKERRQ(ierr);</tt><tt><br>
    </tt><tt>  ierr = DMPlexSetAdjacencyUseClosure(dm, PETSC_FALSE);
      CHKERRQ(ierr);</tt><tt><br>
    </tt><tt>  ierr = DMPlexComputeGeometryFVM(dm, &ctx.cellgeom,
      &ctx.facegeom); CHKERRQ(ierr);</tt><tt><br>
    </tt><tt>  PetscInt nFields = 1, numComp[1];</tt><tt><br>
    </tt><tt>  ierr = DMGetDimension(dm, &ctx.dim);CHKERRQ(ierr);</tt><tt><br>
    </tt><tt>  /* u has 1 dof on each cell */</tt><tt><br>
    </tt><tt>  numComp[0] = 1;</tt><tt><br>
    </tt><tt>  PetscInt *numDofs;</tt><tt><br>
    </tt><tt>  ierr = PetscCalloc1(nFields*(ctx.dim+1), &numDofs);
      CHKERRQ(ierr);</tt><tt><br>
    </tt><tt>  numDofs[0*(ctx.dim+1)+ctx.dim] = 1;   </tt><tt><br>
    </tt><tt>  /* Create a PetscSection with this data layout */</tt><tt><br>
    </tt><tt>  PetscSection   section;</tt><tt><br>
    </tt><tt>  DMPlexCreateSection(dm, ctx.dim, nFields, numComp,
      numDofs, 0, NULL, NULL, NULL, NULL, &section);</tt><tt><br>
    </tt><tt>  ierr = PetscFree(numDofs); CHKERRQ(ierr);</tt><tt><br>
    </tt><tt>  PetscSectionSetFieldName(section, 0, "u");</tt><tt><br>
    </tt><tt>  /* Tell the DM to use this data layout */</tt><tt><br>
    </tt><tt>  DMSetDefaultSection(dm, section);</tt><tt><br>
    </tt><tt>  PetscSectionDestroy(&section);</tt><tt><br>
    </tt><tt>  /* Print out the list of cells */</tt><tt><br>
    </tt><tt>  printCells(dm,ctx);</tt><tt><br>
    </tt><tt>  //Assemble the system</tt><tt><br>
    </tt><tt>  Mat A;</tt><tt><br>
    </tt><tt>  PetscPrintf(PETSC_COMM_WORLD,"Creating matrix\n");</tt><tt><br>
    </tt><tt>  ierr = DMCreateMatrix(dm,&A);CHKERRQ(ierr);</tt><tt><br>
    </tt><tt>  ierr =
      MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE);CHKERRQ(ierr);</tt><tt><br>
    </tt><tt>  //ierr = MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR,
      PETSC_FALSE);CHKERRQ(ierr);</tt><tt><br>
    </tt><tt>  PetscPrintf(PETSC_COMM_WORLD,"Assembling matrix\n");</tt><tt><br>
    </tt><tt>  ierr = assembleMatrix(dm,ctx,A);CHKERRQ(ierr);</tt><tt><br>
    </tt><br>
    <br>
    An error is generated when the assembleMatrix function tried to
    insert an element at position (0,1) of the matrix. When running with
    the -mat_view option I get the following output.<br>
    <tt><br>
    </tt><tt>matteo:~/software/petscMplex$ ./mplexMatrix -mat_view</tt><tt><br>
    </tt><tt>0] Le celle sono i nodi da 0 a 3 nel DMPlex</tt><tt><br>
    </tt><tt>0]  cell 0 has centroid (0.166667, 0.500000) and volume
      0.250000 </tt><tt><br>
    </tt><tt>0]       has 3 faces: 9 10 11 </tt><tt><br>
    </tt><tt>0]  cell 1 has centroid (0.500000, 0.166667) and volume
      0.250000 </tt><tt><br>
    </tt><tt>0]       has 3 faces: 12 13 9 </tt><tt><br>
    </tt><tt>0]  cell 2 has centroid (0.500000, 0.833333) and volume
      0.250000 </tt><tt><br>
    </tt><tt>0]       has 3 faces: 10 14 15 </tt><tt><br>
    </tt><tt>0]  cell 3 has centroid (0.833333, 0.500000) and volume
      0.250000 </tt><tt><br>
    </tt><tt>0]       has 3 faces: 14 13 16 </tt><tt><br>
    </tt><tt>Creating matrix</tt><tt><br>
    </tt><tt>Mat Object: 1 MPI processes</tt><tt><br>
    </tt><tt>  type: seqaij</tt><tt><br>
    </tt><tt>row 0: (0, 0.) </tt><tt><br>
    </tt><tt>row 1: (1, 0.) </tt><tt><br>
    </tt><tt>row 2: (2, 0.) </tt><tt><br>
    </tt><tt>row 3: (3, 0.) </tt><tt><br>
    </tt><tt>Assembling matrix</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: --------------------- Error Message
      --------------------------------------------------------------</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: Argument out of range</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: New nonzero at (0,1) caused a malloc</tt><tt><br>
    </tt><tt>Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR,
      PETSC_FALSE) to turn off this check</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: See
      <a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble
      shooting.</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: Petsc Release Version 3.7.5, Jan, 01, 2017
    </tt><tt><br>
    </tt><tt>[0]PETSC ERROR: ./mplexMatrix on a x86_64-linux-gnu-real
      named signalkuppe by matteo Wed Nov 29 09:10:12 2017</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: Configure options --build=x86_64-linux-gnu
      --prefix=/usr --includedir=${prefix}/include
      --mandir=${prefix}/share/man --infodir=${prefix}/share/info
      --sysconfdir=/etc --localstatedir=/var --with-silent-rules=0
      --libdir=${prefix}/lib/x86_64-linux-gnu
      --libexecdir=${prefix}/lib/x86_64-linux-gnu
      --with-maintainer-mode=0 --with-dependency-tracking=0
      --with-debugging=0 --shared-library-extension=_real
      --with-clanguage=C++ --with-shared-libraries --with-pic=1
      --useThreads=0 --with-fortran-interfaces=1
      --with-mpi-dir=/usr/lib/x86_64-linux-gnu/openmpi
      --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-blacs=1
      --with-blacs-lib="-lblacsCinit-openmpi -lblacs-openmpi"
      --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi
      --with-mumps=1 --with-mumps-include="[]"
      --with-mumps-lib="-ldmumps -lzmumps -lsmumps -lcmumps
      -lmumps_common -lpord" --with-suitesparse=1
      --with-suitesparse-include=/usr/include/suitesparse
      --with-suitesparse-lib="-lumfpack -lamd -lcholmod -lklu"
      --with-spooles=1 --with-spooles-include=/usr/include/spooles
      --with-spooles-lib=-lspooles --with-ptscotch=1
      --with-ptscotch-include=/usr/include/scotch
      --with-ptscotch-lib="-lptesmumps -lptscotch -lptscotcherr"
      --with-fftw=1 --with-fftw-include="[]" --with-fftw-lib="-lfftw3
      -lfftw3_mpi" --with-superlu=1
      --with-superlu-include=/usr/include/superlu
      --with-superlu-lib=-lsuperlu --with-hdf5=1
      --with-hdf5-dir=/usr/lib/x86_64-linux-gnu/hdf5/openmpi
      --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-hypre=1
      --with-hypre-include=/usr/include/hypre
      --with-hypre-lib="-lHYPRE_IJ_mv -lHYPRE_parcsr_ls
      -lHYPRE_sstruct_ls -lHYPRE_sstruct_mv -lHYPRE_struct_ls
      -lHYPRE_struct_mv -lHYPRE_utilities"
      --prefix=/usr/lib/petscdir/3.7.5/x86_64-linux-gnu-real
      PETSC_DIR=/build/petsc-XG7COe/petsc-3.7.5+dfsg1
      --PETSC_ARCH=x86_64-linux-gnu-real CFLAGS="-g -O2
      -fdebug-prefix-map=/build/petsc-XG7COe/petsc-3.7.5+dfsg1=.
      -fstack-protector-strong -Wformat -Werror=format-security -fPIC"
      CXXFLAGS="-g -O2
      -fdebug-prefix-map=/build/petsc-XG7COe/petsc-3.7.5+dfsg1=.
      -fstack-protector-strong -Wformat -Werror=format-security -fPIC"
      FCFLAGS="-g -O2
      -fdebug-prefix-map=/build/petsc-XG7COe/petsc-3.7.5+dfsg1=.
      -fstack-protector-strong -fPIC" FFLAGS="-g -O2
      -fdebug-prefix-map=/build/petsc-XG7COe/petsc-3.7.5+dfsg1=.
      -fstack-protector-strong -fPIC" CPPFLAGS="-Wdate-time
      -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-z,relro -fPIC" MAKEFLAGS=w</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: #1 MatSetValues_SeqAIJ() line 485 in
      /build/petsc-XG7COe/petsc-3.7.5+dfsg1/src/mat/impls/aij/seq/aij.c</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: #2 MatSetValues() line 1190 in
      /build/petsc-XG7COe/petsc-3.7.5+dfsg1/src/mat/interface/matrix.c</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: #3 assembleMatrix() line 150 in
      /home/matteo/software/petscMplex/mplexMatrix.cpp</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: #4 main() line 202 in
      /home/matteo/software/petscMplex/mplexMatrix.cpp</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: PETSc Option Table entries:</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: -mat_view</tt><tt><br>
    </tt><tt>[0]PETSC ERROR: ----------------End of Error Message
      -------send entire error message to
      <a class="moz-txt-link-abbreviated" href="mailto:petsc-maint@mcs.anl.gov">petsc-maint@mcs.anl.gov</a>----------</tt><tt><br>
    </tt><tt>--------------------------------------------------------------------------</tt><tt><br>
    </tt><tt>MPI_ABORT was invoked on rank 0 in communicator
      MPI_COMM_WORLD</tt><tt><br>
    </tt><tt>with errorcode 63.</tt><tt><br>
    </tt><tt><br>
    </tt><tt>NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI
      processes.</tt><tt><br>
    </tt><tt>You may or may not see output from other processes,
      depending on</tt><tt><br>
    </tt><tt>exactly when Open MPI kills them.</tt><tt><br>
    </tt><tt>--------------------------------------------------------------------------</tt><tt><br>
    </tt><br>
    I guess that the matrix is created with only diagonal entries... ?<br>
    <br>
    (The same happens with 3.8.0 from debian experimental)<br>
    <br>
    Thanks,<br>
        Matteo<br>
    <br>
  </body>
</html>