[petsc-users] Unable to extract submatrix from DMComposite Mat if block size > 1

Lawrence Mitchell lawrence.mitchell at imperial.ac.uk
Wed Feb 25 06:02:16 CST 2015


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi all,

I have multi-field operators that have the same layout as given by the
matrices created by DMComposite and would like to be able to switch
between a MatNest representation (if I'm using fieldsplit PCs) or AIJ
(if not).  This works fine if the block sizes of all the sub fields
are 1, but fails if not (even if they are all the same).  The below
code demonstrates the problem using a DMComposite.


$ ./dm-test -ndof_v 1 -ndof_p
V Mat block size (1, 1)
P Mat block size (1, 1)
Composite Global Mat block size (1, 1)
Mat Object: 1 MPI processes
  type: seqaij
  rows=20, cols=20
  total: nonzeros=56, allocated nonzeros=56
  total number of mallocs used during MatSetValues calls =0
    not using I-node routines

Local (0, 0) block has block size (1, 1)
Mat Object: 1 MPI processes
  type: localref
  rows=10, cols=10

Local (0, 1) block has block size (1, 1)
Mat Object: 1 MPI processes
  type: localref
  rows=10, cols=10

Local (1, 0) block has block size (1, 1)
Mat Object: 1 MPI processes
  type: localref
  rows=10, cols=10

Local (1, 1) block has block size (1, 1)
Mat Object: 1 MPI processes
  type: localref
  rows=10, cols=10


$ ./dm-test -ndof_v 2 -ndof_p -pack_dm_mat_type nest
V Mat block size (2, 2)
P Mat block size (1, 1)
Composite Global Mat block size (1, 1)
Mat Object: 1 MPI processes
  type: nest
  rows=30, cols=30
    Matrix object:
      type=nest, rows=2, cols=2
      MatNest structure:
      (0,0) : type=seqaij, rows=20, cols=20
      (0,1) : NULL
      (1,0) : NULL
      (1,1) : type=seqaij, rows=10, cols=10

Local (0, 0) block has block size (2, 2)
Mat Object: 1 MPI processes
  type: seqaij
  rows=20, cols=20, bs=2
  total: nonzeros=112, allocated nonzeros=120
  total number of mallocs used during MatSetValues calls =0
    using I-node routines: found 10 nodes, limit used is 5

Local (1, 1) block has block size (1, 1)
Mat Object: 1 MPI processes
  type: seqaij
  rows=10, cols=10
  total: nonzeros=28, allocated nonzeros=30
  total number of mallocs used during MatSetValues calls =0
    not using I-node routines

$ ./dm-test -ndof_v 2 -ndof_p -pack_dm_mat_type aij
V Mat block size (2, 2)
P Mat block size (1, 1)
Composite Global Mat block size (1, 1)
Mat Object: 1 MPI processes
  type: seqaij
  rows=30, cols=30
  total: nonzeros=140, allocated nonzeros=140
  total number of mallocs used during MatSetValues calls =0
    using I-node routines: found 20 nodes, limit used is 5

[0]PETSC ERROR: --------------------- Error Message
- --------------------------------------------------------------
[0]PETSC ERROR: Petsc has generated inconsistent data
[0]PETSC ERROR: Blocksize of localtoglobalmapping 1 must match that of
layout 2
[0]PETSC ERROR: See
http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.5.2-1615-g4631b1d
GIT Date: 2015-01-27 21:52:20 -0600
[0]PETSC ERROR: ./dm-test on a arch-linux2-c-dbg named
yam.doc.ic.ac.uk by lmitche1 Wed Feb 25 11:43:35 2015
[0]PETSC ERROR: Configure options --download-chaco=1
- --download-ctetgen=1 --download-exodusii=1 --download-hdf5=1
- --download-hypre=1 --download-metis=1 --download-ml=1
- --download-mumps=1 --download-netcdf=1 --download-parmetis=1
- --download-ptscotch=1 --download-scalapack=1 --download-superlu=1
- --download-superlu_dist=1 --download-triangle=1 --with-c2html=0
- --with-debugging=1 --with-make-np=24 --with-openmp=0
- --with-pthreadclasses=0 --with-shared-libraries=1 --with-threadcomm=0
PETSC_ARCH=arch-linux2-c-dbg
[0]PETSC ERROR: #1 PetscLayoutSetBlockSize() line 438 in
/data/lmitche1/src/deps/petsc/src/vec/is/utils/pmap.c
[0]PETSC ERROR: #2 MatCreateLocalRef() line 259 in
/data/lmitche1/src/deps/petsc/src/mat/impls/localref/mlocalref.c
[0]PETSC ERROR: #3 MatGetLocalSubMatrix() line 9523 in
/data/lmitche1/src/deps/petsc/src/mat/interface/matrix.c
[0]PETSC ERROR: #4 main() line 66 in
/data/lmitche1/src/petsc-doodles/dm-test.c
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -ndof_p 1
[0]PETSC ERROR: -ndof_v 2
[0]PETSC ERROR: -pack_dm_mat_type aij
[0]PETSC ERROR: ----------------End of Error Message -------send
entire error message to petsc-maint at mcs.anl.gov----------


This looks like it no longer works after the changes to unify
ISLocalToGlobalMappingApply and ISLocalToGlobalMappingApplyBlock.  At
least, on 98c3331e5 (before the raft of changes in isltog.c) I get:

$ ./dm-test -ndof_v 2 -ndof_p 1 -pack_dm_mat_type aij
V Mat block size (2, 2)
P Mat block size (1, 1)
Composite Global Mat block size (1, 1)
Mat Object: 1 MPI processes
  type: seqaij
  rows=30, cols=30
  total: nonzeros=140, allocated nonzeros=140
  total number of mallocs used during MatSetValues calls =0
    using I-node routines: found 20 nodes, limit used is 5

Local (0, 0) block has block size (2, 2)
Mat Object: 1 MPI processes
  type: localref
  rows=20, cols=20, bs=2

Local (0, 1) block has block size (1, 1)
Mat Object: 1 MPI processes
  type: localref
  rows=20, cols=10

Local (1, 0) block has block size (1, 1)
Mat Object: 1 MPI processes
  type: localref
  rows=10, cols=20

Local (1, 1) block has block size (1, 1)
Mat Object: 1 MPI processes
  type: localref
  rows=10, cols=10

Although ideally the off-diagonal blocks would also support
MatSetValuesBlocked (with block sizes (1, 2) and (2, 1) respectively).

Cheers,

Lawrence

#include <petsc.h>

int main(int argc, char **argv)
{
   PetscErrorCode ierr;
   DM v;
   DM p;
   DM pack;
   PetscInt rbs, cbs;
   PetscInt srbs, scbs;
   PetscInt i, j;
   PetscInt ndof_v = 1;
   PetscInt ndof_p = 1;
   Mat mat;
   Mat submat;
   IS *ises;
   MPI_Comm c;
   PetscViewer vwr;
   PetscInitialize(&argc, &argv, NULL, NULL);

   c = PETSC_COMM_WORLD;
   ierr = PetscOptionsGetInt(NULL, "-ndof_v", &ndof_v, NULL);
CHKERRQ(ierr);
   ierr = PetscOptionsGetInt(NULL, "-ndof_p", &ndof_p, NULL);
CHKERRQ(ierr);

   ierr = DMDACreate1d(c, DM_BOUNDARY_NONE, 10, ndof_v, 1, NULL, &v);
CHKERRQ(ierr);
   ierr = DMDACreate1d(c, DM_BOUNDARY_NONE, 10, ndof_p, 1, NULL, &p);
CHKERRQ(ierr);
   ierr = DMSetFromOptions(v); CHKERRQ(ierr);
   ierr = DMSetFromOptions(p); CHKERRQ(ierr);


   ierr = DMCreateMatrix(v, &mat); CHKERRQ(ierr);

   ierr = MatGetBlockSizes(mat, &rbs, &cbs); CHKERRQ(ierr);

   ierr = PetscPrintf(c, "V Mat block size (%d, %d)\n", rbs, cbs);
CHKERRQ(ierr);
   ierr = MatDestroy(&mat); CHKERRQ(ierr);
   ierr = DMCreateMatrix(p, &mat); CHKERRQ(ierr);

   ierr = MatGetBlockSizes(mat, &rbs, &cbs); CHKERRQ(ierr);

   ierr = PetscPrintf(c, "P Mat block size (%d, %d)\n", rbs, cbs);
CHKERRQ(ierr);
   ierr = MatDestroy(&mat); CHKERRQ(ierr);

   ierr = DMCompositeCreate(c, &pack); CHKERRQ(ierr);

   ierr = PetscObjectSetOptionsPrefix((PetscObject)pack,
"pack_");CHKERRQ(ierr);
   ierr = DMCompositeAddDM(pack, v); CHKERRQ(ierr);
   ierr = DMCompositeAddDM(pack, p); CHKERRQ(ierr);
   ierr = DMSetFromOptions(pack); CHKERRQ(ierr);

   ierr = DMCompositeGetLocalISs(pack, &ises); CHKERRQ(ierr);

   ierr = DMCreateMatrix(pack, &mat); CHKERRQ(ierr);

   ierr = PetscViewerCreate(c, &vwr); CHKERRQ(ierr);
   ierr = PetscViewerSetType(vwr, PETSCVIEWERASCII); CHKERRQ(ierr);
   ierr = PetscViewerSetFormat(vwr, PETSC_VIEWER_ASCII_INFO);
CHKERRQ(ierr);
   ierr = PetscViewerSetUp(vwr); CHKERRQ(ierr);
   ierr = MatGetBlockSizes(mat, &rbs, &cbs); CHKERRQ(ierr);
   ierr = PetscPrintf(c, "Composite Global Mat block size (%d, %d)\n",
rbs, cbs); CHKERRQ(ierr);
   ierr = MatView(mat, vwr); CHKERRQ(ierr);
   ierr = PetscPrintf(c, "\n"); CHKERRQ(ierr);

   for (i=0; i < 2; i++ ) {
       for (j=0; j < 2; j++ ) {
           ierr = MatGetLocalSubMatrix(mat, ises[i], ises[j],
&submat); CHKERRQ(ierr);
           if (submat) {
               ierr = MatGetBlockSizes(submat, &srbs, &scbs);
CHKERRQ(ierr);
               ierr = PetscPrintf(c, "Local (%d, %d) block has block
size (%d, %d)\n", i, j, srbs, scbs); CHKERRQ(ierr);
               ierr = MatAssemblyBegin(submat, MAT_FINAL_ASSEMBLY);
CHKERRQ(ierr);
               ierr = MatAssemblyEnd(submat, MAT_FINAL_ASSEMBLY);
CHKERRQ(ierr);
               ierr = MatView(submat, vwr); CHKERRQ(ierr);
               ierr = MatDestroy(&submat); CHKERRQ(ierr);
               ierr = PetscPrintf(c, "\n"); CHKERRQ(ierr);
           }
       }
   }
   ierr = PetscViewerDestroy(&vwr); CHKERRQ(ierr);
   ierr = MatDestroy(&mat); CHKERRQ(ierr);

   ierr = DMDestroy(&pack); CHKERRQ(ierr);
   ierr = DMDestroy(&v); CHKERRQ(ierr);
   ierr = DMDestroy(&p); CHKERRQ(ierr);

   ierr = ISDestroy(&(ises[0])); CHKERRQ(ierr);
   ierr = ISDestroy(&(ises[1])); CHKERRQ(ierr);
   ierr = PetscFree(ises); CHKERRQ(ierr);

   PetscFinalize();

   return 0;
}

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJU7bnBAAoJECOc1kQ8PEYv37EH/RRlZhuTHjBUBgpZCHDwMVph
psTSRGPL/rEmOgQoYwFzlNLpptZKY1IIIykzxJBvWxW77horIRkX56u0HBIAOMej
Hk29vaTbLXpreaH1Ov9MUbbh34ZZUMtn5EX5Ye3nu1jueUxRTu3roHwZleiRP4ff
f8ankL9C5KVSea/8vPH17TO2us21LpVw1nwvmlCMyA5YUmf8lIKKbpEOagTFK6mO
U/TfQoiU47hLBsCGlgwQhlaMsJW+4GGNfE4o621nc4TlC0J8VtqoyoeOvoWjJdx9
vUuZ+vQhaNrg0yeQsrGvWNn9T7fFVUjOaxrNlfp0u1ZIMvmEAx262IcQPHCGO5U=
=tdzO
-----END PGP SIGNATURE-----


More information about the petsc-users mailing list