[petsc-users] access to matnest block (0,1) ?
Jed Brown
jedbrown at mcs.anl.gov
Wed Jan 18 06:22:16 CST 2012
On Wed, Jan 18, 2012 at 02:07, Klaij, Christiaan <C.Klaij at marin.nl> wrote:
> I have two DMs which I add to a DMComposite and then use MATNEST when
> getting the corresponding matrix. This gives me block (0,0) and block
> (1,1). How do I set/get blocks (0,1) and (1,0)? Looking at ex28 I tried
> MatGetLocalSubMatrix but it gives a null arg...
>
So the problem is that we have no way of knowing what preallocation
(nonzero pattern) _should_ go in the off-diagonal part. Unfortunately, the
current preallocation mechanism (DMCompositeSetCoupling()) is a difficult
thing to implement and the mechanism does not directly apply to MatNest. If
you have ideas for a good preallocation API, I would like to hear it. I
need to get back to the preallocation issue because it's an obvious wart in
the multiphysics support (as long as we don't have fast dynamic
preallocation, which is a somewhat viable alternative). What I would like
is for the user to call MatGetLocalSubMatrix() for any blocks that they
want allocated and set preallocation in terms of the local ordering.
The current (unfortunate) solution for MatNest with off-diagonal parts is
to create the submatrices after DMGetMatrix(), preallocate as you like, and
copy the ISLocalToGlobalMappings over.
> #include <petscdmda.h>
>
> int main(int argc, char **argv)
> {
>
> DM da0, da1;
> DMDABoundaryType bx = DMDA_BOUNDARY_PERIODIC, by = DMDA_BOUNDARY_PERIODIC;
> DMDAStencilType stype = DMDA_STENCIL_STAR;
> PetscInt Mx = 8, My = 8;
>
> PetscInitialize(&argc, &argv, PETSC_NULL, PETSC_NULL);
>
> // create distributed array for Q
>
> DMDACreate2d(PETSC_COMM_WORLD,bx,by,stype,Mx,My,PETSC_DECIDE,PETSC_DECIDE,2,1,PETSC_NULL,PETSC_NULL,&da0);
>
> // create distributed array for C
>
> DMDACreate2d(PETSC_COMM_WORLD,bx,by,stype,Mx,My,PETSC_DECIDE,PETSC_DECIDE,1,1,PETSC_NULL,PETSC_NULL,&da1);
>
> // mat nest from pack
> DM pack;
> Mat A;
> Vec X;
> DMCompositeCreate(PETSC_COMM_WORLD,&pack);
> DMCompositeAddDM(pack,da0);
> DMCompositeAddDM(pack,da1);
> DMSetUp(pack);
> DMGetMatrix(pack,MATNEST,&A);
> MatView(A,PETSC_VIEWER_DEFAULT);
>
> IS *is;
> Mat G;
> PetscInt col[1],ierr;
> PetscScalar vals[1];
> DMCompositeGetLocalISs(pack,&is);
> MatGetLocalSubMatrix(A,is[0],is[1],&G);
> MatView(G,PETSC_VIEWER_DEFAULT);
>
> PetscFinalize();
>
> return 0;
>
> }
>
>
> $ mpiexec -n 1 ./dmda-try3
> Matrix object:
> type=nest, rows=2, cols=2
> MatNest structure:
> (0,0) : type=seqaij, rows=128, cols=128
> (0,1) : PETSC_NULL
> (1,0) : PETSC_NULL
> (1,1) : type=seqaij, rows=64, cols=64
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [0]PETSC ERROR: Null argument, when expecting valid pointer!
> [0]PETSC ERROR: Null Object: Parameter # 1!
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 5, Sat Oct 29 13:45:54
> CDT 2011
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: ./dmda-try3 on a linux_64b named lin0133 by cklaij Wed Jan
> 18 09:00:37 2012
> [0]PETSC ERROR: Libraries linked from
> /opt/refresco/64bit_intelv11.1_openmpi/petsc-3.2-p5/lib
> [0]PETSC ERROR: Configure run at Mon Jan 16 14:03:34 2012
> [0]PETSC ERROR: Configure options
> --prefix=/opt/refresco/64bit_intelv11.1_openmpi/petsc-3.2-p5
> --with-mpi-dir=/opt/refresco/64bit_intelv11.1_openmpi/openmpi-1.4.4
> --with-x=0 --with-mpe=0 --with-debugging=1
> --with-hypre-include=/opt/refresco/64bit_intelv11.1_openmpi/hypre-2.7.0b/include
> --with-hypre-lib=/opt/refresco/64bit_intelv11.1_openmpi/hypre-2.7.0b/lib/libHYPRE.a
> --with-ml-include=/opt/refresco/64bit_intelv11.1_openmpi/ml-6.2/include
> --with-ml-lib=/opt/refresco/64bit_intelv11.1_openmpi/ml-6.2/lib/libml.a
> --with-blas-lapack-dir=/opt/intel/mkl
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: MatView() line 723 in src/mat/interface/matrix.c
>
>
>
> dr. ir. Christiaan Klaij
> CFD Researcher
> Research & Development
> E mailto:C.Klaij at marin.nl
> T +31 317 49 33 44
>
> MARIN
> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands
> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120118/900386df/attachment.htm>
More information about the petsc-users
mailing list