<div class="gmail_quote">On Wed, Jan 18, 2012 at 02:07, Klaij, Christiaan <span dir="ltr"><<a href="mailto:C.Klaij@marin.nl">C.Klaij@marin.nl</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
I have two DMs which I add to a DMComposite and then use MATNEST when getting the corresponding matrix. This gives me block (0,0) and block (1,1). How do I set/get blocks (0,1) and (1,0)? Looking at ex28 I tried MatGetLocalSubMatrix but it gives a null arg...<br>
</blockquote><div><br></div><div>So the problem is that we have no way of knowing what preallocation (nonzero pattern) _should_ go in the off-diagonal part. Unfortunately, the current preallocation mechanism (DMCompositeSetCoupling()) is a difficult thing to implement and the mechanism does not directly apply to MatNest. If you have ideas for a good preallocation API, I would like to hear it. I need to get back to the preallocation issue because it's an obvious wart in the multiphysics support (as long as we don't have fast dynamic preallocation, which is a somewhat viable alternative). What I would like is for the user to call MatGetLocalSubMatrix() for any blocks that they want allocated and set preallocation in terms of the local ordering.</div>
<div><br></div><div>The current (unfortunate) solution for MatNest with off-diagonal parts is to create the submatrices after DMGetMatrix(), preallocate as you like, and copy the ISLocalToGlobalMappings over.</div><div><br>
</div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
#include <petscdmda.h><br>
<br>
int main(int argc, char **argv)<br>
{<br>
<br>
DM da0, da1;<br>
DMDABoundaryType bx = DMDA_BOUNDARY_PERIODIC, by = DMDA_BOUNDARY_PERIODIC;<br>
DMDAStencilType stype = DMDA_STENCIL_STAR;<br>
PetscInt Mx = 8, My = 8;<br>
<br>
PetscInitialize(&argc, &argv, PETSC_NULL, PETSC_NULL);<br>
<br>
// create distributed array for Q<br>
DMDACreate2d(PETSC_COMM_WORLD,bx,by,stype,Mx,My,PETSC_DECIDE,PETSC_DECIDE,2,1,PETSC_NULL,PETSC_NULL,&da0);<br>
<br>
// create distributed array for C<br>
DMDACreate2d(PETSC_COMM_WORLD,bx,by,stype,Mx,My,PETSC_DECIDE,PETSC_DECIDE,1,1,PETSC_NULL,PETSC_NULL,&da1);<br>
<br>
// mat nest from pack<br>
DM pack;<br>
Mat A;<br>
Vec X;<br>
DMCompositeCreate(PETSC_COMM_WORLD,&pack);<br>
DMCompositeAddDM(pack,da0);<br>
DMCompositeAddDM(pack,da1);<br>
DMSetUp(pack);<br>
DMGetMatrix(pack,MATNEST,&A);<br>
MatView(A,PETSC_VIEWER_DEFAULT);<br>
<br>
IS *is;<br>
Mat G;<br>
PetscInt col[1],ierr;<br>
PetscScalar vals[1];<br>
DMCompositeGetLocalISs(pack,&is);<br>
MatGetLocalSubMatrix(A,is[0],is[1],&G);<br>
MatView(G,PETSC_VIEWER_DEFAULT);<br>
<br>
PetscFinalize();<br>
<br>
return 0;<br>
<br>
}<br>
<br>
<br>
$ mpiexec -n 1 ./dmda-try3<br>
Matrix object:<br>
type=nest, rows=2, cols=2<br>
MatNest structure:<br>
(0,0) : type=seqaij, rows=128, cols=128<br>
(0,1) : PETSC_NULL<br>
(1,0) : PETSC_NULL<br>
(1,1) : type=seqaij, rows=64, cols=64<br>
[0]PETSC ERROR: --------------------- Error Message ------------------------------------<br>
[0]PETSC ERROR: Null argument, when expecting valid pointer!<br>
[0]PETSC ERROR: Null Object: Parameter # 1!<br>
[0]PETSC ERROR: ------------------------------------------------------------------------<br>
[0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 5, Sat Oct 29 13:45:54 CDT 2011<br>
[0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
[0]PETSC ERROR: See docs/index.html for manual pages.<br>
[0]PETSC ERROR: ------------------------------------------------------------------------<br>
[0]PETSC ERROR: ./dmda-try3 on a linux_64b named lin0133 by cklaij Wed Jan 18 09:00:37 2012<br>
[0]PETSC ERROR: Libraries linked from /opt/refresco/64bit_intelv11.1_openmpi/petsc-3.2-p5/lib<br>
[0]PETSC ERROR: Configure run at Mon Jan 16 14:03:34 2012<br>
[0]PETSC ERROR: Configure options --prefix=/opt/refresco/64bit_intelv11.1_openmpi/petsc-3.2-p5 --with-mpi-dir=/opt/refresco/64bit_intelv11.1_openmpi/openmpi-1.4.4 --with-x=0 --with-mpe=0 --with-debugging=1 --with-hypre-include=/opt/refresco/64bit_intelv11.1_openmpi/hypre-2.7.0b/include --with-hypre-lib=/opt/refresco/64bit_intelv11.1_openmpi/hypre-2.7.0b/lib/libHYPRE.a --with-ml-include=/opt/refresco/64bit_intelv11.1_openmpi/ml-6.2/include --with-ml-lib=/opt/refresco/64bit_intelv11.1_openmpi/ml-6.2/lib/libml.a --with-blas-lapack-dir=/opt/intel/mkl<br>
[0]PETSC ERROR: ------------------------------------------------------------------------<br>
[0]PETSC ERROR: MatView() line 723 in src/mat/interface/matrix.c<br>
<br>
<br>
<br>
dr. ir. Christiaan Klaij<br>
CFD Researcher<br>
Research & Development<br>
E mailto:<a href="mailto:C.Klaij@marin.nl">C.Klaij@marin.nl</a><br>
T <a href="tel:%2B31%20317%2049%2033%2044" value="+31317493344">+31 317 49 33 44</a><br>
<br>
MARIN<br>
2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands<br>
T <a href="tel:%2B31%20317%2049%2039%2011" value="+31317493911">+31 317 49 39 11</a>, F <a href="tel:%2B31%20317%2049%2032%2045" value="+31317493245">+31 317 49 32 45</a>, I <a href="http://www.marin.nl" target="_blank">www.marin.nl</a><br>
<br>
</blockquote></div><br>