<div dir="ltr"><div dir="ltr"><br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Apr 5, 2019 at 7:19 AM Diogo FERREIRA SABINO via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hi,<br>
I'm new in petsc and I'm trying to construct a MATNEST in two procs, by setting each block of the nested matrix with a MATMPIAIJ matrix defined in each proc.<br>
I'm trying to use MatCreateNest or MatNestSetSubMats, but I'm not being able to do it.<br>
Using MatNestSetSubMats, I'm trying to construct the MATNEST, giving a pointer to the correct matrices depending on the MPIrank of that proc.<br>
I'm obtaining the error message for the line :MatNestSetSubMats(AfullNEST,1,&IS_ROW,2,&IS_COL,AfullNESTpointer);<br>
[0]PETSC ERROR: Invalid argument<br>
[0]PETSC ERROR: Wrong type of object: Parameter # 5<br>
<br>
Is there a way of doing it, or all the blocks of the MATNEST have to exist in the same communicator as the MATNEST matrix?<br></blockquote><div><br></div><div>Yes, they must all have the same communicator. A matrix can be empty on a process, so you just create them with the global communicator, set the local sizes that you want (eg, 0 on some procs).</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
A simple test is given below, lunching it with: mpirun -n 2 ./Main_petsc.exe<br>
<br>
static char help[] = "Create MPI Nest";<br>
#include <petscmat.h><br>
<br>
#undef __FUNCT__<br>
#define __FUNCT__ "main"<br>
int main(int argc,char **argv)<br>
{<br>
PetscInitialize(&argc,&argv,(char*)0,help);<br>
//////////////////////////////////////////////////////////////////////////////<br>
PetscErrorCode ierr;<br>
PetscMPIInt MPIrank,MPIsize;<br>
MPI_Comm_rank(PETSC_COMM_WORLD,&MPIrank);<br>
MPI_Comm_size(PETSC_COMM_WORLD,&MPIsize);<br>
<br>
//////////////////////////////////////////////////////// Create Each Matrix:<br>
Mat Adiag;<br>
<br>
//Create a Adiag different on each proc:<br>
ierr = MatCreate(PETSC_COMM_SELF,&Adiag); CHKERRQ(ierr);<br>
ierr = MatSetSizes(Adiag,2,2,PETSC_DECIDE,PETSC_DECIDE); CHKERRQ(ierr);<br>
ierr = MatSetType(Adiag,MATMPIAIJ); CHKERRQ(ierr);<br>
ierr = MatSetFromOptions(Adiag); CHKERRQ(ierr);<br>
ierr = MatMPIAIJSetPreallocation(Adiag,2,NULL,2,NULL); CHKERRQ(ierr);<br>
<br>
MatSetValue(Adiag,0,0,(MPIrank+5),INSERT_VALUES);<br>
MatSetValue(Adiag,0,1,(MPIrank+10),INSERT_VALUES);<br>
MatSetValue(Adiag,1,0,(MPIrank+15),INSERT_VALUES);<br>
MatSetValue(Adiag,1,1,(MPIrank+20),INSERT_VALUES);<br>
MatAssemblyBegin(Adiag,MAT_FINAL_ASSEMBLY); MatAssemblyEnd(Adiag,MAT_FINAL_ASSEMBLY);<br>
<br>
/////////////////////////////////////////////////////////////// Create Nest:<br>
MPI_Barrier(PETSC_COMM_WORLD);<br>
Mat AfullNEST, *AfullNESTpointer;<br>
<br>
PetscMalloc1(2,&AfullNESTpointer);<br>
AfullNESTpointer[0]=NULL;<br>
AfullNESTpointer[1]=NULL;<br>
AfullNESTpointer[MPIrank]=Adiag;<br>
// Rank=0 --> AfullNESTpointer[0]=Adiag; AfullNESTpointer[1]=NULL;<br>
// Rank=1 --> AfullNESTpointer[0]=NULL; AfullNESTpointer[1]=Adiag;<br>
<br>
IS IS_ROW,IS_COL;<br>
ISCreateStride(PETSC_COMM_SELF,1,MPIrank,0,&IS_ROW);<br>
ISCreateStride(PETSC_COMM_SELF,2,0,1,&IS_COL);<br>
// Rank=0 --> IS_ROW= [ 0 ] ; IS_COL= [ 0, 1 ] ;<br>
// Rank=1 --> IS_ROW= [ 1 ] ; IS_COL= [ 0, 1 ] ;<br>
<br>
MatCreate(PETSC_COMM_WORLD,&AfullNEST);<br>
MatSetSizes(AfullNEST,2,2,PETSC_DECIDE,PETSC_DECIDE);<br>
// MatSetSizes(AfullNEST,PETSC_DECIDE,PETSC_DECIDE,4,4);<br>
// MatCreateNest(PETSC_COMM_WORLD,1,&IS_ROW,1,&IS_COL,AfullNESTpointer,&AfullNEST);<br>
ierr = MatNestSetSubMats(AfullNEST,1,&IS_ROW,2,&IS_COL,AfullNESTpointer); CHKERRQ(ierr);<br>
<br>
ierr = PetscFinalize(); CHKERRQ(ierr);<br>
return 0;<br>
}<br>
</blockquote></div></div>