[petsc-users] Constructing a MATNEST with blocks defined in different procs

Mark Adams mfadams at lbl.gov
Fri Apr 5 07:38:46 CDT 2019


On Fri, Apr 5, 2019 at 7:19 AM Diogo FERREIRA SABINO via petsc-users <
petsc-users at mcs.anl.gov> wrote:

> Hi,
> I'm new in petsc and I'm trying to construct a MATNEST in two procs, by
> setting each block of the nested matrix with a MATMPIAIJ matrix defined in
> each proc.
> I'm trying to use MatCreateNest or MatNestSetSubMats, but I'm not being
> able to do it.
> Using MatNestSetSubMats, I'm trying to construct the MATNEST, giving a
> pointer to the correct matrices depending on the MPIrank of that proc.
> I'm obtaining the error message for the line
> :MatNestSetSubMats(AfullNEST,1,&IS_ROW,2,&IS_COL,AfullNESTpointer);
> [0]PETSC ERROR: Invalid argument
> [0]PETSC ERROR: Wrong type of object: Parameter # 5
>
> Is there a way of doing it, or all the blocks of the MATNEST have to exist
> in the same communicator as the MATNEST matrix?
>

Yes, they must all have the same communicator. A matrix can be empty on a
process, so you just create them with the global communicator, set the
local sizes that you want (eg, 0 on some procs).


> A simple test is given below, lunching it with: mpirun -n 2
> ./Main_petsc.exe
>
> static char help[]     = "Create MPI Nest";
> #include <petscmat.h>
>
> #undef __FUNCT__
> #define __FUNCT__ "main"
> int main(int argc,char **argv)
> {
>   PetscInitialize(&argc,&argv,(char*)0,help);
>
> //////////////////////////////////////////////////////////////////////////////
>   PetscErrorCode  ierr;
>   PetscMPIInt MPIrank,MPIsize;
>   MPI_Comm_rank(PETSC_COMM_WORLD,&MPIrank);
>   MPI_Comm_size(PETSC_COMM_WORLD,&MPIsize);
>
>   ////////////////////////////////////////////////////////   Create Each
> Matrix:
>   Mat Adiag;
>
>   //Create a Adiag different on each proc:
>   ierr = MatCreate(PETSC_COMM_SELF,&Adiag);
>  CHKERRQ(ierr);
>   ierr = MatSetSizes(Adiag,2,2,PETSC_DECIDE,PETSC_DECIDE);
> CHKERRQ(ierr);
>   ierr = MatSetType(Adiag,MATMPIAIJ);
>  CHKERRQ(ierr);
>   ierr = MatSetFromOptions(Adiag);
> CHKERRQ(ierr);
>   ierr = MatMPIAIJSetPreallocation(Adiag,2,NULL,2,NULL);
> CHKERRQ(ierr);
>
>   MatSetValue(Adiag,0,0,(MPIrank+5),INSERT_VALUES);
>   MatSetValue(Adiag,0,1,(MPIrank+10),INSERT_VALUES);
>   MatSetValue(Adiag,1,0,(MPIrank+15),INSERT_VALUES);
>   MatSetValue(Adiag,1,1,(MPIrank+20),INSERT_VALUES);
>   MatAssemblyBegin(Adiag,MAT_FINAL_ASSEMBLY);
>  MatAssemblyEnd(Adiag,MAT_FINAL_ASSEMBLY);
>
>   ///////////////////////////////////////////////////////////////   Create
> Nest:
>   MPI_Barrier(PETSC_COMM_WORLD);
>   Mat       AfullNEST, *AfullNESTpointer;
>
>   PetscMalloc1(2,&AfullNESTpointer);
>   AfullNESTpointer[0]=NULL;
>   AfullNESTpointer[1]=NULL;
>   AfullNESTpointer[MPIrank]=Adiag;
>   // Rank=0 --> AfullNESTpointer[0]=Adiag; AfullNESTpointer[1]=NULL;
>   // Rank=1 --> AfullNESTpointer[0]=NULL;  AfullNESTpointer[1]=Adiag;
>
>   IS        IS_ROW,IS_COL;
>   ISCreateStride(PETSC_COMM_SELF,1,MPIrank,0,&IS_ROW);
>   ISCreateStride(PETSC_COMM_SELF,2,0,1,&IS_COL);
>   // Rank=0 --> IS_ROW= [ 0 ] ; IS_COL= [ 0, 1 ] ;
>   // Rank=1 --> IS_ROW= [ 1 ] ; IS_COL= [ 0, 1 ] ;
>
>   MatCreate(PETSC_COMM_WORLD,&AfullNEST);
>   MatSetSizes(AfullNEST,2,2,PETSC_DECIDE,PETSC_DECIDE);
>   // MatSetSizes(AfullNEST,PETSC_DECIDE,PETSC_DECIDE,4,4);
>   //
> MatCreateNest(PETSC_COMM_WORLD,1,&IS_ROW,1,&IS_COL,AfullNESTpointer,&AfullNEST);
>     ierr =
> MatNestSetSubMats(AfullNEST,1,&IS_ROW,2,&IS_COL,AfullNESTpointer);
> CHKERRQ(ierr);
>
>   ierr = PetscFinalize(); CHKERRQ(ierr);
>   return 0;
> }
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190405/12d3e7e6/attachment.html>


More information about the petsc-users mailing list