[petsc-users] block matrix without MatCreateNest

Matthew Knepley knepley at gmail.com
Sat Jul 30 10:02:21 CDT 2016


On Thu, Jul 28, 2016 at 3:38 AM, Klaij, Christiaan <C.Klaij at marin.nl> wrote:

> I'm trying to understand how to assemble a block matrix in a
> format-independent manner, so that I can switch between types
> mpiaij and matnest.
>
> The manual states that the key to format-independent assembly is
> to use MatGetLocalSubMatrix. So, in the code below, I'm using
> this to assemble a 3-by-3 block matrix A and setting the diagonal
> of block A02. This seems to work for type mpiaij, but not for
> type matnest. What am I missing?
>
> Chris
>
>
> $ cat mattry.F90
> program mattry
>
>   use petscksp
>   implicit none
> #include <petsc/finclude/petsckspdef.h>
>
>   PetscInt :: n=4   ! setting 4 cells per process
>
>   PetscErrorCode         :: ierr
>   PetscInt               :: size,rank,i
>   Mat                    :: A,A02
>   IS                     :: isg0,isg1,isg2
>   IS                     :: isl0,isl1,isl2
>   ISLocalToGlobalMapping :: map
>
>   integer, allocatable, dimension(:) :: idx
>
>   call PetscInitialize(PETSC_NULL_CHARACTER,ierr); CHKERRQ(ierr)
>   call MPI_Comm_size(PETSC_COMM_WORLD,size,ierr); CHKERRQ(ierr)
>   call MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr);CHKERRQ(ierr)
>
>   ! local index sets for 3 fields
>   allocate(idx(n))
>   idx=(/ (i-1, i=1,n) /)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx,PETSC_COPY_VALUES,isl0,ierr);CHKERRQ(ierr)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx+n,PETSC_COPY_VALUES,isl1,ierr);CHKERRQ(ierr)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx+2*n,PETSC_COPY_VALUES,isl2,ierr);CHKERRQ(ierr)
> !  call ISView(isl3,PETSC_VIEWER_STDOUT_WORLD,ierr); CHKERRQ(ierr)
>   deallocate(idx)
>
>   ! global index sets for 3 fields
>   allocate(idx(n))
>   idx=(/ (i-1+rank*3*n, i=1,n) /)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx,PETSC_COPY_VALUES,isg0,ierr);CHKERRQ(ierr)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx+n,PETSC_COPY_VALUES,isg1,ierr);
> CHKERRQ(ierr)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx+2*n,PETSC_COPY_VALUES,isg2,ierr);
> CHKERRQ(ierr)
> !  call ISView(isg3,PETSC_VIEWER_STDOUT_WORLD,ierr); CHKERRQ(ierr)
>   deallocate(idx)
>
>   ! local-to-global mapping
>   allocate(idx(3*n))
>   idx=(/ (i-1+rank*3*n, i=1,3*n) /)
>   call
> ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD,1,3*n,idx,PETSC_COPY_VALUES,map,ierr);
> CHKERRQ(ierr)
> !  call ISLocalToGlobalMappingView(map,PETSC_VIEWER_STDOUT_WORLD,ierr);
> CHKERRQ(ierr)
>   deallocate(idx)
>
>   ! create the 3-by-3 block matrix
>   call MatCreate(PETSC_COMM_WORLD,A,ierr); CHKERRQ(ierr)
>   call MatSetSizes(A,3*n,3*n,PETSC_DECIDE,PETSC_DECIDE,ierr); CHKERRQ(ierr)
> !  call MatSetType(A,MATNEST,ierr); CHKERRQ(ierr)
>   call MatSetUp(A,ierr); CHKERRQ(ierr)
>

I am sorry I have not had time to run this, but I believe you need to
insert a call here:

  MatNestSetSubMats(A, 3, [isg0, isg1, isg2], 3, [isg0, isg1, isg2], NULL);

coming from
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatNestSetSubMats.html#MatNestSetSubMats

Thanks,

    Matt

  call MatSetOptionsPrefix(A,"A_",ierr); CHKERRQ(ierr)
>   call MatSetLocalToGlobalMapping(A,map,map,ierr); CHKERRQ(ierr)
>   call MatSetFromOptions(A,ierr); CHKERRQ(ierr)
>
>   ! set diagonal of block A02 to 0.65
>   call MatGetLocalSubmatrix(A,isl0,isl2,A02,ierr); CHKERRQ(ierr)
>   do i=1,n
>      call MatSetValuesLocal(A02,1,i-1,1,i-1,0.65d0,INSERT_VALUES,ierr);
> CHKERRQ(ierr)
>   end do
>   call MatRestoreLocalSubMatrix(A,isl0,isl2,A02,ierr); CHKERRQ(ierr)
>   call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr); CHKERRQ(ierr)
>   call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr); CHKERRQ(ierr)
>
>   ! verify
>   call MatGetSubmatrix(A,isg0,isg2,MAT_INITIAL_MATRIX,A02,ierr);
> CHKERRQ(ierr)
>   call MatView(A02,PETSC_VIEWER_STDOUT_WORLD,ierr);CHKERRQ(ierr)
>
>   call PetscFinalize(ierr)
>
> end program mattry
>
> $ mpiexec -n 2 ./mattry -A_mat_type mpiaij
> Mat Object: 2 MPI processes
>   type: mpiaij
> row 0: (0, 0.65)
> row 1: (1, 0.65)
> row 2: (2, 0.65)
> row 3: (3, 0.65)
> row 4: (4, 0.65)
> row 5: (5, 0.65)
> row 6: (6, 0.65)
> row 7: (7, 0.65)
>
> $ mpiexec -n 2 ./mattry -A_mat_type nest
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Null argument, when expecting valid pointer
> [0]PETSC ERROR: Null Pointer: Parameter # 3
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.7.3, Jul, 24, 2016
> [0]PETSC ERROR: ./mattry
>
>
>                                                on a linux_64bit_debug named
> lin0322.marin.local by cklaij Thu Jul 28 10:31:04 2016
> [0]PETSC ERROR: Configure options
> --with-mpi-dir=/home/cklaij/ReFRESCO/Dev/trunk/Libs/install/openmpi/1.8.7
> --with-clanguage=c++ --with-x=1 --with-debugging=1
> --with-blas-lapack-dir=/opt/intel/composer_xe_2015.1.133/mkl
> --with-shared-libraries=0
> [0]PETSC ERROR: #1 MatNestFindIS() line 298 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Null argument, when expecting valid pointer
> [1]PETSC ERROR: Null Pointer: Parameter # 3
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.7.3, Jul, 24, 2016
> [1]PETSC ERROR: ./mattry
>
>
>                                                on a linux_64bit_debug named
> lin0322.marin.local by cklaij Thu Jul 28 10:31:04 2016
> [1]PETSC ERROR: Configure options
> --with-mpi-dir=/home/cklaij/ReFRESCO/Dev/trunk/Libs/install/openmpi/1.8.7
> --with-clanguage=c++ --with-x=1 --with-debugging=1
> --with-blas-lapack-dir=/opt/intel/composer_xe_2015.1.133/mkl
> --with-shared-libraries=0
> [1]PETSC ERROR: #1 MatNestFindIS() line 298 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> [1]PETSC ERROR: #2 MatNestFindSubMat() line 371 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> [1]PETSC ERROR: #3 MatGetLocalSubMatrix_Nest() line 414 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> [1]PETSC ERROR: #4 MatGetLocalSubMatrix() line 10099 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/interface/matrix.c
> #2 MatNestFindSubMat() line 371 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> [0]PETSC ERROR: #3 MatGetLocalSubMatrix_Nest() line 414 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> [0]PETSC ERROR: #4 MatGetLocalSubMatrix() line 10099 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/interface/matrix.c
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
> with errorcode 85.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> [lin0322.marin.local:11985] 1 more process has sent help message
> help-mpi-api.txt / mpi-abort
> [lin0322.marin.local:11985] Set MCA parameter "orte_base_help_aggregate"
> to 0 to see all help / error messages
> $
>
>
> dr. ir. Christiaan Klaij  | CFD Researcher | Research & Development
> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl
>
> MARIN news:
> http://www.marin.nl/web/News/News-items/Ship-design-in-EU-project-Holiship.htm
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160730/476b0f9b/attachment.html>


More information about the petsc-users mailing list