[petsc-users] block matrix without MatCreateNest

Matthew Knepley knepley at gmail.com
Mon Aug 1 09:15:09 CDT 2016


On Mon, Aug 1, 2016 at 8:59 AM, Klaij, Christiaan <C.Klaij at marin.nl> wrote:

> Matt,
>
> Thanks for your suggestions. Here's the outcome:
>
> 1) without the "if type=nest" protection, I get a "Cannot
> locate function MatNestSetSubMats_C" error when using
> type mpiaij, see below.
>

That is a bug. It should be using PetscTryMethod() there, not
PetscUseMethod(). We will fix it. That way
it can be called for any matrix type, which is the intention.


> 2) with the isg in a proper array, I get the same "Invalid
> Pointer to Object" error, see below.
>

Can you send a small example that we can run? It obviously should work this
way.

  Thanks,

     Matt


> Chris
>
>
> $ cat mattry.F90
> program mattry
>
>   use petscksp
>   implicit none
> #include <petsc/finclude/petsckspdef.h>
>
>   PetscInt :: n=4   ! setting 4 cells per process
>
>   PetscErrorCode         :: ierr
>   PetscInt               :: size,rank,i
>   Mat                    :: A,A02
>   MatType                :: type
>   IS                     :: isg(3)
>   IS                     :: isl(3)
>   ISLocalToGlobalMapping :: map
>
>   integer, allocatable, dimension(:) :: idx
>
>   call PetscInitialize(PETSC_NULL_CHARACTER,ierr); CHKERRQ(ierr)
>   call MPI_Comm_size(PETSC_COMM_WORLD,size,ierr); CHKERRQ(ierr)
>   call MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr);CHKERRQ(ierr)
>
>   ! local index sets for 3 fields
>   allocate(idx(n))
>   idx=(/ (i-1, i=1,n) /)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx,PETSC_COPY_VALUES,isl(1),ierr);CHKERRQ(ierr)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx+n,PETSC_COPY_VALUES,isl(2),ierr);CHKERRQ(ierr)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx+2*n,PETSC_COPY_VALUES,isl(3),ierr);CHKERRQ(ierr)
> !  call ISView(isl3,PETSC_VIEWER_STDOUT_WORLD,ierr); CHKERRQ(ierr)
>   deallocate(idx)
>
>   ! global index sets for 3 fields
>   allocate(idx(n))
>   idx=(/ (i-1+rank*3*n, i=1,n) /)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx,PETSC_COPY_VALUES,isg(1),ierr);CHKERRQ(ierr)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx+n,PETSC_COPY_VALUES,isg(2),ierr);
> CHKERRQ(ierr)
>   call
> ISCreateGeneral(PETSC_COMM_WORLD,n,idx+2*n,PETSC_COPY_VALUES,isg(3),ierr);
> CHKERRQ(ierr)
> !  call ISView(isg3,PETSC_VIEWER_STDOUT_WORLD,ierr); CHKERRQ(ierr)
>   deallocate(idx)
>
>   ! local-to-global mapping
>   allocate(idx(3*n))
>   idx=(/ (i-1+rank*3*n, i=1,3*n) /)
>   call
> ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD,1,3*n,idx,PETSC_COPY_VALUES,map,ierr);
> CHKERRQ(ierr)
> !  call ISLocalToGlobalMappingView(map,PETSC_VIEWER_STDOUT_WORLD,ierr);
> CHKERRQ(ierr)
>   deallocate(idx)
>
>   ! create the 3-by-3 block matrix
>   call MatCreate(PETSC_COMM_WORLD,A,ierr); CHKERRQ(ierr)
>   call MatSetSizes(A,3*n,3*n,PETSC_DECIDE,PETSC_DECIDE,ierr); CHKERRQ(ierr)
>   call MatSetUp(A,ierr); CHKERRQ(ierr)
>   call MatSetOptionsPrefix(A,"A_",ierr); CHKERRQ(ierr)
>   call MatSetLocalToGlobalMapping(A,map,map,ierr); CHKERRQ(ierr)
>   call MatSetFromOptions(A,ierr); CHKERRQ(ierr)
>
>   ! setup nest
>   call MatNestSetSubMats(A,3,isg,3,isg,PETSC_NULL_OBJECT,ierr);
> CHKERRQ(ierr)
>
>   ! set diagonal of block A02 to 0.65
>   call MatGetLocalSubmatrix(A,isl(1),isl(3),A02,ierr); CHKERRQ(ierr)
>   do i=1,n
>      call MatSetValuesLocal(A02,1,i-1,1,i-1,0.65d0,INSERT_VALUES,ierr);
> CHKERRQ(ierr)
>   end do
>   call MatRestoreLocalSubMatrix(A,isl(1),isl(3),A02,ierr); CHKERRQ(ierr)
>   call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr); CHKERRQ(ierr)
>   call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr); CHKERRQ(ierr)
>
>   ! verify
>   call MatGetSubmatrix(A,isg(1),isg(3),MAT_INITIAL_MATRIX,A02,ierr);
> CHKERRQ(ierr)
>   call MatView(A02,PETSC_VIEWER_STDOUT_WORLD,ierr);CHKERRQ(ierr)
>
>   call PetscFinalize(ierr)
>
> end program mattry
>
>
> $ mpiexec -n 2 ./mattry -A_mat_type mpiaij
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: No support for this operation for this object type
> [0]PETSC ERROR: Cannot locate function MatNestSetSubMats_C in object
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.7.3, Jul, 24, 2016
> [0]PETSC ERROR: ./mattry
>
>
>                                                    on a linux_64bit_debug
> named lin0322.marin.local by cklaij Mon Aug  1 15:42:10 2016
> [0]PETSC ERROR: Configure options
> --with-mpi-dir=/home/cklaij/ReFRESCO/Dev/trunk/Libs/install/openmpi/1.8.7
> --with-clanguage=c++ --with-x=1 --with-debugging=1
> --with-blas-lapack-dir=/opt/intel/composer_xe_2015.1.133/mkl
> --with-shared-libraries=0
> [0]PETSC ERROR: #1 MatNestSetSubMats() line 1105 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 56.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> [1]PETSC ERROR:
> ------------------------------------------------------------------------
> [1]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the
> batch system) has told this process to end
> [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [1]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS
> X to find memory corruption errors
> [1]PETSC ERROR: likely location of problem given in stack below
> [1]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------
> [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> [1]PETSC ERROR:       INSTEAD the line number of the start of the function
> [1]PETSC ERROR:       is given.
> [1]PETSC ERROR: [1] PetscSleep line 35
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/sys/utils/psleep.c
> [1]PETSC ERROR: [1] PetscTraceBackErrorHandler line 193
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/sys/error/errtrace.c
> [1]PETSC ERROR: [1] PetscError line 364
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/sys/error/err.c
> [1]PETSC ERROR: [1] MatNestSetSubMats line 1092
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Signal received
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.7.3, Jul, 24, 2016
> [1]PETSC ERROR: ./mattry
>
>
>                                                    on a linux_64bit_debug
> named lin0322.marin.local by cklaij Mon Aug  1 15:42:10 2016
> [1]PETSC ERROR: Configure options
> --with-mpi-dir=/home/cklaij/ReFRESCO/Dev/trunk/Libs/install/openmpi/1.8.7
> --with-clanguage=c++ --with-x=1 --with-debugging=1
> --with-blas-lapack-dir=/opt/intel/composer_xe_2015.1.133/mkl
> --with-shared-libraries=0
> [1]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> [lin0322.marin.local:19455] 1 more process has sent help message
> help-mpi-api.txt / mpi-abort
> [lin0322.marin.local:19455] Set MCA parameter "orte_base_help_aggregate"
> to 0 to see all help / error messages
>
>
> $ mpiexec -n 2 ./mattry -A_mat_type nest
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Corrupt argument:
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> [0]PETSC ERROR: Invalid Pointer to Object: Parameter # 1
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Corrupt argument:
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> [1]PETSC ERROR: Invalid Pointer to Object: Parameter # 1
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.7.3, Jul, 24, 2016
> [1]PETSC ERROR: ./mattry
>
>
>                                                    on a linux_64bit_debug
> named lin0322.marin.local by cklaij Mon Aug  1 15:42:25 2016
> [1]PETSC ERROR: Configure options
> --with-mpi-dir=/home/cklaij/ReFRESCO/Dev/trunk/Libs/install/openmpi/1.8.7
> --with-clanguage=c++ --with-x=1 --with-debugging=1
> --with-blas-lapack-dir=/opt/intel/composer_xe_2015.1.133/mkl
> --with-shared-libraries=0
> [1]PETSC ERROR: #1 PetscObjectReference() line 534 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/sys/objects/inherit.c
> [1]PETSC ERROR: #2 MatNestSetSubMats_Nest() line 1042 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> [1]PETSC ERROR: #3 MatNestSetSubMats() line 1105 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> [0]PETSC ERROR: Petsc Release Version 3.7.3, Jul, 24, 2016
> [0]PETSC ERROR: ./mattry
>
>
>                                                    on a linux_64bit_debug
> named lin0322.marin.local by cklaij Mon Aug  1 15:42:25 2016
> [0]PETSC ERROR: Configure options
> --with-mpi-dir=/home/cklaij/ReFRESCO/Dev/trunk/Libs/install/openmpi/1.8.7
> --with-clanguage=c++ --with-x=1 --with-debugging=1
> --with-blas-lapack-dir=/opt/intel/composer_xe_2015.1.133/mkl
> --with-shared-libraries=0
> [0]PETSC ERROR: #1 PetscObjectReference() line 534 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/sys/objects/inherit.c
> [0]PETSC ERROR: #2 MatNestSetSubMats_Nest() line 1042 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> [0]PETSC ERROR: #3 MatNestSetSubMats() line 1105 in
> /home/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc/3.7.3-dbg/src/mat/impls/nest/matnest.c
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
> with errorcode 64.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> [lin0322.marin.local:19465] 1 more process has sent help message
> help-mpi-api.txt / mpi-abort
> [lin0322.marin.local:19465] Set MCA parameter "orte_base_help_aggregate"
> to 0 to see all help / error messages
> $
>
>
> dr. ir. Christiaan Klaij  | CFD Researcher | Research & Development
> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl
>
> MARIN news:
> http://www.marin.nl/web/News/News-items/Ship-design-in-EU-project-Holiship.htm
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160801/f7921b2a/attachment.html>


More information about the petsc-users mailing list