[petsc-users] Understanding index sets for PCGASM

Matthew Knepley knepley at gmail.com
Tue May 9 09:24:38 CDT 2023


On Tue, May 9, 2023 at 10:05 AM LEONARDO MUTTI <
leonardo.mutti01 at universitadipavia.it> wrote:

> Great thanks! I can now successfully run
> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/ex71f.F90.
>
> Going forward with my experiments, let me post a new code snippet (very
> similar to  ex71f.F90) that I cannot get to work, probably I must be
> setting up the IS objects incorrectly.
>
> I have an 8x8 matrix A=diag(1,1,2,2,...,2) and a vector  b=(0.5,...,0.5).
> We have only one processor, and I want to solve Ax=b using GASM. In
> particular, KSP is set to preonly, GASM is the preconditioner and it uses
> on each submatrix an lu direct solver (sub_ksp = preonly, sub_pc = lu).
>
> For the GASM algorithm, I divide A into diag(1,1) and diag(2,2,...,2). For
> simplicity I set 0 overlap. Now I want to use GASM to solve Ax=b. The code
> follows.
>
> #include <petsc/finclude/petscmat.h>
> #include <petsc/finclude/petscksp.h>
> #include <petsc/finclude/petscpc.h>
>       USE petscmat
>       USE petscksp
>       USE petscpc
>       USE MPI
>
>       Mat   :: A
>       Vec   :: b, x
>       PetscInt :: M, I, J, ISLen, NSub
>       PetscMPIInt :: size
>       PetscErrorCode :: ierr
>       PetscScalar :: v
>       KSP            :: ksp
>       PC             :: pc
>       IS :: subdomains_IS(2), inflated_IS(2)
>       PetscInt,DIMENSION(4) :: indices_first_domain
>       PetscInt,DIMENSION(36) :: indices_second_domain
>
>       call PetscInitialize(PETSC_NULL_CHARACTER, ierr)
>       call MPI_Comm_size(PETSC_COMM_WORLD, size, ierr)
>
>
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>       ! INTRO: create matrix and right hand side
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>
>       WRITE(*,*) "Assembling A,b"
>
>       M = 8
>       call MatCreateAIJ(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
>      & M, M, PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,A, ierr)
>       DO I=1,M
>          DO J=1,M
>             IF ((I .EQ. J) .AND. (I .LE. 2 )) THEN
>                v = 1
>             ELSE IF ((I .EQ. J) .AND. (I .GT. 2 )) THEN
>                v = 2
>             ELSE
>                v = 0
>             ENDIF
>             call MatSetValue(A, I-1, J-1, v, INSERT_VALUES, ierr)
>          END DO
>       END DO
>
>       call MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY, ierr)
>       call MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY, ierr)
>
>       call VecCreate(PETSC_COMM_WORLD,b,ierr)
>       call VecSetSizes(b, PETSC_DECIDE, M,ierr)
>       call VecSetFromOptions(b,ierr)
>
>       do I=1,M
>          v = 0.5
>          call VecSetValue(b,I-1,v, INSERT_VALUES,ierr)
>       end do
>
>       call VecAssemblyBegin(b,ierr)
>       call VecAssemblyEnd(b,ierr)
>
>
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>       ! FIRST KSP/PC SETUP
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>
>       WRITE(*,*) "KSP/PC first setup"
>
>       call KSPCreate(PETSC_COMM_WORLD, ksp, ierr)
>       call KSPSetOperators(ksp, A, A, ierr)
>       call KSPSetType(ksp, 'preonly', ierr)
>       call KSPGetPC(ksp, pc, ierr)
>       call KSPSetUp(ksp, ierr)
>       call PCSetType(pc, PCGASM, ierr)
>
>
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>       ! GASM, SETTING SUBDOMAINS
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>
>       WRITE(*,*) "Setting GASM subdomains"
>
>       ! Let's create the subdomain IS and inflated_IS
>       ! They are equal if no overlap is present
>       ! They are 1: 0,1,8,9
>       !          2: 10,...,15,18,...,23,...,58,...,63
>
>       indices_first_domain = [0,1,8,9] ! corresponds to diag(1,1)
>       do I=0,5
>          do J=0,5
>             indices_second_domain(I*6+1+J) = 18 + J + 8*I ! corresponds to
> diag(2,2,...,2)
>             !WRITE(*,*) I*6+1+J, 18 + J + 8*I
>          end do
>       end do
>
>       ! Convert into IS
>       ISLen = 4
>       call ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_first_domain,
>      & PETSC_COPY_VALUES, subdomains_IS(1), ierr)
>       call ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_first_domain,
>      & PETSC_COPY_VALUES, inflated_IS(1), ierr)
>       ISLen = 36
>       call ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_second_domain,
>      & PETSC_COPY_VALUES, subdomains_IS(2), ierr)
>       call ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_second_domain,
>      & PETSC_COPY_VALUES, inflated_IS(2), ierr)
>
>       NSub = 2
>       call PCGASMSetSubdomains(pc,NSub,
>      & subdomains_IS,inflated_IS,ierr)
>       call PCGASMDestroySubdomains(NSub,
>      & subdomains_IS,inflated_IS,ierr)
>
>
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>       ! GASM: SET SUBSOLVERS
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>
>       WRITE(*,*) "Setting subsolvers for GASM"
>
>       call PCSetUp(pc, ierr) ! should I add this?
>
>       call PetscOptionsSetValue(PETSC_NULL_OPTIONS,
>      & "-sub_pc_type", "lu", ierr)
>       call PetscOptionsSetValue(PETSC_NULL_OPTIONS,
>      & "-sub_ksp_type", "preonly", ierr)
>
>       call KSPSetFromOptions(ksp, ierr)
>       call PCSetFromOptions(pc, ierr)
>
>
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>       ! DUMMY SOLUTION: DID IT WORK?
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>
>       WRITE(*,*) "Solve"
>
>       call VecDuplicate(b,x,ierr)
>       call KSPSolve(ksp,b,x,ierr)
>
>       call MatDestroy(A, ierr)
>       call KSPDestroy(ksp, ierr)
>       call PetscFinalize(ierr)
>
> This code is failing in multiple points. At  call PCSetUp(pc, ierr) it
> produces:
>
> *[0]PETSC ERROR: Argument out of range*
> *[0]PETSC ERROR: Scatter indices in ix are out of range*
> *...*
> *[0]PETSC ERROR: #1 VecScatterCreate() at
> ***\src\vec\is\sf\INTERF~1\vscat.c:736*
> *[0]PETSC ERROR: #2 PCSetUp_GASM() at ***\src\ksp\pc\impls\gasm\gasm.c:433*
> *[0]PETSC ERROR: #3 PCSetUp() at ***\src\ksp\pc\INTERF~1\precon.c:994*
>
> And at call KSPSolve(ksp,b,x,ierr) it produces:
>
> *forrtl: severe (157): Program Exception - access violation*
>
>
> The index sets are setup coherently with the outputs of e.g.
> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/output/ex71f_1.out:
> in particular each element of the matrix A corresponds to a number from 0
> to 63.
>

This is not correct, I believe. The indices are row/col indices, not
indices into dense blocks, so for
your example, they are all in [0, 8].

  Thanks,

     Matt


> Note that each submatrix does not represent some physical subdomain, the
> subdivision is just at the algebraic level.
> I thus have the following questions:
>
>    - is this the correct way of creating the IS objects, given my
>    objective at the beginning of the email? Is the ordering correct?
>    - what am I doing wrong that is generating the above errors?
>
> Thanks for the patience and the time.
> Best,
> Leonardo
>
> Il giorno ven 5 mag 2023 alle ore 18:43 Barry Smith <bsmith at petsc.dev> ha
> scritto:
>
>>
>>   Added in *barry/2023-05-04/add-pcgasm-set-subdomains *see also
>> https://gitlab.com/petsc/petsc/-/merge_requests/6419
>>
>>   Barry
>>
>>
>> On May 4, 2023, at 11:23 AM, LEONARDO MUTTI <
>> leonardo.mutti01 at universitadipavia.it> wrote:
>>
>> Thank you for the help.
>> Adding to my example:
>>
>>
>> *      call PCGASMSetSubdomains(pc,NSub, subdomains_IS,
>> inflated_IS,ierr)      call
>> PCGASMDestroySubdomains(NSub,subdomains_IS,inflated_IS,ierr)*
>> results in:
>>
>> *      Error LNK2019 unresolved external symbol PCGASMDESTROYSUBDOMAINS
>> referenced in function ...  *
>>
>> *      Error LNK2019 unresolved external symbol PCGASMSETSUBDOMAINS
>> referenced in function ...  *
>> I'm not sure if the interfaces are missing or if I have a compilation
>> problem.
>> Thank you again.
>> Best,
>> Leonardo
>>
>> Il giorno sab 29 apr 2023 alle ore 20:30 Barry Smith <bsmith at petsc.dev>
>> ha scritto:
>>
>>>
>>>    Thank you for the test code. I have a fix in the branch
>>> barry/2023-04-29/fix-pcasmcreatesubdomains2d
>>> <https://gitlab.com/petsc/petsc/-/commits/barry/2023-04-29/fix-pcasmcreatesubdomains2d> with
>>> merge request https://gitlab.com/petsc/petsc/-/merge_requests/6394
>>>
>>>    The functions did not have proper Fortran stubs and interfaces so I
>>> had to provide them manually in the new branch.
>>>
>>>    Use
>>>
>>>    git fetch
>>>    git checkout barry/2023-04-29/fix-pcasmcreatesubdomains2d
>>> <https://gitlab.com/petsc/petsc/-/commits/barry/2023-04-29/fix-pcasmcreatesubdomains2d>
>>>    ./configure etc
>>>
>>>    Your now working test code is in src/ksp/ksp/tests/ex71f.F90  I had
>>> to change things slightly and I updated the error handling for the latest
>>> version.
>>>
>>>    Please let us know if you have any later questions.
>>>
>>>   Barry
>>>
>>>
>>>
>>>
>>> On Apr 28, 2023, at 12:07 PM, LEONARDO MUTTI <
>>> leonardo.mutti01 at universitadipavia.it> wrote:
>>>
>>> Hello. I am having a hard time understanding the index sets to feed
>>> PCGASMSetSubdomains, and I am working in Fortran (as a PETSc novice). To
>>> get more intuition on how the IS objects behave I tried the following
>>> minimal (non) working example, which should tile a 16x16 matrix into 16
>>> square, non-overlapping submatrices:
>>>
>>> #include <petsc/finclude/petscmat.h>
>>> #include <petsc/finclude/petscksp.h>
>>> #include <petsc/finclude/petscpc.h>
>>>       USE petscmat
>>>       USE petscksp
>>>       USE petscpc
>>>
>>>       Mat   :: A
>>>       PetscInt :: M, NSubx, dof, overlap, NSub
>>>       INTEGER :: I,J
>>>       PetscErrorCode      :: ierr
>>>       PetscScalar :: v
>>>       KSP            :: ksp
>>>       PC             :: pc
>>>       IS :: subdomains_IS, inflated_IS
>>>
>>>       call PetscInitialize(PETSC_NULL_CHARACTER , ierr)
>>>
>>> !-----Create a dummy matrix
>>>       M = 16
>>>       call MatCreateAIJ(MPI_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
>>>      & M, M,
>>>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>      & A, ierr)
>>>
>>>       DO I=1,M
>>>          DO J=1,M
>>>             v = I*J
>>>             CALL MatSetValue (A,I-1,J-1,v,
>>>      &                     INSERT_VALUES , ierr)
>>>          END DO
>>>       END DO
>>>
>>>       call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY , ierr)
>>>       call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY , ierr)
>>>
>>> !-----Create KSP and PC
>>>       call KSPCreate(PETSC_COMM_WORLD,ksp, ierr)
>>>       call KSPSetOperators(ksp,A,A, ierr)
>>>       call KSPSetType(ksp,"bcgs",ierr)
>>>       call KSPGetPC(ksp,pc,ierr)
>>>       call KSPSetUp(ksp, ierr)
>>>       call PCSetType(pc,PCGASM, ierr)
>>>       call PCSetUp(pc , ierr)
>>>
>>> !-----GASM setup
>>>       NSubx = 4
>>>       dof = 1
>>>       overlap = 0
>>>
>>>       call PCGASMCreateSubdomains2D(pc,
>>>      &      M, M,
>>>      &      NSubx, NSubx,
>>>      &      dof, overlap,
>>>      &      NSub, subdomains_IS, inflated_IS, ierr)
>>>
>>>       call ISView(subdomains_IS, PETSC_VIEWER_STDOUT_WORLD, ierr)
>>>
>>>       call KSPDestroy(ksp, ierr)
>>>       call PetscFinalize(ierr)
>>>
>>> Running this on one processor, I get NSub = 4.
>>> If PCASM and PCASMCreateSubdomains2D are used instead, I get NSub = 16
>>> as expected.
>>> Moreover, I get in the end "forrtl: severe (157): Program Exception -
>>> access violation". So:
>>> 1) why do I get two different results with ASM, and GASM?
>>> 2) why do I get access violation and how can I solve this?
>>> In fact, in C, subdomains_IS, inflated_IS should pointers to IS objects.
>>> As I see on the Fortran interface, the arguments to
>>> PCGASMCreateSubdomains2D are IS objects:
>>>
>>>        subroutine PCGASMCreateSubdomains2D(a,b,c,d,e,f,g,h,i,j,z)
>>>        import tPC,tIS
>>>        PC a ! PC
>>>        PetscInt b ! PetscInt
>>>        PetscInt c ! PetscInt
>>>        PetscInt d ! PetscInt
>>>        PetscInt e ! PetscInt
>>>        PetscInt f ! PetscInt
>>>        PetscInt g ! PetscInt
>>>        PetscInt h ! PetscInt
>>>        IS i ! IS
>>>        IS j ! IS
>>>        PetscErrorCode z
>>>        end subroutine PCGASMCreateSubdomains2D
>>> Thus:
>>> 3) what should be inside e.g., subdomains_IS? I expect it to contain,
>>> for every created subdomain, the list of rows and columns defining the
>>> subblock in the matrix, am I right?
>>>
>>> Context: I have a block-tridiagonal system arising from space-time
>>> finite elements, and I want to solve it with GMRES+PCGASM preconditioner,
>>> where each overlapping submatrix is on the diagonal and of size 3x3 blocks
>>> (and spanning multiple processes). This is PETSc 3.17.1 on Windows.
>>>
>>> Thanks in advance,
>>> Leonardo
>>>
>>>
>>>
>>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230509/95ceeb9b/attachment-0001.html>


More information about the petsc-users mailing list