[petsc-users] Understanding index sets for PCGASM

LEONARDO MUTTI leonardo.mutti01 at universitadipavia.it
Tue May 9 11:44:57 CDT 2023


Partial typo: I expect 9x(16+16) numbers to be stored in subdomain_IS : #
subdomains x (row indices of the submatrix + col indices of the submatrix).

Il giorno mar 9 mag 2023 alle ore 18:31 LEONARDO MUTTI <
leonardo.mutti01 at universitadipavia.it> ha scritto:

>
>
> ---------- Forwarded message ---------
> Da: LEONARDO MUTTI <leonardo.mutti01 at universitadipavia.it>
> Date: mar 9 mag 2023 alle ore 18:29
> Subject: Re: [petsc-users] Understanding index sets for PCGASM
> To: Matthew Knepley <knepley at gmail.com>
>
>
> Thank you for your answer, but I am still confused, sorry.
> Consider
> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/ex71f.F90 on
> one processor.
> Let M=12 for the sake of simplicity, i.e. we deal with a 12x12 2D grid,
> hence, a  144x144 matrix.
> Let NSubx = 3, so that on the grid we do 3 vertical and 3 horizontal
> subdivisions.
> We should obtain 9 subdomains that are grids of 4x4 nodes each, thus
> corresponding to 9 submatrices of size 16x16.
> In my run I obtain NSub = 9 (great) and subdomain_IS(i), i=1,...,9, reads:
>
> *IS Object: 1 MPI process*
> *  type: general*
> *Number of indices in set 16*
> *0 0*
> *1 1*
> *2 2*
> *3 3*
> *4 12*
> *5 13*
> *6 14*
> *7 15*
> *8 24*
> *9 25*
> *10 26*
> *11 27*
> *12 36*
> *13 37*
> *14 38*
> *15 39*
> *IS Object: 1 MPI process*
> *  type: general*
> *Number of indices in set 16*
> *0 4*
> *1 5*
> *2 6*
> *3 7*
> *4 16*
> *5 17*
> *6 18*
> *7 19*
> *8 28*
> *9 29*
> *10 30*
> *11 31*
> *12 40*
> *13 41*
> *14 42*
> *15 43*
> *IS Object: 1 MPI process*
> *  type: general*
> *Number of indices in set 16*
> *0 8*
> *1 9*
> *2 10*
> *3 11*
> *4 20*
> *5 21*
> *6 22*
> *7 23*
> *8 32*
> *9 33*
> *10 34*
> *11 35*
> *12 44*
> *13 45*
> *14 46*
> *15 47*
> *IS Object: 1 MPI process*
> *  type: general*
> *Number of indices in set 16*
> *0 48*
> *1 49*
> *2 50*
> *3 51*
> *4 60*
> *5 61*
> *6 62*
> *7 63*
> *8 72*
> *9 73*
> *10 74*
> *11 75*
> *12 84*
> *13 85*
> *14 86*
> *15 87*
> *IS Object: 1 MPI process*
> *  type: general*
> *Number of indices in set 16*
> *0 52*
> *1 53*
> *2 54*
> *3 55*
> *4 64*
> *5 65*
> *6 66*
> *7 67*
> *8 76*
> *9 77*
> *10 78*
> *11 79*
> *12 88*
> *13 89*
> *14 90*
> *15 91*
> *IS Object: 1 MPI process*
> *  type: general*
> *Number of indices in set 16*
> *0 56*
> *1 57*
> *2 58*
> *3 59*
> *4 68*
> *5 69*
> *6 70*
> *7 71*
> *8 80*
> *9 81*
> *10 82*
> *11 83*
> *12 92*
> *13 93*
> *14 94*
> *15 95*
> *IS Object: 1 MPI process*
> *  type: general*
> *Number of indices in set 16*
> *0 96*
> *1 97*
> *2 98*
> *3 99*
> *4 108*
> *5 109*
> *6 110*
> *7 111*
> *8 120*
> *9 121*
> *10 122*
> *11 123*
> *12 132*
> *13 133*
> *14 134*
> *15 135*
> *IS Object: 1 MPI process*
> *  type: general*
> *Number of indices in set 16*
> *0 100*
> *1 101*
> *2 102*
> *3 103*
> *4 112*
> *5 113*
> *6 114*
> *7 115*
> *8 124*
> *9 125*
> *10 126*
> *11 127*
> *12 136*
> *13 137*
> *14 138*
> *15 139*
> *IS Object: 1 MPI process*
> *  type: general*
> *Number of indices in set 16*
> *0 104*
> *1 105*
> *2 106*
> *3 107*
> *4 116*
> *5 117*
> *6 118*
> *7 119*
> *8 128*
> *9 129*
> *10 130*
> *11 131*
> *12 140*
> *13 141*
> *14 142*
> *15 143*
>
> As you said, no number here reaches 144.
> But the number stored in subdomain_IS are 9x16= #subdomains x 16, whereas
> I would expect, also given your latest reply, 9x16x16x2=#subdomains x
> submatrix height x submatrix width x length of a (row,column) pair.
> It would really help me if you could briefly explain how the output above
> encodes the subdivision into subdomains.
> Many thanks again,
> Leonardo
>
>
>
> Il giorno mar 9 mag 2023 alle ore 16:24 Matthew Knepley <knepley at gmail.com>
> ha scritto:
>
>> On Tue, May 9, 2023 at 10:05 AM LEONARDO MUTTI <
>> leonardo.mutti01 at universitadipavia.it> wrote:
>>
>>> Great thanks! I can now successfully run
>>> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/ex71f.F90.
>>>
>>> Going forward with my experiments, let me post a new code snippet (very
>>> similar to  ex71f.F90) that I cannot get to work, probably I must be
>>> setting up the IS objects incorrectly.
>>>
>>> I have an 8x8 matrix A=diag(1,1,2,2,...,2) and a
>>> vector  b=(0.5,...,0.5). We have only one processor, and I want to solve
>>> Ax=b using GASM. In particular, KSP is set to preonly, GASM is the
>>> preconditioner and it uses on each submatrix an lu direct solver (sub_ksp =
>>> preonly, sub_pc = lu).
>>>
>>> For the GASM algorithm, I divide A into diag(1,1) and diag(2,2,...,2).
>>> For simplicity I set 0 overlap. Now I want to use GASM to solve Ax=b. The
>>> code follows.
>>>
>>> #include <petsc/finclude/petscmat.h>
>>> #include <petsc/finclude/petscksp.h>
>>> #include <petsc/finclude/petscpc.h>
>>>       USE petscmat
>>>       USE petscksp
>>>       USE petscpc
>>>       USE MPI
>>>
>>>       Mat   :: A
>>>       Vec   :: b, x
>>>       PetscInt :: M, I, J, ISLen, NSub
>>>       PetscMPIInt :: size
>>>       PetscErrorCode :: ierr
>>>       PetscScalar :: v
>>>       KSP            :: ksp
>>>       PC             :: pc
>>>       IS :: subdomains_IS(2), inflated_IS(2)
>>>       PetscInt,DIMENSION(4) :: indices_first_domain
>>>       PetscInt,DIMENSION(36) :: indices_second_domain
>>>
>>>       call PetscInitialize(PETSC_NULL_CHARACTER, ierr)
>>>       call MPI_Comm_size(PETSC_COMM_WORLD, size, ierr)
>>>
>>>
>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>       ! INTRO: create matrix and right hand side
>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>
>>>       WRITE(*,*) "Assembling A,b"
>>>
>>>       M = 8
>>>       call MatCreateAIJ(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
>>>      & M, M, PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,A, ierr)
>>>       DO I=1,M
>>>          DO J=1,M
>>>             IF ((I .EQ. J) .AND. (I .LE. 2 )) THEN
>>>                v = 1
>>>             ELSE IF ((I .EQ. J) .AND. (I .GT. 2 )) THEN
>>>                v = 2
>>>             ELSE
>>>                v = 0
>>>             ENDIF
>>>             call MatSetValue(A, I-1, J-1, v, INSERT_VALUES, ierr)
>>>          END DO
>>>       END DO
>>>
>>>       call MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY, ierr)
>>>       call MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY, ierr)
>>>
>>>       call VecCreate(PETSC_COMM_WORLD,b,ierr)
>>>       call VecSetSizes(b, PETSC_DECIDE, M,ierr)
>>>       call VecSetFromOptions(b,ierr)
>>>
>>>       do I=1,M
>>>          v = 0.5
>>>          call VecSetValue(b,I-1,v, INSERT_VALUES,ierr)
>>>       end do
>>>
>>>       call VecAssemblyBegin(b,ierr)
>>>       call VecAssemblyEnd(b,ierr)
>>>
>>>
>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>       ! FIRST KSP/PC SETUP
>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>
>>>       WRITE(*,*) "KSP/PC first setup"
>>>
>>>       call KSPCreate(PETSC_COMM_WORLD, ksp, ierr)
>>>       call KSPSetOperators(ksp, A, A, ierr)
>>>       call KSPSetType(ksp, 'preonly', ierr)
>>>       call KSPGetPC(ksp, pc, ierr)
>>>       call KSPSetUp(ksp, ierr)
>>>       call PCSetType(pc, PCGASM, ierr)
>>>
>>>
>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>       ! GASM, SETTING SUBDOMAINS
>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>
>>>       WRITE(*,*) "Setting GASM subdomains"
>>>
>>>       ! Let's create the subdomain IS and inflated_IS
>>>       ! They are equal if no overlap is present
>>>       ! They are 1: 0,1,8,9
>>>       !          2: 10,...,15,18,...,23,...,58,...,63
>>>
>>>       indices_first_domain = [0,1,8,9] ! corresponds to diag(1,1)
>>>       do I=0,5
>>>          do J=0,5
>>>             indices_second_domain(I*6+1+J) = 18 + J + 8*I ! corresponds
>>> to diag(2,2,...,2)
>>>             !WRITE(*,*) I*6+1+J, 18 + J + 8*I
>>>          end do
>>>       end do
>>>
>>>       ! Convert into IS
>>>       ISLen = 4
>>>       call ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_first_domain,
>>>      & PETSC_COPY_VALUES, subdomains_IS(1), ierr)
>>>       call ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_first_domain,
>>>      & PETSC_COPY_VALUES, inflated_IS(1), ierr)
>>>       ISLen = 36
>>>       call ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_second_domain,
>>>      & PETSC_COPY_VALUES, subdomains_IS(2), ierr)
>>>       call ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_second_domain,
>>>      & PETSC_COPY_VALUES, inflated_IS(2), ierr)
>>>
>>>       NSub = 2
>>>       call PCGASMSetSubdomains(pc,NSub,
>>>      & subdomains_IS,inflated_IS,ierr)
>>>       call PCGASMDestroySubdomains(NSub,
>>>      & subdomains_IS,inflated_IS,ierr)
>>>
>>>
>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>       ! GASM: SET SUBSOLVERS
>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>
>>>       WRITE(*,*) "Setting subsolvers for GASM"
>>>
>>>       call PCSetUp(pc, ierr) ! should I add this?
>>>
>>>       call PetscOptionsSetValue(PETSC_NULL_OPTIONS,
>>>      & "-sub_pc_type", "lu", ierr)
>>>       call PetscOptionsSetValue(PETSC_NULL_OPTIONS,
>>>      & "-sub_ksp_type", "preonly", ierr)
>>>
>>>       call KSPSetFromOptions(ksp, ierr)
>>>       call PCSetFromOptions(pc, ierr)
>>>
>>>
>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>       ! DUMMY SOLUTION: DID IT WORK?
>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>
>>>       WRITE(*,*) "Solve"
>>>
>>>       call VecDuplicate(b,x,ierr)
>>>       call KSPSolve(ksp,b,x,ierr)
>>>
>>>       call MatDestroy(A, ierr)
>>>       call KSPDestroy(ksp, ierr)
>>>       call PetscFinalize(ierr)
>>>
>>> This code is failing in multiple points. At  call PCSetUp(pc, ierr) it
>>> produces:
>>>
>>> *[0]PETSC ERROR: Argument out of range*
>>> *[0]PETSC ERROR: Scatter indices in ix are out of range*
>>> *...*
>>> *[0]PETSC ERROR: #1 VecScatterCreate() at
>>> ***\src\vec\is\sf\INTERF~1\vscat.c:736*
>>> *[0]PETSC ERROR: #2 PCSetUp_GASM() at
>>> ***\src\ksp\pc\impls\gasm\gasm.c:433*
>>> *[0]PETSC ERROR: #3 PCSetUp() at ***\src\ksp\pc\INTERF~1\precon.c:994*
>>>
>>> And at call KSPSolve(ksp,b,x,ierr) it produces:
>>>
>>> *forrtl: severe (157): Program Exception - access violation*
>>>
>>>
>>> The index sets are setup coherently with the outputs of e.g.
>>> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/output/ex71f_1.out:
>>> in particular each element of the matrix A corresponds to a number from 0
>>> to 63.
>>>
>>
>> This is not correct, I believe. The indices are row/col indices, not
>> indices into dense blocks, so for
>> your example, they are all in [0, 8].
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> Note that each submatrix does not represent some physical subdomain, the
>>> subdivision is just at the algebraic level.
>>> I thus have the following questions:
>>>
>>>    - is this the correct way of creating the IS objects, given my
>>>    objective at the beginning of the email? Is the ordering correct?
>>>    - what am I doing wrong that is generating the above errors?
>>>
>>> Thanks for the patience and the time.
>>> Best,
>>> Leonardo
>>>
>>> Il giorno ven 5 mag 2023 alle ore 18:43 Barry Smith <bsmith at petsc.dev>
>>> ha scritto:
>>>
>>>>
>>>>   Added in *barry/2023-05-04/add-pcgasm-set-subdomains *see also
>>>> https://gitlab.com/petsc/petsc/-/merge_requests/6419
>>>>
>>>>   Barry
>>>>
>>>>
>>>> On May 4, 2023, at 11:23 AM, LEONARDO MUTTI <
>>>> leonardo.mutti01 at universitadipavia.it> wrote:
>>>>
>>>> Thank you for the help.
>>>> Adding to my example:
>>>>
>>>>
>>>> *      call PCGASMSetSubdomains(pc,NSub, subdomains_IS,
>>>> inflated_IS,ierr)      call
>>>> PCGASMDestroySubdomains(NSub,subdomains_IS,inflated_IS,ierr)*
>>>> results in:
>>>>
>>>> *      Error LNK2019 unresolved external symbol PCGASMDESTROYSUBDOMAINS
>>>> referenced in function ...  *
>>>>
>>>> *      Error LNK2019 unresolved external symbol PCGASMSETSUBDOMAINS
>>>> referenced in function ...  *
>>>> I'm not sure if the interfaces are missing or if I have a compilation
>>>> problem.
>>>> Thank you again.
>>>> Best,
>>>> Leonardo
>>>>
>>>> Il giorno sab 29 apr 2023 alle ore 20:30 Barry Smith <bsmith at petsc.dev>
>>>> ha scritto:
>>>>
>>>>>
>>>>>    Thank you for the test code. I have a fix in the branch
>>>>> barry/2023-04-29/fix-pcasmcreatesubdomains2d
>>>>> <https://gitlab.com/petsc/petsc/-/commits/barry/2023-04-29/fix-pcasmcreatesubdomains2d> with
>>>>> merge request https://gitlab.com/petsc/petsc/-/merge_requests/6394
>>>>>
>>>>>    The functions did not have proper Fortran stubs and interfaces so I
>>>>> had to provide them manually in the new branch.
>>>>>
>>>>>    Use
>>>>>
>>>>>    git fetch
>>>>>    git checkout barry/2023-04-29/fix-pcasmcreatesubdomains2d
>>>>> <https://gitlab.com/petsc/petsc/-/commits/barry/2023-04-29/fix-pcasmcreatesubdomains2d>
>>>>>    ./configure etc
>>>>>
>>>>>    Your now working test code is in src/ksp/ksp/tests/ex71f.F90  I had
>>>>> to change things slightly and I updated the error handling for the latest
>>>>> version.
>>>>>
>>>>>    Please let us know if you have any later questions.
>>>>>
>>>>>   Barry
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Apr 28, 2023, at 12:07 PM, LEONARDO MUTTI <
>>>>> leonardo.mutti01 at universitadipavia.it> wrote:
>>>>>
>>>>> Hello. I am having a hard time understanding the index sets to feed
>>>>> PCGASMSetSubdomains, and I am working in Fortran (as a PETSc novice). To
>>>>> get more intuition on how the IS objects behave I tried the following
>>>>> minimal (non) working example, which should tile a 16x16 matrix into 16
>>>>> square, non-overlapping submatrices:
>>>>>
>>>>> #include <petsc/finclude/petscmat.h>
>>>>> #include <petsc/finclude/petscksp.h>
>>>>> #include <petsc/finclude/petscpc.h>
>>>>>       USE petscmat
>>>>>       USE petscksp
>>>>>       USE petscpc
>>>>>
>>>>>       Mat   :: A
>>>>>       PetscInt :: M, NSubx, dof, overlap, NSub
>>>>>       INTEGER :: I,J
>>>>>       PetscErrorCode      :: ierr
>>>>>       PetscScalar :: v
>>>>>       KSP            :: ksp
>>>>>       PC             :: pc
>>>>>       IS :: subdomains_IS, inflated_IS
>>>>>
>>>>>       call PetscInitialize(PETSC_NULL_CHARACTER , ierr)
>>>>>
>>>>> !-----Create a dummy matrix
>>>>>       M = 16
>>>>>       call MatCreateAIJ(MPI_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
>>>>>      & M, M,
>>>>>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>>>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>>>      & A, ierr)
>>>>>
>>>>>       DO I=1,M
>>>>>          DO J=1,M
>>>>>             v = I*J
>>>>>             CALL MatSetValue (A,I-1,J-1,v,
>>>>>      &                     INSERT_VALUES , ierr)
>>>>>          END DO
>>>>>       END DO
>>>>>
>>>>>       call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY , ierr)
>>>>>       call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY , ierr)
>>>>>
>>>>> !-----Create KSP and PC
>>>>>       call KSPCreate(PETSC_COMM_WORLD,ksp, ierr)
>>>>>       call KSPSetOperators(ksp,A,A, ierr)
>>>>>       call KSPSetType(ksp,"bcgs",ierr)
>>>>>       call KSPGetPC(ksp,pc,ierr)
>>>>>       call KSPSetUp(ksp, ierr)
>>>>>       call PCSetType(pc,PCGASM, ierr)
>>>>>       call PCSetUp(pc , ierr)
>>>>>
>>>>> !-----GASM setup
>>>>>       NSubx = 4
>>>>>       dof = 1
>>>>>       overlap = 0
>>>>>
>>>>>       call PCGASMCreateSubdomains2D(pc,
>>>>>      &      M, M,
>>>>>      &      NSubx, NSubx,
>>>>>      &      dof, overlap,
>>>>>      &      NSub, subdomains_IS, inflated_IS, ierr)
>>>>>
>>>>>       call ISView(subdomains_IS, PETSC_VIEWER_STDOUT_WORLD, ierr)
>>>>>
>>>>>       call KSPDestroy(ksp, ierr)
>>>>>       call PetscFinalize(ierr)
>>>>>
>>>>> Running this on one processor, I get NSub = 4.
>>>>> If PCASM and PCASMCreateSubdomains2D are used instead, I get NSub = 16
>>>>> as expected.
>>>>> Moreover, I get in the end "forrtl: severe (157): Program Exception -
>>>>> access violation". So:
>>>>> 1) why do I get two different results with ASM, and GASM?
>>>>> 2) why do I get access violation and how can I solve this?
>>>>> In fact, in C, subdomains_IS, inflated_IS should pointers to IS
>>>>> objects. As I see on the Fortran interface, the arguments to
>>>>> PCGASMCreateSubdomains2D are IS objects:
>>>>>
>>>>>        subroutine PCGASMCreateSubdomains2D(a,b,c,d,e,f,g,h,i,j,z)
>>>>>        import tPC,tIS
>>>>>        PC a ! PC
>>>>>        PetscInt b ! PetscInt
>>>>>        PetscInt c ! PetscInt
>>>>>        PetscInt d ! PetscInt
>>>>>        PetscInt e ! PetscInt
>>>>>        PetscInt f ! PetscInt
>>>>>        PetscInt g ! PetscInt
>>>>>        PetscInt h ! PetscInt
>>>>>        IS i ! IS
>>>>>        IS j ! IS
>>>>>        PetscErrorCode z
>>>>>        end subroutine PCGASMCreateSubdomains2D
>>>>> Thus:
>>>>> 3) what should be inside e.g., subdomains_IS? I expect it to contain,
>>>>> for every created subdomain, the list of rows and columns defining the
>>>>> subblock in the matrix, am I right?
>>>>>
>>>>> Context: I have a block-tridiagonal system arising from space-time
>>>>> finite elements, and I want to solve it with GMRES+PCGASM preconditioner,
>>>>> where each overlapping submatrix is on the diagonal and of size 3x3 blocks
>>>>> (and spanning multiple processes). This is PETSc 3.17.1 on Windows.
>>>>>
>>>>> Thanks in advance,
>>>>> Leonardo
>>>>>
>>>>>
>>>>>
>>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230509/bc0afa5e/attachment-0001.html>


More information about the petsc-users mailing list