[petsc-users] Understanding index sets for PCGASM

LEONARDO MUTTI leonardo.mutti01 at universitadipavia.it
Fri May 5 01:45:11 CDT 2023


Interesting, a priori I'm not sure this will work better, mainly because
I'd lose the compact band structure.

As for waveform relaxation: I excluded it at first since it appears to be
requiring too many CPUs than I have to beat sequential solvers, plus it is
more complicated and I have very limited time at this project.

For both suggestions, because of the way the space-time matrix is
generated, it is much more convenient for me to mess with the time
dimension than with space.

Overall GASM seems a simpler way to go before trying other things.
Please let me know if you decide to add the GASM interfaces.

Thanks again.
Best,
Leonardo

Il ven 5 mag 2023, 00:01 Matthew Knepley <knepley at gmail.com> ha scritto:

> On Thu, May 4, 2023 at 1:43 PM LEONARDO MUTTI <
> leonardo.mutti01 at universitadipavia.it> wrote:
>
>> Of course, I'll try to explain.
>>
>> I am solving a parabolic equation with space-time FEM and I want an
>> efficient solver/preconditioner for the resulting system.
>> The corresponding matrix, call it X, has an e.g. block bi-diagonal
>> structure, if the cG(1)-dG(0) method is used (i.e. implicit Euler solved in
>> batch).
>> Every block-row of X corresponds to a time instant.
>>
>> I want to introduce parallelism in time by subdividing X into overlapping
>> submatrices of e.g 2x2 or 3x3 blocks, along the block diagonal.
>> For instance, call X_i the individual blocks. The submatrices would be,
>> for various i, (X_{i-1,i-1},X_{i-1,i};X_{i,i-1},X_{i,i}).
>> I'd like each submatrix to be solved in parallel, to combine the various
>> results together in an ASM like fashion.
>> Every submatrix has thus a predecessor and a successor, and it overlaps
>> with both, so that as far as I could understand, GASM has to be used in
>> place of ASM.
>>
>
> Yes, ordered that way you need GASM. I wonder if inverting the ordering
> would be useful, namely putting the time index on the inside.
> Then the blocks would be over all time, but limited space, which is more
> the spirit of ASM I think.
>
> Have you considered waveform relaxation for this problem?
>
>    Thanks,
>
>      Matt
>
>
>> Hope this helps.
>> Best,
>> Leonardo
>>
>> Il giorno gio 4 mag 2023 alle ore 18:05 Matthew Knepley <
>> knepley at gmail.com> ha scritto:
>>
>>> On Thu, May 4, 2023 at 11:24 AM LEONARDO MUTTI <
>>> leonardo.mutti01 at universitadipavia.it> wrote:
>>>
>>>> Thank you for the help.
>>>> Adding to my example:
>>>>
>>>>
>>>> *      call PCGASMSetSubdomains(pc,NSub, subdomains_IS,
>>>> inflated_IS,ierr)      call
>>>> PCGASMDestroySubdomains(NSub,subdomains_IS,inflated_IS,ierr)*
>>>> results in:
>>>>
>>>> *      Error LNK2019 unresolved external symbol PCGASMDESTROYSUBDOMAINS
>>>> referenced in function ...  *
>>>>
>>>> *      Error LNK2019 unresolved external symbol PCGASMSETSUBDOMAINS
>>>> referenced in function ...  *
>>>> I'm not sure if the interfaces are missing or if I have a compilation
>>>> problem.
>>>>
>>>
>>> I just want to make sure you really want GASM. It sounded like you might
>>> able to do what you want just with ASM.
>>> Can you tell me again what you want to do overall?
>>>
>>>   Thanks,
>>>
>>>     Matt
>>>
>>>
>>>> Thank you again.
>>>> Best,
>>>> Leonardo
>>>>
>>>> Il giorno sab 29 apr 2023 alle ore 20:30 Barry Smith <bsmith at petsc.dev>
>>>> ha scritto:
>>>>
>>>>>
>>>>>    Thank you for the test code. I have a fix in the branch
>>>>> barry/2023-04-29/fix-pcasmcreatesubdomains2d
>>>>> <https://gitlab.com/petsc/petsc/-/commits/barry/2023-04-29/fix-pcasmcreatesubdomains2d> with
>>>>> merge request https://gitlab.com/petsc/petsc/-/merge_requests/6394
>>>>>
>>>>>    The functions did not have proper Fortran stubs and interfaces so I
>>>>> had to provide them manually in the new branch.
>>>>>
>>>>>    Use
>>>>>
>>>>>    git fetch
>>>>>    git checkout barry/2023-04-29/fix-pcasmcreatesubdomains2d
>>>>> <https://gitlab.com/petsc/petsc/-/commits/barry/2023-04-29/fix-pcasmcreatesubdomains2d>
>>>>>    ./configure etc
>>>>>
>>>>>    Your now working test code is in src/ksp/ksp/tests/ex71f.F90  I had
>>>>> to change things slightly and I updated the error handling for the latest
>>>>> version.
>>>>>
>>>>>    Please let us know if you have any later questions.
>>>>>
>>>>>   Barry
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Apr 28, 2023, at 12:07 PM, LEONARDO MUTTI <
>>>>> leonardo.mutti01 at universitadipavia.it> wrote:
>>>>>
>>>>> Hello. I am having a hard time understanding the index sets to feed
>>>>> PCGASMSetSubdomains, and I am working in Fortran (as a PETSc novice). To
>>>>> get more intuition on how the IS objects behave I tried the following
>>>>> minimal (non) working example, which should tile a 16x16 matrix into 16
>>>>> square, non-overlapping submatrices:
>>>>>
>>>>> #include <petsc/finclude/petscmat.h>
>>>>> #include <petsc/finclude/petscksp.h>
>>>>> #include <petsc/finclude/petscpc.h>
>>>>>       USE petscmat
>>>>>       USE petscksp
>>>>>       USE petscpc
>>>>>
>>>>>       Mat   :: A
>>>>>       PetscInt :: M, NSubx, dof, overlap, NSub
>>>>>       INTEGER :: I,J
>>>>>       PetscErrorCode      :: ierr
>>>>>       PetscScalar :: v
>>>>>       KSP            :: ksp
>>>>>       PC             :: pc
>>>>>       IS :: subdomains_IS, inflated_IS
>>>>>
>>>>>       call PetscInitialize(PETSC_NULL_CHARACTER , ierr)
>>>>>
>>>>> !-----Create a dummy matrix
>>>>>       M = 16
>>>>>       call MatCreateAIJ(MPI_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
>>>>>      & M, M,
>>>>>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>>>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>>>      & A, ierr)
>>>>>
>>>>>       DO I=1,M
>>>>>          DO J=1,M
>>>>>             v = I*J
>>>>>             CALL MatSetValue (A,I-1,J-1,v,
>>>>>      &                     INSERT_VALUES , ierr)
>>>>>          END DO
>>>>>       END DO
>>>>>
>>>>>       call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY , ierr)
>>>>>       call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY , ierr)
>>>>>
>>>>> !-----Create KSP and PC
>>>>>       call KSPCreate(PETSC_COMM_WORLD,ksp, ierr)
>>>>>       call KSPSetOperators(ksp,A,A, ierr)
>>>>>       call KSPSetType(ksp,"bcgs",ierr)
>>>>>       call KSPGetPC(ksp,pc,ierr)
>>>>>       call KSPSetUp(ksp, ierr)
>>>>>       call PCSetType(pc,PCGASM, ierr)
>>>>>       call PCSetUp(pc , ierr)
>>>>>
>>>>> !-----GASM setup
>>>>>       NSubx = 4
>>>>>       dof = 1
>>>>>       overlap = 0
>>>>>
>>>>>       call PCGASMCreateSubdomains2D(pc,
>>>>>      &      M, M,
>>>>>      &      NSubx, NSubx,
>>>>>      &      dof, overlap,
>>>>>      &      NSub, subdomains_IS, inflated_IS, ierr)
>>>>>
>>>>>       call ISView(subdomains_IS, PETSC_VIEWER_STDOUT_WORLD, ierr)
>>>>>
>>>>>       call KSPDestroy(ksp, ierr)
>>>>>       call PetscFinalize(ierr)
>>>>>
>>>>> Running this on one processor, I get NSub = 4.
>>>>> If PCASM and PCASMCreateSubdomains2D are used instead, I get NSub = 16
>>>>> as expected.
>>>>> Moreover, I get in the end "forrtl: severe (157): Program Exception -
>>>>> access violation". So:
>>>>> 1) why do I get two different results with ASM, and GASM?
>>>>> 2) why do I get access violation and how can I solve this?
>>>>> In fact, in C, subdomains_IS, inflated_IS should pointers to IS
>>>>> objects. As I see on the Fortran interface, the arguments to
>>>>> PCGASMCreateSubdomains2D are IS objects:
>>>>>
>>>>>        subroutine PCGASMCreateSubdomains2D(a,b,c,d,e,f,g,h,i,j,z)
>>>>>        import tPC,tIS
>>>>>        PC a ! PC
>>>>>        PetscInt b ! PetscInt
>>>>>        PetscInt c ! PetscInt
>>>>>        PetscInt d ! PetscInt
>>>>>        PetscInt e ! PetscInt
>>>>>        PetscInt f ! PetscInt
>>>>>        PetscInt g ! PetscInt
>>>>>        PetscInt h ! PetscInt
>>>>>        IS i ! IS
>>>>>        IS j ! IS
>>>>>        PetscErrorCode z
>>>>>        end subroutine PCGASMCreateSubdomains2D
>>>>> Thus:
>>>>> 3) what should be inside e.g., subdomains_IS? I expect it to contain,
>>>>> for every created subdomain, the list of rows and columns defining the
>>>>> subblock in the matrix, am I right?
>>>>>
>>>>> Context: I have a block-tridiagonal system arising from space-time
>>>>> finite elements, and I want to solve it with GMRES+PCGASM preconditioner,
>>>>> where each overlapping submatrix is on the diagonal and of size 3x3 blocks
>>>>> (and spanning multiple processes). This is PETSc 3.17.1 on Windows.
>>>>>
>>>>> Thanks in advance,
>>>>> Leonardo
>>>>>
>>>>>
>>>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230505/9a821b73/attachment.html>


More information about the petsc-users mailing list