[petsc-users] Understanding index sets for PCGASM
Leonardo Mutti
leonardo.mutti01 at universitadipavia.it
Thu May 18 20:20:01 CDT 2023
Many thanks for the hint. As for your last two sentences, I am in a similar
situation. So, let me mention some last thoughts, in case they help locate
the issue. If not, I will embark on the debugging adventure with Windows
debuggers.
Consider again the 4x4 matrix (represented by the IS [0,1,2,3], with rows
0,1 on rank 0 and 2,3 on rank 1). If I create the non-inflated IS [0,1,2],
[3], with inflated IS [0,1,2], [1,2,3], no problems arise.
Here is a comparison with the non-working case:
- works: non-inflated IS: [0,1,2], [3], inflated IS: [0,1,2],
[1,2,3]
- doesn't work: non-inflated IS: [0,1], [2,3], inflated IS: [0,1],
[1,2,3]
One difference is that in the working case, the non-inflating subdomain
[0,1,2], is not on a single process.
Moreover, is it possible that the issue has something to do with
PetscObjectReference as seen in /src/ksp/pc/impls/gasm/gasm.c line 1732,
PCGASMCreateSubdomains2D?
Thanks again
Il giorno gio 18 mag 2023 alle ore 04:26 Barry Smith <bsmith at petsc.dev> ha
scritto:
>
> Yikes. Such huge numbers usually come from integer overflow or memory
> corruption.
>
> The code to decide on the memory that needs allocating is straightforward
>
> PetscErrorCode MatCreateSubMatrices_MPIAIJ(Mat C, PetscInt ismax, const IS
> isrow[], const IS iscol[], MatReuse scall, Mat *submat[])
> {
> PetscInt nmax, nstages = 0, i, pos, max_no, nrow, ncol, in[2],
> out[2];
> PetscBool rowflag, colflag, wantallmatrix = PETSC_FALSE;
> Mat_SeqAIJ *subc;
> Mat_SubSppt *smat;
>
> PetscFunctionBegin;
> /* Check for special case: each processor has a single IS */
> if (C->submat_singleis) { /* flag is set in PCSetUp_ASM() to skip
> MPI_Allreduce() */
> PetscCall(MatCreateSubMatrices_MPIAIJ_SingleIS(C, ismax, isrow, iscol,
> scall, submat));
> C->submat_singleis = PETSC_FALSE; /* resume its default value in case
> C will be used for non-single IS */
> PetscFunctionReturn(PETSC_SUCCESS);
> }
>
> /* Collect global wantallmatrix and nstages */
> if (!C->cmap->N) nmax = 20 * 1000000 / sizeof(PetscInt);
> else nmax = 20 * 1000000 / (C->cmap->N * sizeof(PetscInt));
> if (!nmax) nmax = 1;
>
> if (scall == MAT_INITIAL_MATRIX) {
> /* Collect global wantallmatrix and nstages */
> if (ismax == 1 && C->rmap->N == C->cmap->N) {
> PetscCall(ISIdentity(*isrow, &rowflag));
> PetscCall(ISIdentity(*iscol, &colflag));
> PetscCall(ISGetLocalSize(*isrow, &nrow));
> PetscCall(ISGetLocalSize(*iscol, &ncol));
> if (rowflag && colflag && nrow == C->rmap->N && ncol == C->cmap->N) {
> wantallmatrix = PETSC_TRUE;
>
> PetscCall(PetscOptionsGetBool(((PetscObject)C)->options,
> ((PetscObject)C)->prefix, "-use_fast_submatrix", &wantallmatrix, NULL));
> }
> }
>
> /* Determine the number of stages through which submatrices are done
> Each stage will extract nmax submatrices.
> nmax is determined by the matrix column dimension.
> If the original matrix has 20M columns, only one submatrix per
> stage is allowed, etc.
> */
> nstages = ismax / nmax + ((ismax % nmax) ? 1 : 0); /* local nstages */
>
> in[0] = -1 * (PetscInt)wantallmatrix;
> in[1] = nstages;
> PetscCall(MPIU_Allreduce(in, out, 2, MPIU_INT, MPI_MAX,
> PetscObjectComm((PetscObject)C)));
> wantallmatrix = (PetscBool)(-out[0]);
> nstages = out[1]; /* Make sure every processor loops through the
> global nstages */
>
> } else { /* MAT_REUSE_MATRIX */
> if (ismax) {
> subc = (Mat_SeqAIJ *)(*submat)[0]->data;
> smat = subc->submatis1;
> } else { /* (*submat)[0] is a dummy matrix */
> smat = (Mat_SubSppt *)(*submat)[0]->data;
> }
> if (!smat) {
> /* smat is not generated by
> MatCreateSubMatrix_MPIAIJ_All(...,MAT_INITIAL_MATRIX,...) */
> wantallmatrix = PETSC_TRUE;
> } else if (smat->singleis) {
> PetscCall(MatCreateSubMatrices_MPIAIJ_SingleIS(C, ismax, isrow,
> iscol, scall, submat));
> PetscFunctionReturn(PETSC_SUCCESS);
> } else {
> nstages = smat->nstages;
> }
> }
>
> if (wantallmatrix) {
> PetscCall(MatCreateSubMatrix_MPIAIJ_All(C, MAT_GET_VALUES, scall,
> submat));
> PetscFunctionReturn(PETSC_SUCCESS);
> }
>
> /* Allocate memory to hold all the submatrices and dummy submatrices */
> if (scall == MAT_INITIAL_MATRIX) PetscCall(PetscCalloc1(ismax + nstages,
> submat));
>
> for (i = 0, pos = 0; i < nstages; i++) {
> if (pos + nmax <= ismax) max_no = nmax;
> else if (pos >= ismax) max_no = 0;
> else max_no = ismax - pos;
>
> PetscCall(MatCreateSubMatrices_MPIAIJ_Local(C, max_no, isrow + pos,
> iscol + pos, scall, *submat + pos));
> if (!max_no) {
> if (scall == MAT_INITIAL_MATRIX) { /* submat[pos] is a dummy matrix
> */
> smat = (Mat_SubSppt *)(*submat)[pos]->data;
> smat->nstages = nstages;
> }
> pos++; /* advance to next dummy matrix if any */
> } else pos += max_no;
> }
>
> if (ismax && scall == MAT_INITIAL_MATRIX) {
> /* save nstages for reuse */
> subc = (Mat_SeqAIJ *)(*submat)[0]->data;
> smat = subc->submatis1;
> smat->nstages = nstages;
> }
> PetscFunctionReturn(PETSC_SUCCESS);
> }
>
> The easiest way to debug would be to put a breakpoint in
> MatCreateSubMatrices_MPIAIJ on MPI rank zero and next through the
> subroutine to see where the crazy number appears that gets passed down in
> the line if (scall == MAT_INITIAL_MATRIX) PetscCall(PetscCalloc1(ismax +
> nstages, submat)); where either ismax or nstages has a crazy value.
>
> If you are using the GNU compilers you can use the command line options
> -start_in_debugger noxterm -debugger_ranks 0 to start the Gnu debugger. If
> you are using the Microsoft Windows compilers you will need to use their
> debugger, I don't know how to do that (and shudder at the thought :-).
>
> On May 17, 2023, at 7:51 PM, Leonardo Mutti <
> leonardo.mutti01 at universitadipavia.it> wrote:
>
> Thanks for the reply. Even without Valgrind (which I can't use since I'm
> on Windows), by further simplifying the example, I was able to have PETSc
> display a more informative message.
> What I am doing wrong and what should be done differently, this is still
> unclear to me.
>
> The simplified code runs on 2 processors, I built a 4x4 matrix.
> The subdomains are now given by [0,1] and [2,3], with [2,3] inflating to
> [0,1,2,3].
>
> Thank you again.
>
> Error:
>
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Out of memory. This could be due to allocating
> [0]PETSC ERROR: too large an object or bleeding by not properly
> [0]PETSC ERROR: destroying unneeded objects.
> [0]PETSC ERROR: Memory allocated 0 Memory used by process 0
> [0]PETSC ERROR: Try running with -malloc_dump or -malloc_view for info.
> [0]PETSC ERROR: Memory requested 9437902811936987136
> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
> be the program crashed before they were used or a spelling mistake, etc!
> [0]PETSC ERROR: Option left: name:-pc_gasm_view_subdomains value: 1
> source: code
> [0]PETSC ERROR: Option left: name:-sub_ksp_type value: gmres source: code
> [0]PETSC ERROR: Option left: name:-sub_pc_type value: none source: code
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.19.1-234-g43977f8d16
> GIT Date: 2023-05-08 14:50:03 +0000
> [...]
> [0]PETSC ERROR: #1 PetscMallocAlign() at
> ...\Sources\Git\Test-FV\PETSC-~1\src\sys\memory\mal.c:66
> [0]PETSC ERROR: #2 PetscMallocA() at
> ...\Sources\Git\Test-FV\PETSC-~1\src\sys\memory\mal.c:411
> [0]PETSC ERROR: #3 MatCreateSubMatrices_MPIAIJ() at
> ...\Sources\Git\Test-FV\PETSC-~1\src\mat\impls\aij\mpi\mpiov.c:2025
> [0]PETSC ERROR: #4 MatCreateSubMatricesMPI_MPIXAIJ() at
> ...\Sources\Git\Test-FV\PETSC-~1\src\mat\impls\aij\mpi\mpiov.c:3136
> [0]PETSC ERROR: #5 MatCreateSubMatricesMPI_MPIAIJ() at
> ...\Sources\Git\Test-FV\PETSC-~1\src\mat\impls\aij\mpi\mpiov.c:3208
> [0]PETSC ERROR: #6 MatCreateSubMatricesMPI() at
> ...\Sources\Git\Test-FV\PETSC-~1\src\mat\INTERF~1\matrix.c:7071
> [0]PETSC ERROR: #7 PCSetUp_GASM() at
> ...\Sources\Git\Test-FV\PETSC-~1\src\ksp\pc\impls\gasm\gasm.c:556
> [0]PETSC ERROR: #8 PCSetUp() at
> ...\Sources\Git\Test-FV\PETSC-~1\src\ksp\pc\INTERF~1\precon.c:994
> [0]PETSC ERROR: #9 KSPSetUp() at
> ...\Sources\Git\Test-FV\PETSC-~1\src\ksp\ksp\INTERF~1\itfunc.c:406
>
>
> Code, the important bit is after ! GASM, SETTING SUBDOMAINS:
>
> Mat :: A
> Vec :: b
> PetscInt :: M,N_blocks,block_size,NSub,I
> PetscErrorCode :: ierr
> PetscScalar :: v
> KSP :: ksp
> PC :: pc
> IS :: subdomains_IS(2), inflated_IS(2)
> PetscInt :: NMPI,MYRANK,IERMPI
> INTEGER :: start
>
> call PetscInitialize(PETSC_NULL_CHARACTER, ierr)
> call PetscLogDefaultBegin(ierr)
> call MPI_COMM_SIZE(MPI_COMM_WORLD, NMPI, IERMPI)
> CALL MPI_COMM_RANK(MPI_COMM_WORLD, MYRANK,IERMPI)
>
> N_blocks = 2
> block_size = 2
> M = N_blocks * block_size
>
>
> ! INTRO: create matrix and right hand side, create IS
>
> call MatCreateAIJ(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
> & M, M, PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
> & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,A, ierr)
> call VecCreate(PETSC_COMM_WORLD,b,ierr)
> call VecSetSizes(b, PETSC_DECIDE, M,ierr)
> call VecSetFromOptions(b,ierr)
>
> DO I=(MYRANK*(M/NMPI)),((MYRANK+1)*(M/NMPI)-1)
>
> ! Set matrix
> v=1
> call MatSetValue(A, I, I, v, INSERT_VALUES, ierr)
> IF (I-block_size .GE. 0) THEN
> v=-1
> call MatSetValue(A, I, I-block_size, v, INSERT_VALUES, ierr)
> ENDIF
> ! Set rhs
> v = I
> call VecSetValue(b,I,v, INSERT_VALUES,ierr)
>
> END DO
>
> call MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY, ierr)
> call MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY, ierr)
> call VecAssemblyBegin(b,ierr)
> call VecAssemblyEnd(b,ierr)
>
> ! FIRST KSP/PC SETUP
>
> call KSPCreate(PETSC_COMM_WORLD, ksp, ierr)
> call KSPSetOperators(ksp, A, A, ierr)
> call KSPSetType(ksp, 'preonly', ierr)
> call KSPGetPC(ksp, pc, ierr)
> call PCSetType(pc, PCGASM, ierr)
>
>
> ! GASM, SETTING SUBDOMAINS
>
> if (myrank == 0) then
> call ISCreateStride(PETSC_COMM_SELF, 2, 0, 1, subdomains_IS(1),
> ierr)
> call ISCreateStride(PETSC_COMM_WORLD, 0, 0, 1, subdomains_IS(2),
> ierr)
> call ISCreateStride(PETSC_COMM_SELF, 2, 0, 1, inflated_IS(1),
> ierr)
> call ISCreateStride(PETSC_COMM_WORLD, 2, 0, 1, inflated_IS(2),
> ierr)
> start = 1
> NSub = 2
> else
> call ISCreateStride(PETSC_COMM_WORLD, 2, 2, 1, subdomains_IS(2),
> ierr)
> call ISCreateStride(PETSC_COMM_WORLD, 2, 2, 1, inflated_IS(2),
> ierr)
> start = 2
> NSub = 1
> endif
>
> call
> PCGASMSetSubdomains(pc,NSub,subdomains_IS(start:2),inflated_IS(start:2),ierr)
> call
> PCGASMDestroySubdomains(NSub,subdomains_IS(start:2),inflated_IS(start:2),ierr)
>
> ! GASM: SET SUBSOLVERS
>
> call PetscOptionsSetValue(PETSC_NULL_OPTIONS,"-sub_ksp_type",
> "gmres", ierr)
> call PetscOptionsSetValue(PETSC_NULL_OPTIONS,"-sub_pc_type", "none",
> ierr)
> call
> PetscOptionsSetValue(PETSC_NULL_OPTIONS,"-pc_gasm_view_subdomains", "1",
> ierr)
>
> call KSPSetUp(ksp, ierr)
> call PCSetUp(pc, ierr)
> call KSPSetFromOptions(ksp, ierr)
> call PCSetFromOptions(pc, ierr)
> call KSPView(ksp,PETSC_VIEWER_STDOUT_WORLD, ierr)
>
> call MatDestroy(A, ierr)
> call PetscFinalize(ierr)
>
>
> Il giorno mer 17 mag 2023 alle ore 21:55 Barry Smith <bsmith at petsc.dev>
> ha scritto:
>
>>
>>
>> On May 17, 2023, at 11:10 AM, Leonardo Mutti <
>> leonardo.mutti01 at universitadipavia.it> wrote:
>>
>> Dear developers, let me kindly ask for your help again.
>> In the following snippet, a bi-diagonal matrix A is set up. It measures
>> 8x8 blocks, each block is 2x2 elements. I would like to create the
>> correct IS objects for PCGASM.
>> The non-overlapping IS should be: [*0,1*], [*2,3*],[*4,5*], ..., [*14,15*
>> ]. The overlapping IS should be: [*0,1*], [0,1,*2,3*], [2,3,*4,5*], ...,
>> [12,13,*14,15*]
>> I am running the code with 4 processors. For some reason, after calling PCGASMDestroySubdomains
>> the code crashes with severe (157): Program Exception - access violation.
>> A visual inspection of the indices using ISView looks good.
>>
>>
>> Likely memory corruption or use of an object or an array that was
>> already freed. Best to use Valgrind to find the exact location of the mess.
>>
>>
>> Thanks again,
>> Leonardo
>>
>> Mat :: A
>> Vec :: b
>> PetscInt ::
>> M,N_blocks,block_size,I,J,NSub,converged_reason,srank,erank,color,subcomm
>> PetscMPIInt :: size
>> PetscErrorCode :: ierr
>> PetscScalar :: v
>> KSP :: ksp
>> PC :: pc
>> IS,ALLOCATABLE :: subdomains_IS(:), inflated_IS(:)
>> PetscInt :: NMPI,MYRANK,IERMPI
>> INTEGER :: IS_counter, is_start, is_end
>>
>> call PetscInitialize(PETSC_NULL_CHARACTER, ierr)
>> call PetscLogDefaultBegin(ierr)
>> call MPI_COMM_SIZE(MPI_COMM_WORLD, NMPI, IERMPI)
>> CALL MPI_COMM_RANK(MPI_COMM_WORLD, MYRANK,IERMPI)
>>
>> N_blocks = 8
>> block_size = 2
>> M = N_blocks * block_size
>>
>> ALLOCATE(subdomains_IS(N_blocks))
>> ALLOCATE(inflated_IS(N_blocks))
>>
>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>> ! ASSUMPTION: no block spans more than one rank (the inflated
>> blocks can)
>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>
>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>> ! INTRO: create matrix and right hand side
>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>
>> ! How many inflated blocks span more than one rank? NMPI-1 !
>>
>> call MatCreateAIJ(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
>> & M, M, PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>> & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,A, ierr)
>> call VecCreate(PETSC_COMM_WORLD,b,ierr)
>> call VecSetSizes(b, PETSC_DECIDE, M,ierr)
>> call VecSetFromOptions(b,ierr)
>>
>> DO I=(MYRANK*(M/NMPI)),((MYRANK+1)*(M/NMPI)-1)
>>
>> ! Set matrix
>> v=1
>> call MatSetValue(A, I, I, v, INSERT_VALUES, ierr)
>> IF (I-block_size .GE. 0) THEN
>> v=-1
>> call MatSetValue(A, I, I-block_size, v, INSERT_VALUES, ierr)
>> ENDIF
>> ! Set rhs
>> v = I
>> call VecSetValue(b,I,v, INSERT_VALUES,ierr)
>>
>> END DO
>>
>> call MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY, ierr)
>> call MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY, ierr)
>> call VecAssemblyBegin(b,ierr)
>> call VecAssemblyEnd(b,ierr)
>>
>>
>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>> ! FIRST KSP/PC SETUP
>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>
>> call KSPCreate(PETSC_COMM_WORLD, ksp, ierr)
>> call KSPSetOperators(ksp, A, A, ierr)
>> call KSPSetType(ksp, 'preonly', ierr)
>> call KSPGetPC(ksp, pc, ierr)
>> call PCSetType(pc, PCGASM, ierr)
>>
>>
>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>> !! GASM, SETTING SUBDOMAINS
>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>
>> DO IS_COUNTER=1,N_blocks
>>
>> srank = MAX(((IS_COUNTER-2)*block_size)/(M/NMPI),0) ! start
>> rank reached by inflated block
>> erank = MIN(((IS_COUNTER-1)*block_size)/(M/NMPI),NMPI-1) ! end
>> rank reached by inflated block. Coincides with rank containing non-inflated
>> block
>>
>> ! Create subcomms
>> color = MPI_UNDEFINED
>> IF (myrank == srank .or. myrank == erank) THEN
>> color = 1
>> ENDIF
>> call MPI_Comm_split(MPI_COMM_WORLD,color,MYRANK,subcomm,ierr)
>>
>>
>> ! Create IS
>> IF (srank .EQ. erank) THEN ! Block and overlap are on the
>> same rank
>> IF (MYRANK .EQ. srank) THEN
>> call
>> ISCreateStride(PETSC_COMM_SELF,block_size,(IS_COUNTER-1)*block_size,1,subdomains_IS(IS_COUNTER),ierr)
>> IF (IS_COUNTER .EQ. 1) THEN ! the first block is not
>> inflated
>> call
>> ISCreateStride(PETSC_COMM_SELF,block_size,(IS_COUNTER-1)*block_size,1,inflated_IS(IS_COUNTER),ierr)
>> ELSE
>> call
>> ISCreateStride(PETSC_COMM_SELF,2*block_size,(IS_COUNTER-2)*block_size,1,inflated_IS(IS_COUNTER),ierr)
>> ENDIF
>> ENDIF
>> else ! Block and overlap not on the same rank
>> if (myrank == erank) then ! the block
>> call ISCreateStride
>> (subcomm,block_size,(IS_COUNTER-1)*block_size,1,subdomains_IS(IS_COUNTER),ierr)
>> call ISCreateStride
>> (subcomm,block_size,(IS_COUNTER-1)*block_size,1,inflated_IS(IS_COUNTER),ierr)
>> endif
>> if (myrank == srank) then ! the overlap
>> call ISCreateStride
>> (subcomm,block_size,(IS_COUNTER-2)*block_size,1,inflated_IS(IS_COUNTER),ierr)
>> call ISCreateStride
>> (subcomm,0,(IS_COUNTER-1)*block_size,1,subdomains_IS(IS_COUNTER),ierr)
>> endif
>> endif
>>
>> call MPI_Comm_free(subcomm, ierr)
>> END DO
>>
>> ! Set the domains/subdomains
>> NSub = N_blocks/NMPI
>> is_start = 1 + myrank * NSub
>> is_end = min(is_start + NSub, N_blocks)
>> if (myrank + 1 < NMPI) then
>> NSub = NSub + 1
>> endif
>>
>> call
>> PCGASMSetSubdomains(pc,NSub,subdomains_IS(is_start:is_end),inflated_IS(is_start:is_end),ierr)
>> call
>> PCGASMDestroySubdomains(NSub,subdomains_IS(is_start:is_end),inflated_IS(is_start:is_end),ierr)
>>
>> call PetscOptionsSetValue(PETSC_NULL_OPTIONS,"-sub_ksp_type",
>> "gmres", ierr)
>> call PetscOptionsSetValue(PETSC_NULL_OPTIONS,"-sub_pc_type",
>> "none", ierr)
>> call
>> PetscOptionsSetValue(PETSC_NULL_OPTIONS,"-pc_gasm_view_subdomains", "1",
>> ierr)
>>
>> call KSPSetUp(ksp, ierr)
>> call PCSetUp(pc, ierr)
>> call KSPSetFromOptions(ksp, ierr)
>> call PCSetFromOptions(pc, ierr)
>>
>> call KSPView(ksp,PETSC_VIEWER_STDOUT_WORLD, ierr)
>>
>>
>> Il giorno mer 10 mag 2023 alle ore 03:02 Barry Smith <bsmith at petsc.dev>
>> ha scritto:
>>
>>>
>>>
>>> On May 9, 2023, at 4:58 PM, LEONARDO MUTTI <
>>> leonardo.mutti01 at universitadipavia.it> wrote:
>>>
>>> In my notation diag(1,1) means a diagonal 2x2 matrix with 1,1 on the
>>> diagonal, submatrix in the 8x8 diagonal matrix diag(1,1,2,2,...,2).
>>> Am I then correct that the IS representing diag(1,1) is 0,1, and that
>>> diag(2,2,...,2) is represented by 2,3,4,5,6,7?
>>>
>>>
>>> I believe so
>>>
>>> Thanks,
>>> Leonardo
>>>
>>> Il mar 9 mag 2023, 20:45 Barry Smith <bsmith at petsc.dev> ha scritto:
>>>
>>>>
>>>> It is simplier than you are making it out to be. Each IS[] is a list
>>>> of rows (and columns) in the sub (domain) matrix. In your case with the
>>>> matrix of 144 by 144 the indices will go from 0 to 143.
>>>>
>>>> In your simple Fortran code you have a completely different problem.
>>>> A matrix with 8 rows and columns. In that case if you want the first IS to
>>>> represent just the first row (and column) in the matrix then it should
>>>> contain only 0. The second submatrix which is all rows (but the first)
>>>> should have 1,2,3,4,5,6,7
>>>>
>>>> I do not understand why your code has
>>>>
>>>> indices_first_domain = [0,1,8,9] ! corresponds to diag(1,1)
>>>>>>>
>>>>>>
>>>> it should just be 0
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On May 9, 2023, at 12:44 PM, LEONARDO MUTTI <
>>>> leonardo.mutti01 at universitadipavia.it> wrote:
>>>>
>>>> Partial typo: I expect 9x(16+16) numbers to be stored in subdomain_IS :
>>>> # subdomains x (row indices of the submatrix + col indices of the
>>>> submatrix).
>>>>
>>>> Il giorno mar 9 mag 2023 alle ore 18:31 LEONARDO MUTTI <
>>>> leonardo.mutti01 at universitadipavia.it> ha scritto:
>>>>
>>>>>
>>>>>
>>>>> ---------- Forwarded message ---------
>>>>> Da: LEONARDO MUTTI <leonardo.mutti01 at universitadipavia.it>
>>>>> Date: mar 9 mag 2023 alle ore 18:29
>>>>> Subject: Re: [petsc-users] Understanding index sets for PCGASM
>>>>> To: Matthew Knepley <knepley at gmail.com>
>>>>>
>>>>>
>>>>> Thank you for your answer, but I am still confused, sorry.
>>>>> Consider
>>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/ex71f.F90 on
>>>>> one processor.
>>>>> Let M=12 for the sake of simplicity, i.e. we deal with a 12x12 2D
>>>>> grid, hence, a 144x144 matrix.
>>>>> Let NSubx = 3, so that on the grid we do 3 vertical and 3 horizontal
>>>>> subdivisions.
>>>>> We should obtain 9 subdomains that are grids of 4x4 nodes each, thus
>>>>> corresponding to 9 submatrices of size 16x16.
>>>>> In my run I obtain NSub = 9 (great) and subdomain_IS(i), i=1,...,9,
>>>>> reads:
>>>>>
>>>>> *IS Object: 1 MPI process*
>>>>> * type: general*
>>>>> *Number of indices in set 16*
>>>>> *0 0*
>>>>> *1 1*
>>>>> *2 2*
>>>>> *3 3*
>>>>> *4 12*
>>>>> *5 13*
>>>>> *6 14*
>>>>> *7 15*
>>>>> *8 24*
>>>>> *9 25*
>>>>> *10 26*
>>>>> *11 27*
>>>>> *12 36*
>>>>> *13 37*
>>>>> *14 38*
>>>>> *15 39*
>>>>> *IS Object: 1 MPI process*
>>>>> * type: general*
>>>>> *Number of indices in set 16*
>>>>> *0 4*
>>>>> *1 5*
>>>>> *2 6*
>>>>> *3 7*
>>>>> *4 16*
>>>>> *5 17*
>>>>> *6 18*
>>>>> *7 19*
>>>>> *8 28*
>>>>> *9 29*
>>>>> *10 30*
>>>>> *11 31*
>>>>> *12 40*
>>>>> *13 41*
>>>>> *14 42*
>>>>> *15 43*
>>>>> *IS Object: 1 MPI process*
>>>>> * type: general*
>>>>> *Number of indices in set 16*
>>>>> *0 8*
>>>>> *1 9*
>>>>> *2 10*
>>>>> *3 11*
>>>>> *4 20*
>>>>> *5 21*
>>>>> *6 22*
>>>>> *7 23*
>>>>> *8 32*
>>>>> *9 33*
>>>>> *10 34*
>>>>> *11 35*
>>>>> *12 44*
>>>>> *13 45*
>>>>> *14 46*
>>>>> *15 47*
>>>>> *IS Object: 1 MPI process*
>>>>> * type: general*
>>>>> *Number of indices in set 16*
>>>>> *0 48*
>>>>> *1 49*
>>>>> *2 50*
>>>>> *3 51*
>>>>> *4 60*
>>>>> *5 61*
>>>>> *6 62*
>>>>> *7 63*
>>>>> *8 72*
>>>>> *9 73*
>>>>> *10 74*
>>>>> *11 75*
>>>>> *12 84*
>>>>> *13 85*
>>>>> *14 86*
>>>>> *15 87*
>>>>> *IS Object: 1 MPI process*
>>>>> * type: general*
>>>>> *Number of indices in set 16*
>>>>> *0 52*
>>>>> *1 53*
>>>>> *2 54*
>>>>> *3 55*
>>>>> *4 64*
>>>>> *5 65*
>>>>> *6 66*
>>>>> *7 67*
>>>>> *8 76*
>>>>> *9 77*
>>>>> *10 78*
>>>>> *11 79*
>>>>> *12 88*
>>>>> *13 89*
>>>>> *14 90*
>>>>> *15 91*
>>>>> *IS Object: 1 MPI process*
>>>>> * type: general*
>>>>> *Number of indices in set 16*
>>>>> *0 56*
>>>>> *1 57*
>>>>> *2 58*
>>>>> *3 59*
>>>>> *4 68*
>>>>> *5 69*
>>>>> *6 70*
>>>>> *7 71*
>>>>> *8 80*
>>>>> *9 81*
>>>>> *10 82*
>>>>> *11 83*
>>>>> *12 92*
>>>>> *13 93*
>>>>> *14 94*
>>>>> *15 95*
>>>>> *IS Object: 1 MPI process*
>>>>> * type: general*
>>>>> *Number of indices in set 16*
>>>>> *0 96*
>>>>> *1 97*
>>>>> *2 98*
>>>>> *3 99*
>>>>> *4 108*
>>>>> *5 109*
>>>>> *6 110*
>>>>> *7 111*
>>>>> *8 120*
>>>>> *9 121*
>>>>> *10 122*
>>>>> *11 123*
>>>>> *12 132*
>>>>> *13 133*
>>>>> *14 134*
>>>>> *15 135*
>>>>> *IS Object: 1 MPI process*
>>>>> * type: general*
>>>>> *Number of indices in set 16*
>>>>> *0 100*
>>>>> *1 101*
>>>>> *2 102*
>>>>> *3 103*
>>>>> *4 112*
>>>>> *5 113*
>>>>> *6 114*
>>>>> *7 115*
>>>>> *8 124*
>>>>> *9 125*
>>>>> *10 126*
>>>>> *11 127*
>>>>> *12 136*
>>>>> *13 137*
>>>>> *14 138*
>>>>> *15 139*
>>>>> *IS Object: 1 MPI process*
>>>>> * type: general*
>>>>> *Number of indices in set 16*
>>>>> *0 104*
>>>>> *1 105*
>>>>> *2 106*
>>>>> *3 107*
>>>>> *4 116*
>>>>> *5 117*
>>>>> *6 118*
>>>>> *7 119*
>>>>> *8 128*
>>>>> *9 129*
>>>>> *10 130*
>>>>> *11 131*
>>>>> *12 140*
>>>>> *13 141*
>>>>> *14 142*
>>>>> *15 143*
>>>>>
>>>>> As you said, no number here reaches 144.
>>>>> But the number stored in subdomain_IS are 9x16= #subdomains x 16,
>>>>> whereas I would expect, also given your latest reply, 9x16x16x2=#subdomains
>>>>> x submatrix height x submatrix width x length of a (row,column) pair.
>>>>> It would really help me if you could briefly explain how the output
>>>>> above encodes the subdivision into subdomains.
>>>>> Many thanks again,
>>>>> Leonardo
>>>>>
>>>>>
>>>>>
>>>>> Il giorno mar 9 mag 2023 alle ore 16:24 Matthew Knepley <
>>>>> knepley at gmail.com> ha scritto:
>>>>>
>>>>>> On Tue, May 9, 2023 at 10:05 AM LEONARDO MUTTI <
>>>>>> leonardo.mutti01 at universitadipavia.it> wrote:
>>>>>>
>>>>>>> Great thanks! I can now successfully run
>>>>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/ex71f.F90
>>>>>>> .
>>>>>>>
>>>>>>> Going forward with my experiments, let me post a new code snippet
>>>>>>> (very similar to ex71f.F90) that I cannot get to work, probably I must be
>>>>>>> setting up the IS objects incorrectly.
>>>>>>>
>>>>>>> I have an 8x8 matrix A=diag(1,1,2,2,...,2) and a
>>>>>>> vector b=(0.5,...,0.5). We have only one processor, and I want to solve
>>>>>>> Ax=b using GASM. In particular, KSP is set to preonly, GASM is the
>>>>>>> preconditioner and it uses on each submatrix an lu direct solver (sub_ksp =
>>>>>>> preonly, sub_pc = lu).
>>>>>>>
>>>>>>> For the GASM algorithm, I divide A into diag(1,1) and
>>>>>>> diag(2,2,...,2). For simplicity I set 0 overlap. Now I want to use GASM to
>>>>>>> solve Ax=b. The code follows.
>>>>>>>
>>>>>>> #include <petsc/finclude/petscmat.h>
>>>>>>> #include <petsc/finclude/petscksp.h>
>>>>>>> #include <petsc/finclude/petscpc.h>
>>>>>>> USE petscmat
>>>>>>> USE petscksp
>>>>>>> USE petscpc
>>>>>>> USE MPI
>>>>>>>
>>>>>>> Mat :: A
>>>>>>> Vec :: b, x
>>>>>>> PetscInt :: M, I, J, ISLen, NSub
>>>>>>> PetscMPIInt :: size
>>>>>>> PetscErrorCode :: ierr
>>>>>>> PetscScalar :: v
>>>>>>> KSP :: ksp
>>>>>>> PC :: pc
>>>>>>> IS :: subdomains_IS(2), inflated_IS(2)
>>>>>>> PetscInt,DIMENSION(4) :: indices_first_domain
>>>>>>> PetscInt,DIMENSION(36) :: indices_second_domain
>>>>>>>
>>>>>>> call PetscInitialize(PETSC_NULL_CHARACTER, ierr)
>>>>>>> call MPI_Comm_size(PETSC_COMM_WORLD, size, ierr)
>>>>>>>
>>>>>>>
>>>>>>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>> ! INTRO: create matrix and right hand side
>>>>>>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>
>>>>>>> WRITE(*,*) "Assembling A,b"
>>>>>>>
>>>>>>> M = 8
>>>>>>> call MatCreateAIJ(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
>>>>>>> & M, M, PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>>>>> & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,A, ierr)
>>>>>>> DO I=1,M
>>>>>>> DO J=1,M
>>>>>>> IF ((I .EQ. J) .AND. (I .LE. 2 )) THEN
>>>>>>> v = 1
>>>>>>> ELSE IF ((I .EQ. J) .AND. (I .GT. 2 )) THEN
>>>>>>> v = 2
>>>>>>> ELSE
>>>>>>> v = 0
>>>>>>> ENDIF
>>>>>>> call MatSetValue(A, I-1, J-1, v, INSERT_VALUES, ierr)
>>>>>>> END DO
>>>>>>> END DO
>>>>>>>
>>>>>>> call MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY, ierr)
>>>>>>> call MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY, ierr)
>>>>>>>
>>>>>>> call VecCreate(PETSC_COMM_WORLD,b,ierr)
>>>>>>> call VecSetSizes(b, PETSC_DECIDE, M,ierr)
>>>>>>> call VecSetFromOptions(b,ierr)
>>>>>>>
>>>>>>> do I=1,M
>>>>>>> v = 0.5
>>>>>>> call VecSetValue(b,I-1,v, INSERT_VALUES,ierr)
>>>>>>> end do
>>>>>>>
>>>>>>> call VecAssemblyBegin(b,ierr)
>>>>>>> call VecAssemblyEnd(b,ierr)
>>>>>>>
>>>>>>>
>>>>>>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>> ! FIRST KSP/PC SETUP
>>>>>>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>
>>>>>>> WRITE(*,*) "KSP/PC first setup"
>>>>>>>
>>>>>>> call KSPCreate(PETSC_COMM_WORLD, ksp, ierr)
>>>>>>> call KSPSetOperators(ksp, A, A, ierr)
>>>>>>> call KSPSetType(ksp, 'preonly', ierr)
>>>>>>> call KSPGetPC(ksp, pc, ierr)
>>>>>>> call KSPSetUp(ksp, ierr)
>>>>>>> call PCSetType(pc, PCGASM, ierr)
>>>>>>>
>>>>>>>
>>>>>>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>> ! GASM, SETTING SUBDOMAINS
>>>>>>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>
>>>>>>> WRITE(*,*) "Setting GASM subdomains"
>>>>>>>
>>>>>>> ! Let's create the subdomain IS and inflated_IS
>>>>>>> ! They are equal if no overlap is present
>>>>>>> ! They are 1: 0,1,8,9
>>>>>>> ! 2: 10,...,15,18,...,23,...,58,...,63
>>>>>>>
>>>>>>> indices_first_domain = [0,1,8,9] ! corresponds to diag(1,1)
>>>>>>> do I=0,5
>>>>>>> do J=0,5
>>>>>>> indices_second_domain(I*6+1+J) = 18 + J + 8*I !
>>>>>>> corresponds to diag(2,2,...,2)
>>>>>>> !WRITE(*,*) I*6+1+J, 18 + J + 8*I
>>>>>>> end do
>>>>>>> end do
>>>>>>>
>>>>>>> ! Convert into IS
>>>>>>> ISLen = 4
>>>>>>> call
>>>>>>> ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_first_domain,
>>>>>>> & PETSC_COPY_VALUES, subdomains_IS(1), ierr)
>>>>>>> call
>>>>>>> ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_first_domain,
>>>>>>> & PETSC_COPY_VALUES, inflated_IS(1), ierr)
>>>>>>> ISLen = 36
>>>>>>> call
>>>>>>> ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_second_domain,
>>>>>>> & PETSC_COPY_VALUES, subdomains_IS(2), ierr)
>>>>>>> call
>>>>>>> ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_second_domain,
>>>>>>> & PETSC_COPY_VALUES, inflated_IS(2), ierr)
>>>>>>>
>>>>>>> NSub = 2
>>>>>>> call PCGASMSetSubdomains(pc,NSub,
>>>>>>> & subdomains_IS,inflated_IS,ierr)
>>>>>>> call PCGASMDestroySubdomains(NSub,
>>>>>>> & subdomains_IS,inflated_IS,ierr)
>>>>>>>
>>>>>>>
>>>>>>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>> ! GASM: SET SUBSOLVERS
>>>>>>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>
>>>>>>> WRITE(*,*) "Setting subsolvers for GASM"
>>>>>>>
>>>>>>> call PCSetUp(pc, ierr) ! should I add this?
>>>>>>>
>>>>>>> call PetscOptionsSetValue(PETSC_NULL_OPTIONS,
>>>>>>> & "-sub_pc_type", "lu", ierr)
>>>>>>> call PetscOptionsSetValue(PETSC_NULL_OPTIONS,
>>>>>>> & "-sub_ksp_type", "preonly", ierr)
>>>>>>>
>>>>>>> call KSPSetFromOptions(ksp, ierr)
>>>>>>> call PCSetFromOptions(pc, ierr)
>>>>>>>
>>>>>>>
>>>>>>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>> ! DUMMY SOLUTION: DID IT WORK?
>>>>>>> !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>
>>>>>>> WRITE(*,*) "Solve"
>>>>>>>
>>>>>>> call VecDuplicate(b,x,ierr)
>>>>>>> call KSPSolve(ksp,b,x,ierr)
>>>>>>>
>>>>>>> call MatDestroy(A, ierr)
>>>>>>> call KSPDestroy(ksp, ierr)
>>>>>>> call PetscFinalize(ierr)
>>>>>>>
>>>>>>> This code is failing in multiple points. At call PCSetUp(pc, ierr)
>>>>>>> it produces:
>>>>>>>
>>>>>>> *[0]PETSC ERROR: Argument out of range*
>>>>>>> *[0]PETSC ERROR: Scatter indices in ix are out of range*
>>>>>>> *...*
>>>>>>> *[0]PETSC ERROR: #1 VecScatterCreate() at
>>>>>>> ***\src\vec\is\sf\INTERF~1\vscat.c:736*
>>>>>>> *[0]PETSC ERROR: #2 PCSetUp_GASM() at
>>>>>>> ***\src\ksp\pc\impls\gasm\gasm.c:433*
>>>>>>> *[0]PETSC ERROR: #3 PCSetUp() at
>>>>>>> ***\src\ksp\pc\INTERF~1\precon.c:994*
>>>>>>>
>>>>>>> And at call KSPSolve(ksp,b,x,ierr) it produces:
>>>>>>>
>>>>>>> *forrtl: severe (157): Program Exception - access violation*
>>>>>>>
>>>>>>>
>>>>>>> The index sets are setup coherently with the outputs of e.g.
>>>>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/output/ex71f_1.out:
>>>>>>> in particular each element of the matrix A corresponds to a number from 0
>>>>>>> to 63.
>>>>>>>
>>>>>>
>>>>>> This is not correct, I believe. The indices are row/col indices, not
>>>>>> indices into dense blocks, so for
>>>>>> your example, they are all in [0, 8].
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Matt
>>>>>>
>>>>>>
>>>>>>> Note that each submatrix does not represent some physical subdomain,
>>>>>>> the subdivision is just at the algebraic level.
>>>>>>> I thus have the following questions:
>>>>>>>
>>>>>>> - is this the correct way of creating the IS objects, given my
>>>>>>> objective at the beginning of the email? Is the ordering correct?
>>>>>>> - what am I doing wrong that is generating the above errors?
>>>>>>>
>>>>>>> Thanks for the patience and the time.
>>>>>>> Best,
>>>>>>> Leonardo
>>>>>>>
>>>>>>> Il giorno ven 5 mag 2023 alle ore 18:43 Barry Smith <
>>>>>>> bsmith at petsc.dev> ha scritto:
>>>>>>>
>>>>>>>>
>>>>>>>> Added in *barry/2023-05-04/add-pcgasm-set-subdomains *see also
>>>>>>>> https://gitlab.com/petsc/petsc/-/merge_requests/6419
>>>>>>>>
>>>>>>>> Barry
>>>>>>>>
>>>>>>>>
>>>>>>>> On May 4, 2023, at 11:23 AM, LEONARDO MUTTI <
>>>>>>>> leonardo.mutti01 at universitadipavia.it> wrote:
>>>>>>>>
>>>>>>>> Thank you for the help.
>>>>>>>> Adding to my example:
>>>>>>>>
>>>>>>>>
>>>>>>>> * call PCGASMSetSubdomains(pc,NSub, subdomains_IS,
>>>>>>>> inflated_IS,ierr) call
>>>>>>>> PCGASMDestroySubdomains(NSub,subdomains_IS,inflated_IS,ierr)*
>>>>>>>> results in:
>>>>>>>>
>>>>>>>> * Error LNK2019 unresolved external symbol
>>>>>>>> PCGASMDESTROYSUBDOMAINS referenced in function ... *
>>>>>>>>
>>>>>>>> * Error LNK2019 unresolved external symbol PCGASMSETSUBDOMAINS
>>>>>>>> referenced in function ... *
>>>>>>>> I'm not sure if the interfaces are missing or if I have a
>>>>>>>> compilation problem.
>>>>>>>> Thank you again.
>>>>>>>> Best,
>>>>>>>> Leonardo
>>>>>>>>
>>>>>>>> Il giorno sab 29 apr 2023 alle ore 20:30 Barry Smith <
>>>>>>>> bsmith at petsc.dev> ha scritto:
>>>>>>>>
>>>>>>>>>
>>>>>>>>> Thank you for the test code. I have a fix in the branch
>>>>>>>>> barry/2023-04-29/fix-pcasmcreatesubdomains2d
>>>>>>>>> <https://gitlab.com/petsc/petsc/-/commits/barry/2023-04-29/fix-pcasmcreatesubdomains2d> with
>>>>>>>>> merge request https://gitlab.com/petsc/petsc/-/merge_requests/6394
>>>>>>>>>
>>>>>>>>> The functions did not have proper Fortran stubs and interfaces
>>>>>>>>> so I had to provide them manually in the new branch.
>>>>>>>>>
>>>>>>>>> Use
>>>>>>>>>
>>>>>>>>> git fetch
>>>>>>>>> git checkout barry/2023-04-29/fix-pcasmcreatesubdomains2d
>>>>>>>>> <https://gitlab.com/petsc/petsc/-/commits/barry/2023-04-29/fix-pcasmcreatesubdomains2d>
>>>>>>>>> ./configure etc
>>>>>>>>>
>>>>>>>>> Your now working test code is in src/ksp/ksp/tests/ex71f.F90 I
>>>>>>>>> had to change things slightly and I updated the error handling for the
>>>>>>>>> latest version.
>>>>>>>>>
>>>>>>>>> Please let us know if you have any later questions.
>>>>>>>>>
>>>>>>>>> Barry
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Apr 28, 2023, at 12:07 PM, LEONARDO MUTTI <
>>>>>>>>> leonardo.mutti01 at universitadipavia.it> wrote:
>>>>>>>>>
>>>>>>>>> Hello. I am having a hard time understanding the index sets to
>>>>>>>>> feed PCGASMSetSubdomains, and I am working in Fortran (as a PETSc novice).
>>>>>>>>> To get more intuition on how the IS objects behave I tried the following
>>>>>>>>> minimal (non) working example, which should tile a 16x16 matrix into 16
>>>>>>>>> square, non-overlapping submatrices:
>>>>>>>>>
>>>>>>>>> #include <petsc/finclude/petscmat.h>
>>>>>>>>> #include <petsc/finclude/petscksp.h>
>>>>>>>>> #include <petsc/finclude/petscpc.h>
>>>>>>>>> USE petscmat
>>>>>>>>> USE petscksp
>>>>>>>>> USE petscpc
>>>>>>>>>
>>>>>>>>> Mat :: A
>>>>>>>>> PetscInt :: M, NSubx, dof, overlap, NSub
>>>>>>>>> INTEGER :: I,J
>>>>>>>>> PetscErrorCode :: ierr
>>>>>>>>> PetscScalar :: v
>>>>>>>>> KSP :: ksp
>>>>>>>>> PC :: pc
>>>>>>>>> IS :: subdomains_IS, inflated_IS
>>>>>>>>>
>>>>>>>>> call PetscInitialize(PETSC_NULL_CHARACTER , ierr)
>>>>>>>>>
>>>>>>>>> !-----Create a dummy matrix
>>>>>>>>> M = 16
>>>>>>>>> call MatCreateAIJ(MPI_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
>>>>>>>>> & M, M,
>>>>>>>>> & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>>>>>>> & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>>>>>>> & A, ierr)
>>>>>>>>>
>>>>>>>>> DO I=1,M
>>>>>>>>> DO J=1,M
>>>>>>>>> v = I*J
>>>>>>>>> CALL MatSetValue (A,I-1,J-1,v,
>>>>>>>>> & INSERT_VALUES , ierr)
>>>>>>>>> END DO
>>>>>>>>> END DO
>>>>>>>>>
>>>>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY , ierr)
>>>>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY , ierr)
>>>>>>>>>
>>>>>>>>> !-----Create KSP and PC
>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp, ierr)
>>>>>>>>> call KSPSetOperators(ksp,A,A, ierr)
>>>>>>>>> call KSPSetType(ksp,"bcgs",ierr)
>>>>>>>>> call KSPGetPC(ksp,pc,ierr)
>>>>>>>>> call KSPSetUp(ksp, ierr)
>>>>>>>>> call PCSetType(pc,PCGASM, ierr)
>>>>>>>>> call PCSetUp(pc , ierr)
>>>>>>>>>
>>>>>>>>> !-----GASM setup
>>>>>>>>> NSubx = 4
>>>>>>>>> dof = 1
>>>>>>>>> overlap = 0
>>>>>>>>>
>>>>>>>>> call PCGASMCreateSubdomains2D(pc,
>>>>>>>>> & M, M,
>>>>>>>>> & NSubx, NSubx,
>>>>>>>>> & dof, overlap,
>>>>>>>>> & NSub, subdomains_IS, inflated_IS, ierr)
>>>>>>>>>
>>>>>>>>> call ISView(subdomains_IS, PETSC_VIEWER_STDOUT_WORLD, ierr)
>>>>>>>>>
>>>>>>>>> call KSPDestroy(ksp, ierr)
>>>>>>>>> call PetscFinalize(ierr)
>>>>>>>>>
>>>>>>>>> Running this on one processor, I get NSub = 4.
>>>>>>>>> If PCASM and PCASMCreateSubdomains2D are used instead, I get NSub
>>>>>>>>> = 16 as expected.
>>>>>>>>> Moreover, I get in the end "forrtl: severe (157): Program
>>>>>>>>> Exception - access violation". So:
>>>>>>>>> 1) why do I get two different results with ASM, and GASM?
>>>>>>>>> 2) why do I get access violation and how can I solve this?
>>>>>>>>> In fact, in C, subdomains_IS, inflated_IS should pointers to IS
>>>>>>>>> objects. As I see on the Fortran interface, the arguments to
>>>>>>>>> PCGASMCreateSubdomains2D are IS objects:
>>>>>>>>>
>>>>>>>>> subroutine PCGASMCreateSubdomains2D(a,b,c,d,e,f,g,h,i,j,z)
>>>>>>>>> import tPC,tIS
>>>>>>>>> PC a ! PC
>>>>>>>>> PetscInt b ! PetscInt
>>>>>>>>> PetscInt c ! PetscInt
>>>>>>>>> PetscInt d ! PetscInt
>>>>>>>>> PetscInt e ! PetscInt
>>>>>>>>> PetscInt f ! PetscInt
>>>>>>>>> PetscInt g ! PetscInt
>>>>>>>>> PetscInt h ! PetscInt
>>>>>>>>> IS i ! IS
>>>>>>>>> IS j ! IS
>>>>>>>>> PetscErrorCode z
>>>>>>>>> end subroutine PCGASMCreateSubdomains2D
>>>>>>>>> Thus:
>>>>>>>>> 3) what should be inside e.g., subdomains_IS? I expect it to
>>>>>>>>> contain, for every created subdomain, the list of rows and columns defining
>>>>>>>>> the subblock in the matrix, am I right?
>>>>>>>>>
>>>>>>>>> Context: I have a block-tridiagonal system arising from space-time
>>>>>>>>> finite elements, and I want to solve it with GMRES+PCGASM preconditioner,
>>>>>>>>> where each overlapping submatrix is on the diagonal and of size 3x3 blocks
>>>>>>>>> (and spanning multiple processes). This is PETSc 3.17.1 on Windows.
>>>>>>>>>
>>>>>>>>> Thanks in advance,
>>>>>>>>> Leonardo
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their
>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>> experiments lead.
>>>>>> -- Norbert Wiener
>>>>>>
>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>
>>>>>
>>>>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230519/b807f888/attachment-0001.html>
More information about the petsc-users
mailing list