[petsc-users] Mat created by DMStag cannot access ghost points
Barry Smith
bsmith at petsc.dev
Wed Jun 1 13:06:02 CDT 2022
This appears to be a bug in the DMStag/Mat preallocator code. If you add after the DMCreateMatrix() line in your code
PetscCall(MatSetOption(A, MAT_NO_OFF_PROC_ENTRIES, PETSC_FALSE));
Your code will run correctly.
Patrick and Matt,
MatPreallocatorPreallocate_Preallocator() has
PetscCall(MatSetOption(A, MAT_NO_OFF_PROC_ENTRIES, p->nooffproc));
to make the assembly of the stag matrix from the preallocator matrix a little faster,
but then it never "undoes" this call. Hence the matrix is left in the state where it will error if someone sets values from a different rank (which they certainly can using DMStagMatSetValuesStencil().
I think you need to clear the NO_OFF_PROC at the end of MatPreallocatorPreallocate_Preallocator() because just because the preallocation process never needed communication does not mean that when someone puts real values in the matrix they will never use communication; they can put in values any dang way they please.
I don't know why this bug has not come up before.
Barry
> On May 31, 2022, at 11:08 PM, Ye Changqing <Ye_Changqing at outlook.com> wrote:
>
> Dear all,
>
> [BugReport.c] is a sample code, [BugReportParallel.output] is the output when execute BugReport with mpiexec, [BugReportSerial.output] is the output in serial execution.
>
> Best,
> Changqing
>
> 发件人: Dave May <dave.mayhem23 at gmail.com <mailto:dave.mayhem23 at gmail.com>>
> 发送时间: 2022年5月31日 22:55
> 收件人: Ye Changqing <Ye_Changqing at outlook.com <mailto:Ye_Changqing at outlook.com>>
> 抄送: petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
> 主题: Re: [petsc-users] Mat created by DMStag cannot access ghost points
>
>
>
> On Tue 31. May 2022 at 16:28, Ye Changqing <Ye_Changqing at outlook.com <mailto:Ye_Changqing at outlook.com>> wrote:
> Dear developers of PETSc,
>
> I encountered a problem when using the DMStag module. The program could be executed perfectly in serial, while errors are thrown out in parallel (using mpiexec). Some rows in Mat cannot be accessed in local processes when looping all elements in DMStag. The DM object I used only has one DOF in each element. Hence, I could switch to the DMDA module easily, and the program now is back to normal.
>
> Some snippets are below.
>
> Initialise a DMStag object:
> PetscCall(DMStagCreate2d(PETSC_COMM_WORLD, DM_BOUNDARY_NONE, DM_BOUNDARY_NONE, M, N, PETSC_DECIDE, PETSC_DECIDE, 0, 0, 1, DMSTAG_STENCIL_BOX, 1, NULL, NULL, &(s_ctx->dm_P)));
> Created a Mat:
> PetscCall(DMCreateMatrix(s_ctx->dm_P, A));
> Loop:
> PetscCall(DMStagGetCorners(s_ctx->dm_V, &startx, &starty, &startz, &nx, &ny, &nz, &extrax, &extray, &extraz));
> for (ey = starty; ey < starty + ny; ++ey)
> for (ex = startx; ex < startx + nx; ++ex)
> {
> ...
> PetscCall(DMStagMatSetValuesStencil(s_ctx->dm_P, *A, 2, &row[0], 2, &col[0], &val_A[0][0], ADD_VALUES)); // The traceback shows the problem is in here.
> }
>
> In addition to the code or MWE, please forward us the complete stack trace / error thrown to stdout.
>
> Thanks,
> Dave
>
>
>
> Best,
> Changqing
>
> <BugReport.c><BugReportParallel.output><BugReportSerial.output>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220601/db0d9b26/attachment.html>
More information about the petsc-users
mailing list