[petsc-users] error with version 3.17.1

Pierre Jolivet pierre at joliv.et
Tue May 3 13:36:02 CDT 2022



> On 3 May 2022, at 7:33 PM, Randall Mackie <rlmackie862 at gmail.com> wrote:
> 
> Hi Pierre,
> 
> I do not discount that I might not be doing something right.
> 
> However, this code has not changed and worked fine for versions up to and including 3.16.

What are you implying?
It is not because a code works that it is not bugged, that would be too simple.

> Can you please explain what changed in 3.17 that is causing this?

First, let me tell you that now, MatSetBlockSize() and ISSetBlockSize() work in rather different ways. The latter is not permissive and performs lots of checks, while the former is a little bit looser and does not check the pattern of the matrix.
What you are seeing is a side effect of two commits.
1) https://gitlab.com/petsc/petsc/-/commit/30f30de87881d4fa2944681a7d91a8e1d429a77c <https://gitlab.com/petsc/petsc/-/commit/30f30de87881d4fa2944681a7d91a8e1d429a77c> check for valid block size of IS
2) https://gitlab.com/petsc/petsc/-/commit/9dd3beda480d3fbab2ab495f81e1e366d04021c6 <https://gitlab.com/petsc/petsc/-/commit/9dd3beda480d3fbab2ab495f81e1e366d04021c6> propagation of Mat bs to overlapping IS
PCASM uses MatIncreaseOverlap(). You use this PC with a Mat that you tell PETSc has a bs of 3.
We don’t check this for you, but then when setting the bs of the IS, we do check this, and this errors out.

> Are you saying that now you have to explicitly set each 3x3 dense block, even if they are not used and that was not the case before?

That was always the case before, you may have misinterpreted the meaning of a Mat block size?
Again, with version 3.XY (XY < 17), try MatConvert(A,MATBAIJ,MAT_INITIAL_MATRIX,C,ierr) and you will still get an error.
You have two options: either do not set a block size or set a block size but make sure that the pattern of the Mat is indeed only made of blocks of size bs x bs.

Thanks,
Pierre

> Randy
> 
>> On May 3, 2022, at 10:09 AM, Pierre Jolivet <pierre at joliv.et <mailto:pierre at joliv.et>> wrote:
>> 
>> 
>> 
>>> On 3 May 2022, at 6:54 PM, Randall Mackie <rlmackie862 at gmail.com <mailto:rlmackie862 at gmail.com>> wrote:
>>> 
>>> Hi Pierre,
>>> 
>>> Here is how I create the matrix, and how I’ve done it for many years:
>>> 
>>>       ! Create global matrix to hold system of equations resulting from finite discretization
>>>       ! of the Maxwell equations. 
>>>       ngrow3=ginfo%nx*ginfo%ny*ginfo%nz*3
>>>       call MatCreate(comm,A,ierr)
>>>       call MatSetSizes(A,mloc3,mloc3,ngrow3,ngrow3,ierr)
>>>       call MatSetBlockSize(A,i3,ierr)
>> 
>> I don’t know enough about your discretization stencil, but again, since a simple call to MatConvert(A,MATBAIJ,MAT_INITIAL_MATRIX,C,ierr) fails in your MWE, I doubt that this line is correct.
>> 
>>>       call MatSetType(A,MATAIJ,ierr)
>>>       call MatSeqAIJSetPreallocation(A,i15,PETSC_NULL_INTEGER,ierr)
>>>       call MatMPIAIJSetPreallocation(A,i15,PETSC_NULL_INTEGER,i7,PETSC_NULL_INTEGER,ierr)
>>>       call MatSetOption(A,MAT_NEW_NONZERO_ALLOCATION_ERR,PETSC_FALSE,ierr)
>>> 
>>> This is a staggered grid formulation (but derived many years before PETSc had a DMStag class) and so I use regular DMDA….for those locations that are not involved, I put a 1 on the diagonal.
>> 
>> If you want to use a block size of dimension 3, you also need to explicitly MatSetValue (or MatSetValues or MatSetValuesBlocked) zeros for all 3x3 dense blocks.
>> 
>> Thanks,
>> Pierre 
>> 
>>> Randy
>>> 
>>>> On May 3, 2022, at 9:37 AM, Pierre Jolivet <pierre at joliv.et <mailto:pierre at joliv.et>> wrote:
>>>> 
>>>> Thanks for the reproducer.
>>>> My guess is that your AIJ matrix has not a block size of 3.
>>>> A simple call such as: call MatConvert(A,MATBAIJ,MAT_INITIAL_MATRIX,C,ierr) is also failing, while it shouldn’t if your AIJ Mat is truly made of 3x3 dense blocks.
>>>> How did you determine the block size of your Mat?
>>>> Are you allocating 3x3 dense blocks everywhere or are you skipping zero coefficients in your AIJ Mat?
>>>> In the meantime, you can bypass the issue by not setting a block size of 3 on your Mat, or by setting different block size for the column and row distributions, see MatSetBlockSizes().
>>>> 
>>>> Thanks,
>>>> Pierre
>>>> 
>>>>> On 3 May 2022, at 5:39 PM, Randall Mackie <rlmackie862 at gmail.com <mailto:rlmackie862 at gmail.com>> wrote:
>>>>> 
>>>>> Dear PETSc team:
>>>>> 
>>>>> A part of our code that has worked for years and previous versions is now failing with the latest version 3.17.1, on the KSP solve with the following error:
>>>>> 
>>>>> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>>>>> [0]PETSC ERROR: Invalid argument
>>>>> [0]PETSC ERROR: Block size 3 is incompatible with the indices: non consecutive indices 153055 153124
>>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ <https://fra01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fpetsc.org%2Frelease%2Ffaq%2F&data=05%7C01%7CRandall.Mackie%40CGG.com%7C323fb230ca4f45d45c7408da2d18c44e%7C307ea68275e14701a1146c42f9ff0d24%7C0%7C0%7C637871881639501468%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=3iAxFsi8wDXPcxKRBVFotn7DpoVNc9%2Fq60r%2F57nn7Lw%3D&reserved=0> for trouble shooting.
>>>>> [0]PETSC ERROR: Petsc Release Version 3.17.1, Apr 28, 2022 
>>>>> [0]PETSC ERROR: ./test on a linux-gfortran-complex-debug named rmackie-VirtualBox by rmackie Tue May  3 08:12:15 2022
>>>>> [0]PETSC ERROR: Configure options --with-clean=1 --with-scalar-type=complex --with-debugging=1 --with-fortran=1 --download-mpich=../external/mpich-4.0.1.tar.gz
>>>>> [0]PETSC ERROR: #1 ISSetBlockSize() at /home/rmackie/PETSc/petsc-3.17.1/src/vec/is/is/interface/index.c:1898
>>>>> [0]PETSC ERROR: #2 MatIncreaseOverlap() at /home/rmackie/PETSc/petsc-3.17.1/src/mat/interface/matrix.c:7086
>>>>> [0]PETSC ERROR: #3 PCSetUp_ASM() at /home/rmackie/PETSc/petsc-3.17.1/src/ksp/pc/impls/asm/asm.c:238
>>>>> [0]PETSC ERROR: #4 PCSetUp() at /home/rmackie/PETSc/petsc-3.17.1/src/ksp/pc/interface/precon.c:990
>>>>> [0]PETSC ERROR: #5 KSPSetUp() at /home/rmackie/PETSc/petsc-3.17.1/src/ksp/ksp/interface/itfunc.c:407
>>>>> [0]PETSC ERROR: #6 KSPSolve_Private() at /home/rmackie/PETSc/petsc-3.17.1/src/ksp/ksp/interface/itfunc.c:843
>>>>> [0]PETSC ERROR: #7 KSPSolve() at /home/rmackie/PETSc/petsc-3.17.1/src/ksp/ksp/interface/itfunc.c:1078
>>>>> 
>>>>> 
>>>>> I have a small test program with binary matrix and right hand side that will show the problem and I can send it as a zip file, please advise what email address to use or where to send it.
>>>>> 
>>>>> Thanks, 
>>>>> 
>>>>> Randy M.
>>>> 
>>> 
>> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220503/06653a64/attachment.html>


More information about the petsc-users mailing list