[petsc-users] Trying to run https://petsc.org/release/src/ksp/ksp/tutorials/ex72.c.html

Barry Smith bsmith at petsc.dev
Wed Jul 31 17:21:21 CDT 2024


  Take a look at src/ksp/ksp/tutorials/ex71.c

  To have your code below not crash at this point call MatSetBlockSize(A,2) before MatSetUp()


> On Jul 31, 2024, at 6:01 PM, neil liu <liufield at gmail.com> wrote:
> 
> Hi, all, 
> Following Stefano's advice, my code is reorganized as follows, 
> 
>   MatCreate(PETSC_COMM_WORLD, &A);
>   MatSetType(A, MATIS);
>   MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, numberDof_global, numberDof_global);
>   MatSetUp(A);
>   ISLocalToGlobalMapping ltogm;
>   DMGetLocalToGlobalMapping(dm, &ltogm);
>   MatSetLocalToGlobalMapping(A, ltogm, ltogm);
> 
> Then I just ran the above code snippet, which gave me some errors as following. (Local size 67 not compatible with block size 2).
> It doesn't seems it is actually calling my routine, but I could be wrong about this.
> Can anyone give me some ideas to debug this issue?
> I am just coding vector FEM, assigning 2 dofs each edge and 2 dofs each face. 
> 
> The ltogm seems normal. 
> ISLocalToGlobalMapping Object: 2 MPI processes
>   type not yet set
> [0] 0:2 0:2
> [0] 2:4 2:4
> [0] 4:6 54:56
> [0] 6:8 4:6
> [0] 8:10 6:8
> [0] 10:12 8:10
> [0] 12:14 10:12
> [0] 14:16 56:58
> [0] 16:18 12:14
> [0] 18:20 14:16
> [0] 20:22 16:18
> [0] 22:24 58:60
> [0] 24:26 18:20
> [0] 26:28 20:22
> [0] 28:30 22:24
> [0] 30:32 24:26
> [0] 32:34 60:62
> [0] 34:36 26:28
> [0] 36:38 28:30
> [0] 38:40 30:32
> [0] 40:42 32:34
> [0] 42:44 92:94
> [0] 44:46 94:96
> [0] 46:48 34:36
> [0] 48:50 96:98
> [0] 50:52 36:38
> [0] 52:54 98:100
> [0] 54:56 38:40
> [0] 56:58 40:42
> [0] 58:60 100:102
> [0] 60:62 42:44
> [0] 62:64 102:104
> [0] 64:66 44:46
> [0] 66:68 104:106
> [0] 68:70 46:48
> [0] 70:72 106:108
> [0] 72:74 48:50
> [0] 74:76 50:52
> [0] 76:78 108:110
> [0] 78:80 52:54
> [1] 0:2 54:56
> [1] 2:4 56:58
> [1] 4:6 58:60
> [1] 6:8 60:62
> [1] 8:10 62:64
> [1] 10:12 64:66
> [1] 12:14 66:68
> [1] 14:16 68:70
> [1] 16:18 70:72
> [1] 18:20 72:74
> [1] 20:22 74:76
> [1] 22:24 76:78
> [1] 24:26 78:80
> [1] 26:28 80:82
> [1] 28:30 82:84
> [1] 30:32 84:86
> [1] 32:34 86:88
> [1] 34:36 88:90
> [1] 36:38 90:92
> [1] 38:40 92:94
> [1] 40:42 94:96
> [1] 42:44 96:98
> [1] 44:46 98:100
> [1] 46:48 100:102
> [1] 48:50 102:104
> [1] 50:52 104:106
> [1] 52:54 106:108
> [1] 54:56 108:110
> [1] 56:58 110:112
> [1] 58:60 112:114
> [1] 60:62 114:116
> [1] 62:64 116:118
> [1] 64:66 118:120
> [1] 66:68 120:122
> [1] 68:70 122:124
> [1] 70:72 124:126
> [1] 72:74 126:128
> [1] 74:76 128:130
> [1] 76:78 130:132
> [1] 78:80 132:134
> 
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Arguments are incompatible
> [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> Local size 67 not compatible with block size 2
> [0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!fJ1ibiYg4k47CYmvheN6QZ-hfBzb_wjpc_EnmueTZgQBm5eNVzfMFkUgw7EVOhyXJnw44CPWay_QB-74ioBjjfQ$  for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.21.1, unknown
> [0]PETSC ERROR: [1]PETSC ERROR: Arguments are incompatible
> [1]PETSC ERROR: Local size 67 not compatible with block size 2
> [1]PETSC ERROR: ./app on a arch-linux-c-debug by xiaodong.liu Wed Jul 31 17:43:28 2024
> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran --with-cxx=g++ --download-fblaslapack --download-mpich --with-scalar-type=complex --download-triangle
> [0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!fJ1ibiYg4k47CYmvheN6QZ-hfBzb_wjpc_EnmueTZgQBm5eNVzfMFkUgw7EVOhyXJnw44CPWay_QB-74ioBjjfQ$  for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.21.1, unknown
> [1]PETSC ERROR: ./app on a arch-linux-c-debug  by xiaodong.liu Wed Jul 31 17:43:28 2024
> [1]PETSC ERROR: #1 PetscLayoutSetBlockSize() at /home/xiaodong.liu/Documents/petsc-3.21.1/petsc/src/vec/is/utils/pmap.c:473
> [0]PETSC ERROR: #2 MatSetLocalToGlobalMapping_IS() at /home/xiaodong.liu/Documents/petsc-3.21.1/petsc/src/mat/impls/is/matis.c:2831
> [0]PETSC ERROR: #3 MatSetLocalToGlobalMapping() at /home/xiaodong.liu/Documents/petsc-3.21.1/petsc/src/mat/interface/matrix.c:2252
> Configure options --with-cc=gcc --with-fc=gfortran --with-cxx=g++ --download-fblaslapack --download-mpich --with-scalar-type=complex --download-triangle
> [1]PETSC ERROR: #1 PetscLayoutSetBlockSize() at /home/xiaodong.liu/Documents/petsc-3.21.1/petsc/src/vec/is/utils/pmap.c:473
> [1]PETSC ERROR: After Mat set local to global mapping!
> #2 MatSetLocalToGlobalMapping_IS() at /home/xiaodong.liu/Documents/petsc-3.21.1/petsc/src/mat/impls/is/matis.c:2831
> [1]PETSC ERROR: #3 MatSetLocalToGlobalMapping() at /home/xiaodong.liu/Documents/petsc-3.21.1/petsc/src/mat/interface/matrix.c:2252
> After Mat set local to global mapping!
> 
> Thanks,
> 
> On Tue, Jul 30, 2024 at 2:51 PM neil liu <liufield at gmail.com <mailto:liufield at gmail.com>> wrote:
>> Hi, Stefano,
>> 
>> I am trying to understand the example there you mentioned. I have a question, 
>> the example always use DMDA there. Does BDDC also work for DMPLEX? 
>> 
>> Thanks ,
>> 
>> On Tue, Jul 30, 2024 at 1:47 PM neil liu <liufield at gmail.com <mailto:liufield at gmail.com>> wrote:
>>> Thanks,  Stefano, 
>>> 
>>> I am trying to modify the code as follows, 
>>> MatCreate(PETSC_COMM_WORLD, &A);
>>> MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, numberDof_global, numberDof_global);
>>> MatSetType(A, MATIS);
>>> MatSetUp(A);
>>> MatZeroEntries(A);
>>> VecCreate(PETSC_COMM_WORLD, &b);
>>> VecSetSizes(b, PETSC_DECIDE, numberDof_global);
>>> VecSetUp(b);
>>> VecSet(b,0.0);
>>> VecDuplicate(b, &x);
>>> 
>>> const PetscInt *g_idx;
>>> ISLocalToGlobalMapping ltogm;
>>> DMGetLocalToGlobalMapping(dm, &ltogm);
>>> ISLocalToGlobalMappingGetIndices(ltogm, &g_idx);
>>> 
>>> //Build idxm_global and Set LHS
>>> idxm_Global[ idxDofLocal ] = g_idx[ numdofPerFace*idxm[idxDofLocal]];
>>> MatSetValues(A, numberDof_local, idxm_Global.data(), numberDof_local, idxm_Global.data(), MatrixLocal.data(), ADD_VALUES);
>>> 
>>> //Set RHS
>>> PetscScalar valueDiag = 1.0 ;
>>> MatZeroRows(A, objGeometryInfo.numberDof_Dirichlet, (objGeometryInfo.arrayDofSeqGlobal_Dirichlet).data(), valueDiag, 0, 0);
>>> 
>>> VecSetValues(b, objGeometryInfo.numberDof_Dirichlet, (objGeometryInfo.arrayDofSeqGlobal_Dirichlet).data(), (objGeometryInfo.dof_Dirichlet).data(), INSERT_VALUES);
>>> VecSetValues(x, objGeometryInfo.numberDof_Dirichlet, (objGeometryInfo.arrayDofSeqGlobal_Dirichlet).data(), (objGeometryInfo.dof_Dirichlet).data(), INSERT_VALUES);
>>> ISLocalToGlobalMappingRestoreIndices(ltogm, &g_idx);
>>> VecAssemblyBegin(b);
>>> VecAssemblyEnd(b);
>>> VecAssemblyBegin(x);
>>> VecAssemblyEnd(x);
>>> It shows the attached error when I run the code. It seems something wrong is with setting RHS.
>>> Could you please help me double check my above code to setup the RHS? 
>>> Thanks,
>>> 
>>> On Tue, Jul 30, 2024 at 11:56 AM Stefano Zampini <stefano.zampini at gmail.com <mailto:stefano.zampini at gmail.com>> wrote:
>>>> BDDC needs the matrix in MATIS format. Using MatConvert will give you back the right format, but the subdomain matrices are wrong. You need to assemble directly in MATIS format, something like
>>>> 
>>>> MatCreate(comm,&A)
>>>> MatSetType(A,MATIS)
>>>> MatSetLocalToGlobalMapping(A,l2gmap, l2gmap)
>>>> for e in local_elements:
>>>>    E = compute_element_matrix(e)
>>>>    MatSetValues(A,local_element_dofs,local_element_dofs,....)
>>>> 
>>>> l2gmap is an ISLocalToGlobalMapping that stores the global dof number of the dofs that are local to the mesh
>>>> 
>>>> See e.g.  https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tutorials/ex59.c?ref_type=heads__;!!G_uCfscf7eWS!fJ1ibiYg4k47CYmvheN6QZ-hfBzb_wjpc_EnmueTZgQBm5eNVzfMFkUgw7EVOhyXJnw44CPWay_QB-74jmQ3o2U$  or https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tutorials/ex71.c?ref_type=heads__;!!G_uCfscf7eWS!fJ1ibiYg4k47CYmvheN6QZ-hfBzb_wjpc_EnmueTZgQBm5eNVzfMFkUgw7EVOhyXJnw44CPWay_QB-74rdvFRNE$ 
>>>> 
>>>> Il giorno mar 30 lug 2024 alle ore 17:50 neil liu <liufield at gmail.com <mailto:liufield at gmail.com>> ha scritto:
>>>>> Hi, 
>>>>> I am trying to use PCBDDC for the vector based FEM. (Complex system, double precision )
>>>>> My code can work well with asm, 
>>>>> petsc-3.21.1/petsc/arch-linux-c-opt/bin/mpirun -n 8 ./app -pc_type asm -pc_asm_overlap 6  -ksp_converged_reason -ksp_view -ksp_gmres_modifiedgramschmidt  -ksp_gmres_restart 1500 -ksp_rtol 1e-8 -ksp_monitor -ksp_max_it 100000
>>>>> 
>>>>> When I tried BDDC, it was stuck for solving the linear system (it can not print anything for ksp_monitor). I did the conversion for matrix, 
>>>>> 
>>>>>    Mat J;
>>>>>    MatConvert(A, MATIS, MAT_INITIAL_MATRIX, &J);
>>>>>    KSPSetOperators(ksp, A, J);
>>>>>    MatDestroy(&J);
>>>>>    KSPSetInitialGuessNonzero(ksp, PETSC_TRUE);
>>>>>    KSPSetFromOptions(ksp);
>>>>> 
>>>>> petsc-3.21.1/petsc/arch-linux-c-debug/bin/mpirun -n 2 ./app -ksp_type cg -pc_type bddc -ksp_monitor  -mat_type is 
>>>>> 
>>>>> Do you have any suggestions? 
>>>>> 
>>>>> Thanks ,
>>>>> Xiaodong 
>>>>> 
>>>>> 
>>>>> On Mon, Jul 29, 2024 at 6:19 PM neil liu <liufield at gmail.com <mailto:liufield at gmail.com>> wrote:
>>>>>> When I compile with real data, 
>>>>>> it shows the attached error.
>>>>>> 
>>>>>> The data file is in binary format, right? 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> On Mon, Jul 29, 2024 at 5:36 PM Stefano Zampini <stefano.zampini at gmail.com <mailto:stefano.zampini at gmail.com>> wrote:
>>>>>>> Your PETSc installation is for complex,  data is for real
>>>>>>> 
>>>>>>> On Mon, Jul 29, 2024, 23:14 neil liu <liufield at gmail.com <mailto:liufield at gmail.com>> wrote:
>>>>>>>> This Message Is From an External Sender 
>>>>>>>> This message came from outside your organization.
>>>>>>>>  
>>>>>>>> I compiled Petsc with single precision. However, it is not converged with the data. 
>>>>>>>> 
>>>>>>>> Please see the attached file. 
>>>>>>>> 
>>>>>>>> On Mon, Jul 29, 2024 at 4:25 PM Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> wrote:
>>>>>>>>> 
>>>>>>>>>    This can happen if the data was stored in single precision and PETSc was built for double.
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>>> On Jul 29, 2024, at 3:55 PM, neil liu <liufield at gmail.com <mailto:liufield at gmail.com>> wrote:
>>>>>>>>>> 
>>>>>>>>>> This Message Is From an External Sender
>>>>>>>>>> This message came from outside your organization.
>>>>>>>>>> Dear Petsc developers,,
>>>>>>>>>> 
>>>>>>>>>> I am trying to run 
>>>>>>>>>> https://urldefense.us/v3/__https://petsc.org/release/src/ksp/ksp/tutorials/ex72.c.html__;!!G_uCfscf7eWS!fJ1ibiYg4k47CYmvheN6QZ-hfBzb_wjpc_EnmueTZgQBm5eNVzfMFkUgw7EVOhyXJnw44CPWay_QB-74MofxzyQ$  <https://urldefense.us/v3/__https://petsc.org/release/src/ksp/ksp/tutorials/ex72.c.html__;!!G_uCfscf7eWS!ZG4gvmS6hQD8ymbvCUDfAatzRUJHzmWO-hOgp9m0xXuAXgIB-fxe_xspYs3WEPi_Ed0UFLMHKanYuYWrTlQGrA$>
>>>>>>>>>> with 
>>>>>>>>>> 
>>>>>>>>>> petsc-3.21.1/petsc/arch-linux-c-opt/bin/mpirun -n 2 ./ex72  -f /Documents/PetscData/poisson_DMPLEX_32x32_16.dat -pc_type bddc -ksp_type cg -ksp_norm_type natural -ksp_error_if_not_converged -mat_type is
>>>>>>>>>> 
>>>>>>>>>> The file was downloaded and put in the directory PetscData. 
>>>>>>>>>> 
>>>>>>>>>> The error is shown as follows,
>>>>>>>>>> 
>>>>>>>>>> 0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>>>>>>>>>> [0]PETSC ERROR: Read from file failed
>>>>>>>>>> [0]PETSC ERROR: Read past end of file
>>>>>>>>>> [0]PETSC ERROR: WARNING! There are unused option(s) set! Could be the program crashed before usage or a spelling mistake, etc!
>>>>>>>>>> [0]PETSC ERROR:   Option left: name:-ksp_error_if_not_converged (no value) source: command line
>>>>>>>>>> [0]PETSC ERROR:   Option left: name:-ksp_norm_type value: natural source: command line
>>>>>>>>>> [0]PETSC ERROR:   Option left: name:-ksp_type value: cg source: command line
>>>>>>>>>> [0]PETSC ERROR:   Option left: name:-pc_type value: bddc source: command line
>>>>>>>>>> [0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!fJ1ibiYg4k47CYmvheN6QZ-hfBzb_wjpc_EnmueTZgQBm5eNVzfMFkUgw7EVOhyXJnw44CPWay_QB-74ioBjjfQ$  <https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!ZG4gvmS6hQD8ymbvCUDfAatzRUJHzmWO-hOgp9m0xXuAXgIB-fxe_xspYs3WEPi_Ed0UFLMHKanYuYVR0x14Xg$> for trouble shooting.
>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.21.1, unknown 
>>>>>>>>>> [0]PETSC ERROR: ./ex72 on a arch-linux-c-opt named
>>>>>>>>>>  Mon Jul 29 15:50:04 2024
>>>>>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran --with-cxx=g++ --download-fblaslapack --download-mpich --with-scalar-type=complex --download-triangle --with-debugging=no
>>>>>>>>>> [0]PETSC ERROR: #1 PetscBinaryRead() at /home/xxxxxx/Documents/petsc-3.21.1/petsc/src/sys/fileio/sysio.c:327
>>>>>>>>>> [0]PETSC ERROR: #2 PetscViewerBinaryWriteReadAll() at /home/xiaodong.liu/Documents/petsc-3.21.1/petsc/src/sys/classes/viewer/impls/binary/binv.c:1077
>>>>>>>>>> [0]PETSC ERROR: #3 PetscViewerBinaryReadAll() at /home/xiaodong.liu/Documents/petsc-3.21.1/petsc/src/sys/classes/viewer/impls/binary/binv.c:1119
>>>>>>>>>> [0]PETSC ERROR: #4 MatLoad_MPIAIJ_Binary() at Documents/petsc-3.21.1/petsc/src/mat/impls/aij/mpi/mpiaij.c:3093
>>>>>>>>>> [0]PETSC ERROR: #5 MatLoad_MPIAIJ() at /Documents/petsc-3.21.1/petsc/src/mat/impls/aij/mpi/mpiaij.c:3035
>>>>>>>>>> [0]PETSC ERROR: #6 MatLoad() at /Documents/petsc-3.21.1/petsc/src/mat/interface/matrix.c:1344
>>>>>>>>>> [0]PETSC ERROR: #7 MatLoad_IS() at /Documents/petsc-3.21.1/petsc/src/mat/impls/is/matis.c:2575
>>>>>>>>>> [0]PETSC ERROR: #8 MatLoad() at /home/Documents/petsc-3.21.1/petsc/src/mat/interface/matrix.c:1344
>>>>>>>>>> [0]PETSC ERROR: #9 main() at ex72.c:105
>>>>>>>>>> [0]PETSC ERROR: PETSc Option Table entries:
>>>>>>>>>> [0]PETSC ERROR: -f 
>>>>>>>>>> /Documents/PetscData/poisson_DMPLEX_32x32_16.dat (source: command line)
>>>>>>>>>> [0]PETSC ERROR: -ksp_error_if_not_converged (source: command line)
>>>>>>>>>> [0]PETSC ERROR: -ksp_norm_type natural (source: command line)
>>>>>>>>>> [0]PETSC ERROR: -ksp_type cg (source: command line)
>>>>>>>>>> [0]PETSC ERROR: -mat_type is (source: command line)
>>>>>>>>>> [0]PETSC ERROR: -pc_type bddc (source: command line)
>>>>>>>>>> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov>----------
>>>>>>>>>> application called MPI_Abort(MPI_COMM_SELF, 66) - process 0
>>>>>>>>> 
>>>> 
>>>> 
>>>> --
>>>> Stefano

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240731/0aec6b03/attachment-0001.html>


More information about the petsc-users mailing list