[petsc-users] Can I assign different local number of row and column to the parallel matrix in different process?

Barry Smith bsmith at petsc.dev
Sun Apr 25 22:53:30 CDT 2021



> On Apr 25, 2021, at 10:04 PM, 王 杰 <wangjieneu at hotmail.com> wrote:
> 
> Thanks very much for your reply!
> 1、
> In the first process, the global index starts from 0, and the local index also starts from 0. The MatCreateAij and MatSetValues work well.
> However, in the second process, the minimum global index is 601, which doesn't coincide with the local index that starts from 0.  Should I give a global_offset '601' when prepare the parameters *d_nnz and *o_nnz?

  No, d_nnz and *o_nnz don't have anything to do with global indices (that is determined completely by local indices set on each ran).

> 2、
> In the first process,m and n are 1600 but in the second process they are 900. Is it ok to give different m and n in the two process which related to the same matrix d_A ? For example, (1600,1600)for the first process and (900,900) for the second process?

  Yes

> 3、
> The global index in both process are not continuous. Is it necessory to use Application Orderings routines to make it continuous? 

  PETSc global indices for matrices and vectors are always continuous; how you choose to manage your numbering of your own internal objects may or may not relate to the ordering PETSc users. When you use MatSetValue() and VecSetValues() you must use PETSc's continuous ordering. You can set MatSetLocalToGlobalMapping and VecSetLocalToGlobalMapping to allow you to use MatSetValuesLocal and VecSetValuesLocal using arbitrary local orderings.

> 4、
> Is it necessory  to difine the ghost nodes explicitily with the rountine DMDACreate2d(MPLComm comm, DMBoundaryType bx, DMBoundaryType by, DMDAStencilType stype, Petscint M, Petscint N, Petscint m, Petscint n, Petscint do£, Petscint s, canst Petscint lx[], canst Petscint ly[], DM '''da)?
> Or I just to assign the nodes with the same global indices to both process, and the Petsc will generate ghost nodes automatically?

  If you use DMDACreate2d() which only makes sense for 2d structure grids the arguments determine the exact ghosting, basically the stencil width s and type of boundary conditions determine the ghosting. You cannot use another ghosting. But global indexing has nothing to do with ghosting and any process can use any global indices it would like and the Mat/Vec handle the needed communication.

  Barry

> 
> 发件人: Barry Smith <bsmith at petsc.dev>
> 发送时间: 2021年4月25日 17:22
> 收件人: 王 杰 <wangjieneu at hotmail.com>
> 抄送: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> 主题: Re: [petsc-users] Can I assign different local number of row and column to the parallel matrix in different process?
>  
> 
>   Yes; but note that for square matrices using n == m almost always makes sense, otherwise the input vector to the matrix-vector product will have a different layout than the output vector. Code code is difficult to manage. 
> 
>> On Apr 23, 2021, at 9:29 PM, 王 杰 <wangjieneu at hotmail.com <mailto:wangjieneu at hotmail.com>> wrote:
>> 
>> hello,
>> 
>> I use petsc 3.14.5 with OpenMPI 3.1.6 in Ubuntu 18.04.5.
>> 
>> I don't well understand the routine  MatCreateAIJ(MPI_Comm comm,PetscInt m, PetscInt n,PetscInt M,PetscInt N,PetscInt d_nz,PetscInt *d_nnz, PetscInt o_nz,PetscInt *o_nnz,Mat *A). 
>> 
>> I have tow process. The total number of global index is 2188. The first one deal with the global index 0~1400 with ghost node 1401~1500 and 2001~2188. The second process deal with globla index 1400~2100 with ghost node 601~700 and 701~800. This is dynamiclly assigned by load balancer.
>>  
>> I want to assign the first 1400 rows to the first process and the remained 788 rows in the second process. Can I give different values to m and n in the routine MatCreateAIJin different process?
>> 
>> Thanks.
>>        Wang Jie

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210425/ddaefa1f/attachment-0001.html>


More information about the petsc-users mailing list