[petsc-users] FEM Matrix Assembly of Overlapped Mesh
袁煕
yuanxi at advancesoft.jp
Sun Jul 23 10:50:04 CDT 2023
Hi,
I used PETSc to assemble a FEM stiff matrix of an overlapped (overlap=2)
DMPlex and used the MUMPS solver to solve it. But I got different solution
by using 1 CPU and MPI parallel computation. I am wondering if I missed
some necessary step or setting during my implementation.
My calling process like follows
1. Generate matrix pkij from DMPlex dm_mesh
```
PetscCallA( DMCreateMatrix(dm_mesh, pkij, ierr) )
PetscCallA( MatSetBlockSize(pkij, 1, ierr))
if( overlap>0 ) call
MatSetOption(pkij,MAT_IGNORE_OFF_PROC_ENTRIES,PETSC_TRUE,ierr)
PetscCallA( MatSetFromOptions(pkij, ierr))
PetscCallA(MatGetLocalSize(pkij, m, n, ierr));
PetscCallA(MatGetSize(pkij, gm, gn, ierr));
PetscCallA(MatGetBlockSize(pkij, bs, ierr));
!
! Create a preallocator matrix with sizes (local and global)
matching the jacobian A.
!
PetscCallA(MatCreate(PETSC_COMM_WORLD, preallocator, ierr));
PetscCallA(MatSetType(preallocator, MATPREALLOCATOR, ierr));
PetscCallA(MatSetSizes(preallocator, m, n, gm, gn, ierr));
PetscCallA(MatSetBlockSize(preallocator, bs, ierr));
PetscCallA(MatSetUp(preallocator, ierr));
```
2. Do Matrix preallocation
```
PetscCallA(MatCreate(PETSC_COMM_WORLD, preallocator, ierr));
PetscCallA(MatSetType(preallocator, MATPREALLOCATOR, ierr));
PetscCallA(MatSetSizes(preallocator, m, n, gm, gn, ierr));
PetscCallA(MatSetBlockSize(preallocator, bs, ierr));
PetscCallA(MatSetUp(preallocator, ierr));
.......
PetscCallA( MatPreallocatorPreallocate(preallocator, PETSC_TRUE,
pkij, ierr) )
PetscCallA( MatDestroy(preallocator,ierr) )
```
3. Assemble matrix pkij by calling MatSetValue of all overlapped elements.
In my above implementation, I used
MatSetOption(pkij,MAT_IGNORE_OFF_PROC_ENTRIES,PETSC_TRUE,ierr). Is
that correct? Or even other options are needed?
Thanks for
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230724/fd779861/attachment.html>
More information about the petsc-users
mailing list