[petsc-users] Field split degree of freedom ordering
Jed Brown
jed at jedbrown.org
Tue Nov 1 23:57:23 CDT 2022
In most circumstances, you can and should interlace in some form such that each block in fieldsplit is distributed across all ranks. If you interlace at scalar granularity as described, then each block needs to be able to do that. So for the Stokes equations with equal order elements (like P1-P1 stabilized), you can interlace (u,v,w,p), but for mixed elements (like Q2-P1^discontinuous) you can't interlace in that way. You can still distribute pressure and velocity over all processes, but will need index sets to identify the velocity-pressure splits.
Alexander Lindsay <alexlindsay239 at gmail.com> writes:
> In the block matrices documentation, it's stated: "Note that for interlaced
> storage the number of rows/columns of each block must be the same size" Is
> interlacing defined in a global sense, or a process-local sense? So
> explicitly, if I don't want the same size restriction, do I need to ensure
> that globally all of my block 1 dofs are numbered after my block 0 dofs? Or
> do I need to follow that on a process-local level? Essentially in libMesh
> we always follow rank-major ordering. I'm asking whether for unequal row
> sizes, in order to split, would we need to strictly follow variable-major
> ordering (splitting here meaning splitting by variable)?
>
> Alex
More information about the petsc-users
mailing list