[petsc-users] Field split degree of freedom ordering
Alexander Lindsay
alexlindsay239 at gmail.com
Wed Nov 2 07:52:01 CDT 2022
So, in the latter case, IIUC we can maintain how we distribute data among
the processes (partitioning of elements) such that with respect to a
`-ksp_view_pmat` nothing changes and our velocity and pressure dofs are
interlaced on a global scale (e.g. each process has some velocity and
pressure dofs) ... but in order to leverage field split we need those index
sets in order to avoid the equal size constraint?
On Tue, Nov 1, 2022 at 11:57 PM Jed Brown <jed at jedbrown.org> wrote:
> In most circumstances, you can and should interlace in some form such that
> each block in fieldsplit is distributed across all ranks. If you interlace
> at scalar granularity as described, then each block needs to be able to do
> that. So for the Stokes equations with equal order elements (like P1-P1
> stabilized), you can interlace (u,v,w,p), but for mixed elements (like
> Q2-P1^discontinuous) you can't interlace in that way. You can still
> distribute pressure and velocity over all processes, but will need index
> sets to identify the velocity-pressure splits.
>
> Alexander Lindsay <alexlindsay239 at gmail.com> writes:
>
> > In the block matrices documentation, it's stated: "Note that for
> interlaced
> > storage the number of rows/columns of each block must be the same size"
> Is
> > interlacing defined in a global sense, or a process-local sense? So
> > explicitly, if I don't want the same size restriction, do I need to
> ensure
> > that globally all of my block 1 dofs are numbered after my block 0 dofs?
> Or
> > do I need to follow that on a process-local level? Essentially in libMesh
> > we always follow rank-major ordering. I'm asking whether for unequal row
> > sizes, in order to split, would we need to strictly follow variable-major
> > ordering (splitting here meaning splitting by variable)?
> >
> > Alex
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221102/c38affc3/attachment-0001.html>
More information about the petsc-users
mailing list