[petsc-users] Field split degree of freedom ordering

Jed Brown jed at jedbrown.org
Wed Nov 2 08:06:40 CDT 2022


Yes, the normal approach is to partition your mesh once, then for each field, resolve ownership of any interface dofs with respect to the element partition (so shared vertex velocity can land on any process that owns an adjacent element, though even this isn't strictly necessary).

Alexander Lindsay <alexlindsay239 at gmail.com> writes:

> So, in the latter case, IIUC we can maintain how we distribute data among
> the processes (partitioning of elements) such that with respect to a
> `-ksp_view_pmat` nothing changes and our velocity and pressure dofs are
> interlaced on a global scale (e.g. each process has some velocity and
> pressure dofs) ... but in order to leverage field split we need those index
> sets in order to avoid the equal size constraint?
>
> On Tue, Nov 1, 2022 at 11:57 PM Jed Brown <jed at jedbrown.org> wrote:
>
>> In most circumstances, you can and should interlace in some form such that
>> each block in fieldsplit is distributed across all ranks. If you interlace
>> at scalar granularity as described, then each block needs to be able to do
>> that. So for the Stokes equations with equal order elements (like P1-P1
>> stabilized), you can interlace (u,v,w,p), but for mixed elements (like
>> Q2-P1^discontinuous) you can't interlace in that way. You can still
>> distribute pressure and velocity over all processes, but will need index
>> sets to identify the velocity-pressure splits.
>>
>> Alexander Lindsay <alexlindsay239 at gmail.com> writes:
>>
>> > In the block matrices documentation, it's stated: "Note that for
>> interlaced
>> > storage the number of rows/columns of each block must be the same size"
>> Is
>> > interlacing defined in a global sense, or a process-local sense? So
>> > explicitly, if I don't want the same size restriction, do I need to
>> ensure
>> > that globally all of my block 1 dofs are numbered after my block 0 dofs?
>> Or
>> > do I need to follow that on a process-local level? Essentially in libMesh
>> > we always follow rank-major ordering. I'm asking whether for unequal row
>> > sizes, in order to split, would we need to strictly follow variable-major
>> > ordering (splitting here meaning splitting by variable)?
>> >
>> > Alex
>>


More information about the petsc-users mailing list