[petsc-users] Vecs and Mats with non-contiguous parallel layout
Dave Lee
davelee2804 at gmail.com
Thu Jan 31 19:30:32 CST 2019
Whoops, thanks Matt, well spotted. Looks a lot better now.
Cheers, Dave.
On Thu, Jan 31, 2019 at 10:08 PM Matthew Knepley <knepley at gmail.com> wrote:
> On Thu, Jan 31, 2019 at 12:22 AM Dave Lee <davelee2804 at gmail.com> wrote:
>
>> Hi Barry and Matt,
>>
>> This seems to work for the outer (composite multiplicative) field split,
>> but fails for the inner (schur) field split (output error file attached).
>>
>
> The inner FS numbering is with respect to the reduced problem, not the
> original global problem (it can't know you did one FS on the outside).
>
> THanks,
>
> Matt
>
>
>> I'm setting up the nested field splits as:
>>
>> SNESGetKSP(snes, &ksp);
>>
>> KSPGetPC(ksp, &pc);
>>
>> PCSetType(pc, PCFIELDSPLIT);
>>
>> PCFieldSplitSetType(pc, PC_COMPOSITE_MULTIPLICATIVE);
>>
>>
>> for(int slice_i = 0; slice_i < nSlice; slice_i++) {
>>
>> if(Geometry::procID() == slice_i) {
>>
>> ISCreateStride(MPI_COMM_SELF, nDofsSlice, slice_i * nDofsSlice, 1,
>> &is_s[slice_i]);
>>
>> } else {
>>
>> ISCreateStride(MPI_COMM_SELF, 0, 0, 1, &is_s[slice_i]);
>>
>> }
>>
>> MPI_Barrier(MPI_COMM_WORLD);
>>
>> PCFieldSplitSetIS(pc, NULL, is_s[slice_i]);
>>
>> }
>>
>> PCSetUp(pc);
>>
>>
>> PCFieldSplitGetSubKSP(pc, &n_split, &ksp_i);
>>
>> for(int slice_i = 0; slice_i < nSlice; slice_i++) {
>>
>> KSPGetPC(ksp_i[slice_i], &pc_i);
>>
>> PCSetType(pc_i, PCFIELDSPLIT);
>>
>>
>> if(Geometry::procID() == slice_i) {
>>
>> ISCreateStride(MPI_COMM_SELF, nDofs_u, slice_i * nDofsSlice, 1,
>> &is_u[slice_i]);
>>
>> ISCreateStride(MPI_COMM_SELF, nDofs_p, slice_i * nDofsSlice +
>> nDofs_u, 1, &is_p[slice_i]);
>>
>> } else {
>>
>> ISCreateStride(MPI_COMM_SELF, 0, 0, 1, &is_u[slice_i]);
>>
>> ISCreateStride(MPI_COMM_SELF, 0, 0, 1, &is_p[slice_i]);
>>
>> }
>>
>> MPI_Barrier(MPI_COMM_WORLD);
>>
>> PCFieldSplitSetIS(pc_i, "u", is_u[slice_i]);
>>
>> PCFieldSplitSetIS(pc_i, "p", is_p[slice_i]);
>>
>>
>> PCFieldSplitSetType(pc_i, PC_COMPOSITE_SCHUR);
>>
>> PCSetUp(pc_i);
>>
>>
>> if(Geometry::procID() == slice_i) {
>>
>> PCFieldSplitGetSubKSP(pc_i, &m_split, &ksp_j);
>>
>> KSPSetType(ksp_j[0], KSPGMRES);
>>
>> KSPSetType(ksp_j[1], KSPGMRES);
>>
>> KSPGetPC(ksp_j[0], &pc_j);
>>
>> PCSetType(pc_j, PCJACOBI);
>>
>> KSPGetPC(ksp_j[1], &pc_j);
>>
>> PCSetType(pc_j, PCJACOBI);
>>
>> }
>>
>> MPI_Barrier(MPI_COMM_WORLD);
>>
>> }
>>
>>
>> I could re-arrange the indexing and then use PCFieldSplitSetBlockSize(),
>> however this will undermine the existing block diagonal structure of the
>> system.
>>
>> Cheers, Dave.
>>
>> On Thu, Jan 31, 2019 at 1:28 PM Smith, Barry F. <bsmith at mcs.anl.gov>
>> wrote:
>>
>>>
>>>
>>> > On Jan 30, 2019, at 7:36 PM, Dave Lee via petsc-users <
>>> petsc-users at mcs.anl.gov> wrote:
>>> >
>>> > Thanks Matt,
>>> >
>>> > I'm currently working through the details of the nested field split.
>>> I've decided to re-organise my application data layout so that the local
>>> portions are contiguous in the global space for consistency with the
>>> default PETSc layout.
>>> >
>>> > Does the IndexSet need to be the same size on each processor in order
>>> for this IndexSet to be supplied to a FieldSplit PC?
>>>
>>> No, since it is the local size each process can have a different one.
>>> >
>>> > Moreover can an IndexSet have 0 entries on a particular processor and
>>> still be part of a global FieldSplit PC?
>>>
>>> Yes (at least in theory, hopefully there are not bugs for this corner
>>> case).
>>> >
>>> > Cheers, Dave.
>>> >
>>> >
>>> > On Wed, Jan 30, 2019 at 11:10 PM Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>> > On Tue, Jan 29, 2019 at 7:58 PM Dave Lee via petsc-users <
>>> petsc-users at mcs.anl.gov> wrote:
>>> > Hi,
>>> >
>>> > just wondering if its possible to manually specify the parallel
>>> decomposition of data within PETSc Vecs and Mats? And if so, will the
>>> FieldSplit preconditioning handle this also?
>>> >
>>> > I have an application (non-PETSc) data layout as:
>>> > DATA[16][4][num_procs][3472]
>>> >
>>> > and if possible I would like to use FieldSplit preconditioning without
>>> re-arranging this layout.
>>> >
>>> > Yes. You need to call
>>> >
>>> >
>>> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetIS.html
>>> >
>>> > in order to define the splits. Note that if you have nested splits,
>>> you will have to also call it on the
>>> > nested PC, which stinks, but we have no other way of getting the
>>> information.
>>> >
>>> > Nested splits can also be handled by implementing the DM interface,
>>> but that is a lot of work. It is,
>>> > however, the way we handle it internally because it makes things much
>>> easier and more flexible.
>>> >
>>> > Thanks,
>>> >
>>> > Matt
>>> >
>>> > Currently I have a nested FieldSplit preconditioning with 16 outer
>>> field splits, and a schur complement field split within each. however this
>>> throws up an indexing error as FieldSplit assumes that the parallel
>>> decomposition is contiguous (I think).
>>> >
>>> > [0]PETSC ERROR: #1 ISComplement() line 750 in
>>> /home/dlee0035/soft/petsc-3.9.3/src/vec/is/is/utils/iscoloring.c
>>> > [0]PETSC ERROR: #2 PCSetUp_FieldSplit() line 703 in
>>> /home/dlee0035/soft/petsc-3.9.3/src/ksp/pc/impls/fieldsplit/fieldsplit.c
>>> > [0]PETSC ERROR: #3 PCSetUp() line 923 in
>>> /home/dlee0035/soft/petsc-3.9.3/src/ksp/pc/interface/precon.c
>>> >
>>> > Cheers, Dave.
>>> >
>>> >
>>> >
>>> >
>>> > --
>>> > What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> > -- Norbert Wiener
>>> >
>>> > https://www.cse.buffalo.edu/~knepley/
>>>
>>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190201/ac81e6b6/attachment.html>
More information about the petsc-users
mailing list