[petsc-dev] Preconditioner for Schur complement (pcfieldsplit)

Thomas Witkowski thomas.witkowski at tu-dresden.de
Thu Apr 7 02:42:17 CDT 2011


Jed Brown wrote:
> On Wed, Apr 6, 2011 at 11:36, Thomas Witkowski 
> <thomas.witkowski at tu-dresden.de 
> <mailto:thomas.witkowski at tu-dresden.de>> wrote:
>
>     This brings me back to the starting point of my current problem:
>     how to set splits for MatNest correctly? PCFieldSplitSetFields
>     seems not to work in my case.
>
>
> PCFieldSplitSetFields() is a special case that only works when fields 
> are interlaced. I would like to get rid of it, but there are still 
> some cases where it can provide a faster implementation in the special 
> case so we'll wait until the faster implementation can be accessed 
> through the generic interface.
>  
>
>     When I want to make use of PCFieldSplitSetIS for splitting
>     MatNest, which indices should I provide? The MatNest I've created
>     consists of the to main diagonal blocks with indices 0 to n and 0
>     to m.
>
>
> This does not make sense because these index sets overlap. Maybe you 
> want to use (0:m) and (m:m+n), then these are the index sets you pass 
> to PCFieldSplitSetIS(). With two processes, you would usually have
>
> is0 = [(0:m0), (m0+n0:m0+n0+m1)]
> is1 = [(m0:m0+n0), (m0+n0+m1:m0+n0+m1+n1)]
>
> This is the decomposition that is set up automatically by 
> MatCreateNest when you pass PETSC_NULL for the index sets.
>
> You should be able to retrieve the index sets that MatNest created 
> using MatNestGetSubMats(), but this is not implemented (Dave and I 
> discussed it and agree, but a patch has not been pushed yet).
Doing it this way seems to work correctly, at least for the 
initialization. When solving the system, I get a segdev. Here is the 
stack (I make use of the current petsc-dev):

#0  0x00000000008ce6b3 in PetscMemcpy (a=0x188bba0, b=0x2, n=128) at 
/fastfs/witkowsk/software/petsc-dev/include/petscsys.h:1777
#1  0x00000000008d0935 in VecScatterBegin_1 (ctx=0x18a4460, 
xin=0x17b1d70, yin=0x1889840, addv=INSERT_VALUES, mode=SCATTER_FORWARD) 
at /fastfs/witkowsk/software/petsc-dev/src/vec/vec/utils/vpscat.h:11
4
#2  0x00000000008c6953 in VecScatterBegin (inctx=0x18a4460, x=0x17b1d70, 
y=0x1889840, addv=INSERT_VALUES, mode=SCATTER_FORWARD) at 
/fastfs/witkowsk/software/petsc-dev/src/vec/vec/utils/vscat.c:1575
#3  0x00000000007160b6 in PCApply_FieldSplit_Schur (pc=0x182f4e0, 
x=0x17b1d70, y=0x17c2640) at 
/fastfs/witkowsk/software/petsc-dev/src/ksp/pc/impls/fieldsplit/fieldsplit.c:565
#4  0x0000000000c7015c in PCApply (pc=0x182f4e0, x=0x17b1d70, 
y=0x17c2640) at 
/fastfs/witkowsk/software/petsc-dev/src/ksp/pc/interface/precon.c:384
#5  0x00000000007ff6e3 in KSPSolve_PREONLY (ksp=0x17fe730) at 
/fastfs/witkowsk/software/petsc-dev/src/ksp/ksp/impls/preonly/preonly.c:26
#6  0x000000000077b246 in KSPSolve (ksp=0x17fe730, b=0x17b1d70, 
x=0x17c2640) at 
/fastfs/witkowsk/software/petsc-dev/src/ksp/ksp/interface/itfunc.c:426
....

The problem is that the data field (vec->data) of the nested vectors is 
empty. I created the nested vectors exactly in the same way as in 
/src/ksp/ksp/examples/tests/ex22.c. When I would make a 
VecGetArray(myvec, &vecptr) on a nested vector myvec, than vecptr is not 
a vailde pointer. This is the reason for the segdev. Is it a bug in 
PETSc or some problem of my usage of PETSc's interface?

Thomas



More information about the petsc-dev mailing list