[petsc-users] Set saddle-point structure in parallel

Karin&NiKo niko.karin at gmail.com
Fri Dec 2 12:13:11 CST 2016


Thank you Barry.
If I understand well, each process needs to provide the IS of the global
number of each local row of the considered field. Right?
This is what I tried to code. I am gonna check my implementation.

Nicolas

2016-12-02 18:34 GMT+01:00 Barry Smith <bsmith at mcs.anl.gov>:

>
>    Each process needs to provide the IS that contain only local entries
> for that process.
>
>   It looks like you might be doing the opposite.
>
>
> > On Dec 2, 2016, at 10:36 AM, Karin&NiKo <niko.karin at gmail.com> wrote:
> >
> > Dear all,
> >
> > Thanks to Matt's help, I have been able to set up a fieldsplit
> preconditioner for a Stokes-like problem. But it was in sequential! Now I
> am facing new issues when trying to set up the saddle-point structure in
> parallel.
> >
> > Well, I have a matrix with 38 DOF. In the global numbering, the pressure
> DOF are numbered : 2,5,8,11,14,17 and the velocity DOF are the others. The
> matrix is distributed on 2 procs, the rows 0 to 18 on proc0, the rows from
> 19 to 38 on procs1.
> > I have set the following IS in order to pass them to the PCFieldSplit :
> > call ISCreateGeneral(PETSC_COMM_SELF, nbddl0, vec_ddl0,
> PETSC_COPY_VALUES, is0, ierr)
> > call ISCreateGeneral(PETSC_COMM_SELF, nbddl1, vec_ddl1,
> PETSC_COPY_VALUES, is1, ierr)
> >
> > This is what they contain :
> >
> > is0 on proc0 :
> > -------------------
> > IS Object: 1 MPI processes
> >   type: general
> > Number of indices in set 19
> > 0 19
> > 1 20
> > 2 21
> > 3 22
> > 4 23
> > 5 24
> > 6 25
> > 7 26
> > 8 27
> > 9 28
> > 10 29
> > 11 30
> > 12 31
> > 13 32
> > 14 33
> > 15 34
> > 16 35
> > 17 36
> > 18 37
> >
> > is1 on proc0 :
> > -------------------
> > IS Object: 1 MPI processes
> >   type: general
> > Number of indices in set 0
> >
> > is0 on proc1 :
> > -------------------
> > IS Object: 1 MPI processes
> >   type: general
> > Number of indices in set 13
> > 0 0
> > 1 1
> > 2 3
> > 3 4
> > 4 6
> > 5 7
> > 6 9
> > 7 10
> > 8 12
> > 9 13
> > 10 15
> > 11 16
> > 12 18
> >
> > is1 on proc1 :
> > -------------------
> > IS Object: 1 MPI processes
> >   type: general
> > Number of indices in set 6
> > 0 2
> > 1 5
> > 2 8
> > 3 11
> > 4 14
> > 5 17
> >
> > Then I pass them to the FieldSplit :
> > call PCFieldSplitSetIS(pc,'0',is0, ierr)
> > call PCFieldSplitSetIS(pc,'1',is1, ierr)
> >
> >
> > But when the PC is set up, PETSc complains about :
> >
> > [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [1]PETSC ERROR: Nonconforming object sizes
> > [1]PETSC ERROR: Local column sizes 32 do not add up to total number of
> columns 19
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > [1]PETSC ERROR:
>
>
>                                                \C0\E3o on a
> arch-linux2-c-debug named dsp0780450 by B07947 Fri Dec  2 17:07:54 2016
> > [1]PETSC ERROR: Configure options --prefix=/home/B07947/dev/
> codeaster-prerequisites/petsc-3.7.2/Install --with-mpi=yes --with-x=yes
> --download-ml=/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/ml-6.2-p3.tar.gz
> --with-mumps-lib="-L/home/B07947/dev/codeaster-prerequisites/v13/
> prerequisites/Mumps-502_consortium_aster1/MPI/lib -lzmumps -ldmumps
> -lmumps_common -lpord -L/home/B07947/dev/codeaster-prerequisites/v13/
> prerequisites/Scotch_aster-604_aster6/MPI/lib -lesmumps -lptscotch
> -lptscotcherr -lptscotcherrexit -lscotch -lscotcherr -lscotcherrexit
> -L/home/B07947/dev/codeaster-prerequisites/v13/
> prerequisites/Parmetis_aster-403_aster/lib -lparmetis
> -L/home/B07947/dev/codeaster-prerequisites/v13/
> prerequisites/Metis_aster-510_aster1/lib -lmetis -L/usr/lib
> -lscalapack-openmpi -L/usr/lib -lblacs-openmpi -lblacsCinit-openmpi
> -lblacsF77init-openmpi -L/usr/lib/x86_64-linux-gnu -lgomp "
> --with-mumps-include=/home/B07947/dev/codeaster-prerequisites/v13/
> prerequisites/Mumps-502_consortium_aster1/MPI/include
> --with-scalapack-lib="-L/usr/lib -lscalapack-openmpi"
> --with-blacs-lib="-L/usr/lib -lblacs-openmpi -lblacsCinit-openmpi
> -lblacsF77init-openmpi" --with-blas-lib="-L/usr/lib -lopenblas -lcblas"
> --with-lapack-lib="-L/usr/lib -llapack"
> > [1]PETSC ERROR: #1 MatGetSubMatrix_MPIAIJ_Private() line 3181 in
> /home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/src/
> mat/impls/aij/mpi/mpiaij.c
> > [1]PETSC ERROR: #2 MatGetSubMatrix_MPIAIJ() line 3100 in
> /home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/src/
> mat/impls/aij/mpi/mpiaij.c
> > [1]PETSC ERROR: #3 MatGetSubMatrix() line 7825 in
> /home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/src/
> mat/interface/matrix.c
> > [1]PETSC ERROR: #4 PCSetUp_FieldSplit() line 560 in
> /home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/src/
> ksp/pc/impls/fieldsplit/fieldsplit.c
> > [1]PETSC ERROR: #5 PCSetUp() line 968 in /home/B07947/dev/codeaster-
> prerequisites/petsc-3.7.2/src/ksp/pc/interface/precon.c
> > [1]PETSC ERROR: #6 KSPSetUp() line 390 in /home/B07947/dev/codeaster-
> prerequisites/petsc-3.7.2/src/ksp/ksp/interface/itfunc.c
> >
> >
> > I am doing something wrong but I cannot see how I should specify the
> layout of my fields.
> >
> > Thanks in advance,
> > Nicolas
> >
> >
> >
> >
> >
> > <image.png>
> >
> > <Matrix38.ascii>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20161202/3a860456/attachment.html>


More information about the petsc-users mailing list