[petsc-dev] [petsc-maint #47934] fieldsplit

Barry Smith bsmith at mcs.anl.gov
Thu Jun 17 12:04:39 CDT 2010



On Jun 17, 2010, at 11:44 AM, Matthew Knepley wrote:

> On Thu, Jun 17, 2010 at 11:27 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> 
>    Matt,
> 
>      Jed is not on petsc-maint. He is suppose to writing his thesis :-)
> 
> Cool, I only asked because I knew he did stuff on MatGetSubMatrix and its interaction
> with FS.
>  
>       MatGetSubMatrices() was written for doing ASM, hence it only works in that limited way. 
> 
> Can't this be fixed by making these submatrices have blocksize one by default, specified from
> FS? That would leave the ASM behavior intact. This moves MatGetSubmatrices() more in the
> direction of MatLoad() where you can pass in constructed Mats if you want.

   Yes, eventually we should provide more general support for the format of the submatrices. It is just ALL the logic for MatGetSubMatrix.... for BAIJ matrices (which is pretty nasty complicate) is built around always get all of a block, so a good amount of complex code needs to be written to provide the general support; won't happen today.

  Barry

> 
>    Matt
>  
>       This guy should just be using AIJ matrices, he get's no advantage from using BAIJ anyways.
> 
>    Barry
> 
> On Jun 17, 2010, at 9:47 AM, Matthew Knepley wrote:
> 
>> Barry and Jed,
>> 
>>  I tried to replicate this, and the problem seems to be with
>> MatGetSubMatrix() and BAIJ
>> matrices. It is assumed that the submatrix has the same block size, which
>> will not be
>> true for FieldSplit. How does this not screw up for the DA version?
>> 
>>   Matt
>> 
>> On Thu, Jun 17, 2010 at 8:14 AM, Laurent Michel <laurent.michel at epfl.ch>wrote:
>> 
>>> Hi,
>>> 
>>> Matthew Knepley wrote:
>>>> On Thu, Jun 17, 2010 at 5:43 AM, Laurent Michel
>>>> <laurent.michel at epfl.ch <mailto:laurent.michel at epfl.ch>> wrote:
>>>> 
>>>>    Hi again,
>>>> 
>>>>    one more thing I am a bit worried about. Following your advice, I am
>>>>    intending to use the fieldsplit preconditioner in my code. I have
>>>>    browsed the web a bit and I have not been able to find much
>>>>    information
>>>>    about it. Even your documentation is not that clear to me, hence my
>>>>    question.
>>>> 
>>>>    I am solving a Stokes problem in 3d (P1-P1 stabilized elements).
>>>>    Originally, the matrix of the linear system is ordered in the
>>>>    following
>>>>    basis (for local elements):
>>>> 
>>>>    {(phi0, 0, 0), ..., (phi3, 0, 0), (0, phi0, 0), ..., (0, phi3, 0),
>>> (0,
>>>>    0, phi0), ..., (0, 0, phi3), phi0, phi1, phi2, phi3}
>>>> 
>>>>    I am using the MATMPIBAIJ matrix format, and I have chosen a
>>>>    matrix size
>>>>    of 4, because I have 4 degrees of freedom (following your
>>>>    documentation). Anyway, if I wanted to choose a block size of 16
>>> (what
>>>>    would've been a natural choice for me), it wouldn't work, because,
>>>>    if I
>>>>    have well understood, the number of rows must be divisible by the
>>>>    block
>>>>    size.
>>>> 
>>>> 
>>>> Let's start here. There is some misunderstanding. I cannot understand
>>>> what you
>>>> mean by degree of freedom, or we might say "unknown". In P1-P1, on I am
>>>> assuming a hex grid, you have 4 unknowns per vertex, and thus 16 unknowns
>>>> per element. You would use a blocksize of 4 (indicating a vertex with
>>>> u,v,w,p).
>>>> Then you could split apart the fields if you wanted.
>>>> 
>>>>    Matt
>>> Ok. My degrees of freedom are as you say, a 3d velocity (vx, vy, vz) and
>>> a pressure (p). I have an unstructured mesh with
>>> 
>>> 337983 elements
>>> 63686 nodes
>>> 
>>> I have penalised the velocity where it was set to 0, so that I can use
>>> MPIBAIJ. My system consists of a matrix of
>>> 
>>> 254744 = 4*63686 rows
>>> 14490112 non-zeros
>>> 
>>> The maximal number of neighbours of a node is 116.
>>> 
>>> Given that matrix, created with
>>> 
>>> MatCreateMPIBAIJ(4,
>>>                 PETSC_DECIDE,
>>>                 PETSC_DECIDE,
>>>                 254744
>>>                 254744
>>>                 116,
>>>                 PETSC_DECIDE,
>>>                 &A);
>>> 
>>> if I now call my program with options (on any number of procs, but below
>>> with 8 procs)
>>> 
>>> -pc_type fieldsplit -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3
>>> 
>>> I get the error (when the script is calling KSPSolve()):
>>> 
>>> [7]PETSC ERROR: --------------------- Error Message
>>> ------------------------------------
>>> [7]PETSC ERROR: Nonconforming object sizes!
>>> [7]PETSC ERROR: Local column sizes 47760 do not add up to total number
>>> of columns 63686!
>>> [7]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [7]PETSC ERROR: Petsc Release Version 3.1.0, Patch 1, Thu Apr  8
>>> 14:16:50 CDT 2010
>>> [7]PETSC ERROR: See docs/changes/index.html for recent updates.
>>> [7]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>> [7]PETSC ERROR: See docs/index.html for manual pages.
>>> [7]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [7]PETSC ERROR: stokes-5.3b01 on a linux-gnu named iacspc3 by lmichel
>>> Thu Jun 17 15:05:51 2010
>>> [7]PETSC ERROR: Libraries linked from
>>> /data/progr/petsc-3.1-openmpi-1.4.2/lib
>>> [7]PETSC ERROR: Configure run at Sat May 29 14:01:23 2010
>>> [7]PETSC ERROR: Configure options
>>> --prefix=/data/progr/petsc-3.1-openmpi-1.4.2
>>> --PETSC_ARCH=linux-gnu-cxx-openmpi-1.4.2-opt --with-clanguage=C++
>>> --with-debugging=0 --with-blas-lib=-lblas --with-lapack-lib=-llapack
>>> --with-shared --with-mpi-dir=/data/progr/openmpi-1.4.2
>>> --download-umfpack=1 --with-umfpack=1 --download-parmetis=1
>>> --with-parmetis=1 --download-prometheus=1 --with-prometheus=1
>>> [7]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [7]PETSC ERROR: MatGetSubMatrix_MPIBAIJ() line 1927 in
>>> src/mat/impls/baij/mpi/mpibaij.c
>>> [7]PETSC ERROR: MatGetSubMatrix_MPIBAIJ() line 1851 in
>>> src/mat/impls/baij/mpi/mpibaij.c
>>> [7]PETSC ERROR: MatGetSubMatrix() line 6425 in src/mat/interface/matrix.c
>>> [7]PETSC ERROR: PCSetUp_FieldSplit() line 296 in
>>> src/ksp/pc/impls/fieldsplit/fieldsplit.c
>>> [7]PETSC ERROR: PCSetUp() line 795 in src/ksp/pc/interface/precon.c
>>> [7]PETSC ERROR: KSPSetUp() line 237 in src/ksp/ksp/interface/itfunc.c
>>> [7]PETSC ERROR: KSPSolve() line 353 in src/ksp/ksp/interface/itfunc.c
>>> 
>>> thanks for your help,
>>> 
>>> L.
>>> 
>>> --
>>> Laurent Michel
>>> PhD candidate
>>> EPFL SB IACS ASN
>>> MA C2 644
>>> +41 21 693 42 46
>>> +41 77 433 38 94
>>> 
>>> 
>>> 
>> 
>> 
>> -- 
>> What most experimenters take for granted before they begin their experiments
>> is infinitely more interesting than any results to which their experiments
>> lead.
>> -- Norbert Wiener
>> 
>> Barry and Jed,
>> 
>>   I tried to replicate this, and the problem seems to be with MatGetSubMatrix() and BAIJ
>> matrices. It is assumed that the submatrix has the same block size, which will not be
>> true for FieldSplit. How does this not screw up for the DA version?
>> 
>>    Matt
>> 
>> On Thu, Jun 17, 2010 at 8:14 AM, Laurent Michel <laurent.michel at epfl.ch> wrote:
>> Hi,
>> 
>> Matthew Knepley wrote:
>> > On Thu, Jun 17, 2010 at 5:43 AM, Laurent Michel
>> > <laurent.michel at epfl.ch <mailto:laurent.michel at epfl.ch>> wrote:
>> >
>> >     Hi again,
>> >
>> >     one more thing I am a bit worried about. Following your advice, I am
>> >     intending to use the fieldsplit preconditioner in my code. I have
>> >     browsed the web a bit and I have not been able to find much
>> >     information
>> >     about it. Even your documentation is not that clear to me, hence my
>> >     question.
>> >
>> >     I am solving a Stokes problem in 3d (P1-P1 stabilized elements).
>> >     Originally, the matrix of the linear system is ordered in the
>> >     following
>> >     basis (for local elements):
>> >
>> >     {(phi0, 0, 0), ..., (phi3, 0, 0), (0, phi0, 0), ..., (0, phi3, 0), (0,
>> >     0, phi0), ..., (0, 0, phi3), phi0, phi1, phi2, phi3}
>> >
>> >     I am using the MATMPIBAIJ matrix format, and I have chosen a
>> >     matrix size
>> >     of 4, because I have 4 degrees of freedom (following your
>> >     documentation). Anyway, if I wanted to choose a block size of 16 (what
>> >     would've been a natural choice for me), it wouldn't work, because,
>> >     if I
>> >     have well understood, the number of rows must be divisible by the
>> >     block
>> >     size.
>> >
>> >
>> > Let's start here. There is some misunderstanding. I cannot understand
>> > what you
>> > mean by degree of freedom, or we might say "unknown". In P1-P1, on I am
>> > assuming a hex grid, you have 4 unknowns per vertex, and thus 16 unknowns
>> > per element. You would use a blocksize of 4 (indicating a vertex with
>> > u,v,w,p).
>> > Then you could split apart the fields if you wanted.
>> >
>> >     Matt
>> Ok. My degrees of freedom are as you say, a 3d velocity (vx, vy, vz) and
>> a pressure (p). I have an unstructured mesh with
>> 
>> 337983 elements
>> 63686 nodes
>> 
>> I have penalised the velocity where it was set to 0, so that I can use
>> MPIBAIJ. My system consists of a matrix of
>> 
>> 254744 = 4*63686 rows
>> 14490112 non-zeros
>> 
>> The maximal number of neighbours of a node is 116.
>> 
>> Given that matrix, created with
>> 
>> MatCreateMPIBAIJ(4,
>>                  PETSC_DECIDE,
>>                  PETSC_DECIDE,
>>                  254744
>>                  254744
>>                  116,
>>                  PETSC_DECIDE,
>>                  &A);
>> 
>> if I now call my program with options (on any number of procs, but below
>> with 8 procs)
>> 
>> -pc_type fieldsplit -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3
>> 
>> I get the error (when the script is calling KSPSolve()):
>> 
>> [7]PETSC ERROR: --------------------- Error Message
>> ------------------------------------
>> [7]PETSC ERROR: Nonconforming object sizes!
>> [7]PETSC ERROR: Local column sizes 47760 do not add up to total number
>> of columns 63686!
>> [7]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [7]PETSC ERROR: Petsc Release Version 3.1.0, Patch 1, Thu Apr  8
>> 14:16:50 CDT 2010
>> [7]PETSC ERROR: See docs/changes/index.html for recent updates.
>> [7]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>> [7]PETSC ERROR: See docs/index.html for manual pages.
>> [7]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [7]PETSC ERROR: stokes-5.3b01 on a linux-gnu named iacspc3 by lmichel
>> Thu Jun 17 15:05:51 2010
>> [7]PETSC ERROR: Libraries linked from
>> /data/progr/petsc-3.1-openmpi-1.4.2/lib
>> [7]PETSC ERROR: Configure run at Sat May 29 14:01:23 2010
>> [7]PETSC ERROR: Configure options
>> --prefix=/data/progr/petsc-3.1-openmpi-1.4.2
>> --PETSC_ARCH=linux-gnu-cxx-openmpi-1.4.2-opt --with-clanguage=C++
>> --with-debugging=0 --with-blas-lib=-lblas --with-lapack-lib=-llapack
>> --with-shared --with-mpi-dir=/data/progr/openmpi-1.4.2
>> --download-umfpack=1 --with-umfpack=1 --download-parmetis=1
>> --with-parmetis=1 --download-prometheus=1 --with-prometheus=1
>> [7]PETSC ERROR:
>> ------------------------------------------------------------------------
>> [7]PETSC ERROR: MatGetSubMatrix_MPIBAIJ() line 1927 in
>> src/mat/impls/baij/mpi/mpibaij.c
>> [7]PETSC ERROR: MatGetSubMatrix_MPIBAIJ() line 1851 in
>> src/mat/impls/baij/mpi/mpibaij.c
>> [7]PETSC ERROR: MatGetSubMatrix() line 6425 in src/mat/interface/matrix.c
>> [7]PETSC ERROR: PCSetUp_FieldSplit() line 296 in
>> src/ksp/pc/impls/fieldsplit/fieldsplit.c
>> [7]PETSC ERROR: PCSetUp() line 795 in src/ksp/pc/interface/precon.c
>> [7]PETSC ERROR: KSPSetUp() line 237 in src/ksp/ksp/interface/itfunc.c
>> [7]PETSC ERROR: KSPSolve() line 353 in src/ksp/ksp/interface/itfunc.c
>> 
>> thanks for your help,
>> 
>> L.
>> 
>> --
>> Laurent Michel
>> PhD candidate
>> EPFL SB IACS ASN
>> MA C2 644
>> +41 21 693 42 46
>> +41 77 433 38 94
>> 
>> 
>> 
>> 
>> 
>> -- 
>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>> -- Norbert Wiener
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20100617/ec8f3640/attachment.html>


More information about the petsc-dev mailing list