[petsc-users] Setting up a PETSc section for field-split

Luc Berger-Vergiat lb2653 at columbia.edu
Tue Jan 28 10:11:52 CST 2014


What I don't really understand is why the size of all the sub fields is 
zero?
As you can see all the matrix object in my fieldsplit preconditioner 
have total:nonzeros=0, allocated nonzeros=0.

This seems strange since I issued the following command:
//call DMSetDefaultSection(FSDM, FSSection, ierr)

with a the following FSSection:
PetscSection with 4 fields
   field 0 with 1 components
Process 0:
   (   0) dim  0 offset   0
   (   1) dim  1 offset   0
   (   2) dim  0 offset   0
   (   3) dim  0 offset   0
   (   4) dim  1 offset   0
   (   5) dim  0 offset   0
   (   6) dim  0 offset   0
   (   7) dim  0 offset   0
   (   8) dim  0 offset   0
   (   9) dim  0 offset   0
   (  10) dim  0 offset   0
   (  11) dim  0 offset   0
   (  12) dim  0 offset   0
   (  13) dim  0 offset   0
   (  14) dim  0 offset   0
   (  15) dim  0 offset   0
   field 1 with 1 components
Process 0:
   (   0) dim  1 offset   0
   (   1) dim  0 offset   1
   (   2) dim  1 offset   0
   (   3) dim  1 offset   0
   (   4) dim  0 offset   1
   (   5) dim  1 offset   0
   (   6) dim  0 offset   0
   (   7) dim  0 offset   0
   (   8) dim  0 offset   0
   (   9) dim  0 offset   0
   (  10) dim  0 offset   0
   (  11) dim  0 offset   0
   (  12) dim  0 offset   0
   (  13) dim  0 offset   0
   (  14) dim  0 offset   0
   (  15) dim  0 offset   0
   field 2 with 1 components
Process 0:
   (   0) dim  0 offset   1
   (   1) dim  0 offset   1
   (   2) dim  0 offset   1
   (   3) dim  0 offset   1
   (   4) dim  0 offset   1
   (   5) dim  0 offset   1
   (   6) dim  1 offset   0
   (   7) dim  1 offset   0
   (   8) dim  1 offset   0
   (   9) dim  1 offset   0
   (  10) dim  1 offset   0
   (  11) dim  1 offset   0
   (  12) dim  0 offset   0
   (  13) dim  0 offset   0
   (  14) dim  0 offset   0
   (  15) dim  0 offset   0
   field 3 with 1 components
Process 0:
   (   0) dim  0 offset   1
   (   1) dim  0 offset   1
   (   2) dim  0 offset   1
   (   3) dim  0 offset   1
   (   4) dim  0 offset   1
   (   5) dim  0 offset   1
   (   6) dim  0 offset   1
   (   7) dim  0 offset   1
   (   8) dim  0 offset   1
   (   9) dim  0 offset   1
   (  10) dim  0 offset   1
   (  11) dim  0 offset   1
   (  12) dim  1 offset   0
   (  13) dim  1 offset   0
   (  14) dim  1 offset   0
   (  15) dim  1 offset   0

I thought that by using DMSetDefaultSection() I would be done setting 
the fields and that the fieldsplit would detect that section and use it 
to construct the splits.

Should I use another command to tell the PC to use the DM section?

Best,
Luc

On 01/28/2014 10:25 AM, Matthew Knepley wrote:
> On Mon, Jan 27, 2014 at 1:35 PM, Luc Berger-Vergiat 
> <lb2653 at columbia.edu <mailto:lb2653 at columbia.edu>> wrote:
>
>     Thanks Matt,
>     this indeed setting the number of fields earlier solve my issue!
>
>     I have now made some progress getting my DM setup but I am having
>     troubles setting my splitfield options.
>     I ran my problem with the -ksp_view option to see what is going on
>     and I guess that somehow the section that I define in my DM is not
>     used by the preconditioner to split the fields.
>     Here is the output of PETSc
>
>     KSP Object: 1 MPI processes
>       type: gmres
>         GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>     Orthogonalization with no iterative refinement
>         GMRES: happy breakdown tolerance 1e-30
>       maximum iterations=10000, initial guess is zero
>       tolerances:  relative=1e-08, absolute=1e-16, divergence=1e+16
>       left preconditioning
>       using PRECONDITIONED norm type for convergence test
>     PC Object: 1 MPI processes
>       type: fieldsplit
>         FieldSplit with MULTIPLICATIVE composition: total splits = 4
>         Solver info for each split is in the following KSP objects:
>         Split number 0 Defined by IS
>
>
> There are 4 splits here and they are defined by an IS. Why do you 
> think that is not what you asked for?
>
>   Thanks,
>
>     Matt
>
>         KSP Object:    (fieldsplit_Field_0_)     1 MPI processes
>           type: preonly
>           maximum iterations=10000, initial guess is zero
>           tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>           left preconditioning
>           using NONE norm type for convergence test
>         PC Object:    (fieldsplit_Field_0_)     1 MPI processes
>           type: ilu
>             ILU: out-of-place factorization
>             0 levels of fill
>             tolerance for zero pivot 2.22045e-14
>             using diagonal shift on blocks to prevent zero pivot
>             matrix ordering: natural
>             factor fill ratio given 1, needed 1
>               Factored matrix follows:
>                 Matrix Object:             1 MPI processes
>                   type: seqaij
>                   rows=0, cols=0
>                   package used to perform factorization: petsc
>                   total: nonzeros=0, allocated nonzeros=0
>                   total number of mallocs used during MatSetValues
>     calls =0
>                     not using I-node routines
>           linear system matrix = precond matrix:
>           Matrix Object:       1 MPI processes
>             type: seqaij
>             rows=0, cols=0
>             total: nonzeros=0, allocated nonzeros=0
>             total number of mallocs used during MatSetValues calls =0
>               not using I-node routines
>         Split number 1 Defined by IS
>         KSP Object:    (fieldsplit_Field_1_)     1 MPI processes
>           type: preonly
>           maximum iterations=10000, initial guess is zero
>           tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>           left preconditioning
>           using NONE norm type for convergence test
>         PC Object:    (fieldsplit_Field_1_)     1 MPI processes
>           type: ilu
>             ILU: out-of-place factorization
>             0 levels of fill
>             tolerance for zero pivot 2.22045e-14
>             using diagonal shift on blocks to prevent zero pivot
>             matrix ordering: natural
>             factor fill ratio given 1, needed 1
>               Factored matrix follows:
>                 Matrix Object:             1 MPI processes
>                   type: seqaij
>                   rows=0, cols=0
>                   package used to perform factorization: petsc
>                   total: nonzeros=0, allocated nonzeros=0
>                   total number of mallocs used during MatSetValues
>     calls =0
>                     not using I-node routines
>           linear system matrix = precond matrix:
>           Matrix Object:       1 MPI processes
>             type: seqaij
>             rows=0, cols=0
>             total: nonzeros=0, allocated nonzeros=0
>             total number of mallocs used during MatSetValues calls =0
>               not using I-node routines
>         Split number 2 Defined by IS
>         KSP Object:    (fieldsplit_Field_2_)     1 MPI processes
>           type: preonly
>           maximum iterations=10000, initial guess is zero
>           tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>           left preconditioning
>           using NONE norm type for convergence test
>         PC Object:    (fieldsplit_Field_2_)     1 MPI processes
>           type: ilu
>             ILU: out-of-place factorization
>             0 levels of fill
>             tolerance for zero pivot 2.22045e-14
>             using diagonal shift on blocks to prevent zero pivot
>             matrix ordering: natural
>             factor fill ratio given 1, needed 1
>               Factored matrix follows:
>                 Matrix Object:             1 MPI processes
>                   type: seqaij
>                   rows=0, cols=0
>                   package used to perform factorization: petsc
>                   total: nonzeros=0, allocated nonzeros=0
>                   total number of mallocs used during MatSetValues
>     calls =0
>                     not using I-node routines
>           linear system matrix = precond matrix:
>           Matrix Object:       1 MPI processes
>             type: seqaij
>             rows=0, cols=0
>             total: nonzeros=0, allocated nonzeros=0
>             total number of mallocs used during MatSetValues calls =0
>               not using I-node routines
>         Split number 3 Defined by IS
>         KSP Object:    (fieldsplit_Field_3_)     1 MPI processes
>           type: preonly
>           maximum iterations=10000, initial guess is zero
>           tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>           left preconditioning
>           using NONE norm type for convergence test
>         PC Object:    (fieldsplit_Field_3_)     1 MPI processes
>           type: ilu
>             ILU: out-of-place factorization
>             0 levels of fill
>             tolerance for zero pivot 2.22045e-14
>             using diagonal shift on blocks to prevent zero pivot
>             matrix ordering: natural
>             factor fill ratio given 1, needed 1
>               Factored matrix follows:
>                 Matrix Object:             1 MPI processes
>                   type: seqaij
>                   rows=0, cols=0
>                   package used to perform factorization: petsc
>                   total: nonzeros=0, allocated nonzeros=0
>                   total number of mallocs used during MatSetValues
>     calls =0
>                     not using I-node routines
>           linear system matrix = precond matrix:
>           Matrix Object:       1 MPI processes
>             type: seqaij
>             rows=0, cols=0
>             total: nonzeros=0, allocated nonzeros=0
>             total number of mallocs used during MatSetValues calls =0
>               not using I-node routines
>       linear system matrix = precond matrix:
>       Matrix Object:   1 MPI processes
>         type: seqaij
>         rows=16, cols=16
>         total: nonzeros=256, allocated nonzeros=256
>         total number of mallocs used during MatSetValues calls =0
>           using I-node routines: found 4 nodes, limit used is 5
>
>     I am also attaching part of my code which I use to generate the
>     DM, the KSP and the PC objects.
>
>     Best,
>     Luc
>
>     On 01/25/2014 10:31 AM, Matthew Knepley wrote:
>>     On Fri, Jan 24, 2014 at 5:10 PM, Luc Berger-Vergiat
>>     <lb2653 at columbia.edu <mailto:lb2653 at columbia.edu>> wrote:
>>
>>         Hi all,
>>         I want to use PETSc as a solver for an FEM problem:
>>         modelization of shear bands using a 4 fields mixed formulation.
>>
>>         So far I am just trying to set up a two fields in a Petsc
>>         section, assign dof too each field and then set up the
>>         section to pass it to a DM. I am taking this approach because
>>         in general I want for fieldsplit to work purely on the
>>         algebraic level without knowledge of boundary conditions or
>>         geometry.
>>
>>         As of now I have issues when I try to assign a point and its
>>         associated degree of freedom to a field using:
>>         PetscSectionSetFieldDof.
>>         Is this the correct way to associate a dof/point to a field?
>>
>>
>>     You have to set the number of fields before the chart. I am
>>     updating the docs.
>>
>>      Thanks,
>>
>>         Matt
>>
>>         I attached an example code and its makefile.
>>
>>         -- 
>>         Best,
>>         Luc
>>
>>
>>
>>
>>     -- 
>>     What most experimenters take for granted before they begin their
>>     experiments is infinitely more interesting than any results to
>>     which their experiments lead.
>>     -- Norbert Wiener
>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140128/be4524ac/attachment-0001.html>


More information about the petsc-users mailing list