[petsc-users] Setting up a PETSc section for field-split
Matthew Knepley
knepley at gmail.com
Tue Jan 28 09:25:14 CST 2014
On Mon, Jan 27, 2014 at 1:35 PM, Luc Berger-Vergiat <lb2653 at columbia.edu>wrote:
> Thanks Matt,
> this indeed setting the number of fields earlier solve my issue!
>
> I have now made some progress getting my DM setup but I am having troubles
> setting my splitfield options.
> I ran my problem with the -ksp_view option to see what is going on and I
> guess that somehow the section that I define in my DM is not used by the
> preconditioner to split the fields.
> Here is the output of PETSc
>
> KSP Object: 1 MPI processes
> type: gmres
> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> GMRES: happy breakdown tolerance 1e-30
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-08, absolute=1e-16, divergence=1e+16
> left preconditioning
> using PRECONDITIONED norm type for convergence test
> PC Object: 1 MPI processes
> type: fieldsplit
> FieldSplit with MULTIPLICATIVE composition: total splits = 4
> Solver info for each split is in the following KSP objects:
> Split number 0 Defined by IS
>
There are 4 splits here and they are defined by an IS. Why do you think
that is not what you asked for?
Thanks,
Matt
> KSP Object: (fieldsplit_Field_0_) 1 MPI processes
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (fieldsplit_Field_0_) 1 MPI processes
> type: ilu
> ILU: out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> using diagonal shift on blocks to prevent zero pivot
> matrix ordering: natural
> factor fill ratio given 1, needed 1
> Factored matrix follows:
> Matrix Object: 1 MPI processes
> type: seqaij
> rows=0, cols=0
> package used to perform factorization: petsc
> total: nonzeros=0, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> linear system matrix = precond matrix:
> Matrix Object: 1 MPI processes
> type: seqaij
> rows=0, cols=0
> total: nonzeros=0, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> Split number 1 Defined by IS
> KSP Object: (fieldsplit_Field_1_) 1 MPI processes
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (fieldsplit_Field_1_) 1 MPI processes
> type: ilu
> ILU: out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> using diagonal shift on blocks to prevent zero pivot
> matrix ordering: natural
> factor fill ratio given 1, needed 1
> Factored matrix follows:
> Matrix Object: 1 MPI processes
> type: seqaij
> rows=0, cols=0
> package used to perform factorization: petsc
> total: nonzeros=0, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> linear system matrix = precond matrix:
> Matrix Object: 1 MPI processes
> type: seqaij
> rows=0, cols=0
> total: nonzeros=0, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> Split number 2 Defined by IS
> KSP Object: (fieldsplit_Field_2_) 1 MPI processes
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (fieldsplit_Field_2_) 1 MPI processes
> type: ilu
> ILU: out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> using diagonal shift on blocks to prevent zero pivot
> matrix ordering: natural
> factor fill ratio given 1, needed 1
> Factored matrix follows:
> Matrix Object: 1 MPI processes
> type: seqaij
> rows=0, cols=0
> package used to perform factorization: petsc
> total: nonzeros=0, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> linear system matrix = precond matrix:
> Matrix Object: 1 MPI processes
> type: seqaij
> rows=0, cols=0
> total: nonzeros=0, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> Split number 3 Defined by IS
> KSP Object: (fieldsplit_Field_3_) 1 MPI processes
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (fieldsplit_Field_3_) 1 MPI processes
> type: ilu
> ILU: out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> using diagonal shift on blocks to prevent zero pivot
> matrix ordering: natural
> factor fill ratio given 1, needed 1
> Factored matrix follows:
> Matrix Object: 1 MPI processes
> type: seqaij
> rows=0, cols=0
> package used to perform factorization: petsc
> total: nonzeros=0, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> linear system matrix = precond matrix:
> Matrix Object: 1 MPI processes
> type: seqaij
> rows=0, cols=0
> total: nonzeros=0, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> linear system matrix = precond matrix:
> Matrix Object: 1 MPI processes
> type: seqaij
> rows=16, cols=16
> total: nonzeros=256, allocated nonzeros=256
> total number of mallocs used during MatSetValues calls =0
> using I-node routines: found 4 nodes, limit used is 5
>
> I am also attaching part of my code which I use to generate the DM, the
> KSP and the PC objects.
>
> Best,
> Luc
>
> On 01/25/2014 10:31 AM, Matthew Knepley wrote:
>
> On Fri, Jan 24, 2014 at 5:10 PM, Luc Berger-Vergiat <lb2653 at columbia.edu>wrote:
>
>> Hi all,
>> I want to use PETSc as a solver for an FEM problem: modelization of shear
>> bands using a 4 fields mixed formulation.
>>
>> So far I am just trying to set up a two fields in a Petsc section, assign
>> dof too each field and then set up the section to pass it to a DM. I am
>> taking this approach because in general I want for fieldsplit to work
>> purely on the algebraic level without knowledge of boundary conditions or
>> geometry.
>>
>> As of now I have issues when I try to assign a point and its associated
>> degree of freedom to a field using: PetscSectionSetFieldDof.
>> Is this the correct way to associate a dof/point to a field?
>>
>
> You have to set the number of fields before the chart. I am updating the
> docs.
>
> Thanks,
>
> Matt
>
>
>> I attached an example code and its makefile.
>>
>> --
>> Best,
>> Luc
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140128/f446fc20/attachment.html>
More information about the petsc-users
mailing list