[petsc-users] Roles of ksp_type and pc_type
Massimiliano Leoni
leoni.massimiliano1 at gmail.com
Mon Nov 17 16:27:57 CST 2014
Hi again,
I've been experimenting further with fieldsplit, following your slides, but
for some reason -pc_fieldsplit_type multiplicative and schur segfault at
KSPSolve. Additive seems to work fine.
I could not reproduce ex62 as in your slides because I don't have the header
file and the script PetscGenerateFEMQuadrature.py, which I downloaded from the
github repo, complains that PETSC.FEM module is missing, so I used my code,
which is supopsd to assemble the same matrix, then I call KSPSetOperators with
Stokes matrix as both the matrix and the preconditioner.
I use the same options as in slide 115, the "block jacobi, inexact" version
works, while the "block Gauss-seidl, inexact" doesn't and segfaults.
I attach backtrace and log files, according to what I was told last time.
Thanks for any help,
Massimiliano
In data giovedì 13 novembre 2014 15:07:40, Massimiliano Leoni ha scritto:
> In data giovedì 13 novembre 2014 06:39:38, Matthew Knepley ha scritto:
> > On Thu, Nov 13, 2014 at 6:24 AM, Massimiliano Leoni <
> >
> > This is not exactly right because here you are just using additive
> > fieldsplit.
>
> AFAI understood from the user manual, this should be right because my
> preconditioner is block diagonal, so I want to use block Jacobi.
>
> If I used the other one, P = [[C,O],[B,Mp/nu]] then I would need a
> multiplicative fieldsplit because this would be lower triangular.
>
> Is this correct?
>
> > The mass matrix
> > is a good preconditioner for the Schur complement, not the zero matrix.
>
> > Take a look at these slides:
> What do you mean?
> I know from theory that the preconditioner I am using is optimal [iteration
> number independent of grid size].
>
> I read the slides and the manual and the Schur complement is what I was
> talking about earlier, so it's not what I wanna do now -- even though
> knowing that it's so easy to implement my be very useful.
>
> Anyway, thanks for the slides, they have been enlightening on the power of
> command line options!
>
> > Thanks,
> >
> > Matt
>
> Thanks again,
> Massimiliano
-------------- next part --------------
(gdb) bt full
#0 0x00007ffff3c1d8d8 in PCApply_FieldSplit(_p_PC*, _p_Vec*, _p_Vec*) () from /usr/lib/petscdir/3.4.2/linux-gnu-c-opt/lib/libpetsc.so.3.4.2
No symbol table info available.
#1 0x00007ffff3ec407e in PCApply () from /usr/lib/petscdir/3.4.2/linux-gnu-c-opt/lib/libpetsc.so.3.4.2
No symbol table info available.
#2 0x00007ffff3c17a25 in KSPFGMRESCycle(int*, _p_KSP*) () from /usr/lib/petscdir/3.4.2/linux-gnu-c-opt/lib/libpetsc.so.3.4.2
No symbol table info available.
#3 0x00007ffff3c18478 in KSPSolve_FGMRES(_p_KSP*) () from /usr/lib/petscdir/3.4.2/linux-gnu-c-opt/lib/libpetsc.so.3.4.2
No symbol table info available.
#4 0x00007ffff3ccab51 in KSPSolve () from /usr/lib/petscdir/3.4.2/linux-gnu-c-opt/lib/libpetsc.so.3.4.2
No symbol table info available.
#5 0x00007ffff7bc52e9 in SSolver::solve (this=this at entry=0x7fffffffd9e0) at .../src/SSolver.cpp:57
solVec = 0xf12400
bVec = 0x12efb40
#6 0x0000000000402a60 in main (argc=<optimized out>, argv=<optimized out>) at .../apps/Stokes/cavity.cpp:111
resolution = 50
ufile = {_mpi_comm = 0x100000000, file = std::unique_ptr<dolfin::GenericFile> containing 0x100002b9d}
u1 = <optimized out>
mesh = <incomplete type>
zero_vector = <incomplete type>
noslip_domain = {<dolfin::SubDomain> = {<No data fields>}, <No data fields>}
f = <incomplete type>
p1 = <optimized out>
cavity = <incomplete type>
u_top = {<dolfin::Expression> = {<No data fields>}, <No data fields>}
top_domain = {<dolfin::SubDomain> = {<No data fields>}, <No data fields>}
p = <incomplete type>
bcu = std::vector of length 2, capacity 2 = {{first = 0x7fffffffd0c0, second = 0x7fffffffce80}, {first = 0x7fffffffcf50, second = 0x7fffffffce60}}
nu = <incomplete type>
solver = {_vptr.SSolver = 0x4037d0 <vtable for SSolver+16>, mesh = <incomplete type>, W = 0x6ece60, V = 0xb35980, bcu = std::vector of length 2, capacity 2 = {0xf13520, 0xf13b00}, a = 0xf11850, F = 0xf119e0, forcing = 0xf0b020,
viscosity = 0xf11260, A = std::shared_ptr (count 2, weak 0) 0xf11bd0, b = 0xf11d40, P = std::shared_ptr (count 2, weak 0) 0xf11bd0, solution = 0xf12030, Prec = 0x0, genericSolver = 0xee2550}
pfile = {_mpi_comm = 0x100000000, file = std::unique_ptr<dolfin::GenericFile> containing 0x0}
-------------- next part --------------
A non-text attachment was scrubbed...
Name: make.log
Type: text/x-log
Size: 52173 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141117/98bf484b/attachment-0001.bin>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: configure.log.zip
Type: application/zip
Size: 211738 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141117/98bf484b/attachment-0001.zip>
More information about the petsc-users
mailing list