<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<p>Hi Matthew,</p>
<p>Which database option are you referring to?</p>
<p>
</p>
<p>I tried to add -fieldsplit_mg_levels_ksp_type gmres (and
-fieldsplit_mg_levels_ksp_max_it 4 for another run) to my options
(cf. below) which starts the iterations but it takes 1 hour for
PETSc to do 13 of them so it must be wrong.</p>
<p>Reminder: my baseline database options line reads</p>
<p><tt>-ksp_view_pre -ksp_monitor -ksp_converged_reason \<br>
-ksp_rtol 1.0e-8 -ksp_gmres_restart 300 \<br>
-ksp_type fgmres \<br>
-pc_type fieldsplit \<br>
-pc_fieldsplit_type multiplicative \<br>
-pc_fieldsplit_block_size 3 \<br>
-pc_fieldsplit_0_fields 0 \<br>
-pc_fieldsplit_1_fields 1 \<br>
-pc_fieldsplit_2_fields 2 \<br>
-fieldsplit_pc_type gamg \<br>
-fieldsplit_ksp_type gmres \<br>
-fieldsplit_ksp_rtol 1.0e-8</tt><tt></tt></p>
<p>which gives</p>
<p><font size="-1"><tt>KSP Object: 1 MPI processes</tt><tt><br>
</tt><tt> type: fgmres</tt><tt><br>
</tt><tt> restart=300, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement</tt><tt><br>
</tt><tt> happy breakdown tolerance 1e-30</tt><tt><br>
</tt><tt> maximum iterations=10000, initial guess is zero</tt><tt><br>
</tt><tt> tolerances: relative=1e-08, absolute=1e-50,
divergence=10000.</tt><tt><br>
</tt><tt> left preconditioning</tt><tt><br>
</tt><tt> using DEFAULT norm type for convergence test</tt><tt><br>
</tt><tt>PC Object: 1 MPI processes</tt><tt><br>
</tt><tt> type: fieldsplit</tt><tt><br>
</tt><tt> PC has not been set up so information may be
incomplete</tt><tt><br>
</tt><tt> FieldSplit with MULTIPLICATIVE composition: total
splits = 3, blocksize = 3</tt><tt><br>
</tt><tt> Solver info for each split is in the following KSP
objects:</tt><tt><br>
</tt><tt> Split number 0 Fields 0</tt><tt><br>
</tt><tt> KSP Object: (fieldsplit_0_) 1 MPI processes</tt><tt><br>
</tt><tt> type: preonly</tt><tt><br>
</tt><tt> maximum iterations=10000, initial guess is zero</tt><tt><br>
</tt><tt> tolerances: relative=1e-05, absolute=1e-50,
divergence=10000.</tt><tt><br>
</tt><tt> left preconditioning</tt><tt><br>
</tt><tt> using DEFAULT norm type for convergence test</tt><tt><br>
</tt><tt> PC Object: (fieldsplit_0_) 1 MPI processes</tt><tt><br>
</tt><tt> type not yet set</tt><tt><br>
</tt><tt> PC has not been set up so information may be
incomplete</tt><tt><br>
</tt><tt> Split number 1 Fields 1</tt><tt><br>
</tt><tt> KSP Object: (fieldsplit_1_) 1 MPI processes</tt><tt><br>
</tt><tt> type: preonly</tt><tt><br>
</tt><tt> maximum iterations=10000, initial guess is zero</tt><tt><br>
</tt><tt> tolerances: relative=1e-05, absolute=1e-50,
divergence=10000.</tt><tt><br>
</tt><tt> left preconditioning</tt><tt><br>
</tt><tt> using DEFAULT norm type for convergence test</tt><tt><br>
</tt><tt> PC Object: (fieldsplit_1_) 1 MPI processes</tt><tt><br>
</tt><tt> type not yet set</tt><tt><br>
</tt><tt> PC has not been set up so information may be
incomplete</tt><tt><br>
</tt><tt> Split number 2 Fields 2</tt><tt><br>
</tt><tt> KSP Object: (fieldsplit_2_) 1 MPI processes</tt><tt><br>
</tt><tt> type: preonly</tt><tt><br>
</tt><tt> maximum iterations=10000, initial guess is zero</tt><tt><br>
</tt><tt> tolerances: relative=1e-05, absolute=1e-50,
divergence=10000.</tt><tt><br>
</tt><tt> left preconditioning</tt><tt><br>
</tt><tt> using DEFAULT norm type for convergence test</tt><tt><br>
</tt><tt> PC Object: (fieldsplit_2_) 1 MPI processes</tt><tt><br>
</tt><tt> type not yet set</tt><tt><br>
</tt><tt> PC has not been set up so information may be
incomplete</tt><tt><br>
</tt><tt> linear system matrix = precond matrix:</tt><tt><br>
</tt><tt> Mat Object: 1 MPI processes</tt><tt><br>
</tt><tt> type: seqaij</tt><tt><br>
</tt><tt> rows=52500, cols=52500</tt><tt><br>
</tt><tt> total: nonzeros=1127079, allocated nonzeros=1128624</tt><tt><br>
</tt><tt> total number of mallocs used during MatSetValues
calls =0</tt><tt><br>
</tt><tt> not using I-node routines</tt><tt><br>
</tt><tt> 0 KSP Residual norm 3.583290589961e+00 </tt><tt><br>
</tt><tt>[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------</tt><tt><br>
</tt><tt>[0]PETSC ERROR: Petsc has generated inconsistent data</tt><tt><br>
</tt><tt>[0]PETSC ERROR: Eigen estimator failed:
DIVERGED_NANORINF at iteration 0[0]PETSC ERROR: Petsc Release
Version 3.10.2, unknown </tt><tt><br>
</tt><tt>[0]PETSC ERROR: Configure options
--PETSC_ARCH=msi_cplx_debug --with-scalar-type=complex
--with-precision=double --with-debugging=1 --with-valgrind=1
--with-debugger=gdb --with-fortran-kernels=1 --download-mpich
--download-hwloc --download-fblaslapack --download-scalapack
--download-metis --download-parmetis --download-ptscotch
--download-mumps --download-slepc</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #1 KSPSolve_Chebyshev() line 381 in
/home/thibaut/Packages/petsc/src/ksp/ksp/impls/cheby/cheby.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #2 KSPSolve() line 780 in
/home/thibaut/Packages/petsc/src/ksp/ksp/interface/itfunc.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #3 PCMGMCycle_Private() line 20 in
/home/thibaut/Packages/petsc/src/ksp/pc/impls/mg/mg.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #4 PCApply_MG() line 377 in
/home/thibaut/Packages/petsc/src/ksp/pc/impls/mg/mg.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #5 PCApply() line 462 in
/home/thibaut/Packages/petsc/src/ksp/pc/interface/precon.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #6 KSP_PCApply() line 281 in
/home/thibaut/Packages/petsc/include/petsc/private/kspimpl.h</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #7 KSPInitialResidual() line 67 in
/home/thibaut/Packages/petsc/src/ksp/ksp/interface/itres.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #8 KSPSolve_GMRES() line 233 in
/home/thibaut/Packages/petsc/src/ksp/ksp/impls/gmres/gmres.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #9 KSPSolve() line 780 in
/home/thibaut/Packages/petsc/src/ksp/ksp/interface/itfunc.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #10 PCApply_FieldSplit() line 1107 in
/home/thibaut/Packages/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #11 PCApply() line 462 in
/home/thibaut/Packages/petsc/src/ksp/pc/interface/precon.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #12 KSP_PCApply() line 281 in
/home/thibaut/Packages/petsc/include/petsc/private/kspimpl.h</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #13 KSPFGMRESCycle() line 166 in
/home/thibaut/Packages/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #14 KSPSolve_FGMRES() line 291 in
/home/thibaut/Packages/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c</tt><tt><br>
</tt><tt>[0]PETSC ERROR: #15 KSPSolve() line 780 in
/home/thibaut/Packages/petsc/src/ksp/ksp/interface/itfunc.c</tt></font></p>
<p><font size="-1"><tt><br>
</tt></font></p>
<p>Thibaut</p>
<p><font size="-1"><tt><br>
</tt></font></p>
<p><font size="-1"><tt><br>
</tt></font></p>
<div class="moz-cite-prefix">On 30/10/2018 23:12, Matthew Knepley
wrote:<br>
</div>
<blockquote type="cite"
cite="mid:CAMYG4GnRdMDs=3eaw589o7hYvi60wShHrO_EzY9q6pmHFiOpgQ@mail.gmail.com">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<div dir="ltr">
<div class="gmail_quote">
<div dir="ltr">On Tue, Oct 30, 2018 at 5:21 PM Appel, Thibaut
via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov"
moz-do-not-send="true">petsc-users@mcs.anl.gov</a>>
wrote:<br>
</div>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">Dear
users,<br>
<br>
Following a suggestion from Matthew Knepley I’ve been trying
to apply fieldsplit/gamg for my set of PDEs but I’m still
encountering issues despite various tests. pc_gamg simply
won’t start.<br>
Note that direct solvers always yield the correct, physical
result.<br>
Removing the fieldsplit to focus on the gamg bit </blockquote>
<div><br>
</div>
<div>This is a mistake I think because you want a simpler
operator to try GAMG on.</div>
<div><br>
</div>
<div>Can you go back to splitting the system, and try using
GMRES as the smoother. Its important to see</div>
<div>whether the smoother makes no progress, or the coarse
correction stinks.</div>
<div><br>
</div>
<div> Thanks,</div>
<div><br>
</div>
<div> Matt</div>
<div> </div>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">and trying
to solve the linear system on a modest size problem still
gives, with<br>
<br>
'-ksp_monitor -ksp_rtol 1.0e-10 -ksp_gmres_restart 300
-ksp_type gmres -pc_type gamg'<br>
<br>
[3]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------<br>
[3]PETSC ERROR: Petsc has generated inconsistent data<br>
[3]PETSC ERROR: Have un-symmetric graph (apparently). Use
'-(null)pc_gamg_sym_graph true' to symetrize the graph or
'-(null)pc_gamg_threshold -1' if the matrix is structurally
symmetric.<br>
<br>
And since then, after adding '-pc_gamg_sym_graph true' I
have been getting<br>
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------<br>
[0]PETSC ERROR: Petsc has generated inconsistent data<br>
[0]PETSC ERROR: Eigen estimator failed: DIVERGED_NANORINF at
iteration<br>
<br>
-ksp_chebyshev_esteig_noisy 0/1 does not change anything<br>
<br>
Knowing that Chebyshev eigen estimator needs a positive
spectrum I tried ‘-mg_levels_ksp_type gmres’ but iterations
would just go on endlessly.<br>
<br>
It seems that I have indeed eigenvalues of rather high
magnitude in the spectrum of my operator without being able
to determine the reason.<br>
The eigenvectors look like small artifacts at the
wall-inflow or wall-outflow corners with zero anywhere else
but I do not know how to interpret this.<br>
Equations are time-harmonic linearized Navier-Stokes to
which a forcing is applied, there’s no time-marching.<br>
<br>
Matrix is formed with a MPIAIJ type. The formulation is
incompressible, in complex arithmetic and the 2D physical
domain is mapped to a logically rectangular, regular
collocated grid with a high-order finite difference method.<br>
I determine the ownership of the rows/degrees of freedom of
the matrix with PetscSplitOwnership and I’m not using DMDA.<br>
<br>
The Fortran application code is memory-leak free and has
undergone a strict verification/validation procedure for
different variations of the PDEs.<br>
<br>
If there’s any problem with the matrix what could help for
the diagnostic? At this point I’m running out of ideas so I
would really appreciate additional suggestions and
discussions.<br>
<br>
Thanks for your continued support,<br>
<br>
<br>
Thibaut</blockquote>
</div>
<br clear="all">
<div><br>
</div>
-- <br>
<div dir="ltr" class="gmail_signature"
data-smartmail="gmail_signature">
<div dir="ltr">
<div>
<div dir="ltr">
<div>
<div dir="ltr">
<div>What most experimenters take for granted before
they begin their experiments is infinitely more
interesting than any results to which their
experiments lead.<br>
-- Norbert Wiener</div>
<div><br>
</div>
<div><a href="http://www.cse.buffalo.edu/~knepley/"
target="_blank" moz-do-not-send="true">https://www.cse.buffalo.edu/~knepley/</a><br>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</blockquote>
</body>
</html>