<html>
<head>
<meta content="text/html; charset=ISO-8859-1"
http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<font face="Ubuntu">Barry,<br>
<br>
I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view
-options_left<br>
<br>
For the test I use a 64^3 grid and 4 processors.<br>
<br>
The output is:<br>
<br>
[2]PETSC ERROR: --------------------- Error Message
------------------------------------<br>
[2]PETSC ERROR: Arguments are incompatible!<br>
[2]PETSC ERROR: Zero diagonal on row 0!<br>
[2]PETSC ERROR:
------------------------------------------------------------------------<br>
[2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 <br>
[2]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[2]PETSC ERROR: See docs/faq.html for hints about trouble
shooting.<br>
[2]PETSC ERROR: See docs/index.html for manual pages.<br>
[2]PETSC ERROR:
------------------------------------------------------------------------<br>
[2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by
mic Thu Aug 1 14:44:04 2013<br>
[0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from
/opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib<br>
[2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013<br>
[2]PETSC ERROR: Configure options <br>
[2]PETSC ERROR:
------------------------------------------------------------------------<br>
[2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in
src/mat/impls/aij/seq/aij.c<br>
[2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in
src/mat/impls/aij/seq/aij.c<br>
--------------------- Error Message
------------------------------------<br>
[2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in
src/mat/impls/aij/mpi/mpiaij.c<br>
[2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c<br>
[2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in
src/ksp/pc/impls/sor/sor.c<br>
[2]PETSC ERROR: PCApply() line 442 in
src/ksp/pc/interface/precon.c<br>
[2]PETSC ERROR: KSP_PCApply() line 227 in
src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h<br>
Arguments are incompatible!<br>
[2]PETSC ERROR: KSPInitialResidual() line 64 in
src/ksp/ksp/interface/itres.c<br>
[2]PETSC ERROR: KSPSolve_GMRES() line 239 in
src/ksp/ksp/impls/gmres/gmres.c<br>
[2]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
[2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in
src/ksp/ksp/impls/cheby/cheby.c<br>
[2]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
[2]PETSC ERROR: PCMGMCycle_Private() line 19 in
src/ksp/pc/impls/mg/mg.c<br>
Zero diagonal on row 0!<br>
[2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c<br>
[2]PETSC ERROR: PCApply() line 442 in
src/ksp/pc/interface/precon.c<br>
[0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in
src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h<br>
[2]PETSC ERROR: KSPSolve_CG() line 175 in
src/ksp/ksp/impls/cg/cg.c<br>
------------------------------------------------------------------------<br>
[2]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
[0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 <br>
[0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about
trouble shooting.<br>
[0]PETSC ERROR: --------------------- Error Message
------------------------------------<br>
[3]PETSC ERROR: Arguments are incompatible!<br>
[3]PETSC ERROR: Zero diagonal on row 0!<br>
[3]PETSC ERROR:
------------------------------------------------------------------------<br>
[3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 <br>
[3]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[3]PETSC ERROR: See docs/faq.html for hints about trouble
shooting.<br>
[3]PETSC ERROR: See docs/index.html for manual pages.<br>
[3]PETSC ERROR:
------------------------------------------------------------------------<br>
See docs/index.html for manual pages.<br>
[3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by
mic Thu Aug 1 14:44:04 2013<br>
[3]PETSC ERROR: Libraries linked from
/opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib<br>
[1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1
12:01:44 2013<br>
[3]PETSC ERROR: Configure options <br>
[3]PETSC ERROR:
------------------------------------------------------------------------<br>
[3]PETSC ERROR: --------------------- Error Message
------------------------------------<br>
MatInvertDiagonal_SeqAIJ() line 1457 in
src/mat/impls/aij/seq/aij.c<br>
[3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in
src/mat/impls/aij/seq/aij.c<br>
[3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in
src/mat/impls/aij/mpi/mpiaij.c<br>
[1]PETSC ERROR: Arguments are incompatible!<br>
[1]PETSC ERROR: Zero diagonal on row 0!<br>
[1]PETSC ERROR:
------------------------------------------------------------------------<br>
[1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 <br>
[1]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[1]PETSC ERROR: See docs/faq.html for hints about trouble
shooting.<br>
[1]PETSC ERROR: See docs/index.html for manual pages.<br>
[1]PETSC ERROR:
------------------------------------------------------------------------<br>
[1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by
mic Thu Aug 1 14:44:04 2013<br>
[1]PETSC ERROR: Libraries linked from
/opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib<br>
[1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013<br>
[1]PETSC ERROR: Configure options <br>
[1]PETSC ERROR:
------------------------------------------------------------------------<br>
[1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in
src/mat/impls/aij/seq/aij.c<br>
[1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in
src/mat/interface/matrix.c<br>
[3]PETSC ERROR: PCApply_SOR() line 35 in
src/ksp/pc/impls/sor/sor.c<br>
[3]PETSC ERROR: PCApply() line 442 in
src/ksp/pc/interface/precon.c<br>
[3]PETSC ERROR: KSP_PCApply() line 227 in
src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h<br>
[3]PETSC ERROR: KSPInitialResidual() line 64 in
src/ksp/ksp/interface/itres.c<br>
[3]PETSC ERROR: KSPSolve_GMRES() line 239 in
src/ksp/ksp/impls/gmres/gmres.c<br>
[3]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
[3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in
src/ksp/ksp/impls/cheby/cheby.c<br>
[3]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
[3]PETSC ERROR: PCMGMCycle_Private() line 19 in
src/ksp/pc/impls/mg/mg.c<br>
[3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c<br>
[3]PETSC ERROR: PCApply() line 442 in
src/ksp/pc/interface/precon.c<br>
[3]PETSC ERROR: KSP_PCApply() line 227 in
src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h<br>
[3]PETSC ERROR: KSPSolve_CG() line 175 in
src/ksp/ksp/impls/cg/cg.c<br>
[3]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c<br>
[1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in
src/mat/impls/aij/mpi/mpiaij.c<br>
[1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c<br>
[1]PETSC ERROR: PCApply_SOR() line 35 in
src/ksp/pc/impls/sor/sor.c<br>
[1]PETSC ERROR: PCApply() line 442 in
src/ksp/pc/interface/precon.c<br>
[1]PETSC ERROR: KSP_PCApply() line 227 in
src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h<br>
[1]PETSC ERROR: KSPInitialResidual() line 64 in
src/ksp/ksp/interface/itres.c<br>
[1]PETSC ERROR: KSPSolve_GMRES() line 239 in
src/ksp/ksp/impls/gmres/gmres.c<br>
[1]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
[1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in
src/ksp/ksp/impls/cheby/cheby.c<br>
[1]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
[1]PETSC ERROR: PCMGMCycle_Private() line 19 in
src/ksp/pc/impls/mg/mg.c<br>
[1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c<br>
[1]PETSC ERROR: PCApply() line 442 in
src/ksp/pc/interface/precon.c<br>
[1]PETSC ERROR: KSP_PCApply() line 227 in
src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h<br>
[1]PETSC ERROR: KSPSolve_CG() line 175 in
src/ksp/ksp/impls/cg/cg.c<br>
[1]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
------------------------------------------------------------------------<br>
[0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by
mic Thu Aug 1 14:44:04 2013<br>
[0]PETSC ERROR: Libraries linked from
/opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib<br>
[0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013<br>
[0]PETSC ERROR: Configure options <br>
[0]PETSC ERROR:
------------------------------------------------------------------------<br>
[0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in
src/mat/impls/aij/seq/aij.c<br>
[0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in
src/mat/impls/aij/seq/aij.c<br>
[0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in
src/mat/impls/aij/mpi/mpiaij.c<br>
[0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c<br>
[0]PETSC ERROR: PCApply_SOR() line 35 in
src/ksp/pc/impls/sor/sor.c<br>
[0]PETSC ERROR: PCApply() line 442 in
src/ksp/pc/interface/precon.c<br>
[0]PETSC ERROR: KSP_PCApply() line 227 in
src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h<br>
[0]PETSC ERROR: KSPInitialResidual() line 64 in
src/ksp/ksp/interface/itres.c<br>
[0]PETSC ERROR: KSPSolve_GMRES() line 239 in
src/ksp/ksp/impls/gmres/gmres.c<br>
[0]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
[0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in
src/ksp/ksp/impls/cheby/cheby.c<br>
[0]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
[0]PETSC ERROR: PCMGMCycle_Private() line 19 in
src/ksp/pc/impls/mg/mg.c<br>
[0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c<br>
[0]PETSC ERROR: PCApply() line 442 in
src/ksp/pc/interface/precon.c<br>
[0]PETSC ERROR: KSP_PCApply() line 227 in
src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h<br>
[0]PETSC ERROR: KSPSolve_CG() line 175 in
src/ksp/ksp/impls/cg/cg.c<br>
[0]PETSC ERROR: KSPSolve() line 441 in
src/ksp/ksp/interface/itfunc.c<br>
#PETSc Option Table entries:<br>
-da_refine 4<br>
-ksp_view<br>
-options_left<br>
-pc_mg_galerkin<br>
-pc_type mg<br>
#End of PETSc Option Table entries<br>
There is one unused database option. It is:<br>
Option left: name:-da_refine value: 4<br>
<br>
<br>
Here is the code I use to setup DMDA and KSP:<br>
<br>
call DMDACreate3d( PETSC_COMM_WORLD
, &<br>
& DMDA_BOUNDARY_PERIODIC ,
DMDA_BOUNDARY_PERIODIC, &<br>
& DMDA_BOUNDARY_PERIODIC ,
DMDA_STENCIL_STAR, &<br>
& N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip,
1_ip , 1_ip , &<br>
& int(NNZ,ip) ,int(NNY,ip) , NNX, da ,
ierr)<br>
<br>
! Create Global Vectors <br>
call DMCreateGlobalVector(da,b,ierr)<br>
call VecDuplicate(b,x,ierr)<br>
<br>
! Set initial guess for first use of the module to 0<br>
call VecSet(x,0.0_rp,ierr) <br>
<br>
! Create matrix <br>
call DMCreateMatrix(da,MATMPIAIJ,A,ierr)<br>
<br>
! Create solver<br>
call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) <br>
call KSPSetDM(ksp,da,ierr)<br>
call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr)<br>
! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr)<br>
call KSPSetType(ksp,KSPCG,ierr)<br>
call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real
residual<br>
call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr)<br>
call KSPSetTolerances(ksp, tol
,PETSC_DEFAULT_DOUBLE_PRECISION,&<br>
&
PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) <br>
<br>
! To allow using option from command line<br>
call KSPSetFromOptions(ksp,ierr)<br>
<br>
<br>
Michele<br>
<br>
<br>
<br>
<br>
</font>
<div class="moz-cite-prefix">On 08/01/2013 01:04 PM, Barry Smith
wrote:<br>
</div>
<blockquote
cite="mid:A8D17A9E-08ED-427F-B22B-FCA1F35A5AA0@mcs.anl.gov"
type="cite">
<pre wrap="">
You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working
If it doesn't work, send us all the output.
Barry
On Aug 1, 2013, at 2:47 PM, Michele Rosso <a class="moz-txt-link-rfc2396E" href="mailto:mrosso@uci.edu"><mrosso@uci.edu></a> wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Barry,
you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the
geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through
KSPSetComputeOperators and KSPSetComputeRHS.
I do not do that, I simply build a rhs vector and a matrix and then I solve the system.
If you confirm what I just wrote, I will try to modify my code accordingly and get back to you.
Thank you,
Michele
On 08/01/2013 11:48 AM, Barry Smith wrote:
</pre>
<blockquote type="cite">
<pre wrap=""> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c
Barry
On Aug 1, 2013, at 1:35 PM, Michele Rosso
<a class="moz-txt-link-rfc2396E" href="mailto:mrosso@uci.edu"><mrosso@uci.edu></a>
wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Barry,
I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem.
I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct?
I tried to run my case with
-pc_type mg -da_refine 4
but it does not seem to use the -da_refine option:
mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left
KSP Object: 4 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-08, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using UNPRECONDITIONED norm type for convergence test
PC Object: 4 MPI processes
type: mg
MG: type is MULTIPLICATIVE, levels=1 cycles=v
Cycles per PCApply=1
Not using Galerkin computed coarse grid matrices
Coarse grid solver -- level -------------------------------
KSP Object: (mg_levels_0_) 4 MPI processes
type: chebyshev
Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998
Chebyshev: estimated using: [0 0.1; 0 1.1]
KSP Object: (mg_levels_0_est_) 4 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=10, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_levels_0_) 4 MPI processes
type: sor
SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1
linear system matrix = precond matrix:
Matrix Object: 4 MPI processes
type: mpiaij
rows=262144, cols=262144
total: nonzeros=1835008, allocated nonzeros=1835008
total number of mallocs used during MatSetValues calls =0
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_levels_0_) 4 MPI processes
type: sor
SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1
linear system matrix = precond matrix:
Matrix Object: 4 MPI processes
type: mpiaij
rows=262144, cols=262144
total: nonzeros=1835008, allocated nonzeros=1835008
total number of mallocs used during MatSetValues calls =0
linear system matrix = precond matrix:
Matrix Object: 4 MPI processes
type: mpiaij
rows=262144, cols=262144
total: nonzeros=1835008, allocated nonzeros=1835008
total number of mallocs used during MatSetValues calls =0
Solution = 1.53600013 sec
#PETSc Option Table entries:
-da_refine 4
-ksp_view
-options_left
-pc_type mg
#End of PETSc Option Table entries
There is one unused database option. It is:
Option left: name:-da_refine value: 4
Michele
On 08/01/2013 11:21 AM, Barry Smith wrote:
</pre>
<blockquote type="cite">
<pre wrap=""> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster.
For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels.
Barry
On Aug 1, 2013, at 1:14 PM, Michele Rosso
<a class="moz-txt-link-rfc2396E" href="mailto:mrosso@uci.edu"><mrosso@uci.edu></a>
wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Hi,
I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread.
So far I am using GAMG with the default settings, i.e.
-pc_type gamg -pc_gamg_agg_nsmooths 1
The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly
if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on.
So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings?
Finally, I did not try geometric multigrid: do you think it is worth a shot?
Here are my current settings:
I run with
-pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left
and the output is:
KSP Object: 4 MPI processes
type: cg
maximum iterations=10000
tolerances: relative=1e-08, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using UNPRECONDITIONED norm type for convergence test
PC Object: 4 MPI processes
type: gamg
MG: type is MULTIPLICATIVE, levels=3 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
Coarse grid solver -- level -------------------------------
KSP Object: (mg_coarse_) 4 MPI processes
type: preonly
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_coarse_) 4 MPI processes
type: bjacobi
block Jacobi: number of blocks = 4
Local solve info for each block is in the following KSP and PC objects:
[0] number of local blocks = 1, first local block number = 0
[0] local block number 0
KSP Object: (mg_coarse_sub_) 1 MPI processes
type: preonly
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
KSP Object: (mg_coarse_sub_) left preconditioning
using NONE norm type for convergence test
PC Object: (mg_coarse_sub_) 1 MPI processes
type: preonly
1 MPI processes
type: lu
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
LU: out-of-place factorization
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_coarse_sub_) 1 MPI processes
type: lu
tolerance for zero pivot 2.22045e-14
using diagonal shift on blocks to prevent zero pivot
matrix ordering: nd
LU: out-of-place factorization
tolerance for zero pivot 2.22045e-14
using diagonal shift on blocks to prevent zero pivot
matrix ordering: nd
factor fill ratio given 5, needed 0
Factored matrix follows:
factor fill ratio given 5, needed 4.13207
Factored matrix follows:
Matrix Object: Matrix Object: 1 MPI processes
type: seqaij
rows=395, cols=395
package used to perform factorization: petsc
total: nonzeros=132379, allocated nonzeros=132379
total number of mallocs used during MatSetValues calls =0
not using I-node routines
1 MPI processes
type: seqaij
linear system matrix = precond matrix:
rows=0, cols=0
package used to perform factorization: petsc
total: nonzeros=1, allocated nonzeros=1
total number of mallocs used during MatSetValues calls =0
not using I-node routines
linear system matrix = precond matrix:
Matrix Object: 1 MPI processes
type: seqaij
Matrix Object:KSP Object: 1 MPI processes
type: seqaij
rows=0, cols=0
total: nonzeros=0, allocated nonzeros=0
total number of mallocs used during MatSetValues calls =0
not using I-node routines
rows=395, cols=395
total: nonzeros=32037, allocated nonzeros=32037
total number of mallocs used during MatSetValues calls =0
not using I-node routines
- - - - - - - - - - - - - - - - - -
KSP Object: (mg_coarse_sub_) 1 MPI processes
type: preonly
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_coarse_sub_) 1 MPI processes
type: lu
LU: out-of-place factorization
tolerance for zero pivot 2.22045e-14
using diagonal shift on blocks to prevent zero pivot
matrix ordering: nd
factor fill ratio given 5, needed 0
Factored matrix follows:
Matrix Object: 1 MPI processes
type: seqaij
rows=0, cols=0
package used to perform factorization: petsc
total: nonzeros=1, allocated nonzeros=1
total number of mallocs used during MatSetValues calls =0
not using I-node routines
linear system matrix = precond matrix:
Matrix Object: 1 MPI processes
type: seqaij
rows=0, cols=0
total: nonzeros=0, allocated nonzeros=0
total number of mallocs used during MatSetValues calls =0
not using I-node routines
(mg_coarse_sub_) 1 MPI processes
type: preonly
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_coarse_sub_) 1 MPI processes
type: lu
LU: out-of-place factorization
tolerance for zero pivot 2.22045e-14
using diagonal shift on blocks to prevent zero pivot
matrix ordering: nd
factor fill ratio given 5, needed 0
Factored matrix follows:
Matrix Object: 1 MPI processes
type: seqaij
rows=0, cols=0
package used to perform factorization: petsc
total: nonzeros=1, allocated nonzeros=1
total number of mallocs used during MatSetValues calls =0
not using I-node routines
linear system matrix = precond matrix:
Matrix Object: 1 MPI processes
type: seqaij
rows=0, cols=0
total: nonzeros=0, allocated nonzeros=0
total number of mallocs used during MatSetValues calls =0
not using I-node routines
[1] number of local blocks = 1, first local block number = 1
[1] local block number 0
- - - - - - - - - - - - - - - - - -
[2] number of local blocks = 1, first local block number = 2
[2] local block number 0
- - - - - - - - - - - - - - - - - -
[3] number of local blocks = 1, first local block number = 3
[3] local block number 0
- - - - - - - - - - - - - - - - - -
linear system matrix = precond matrix:
Matrix Object: 4 MPI processes
type: mpiaij
rows=395, cols=395
total: nonzeros=32037, allocated nonzeros=32037
total number of mallocs used during MatSetValues calls =0
not using I-node (on process 0) routines
Down solver (pre-smoother) on level 1 -------------------------------
KSP Object: (mg_levels_1_) 4 MPI processes
type: chebyshev
Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607
maximum iterations=2
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using NONE norm type for convergence test
PC Object: (mg_levels_1_) 4 MPI processes
type: jacobi
linear system matrix = precond matrix:
Matrix Object: 4 MPI processes
type: mpiaij
rows=23918, cols=23918
total: nonzeros=818732, allocated nonzeros=818732
total number of mallocs used during MatSetValues calls =0
not using I-node (on process 0) routines
Up solver (post-smoother) same as down solver (pre-smoother)
Down solver (pre-smoother) on level 2 -------------------------------
KSP Object: (mg_levels_2_) 4 MPI processes
type: chebyshev
Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987
maximum iterations=2
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using nonzero initial guess
using NONE norm type for convergence test
PC Object: (mg_levels_2_) 4 MPI processes
type: jacobi
linear system matrix = precond matrix:
Matrix Object: 4 MPI processes
type: mpiaij
rows=262144, cols=262144
total: nonzeros=1835008, allocated nonzeros=1835008
total number of mallocs used during MatSetValues calls =0
Up solver (post-smoother) same as down solver (pre-smoother)
linear system matrix = precond matrix:
Matrix Object: 4 MPI processes
type: mpiaij
rows=262144, cols=262144
total: nonzeros=1835008, allocated nonzeros=1835008
total number of mallocs used during MatSetValues calls =0
#PETSc Option Table entries:
-ksp_view
-options_left
-pc_gamg_agg_nsmooths 1
-pc_type gamg
#End of PETSc Option Table entries
There are no unused options.
Thank you,
Michele
</pre>
</blockquote>
</blockquote>
</blockquote>
<pre wrap="">
</pre>
</blockquote>
<pre wrap="">
</pre>
</blockquote>
<pre wrap="">
</pre>
</blockquote>
<br>
</body>
</html>