<html>
<head>
<meta content="text/html; charset=utf-8" http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<div class="moz-cite-prefix">Sure, so with gamg (which was my first
choice too), I get the following:<br>
<br>
<font face="Courier New, Courier, monospace">[0]PETSC ERROR:
--------------------- Error Message
--------------------------------------------------------------<br>
[0]PETSC ERROR: <br>
[0]PETSC ERROR: Must call DMShellSetGlobalVector() or
DMShellSetCreateGlobalVector()<br>
[0]PETSC ERROR: See
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble
shooting.<br>
[0]PETSC ERROR: Petsc Release Version 3.5.2, Sep, 08, 2014 <br>
[0]PETSC ERROR:
/home/luc/research/feap_repo/ShearBands/parfeap/feap on a
arch-opt named euler by luc Thu Nov 20 16:19:29 2014<br>
[0]PETSC ERROR: Configure options --with-cc=gcc
--with-fc=gfortran --with-cxx=g++ --with-debugging=0
--with-shared-libraries=0 --download-fblaslapack
--download-mpich --download-parmetis --download-metis
--download-ml=yes --download-hypre --download-superlu_dist
--download-mumps --download-scalapack<br>
[0]PETSC ERROR: #145 DMCreateGlobalVector_Shell() line 245 in
/home/luc/research/petsc-3.5.2/src/dm/impls/shell/dmshell.c<br>
[0]PETSC ERROR: #146 DMCreateGlobalVector() line 681 in
/home/luc/research/petsc-3.5.2/src/dm/interface/dm.c<br>
[0]PETSC ERROR: #147 DMGetGlobalVector() line 154 in
/home/luc/research/petsc-3.5.2/src/dm/interface/dmget.c<br>
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------<br>
[0]PETSC ERROR: <br>
[0]PETSC ERROR: Must call DMShellSetGlobalVector() or
DMShellSetCreateGlobalVector()<br>
[0]PETSC ERROR: See
<a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble
shooting.<br>
[0]PETSC ERROR: Petsc Release Version 3.5.2, Sep, 08, 2014 <br>
[0]PETSC ERROR:
/home/luc/research/feap_repo/ShearBands/parfeap/feap on a
arch-opt named euler by luc Thu Nov 20 16:19:29 2014<br>
[0]PETSC ERROR: Configure options --with-cc=gcc
--with-fc=gfortran --with-cxx=g++ --with-debugging=0
--with-shared-libraries=0 --download-fblaslapack
--download-mpich --download-parmetis --download-metis
--download-ml=yes --download-hypre --download-superlu_dist
--download-mumps --download-scalapack<br>
[0]PETSC ERROR: #148 DMCreateGlobalVector_Shell() line 245 in
/home/luc/research/petsc-3.5.2/src/dm/impls/shell/dmshell.c<br>
[0]PETSC ERROR: #149 DMCreateGlobalVector() line 681 in
/home/luc/research/petsc-3.5.2/src/dm/interface/dm.c<br>
[0]PETSC ERROR: #150 DMGetGlobalVector() line 154 in
/home/luc/research/petsc-3.5.2/src/dm/interface/dmget.c<br>
KSP Object: 1 MPI processes<br>
type: gmres<br>
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement<br>
GMRES: happy breakdown tolerance 1e-30<br>
maximum iterations=10000, initial guess is zero<br>
tolerances: relative=1e-08, absolute=1e-16, divergence=1e+16<br>
left preconditioning<br>
using PRECONDITIONED norm type for convergence test<br>
PC Object: 1 MPI processes<br>
type: fieldsplit<br>
FieldSplit with Schur preconditioner, factorization FULL<br>
Preconditioner for the Schur complement formed from Sp, an
assembled approximation to S, which uses (the lumped) A00's
diagonal's inverse<br>
Split info:<br>
Split number 0 Defined by IS<br>
Split number 1 Defined by IS<br>
KSP solver for A00 block<br>
KSP Object: (fieldsplit_0_) 1 MPI processes<br>
type: preonly<br>
maximum iterations=10000, initial guess is zero<br>
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000<br>
left preconditioning<br>
using NONE norm type for convergence test<br>
PC Object: (fieldsplit_0_) 1 MPI processes<br>
type: hypre<br>
HYPRE Euclid preconditioning<br>
HYPRE Euclid: number of levels 1<br>
linear system matrix = precond matrix:<br>
Mat Object: (fieldsplit_0_) 1 MPI
processes<br>
type: seqaij<br>
rows=2000, cols=2000<br>
total: nonzeros=40000, allocated nonzeros=40000<br>
total number of mallocs used during MatSetValues calls
=0<br>
using I-node routines: found 400 nodes, limit used
is 5<br>
KSP solver for S = A11 - A10 inv(A00) A01 <br>
KSP Object: (fieldsplit_1_) 1 MPI processes<br>
type: gmres<br>
GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement<br>
GMRES: happy breakdown tolerance 1e-30<br>
maximum iterations=10000, initial guess is zero<br>
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000<br>
left preconditioning<br>
using PRECONDITIONED norm type for convergence test<br>
PC Object: (fieldsplit_1_) 1 MPI processes<br>
type: gamg<br>
MG: type is MULTIPLICATIVE, levels=2 cycles=v<br>
Cycles per PCApply=1<br>
Using Galerkin computed coarse grid matrices<br>
Coarse grid solver -- level
-------------------------------<br>
KSP Object:
(fieldsplit_1_mg_coarse_) 1 MPI processes<br>
type: preonly<br>
maximum iterations=1, initial guess is zero<br>
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000<br>
left preconditioning<br>
using NONE norm type for convergence test<br>
PC Object:
(fieldsplit_1_mg_coarse_) 1 MPI processes<br>
type: bjacobi<br>
block Jacobi: number of blocks = 1<br>
Local solve is same for all blocks, in the
following KSP and PC objects:<br>
KSP Object:
(fieldsplit_1_mg_coarse_sub_) 1 MPI processes<br>
type: preonly<br>
maximum iterations=1, initial guess is zero<br>
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000<br>
left preconditioning<br>
using NONE norm type for convergence test<br>
PC Object:
(fieldsplit_1_mg_coarse_sub_) 1 MPI processes<br>
type: lu<br>
LU: out-of-place factorization<br>
tolerance for zero pivot 2.22045e-14<br>
using diagonal shift on blocks to prevent zero
pivot [INBLOCKS]<br>
matrix ordering: nd<br>
factor fill ratio given 5, needed 1.22785<br>
Factored matrix follows:<br>
Mat Object: 1 MPI
processes<br>
type: seqaij<br>
rows=13, cols=13<br>
package used to perform factorization:
petsc<br>
total: nonzeros=97, allocated
nonzeros=97<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 10 nodes,
limit used is 5<br>
linear system matrix = precond matrix:<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=13, cols=13<br>
total: nonzeros=79, allocated nonzeros=79<br>
total number of mallocs used during
MatSetValues calls =0<br>
not using I-node routines<br>
linear system matrix = precond matrix:<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=13, cols=13<br>
total: nonzeros=79, allocated nonzeros=79<br>
total number of mallocs used during MatSetValues
calls =0<br>
not using I-node routines<br>
Down solver (pre-smoother) on level 1
-------------------------------<br>
KSP Object:
(fieldsplit_1_mg_levels_1_) 1 MPI processes<br>
type: chebyshev<br>
Chebyshev: eigenvalue estimates: min = 0.0979919,
max = 2.05783<br>
maximum iterations=2<br>
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000<br>
left preconditioning<br>
using nonzero initial guess<br>
using NONE norm type for convergence test<br>
PC Object:
(fieldsplit_1_mg_levels_1_) 1 MPI processes<br>
type: sor<br>
SOR: type = local_symmetric, iterations = 1, local
iterations = 1, omega = 1<br>
linear system matrix followed by preconditioner
matrix:<br>
Mat Object: (fieldsplit_1_) 1
MPI processes<br>
type: schurcomplement<br>
rows=330, cols=330<br>
Schur complement A11 - A10 inv(A00) A01<br>
A11<br>
Mat Object:
(fieldsplit_1_) 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=330<br>
total: nonzeros=7642, allocated
nonzeros=7642<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 121 nodes,
limit used is 5<br>
A10<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=2000<br>
total: nonzeros=22800, allocated
nonzeros=22800<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 121 nodes,
limit used is 5<br>
KSP of A00<br>
KSP Object:
(fieldsplit_0_) 1 MPI processes<br>
type: preonly<br>
maximum iterations=10000, initial guess is
zero<br>
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000<br>
left preconditioning<br>
using NONE norm type for convergence test<br>
PC Object:
(fieldsplit_0_) 1 MPI processes<br>
type: hypre<br>
HYPRE Euclid preconditioning<br>
HYPRE Euclid: number of levels 1<br>
linear system matrix = precond matrix:<br>
Mat Object:
(fieldsplit_0_) 1 MPI processes<br>
type: seqaij<br>
rows=2000, cols=2000<br>
total: nonzeros=40000, allocated
nonzeros=40000<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 400 nodes,
limit used is 5<br>
A01<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=2000, cols=330<br>
total: nonzeros=22800, allocated
nonzeros=22800<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 400 nodes,
limit used is 5<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=330<br>
total: nonzeros=7642, allocated nonzeros=7642<br>
total number of mallocs used during MatSetValues
calls =0<br>
using I-node routines: found 121 nodes, limit
used is 5<br>
Up solver (post-smoother) same as down solver
(pre-smoother)<br>
linear system matrix followed by preconditioner matrix:<br>
Mat Object: (fieldsplit_1_) 1 MPI
processes<br>
type: schurcomplement<br>
rows=330, cols=330<br>
Schur complement A11 - A10 inv(A00) A01<br>
A11<br>
Mat Object:
(fieldsplit_1_) 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=330<br>
total: nonzeros=7642, allocated nonzeros=7642<br>
total number of mallocs used during MatSetValues
calls =0<br>
using I-node routines: found 121 nodes, limit
used is 5<br>
A10<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=2000<br>
total: nonzeros=22800, allocated nonzeros=22800<br>
total number of mallocs used during MatSetValues
calls =0<br>
using I-node routines: found 121 nodes, limit
used is 5<br>
KSP of A00<br>
KSP Object:
(fieldsplit_0_) 1 MPI processes<br>
type: preonly<br>
maximum iterations=10000, initial guess is zero<br>
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000<br>
left preconditioning<br>
using NONE norm type for convergence test<br>
PC Object:
(fieldsplit_0_) 1 MPI processes<br>
type: hypre<br>
HYPRE Euclid preconditioning<br>
HYPRE Euclid: number of levels 1<br>
linear system matrix = precond matrix:<br>
Mat Object:
(fieldsplit_0_) 1 MPI processes<br>
type: seqaij<br>
rows=2000, cols=2000<br>
total: nonzeros=40000, allocated
nonzeros=40000<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 400 nodes,
limit used is 5<br>
A01<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=2000, cols=330<br>
total: nonzeros=22800, allocated nonzeros=22800<br>
total number of mallocs used during MatSetValues
calls =0<br>
using I-node routines: found 400 nodes, limit
used is 5<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=330<br>
total: nonzeros=7642, allocated nonzeros=7642<br>
total number of mallocs used during MatSetValues calls
=0<br>
using I-node routines: found 121 nodes, limit used
is 5<br>
linear system matrix = precond matrix:<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=2330, cols=2330<br>
total: nonzeros=93242, allocated nonzeros=93242<br>
total number of mallocs used during MatSetValues calls =0<br>
using I-node routines: found 521 nodes, limit used is 5</font><br>
<br>
I am still missing the same call to <font face="Courier New,
Courier, monospace">DMShellSetCreateGlobalVector()/DMShellSetGlobalVector()</font>,
the only difference is that I get this error message twice : )<br>
<br>
FYI this how I set my DMShell, KSP and PC:<br>
<br>
c Set up the number of fields, the section dofs and the fields
sections dofs<br>
call PetscSectionCreate(PETSC_COMM_WORLD, FSSection, ierr)<br>
call PetscSectionSetNumFields(FSSection, numFields, ierr)<br>
call PetscSectionSetChart(FSSection, 1, numpeq+1, ierr)<br>
call PetscSectionGetChart(FSSection, NF, NE, ierr)<br>
<br>
do some loop over nodes and dofs<br>
call PetscSectionSetDof(FSSection, kk, 1,
ierr)<br>
call PetscSectionSetFieldDof(FSSection, kk,
jj-1,1, ierr)<br>
enddo<br>
<br>
call PetscSectionSetUp(FSSection, ierr)<br>
c Create the DM and set the default section<br>
if(DM_flg) then<br>
call DMShellCreate(PETSC_COMM_WORLD,FSDM, ierr)<br>
call DMSetDefaultSection(FSDM, FSSection, ierr)<br>
call DMSetUp(FSDM, ierr)<br>
endif<br>
<br>
call KSPCreate (PETSC_COMM_WORLD, kspsol ,ierr)<br>
<br>
call PetscOptionsGetString(PETSC_NULL_CHARACTER,'-pc_type',<br>
& pctype,chk,ierr)<br>
c Check for fieldsplit preconditioner to assign a DM to the
KSP<br>
if (chk .eqv. PETSC_TRUE) then<br>
if(pctype.eq.'fieldsplit') then<br>
call KSPSetDM(kspsol, FSDM, ierr)<br>
call KSPSetDMActive(kspsol, PETSC_FALSE, ierr)<br>
endif<br>
endif<br>
c call KSPSetOperators (kspsol, Kmat, Kmat,<br>
c & DIFFERENT_NONZERO_PATTERN
,ierr)<br>
call KSPSetOperators(kspsol, Kmat, Kmat, ierr)<br>
<br>
c Options from command line<br>
<br>
call KSPSetFromOptions(kspsol,ierr)<br>
<br>
call KSPGetPC (kspsol, pc , ierr)<br>
call PCSetCoordinates(pc,ndm,numpn,hr(np(43)),ierr)<br>
<br>
<pre class="moz-signature" cols="72">Best,
Luc</pre>
On 11/21/2014 11:14 AM, Matthew Knepley wrote:<br>
</div>
<blockquote
cite="mid:CAMYG4GmLMXqiU=aHHsTEo7AHVzr-0XJytLF0jC2nWNW=3Lti3g@mail.gmail.com"
type="cite">
<div dir="ltr">
<div class="gmail_extra">
<div class="gmail_quote">On Fri, Nov 21, 2014 at 9:55 AM, Luc
Berger-Vergiat <span dir="ltr"><<a
moz-do-not-send="true" href="mailto:lb2653@columbia.edu"
target="_blank">lb2653@columbia.edu</a>></span>
wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
<div bgcolor="#FFFFFF" text="#000000"> Hi all,<br>
I am using a DMShell to create to use a fieldsplit
preconditioner.<br>
I would like to try some of petsc's multigrid options:
mg and gamg.<br>
</div>
</blockquote>
<div><br>
</div>
<div>Lets start with gamg, since that does not need anything
else from the user. If you want to use mg,</div>
<div>then you will need to supply the
interpolation/restriction operators since we cannot
calculate them</div>
<div>without an idea of the discretization.</div>
<div><br>
</div>
<div> Thanks,</div>
<div><br>
</div>
<div> Matt</div>
<div> </div>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
<div bgcolor="#FFFFFF" text="#000000"> When I call my
preconditioner:<br>
<blockquote><font face="Courier New, Courier, monospace">-ksp_type
gmres -pc_type fieldsplit -pc_fieldsplit_type schur
-pc_fieldsplit_schur_factorization_type full
-pc_fieldsplit_schur_precondition selfp
-pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields
0,1 -fieldsplit_0_ksp_type preonly
-fieldsplit_0_pc_type hypre
-fieldsplit_0_pc_hypre_type euclid
-fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type
mg -malloc_log mlog -log_summary time.log -ksp_view</font><br>
</blockquote>
it returns me the following error message and ksp_view:<br>
<br>
<font face="Courier New, Courier, monospace">[0]PETSC
ERROR: --------------------- Error Message
--------------------------------------------------------------<br>
[0]PETSC ERROR: <br>
[0]PETSC ERROR: Must call DMShellSetGlobalVector() or
DMShellSetCreateGlobalVector()<br>
[0]PETSC ERROR: See <a moz-do-not-send="true"
href="http://www.mcs.anl.gov/petsc/documentation/faq.html"
target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.<br>
[0]PETSC ERROR: Petsc Release Version 3.5.2, Sep, 08,
2014 <br>
[0]PETSC ERROR:
/home/luc/research/feap_repo/ShearBands/parfeap/feap
on a arch-opt named euler by luc Fri Nov 21 10:12:53
2014<br>
[0]PETSC ERROR: Configure options --with-cc=gcc
--with-fc=gfortran --with-cxx=g++ --with-debugging=0
--with-shared-libraries=0 --download-fblaslapack
--download-mpich --download-parmetis --download-metis
--download-ml=yes --download-hypre
--download-superlu_dist --download-mumps
--download-scalapack<br>
[0]PETSC ERROR: #259 DMCreateGlobalVector_Shell() line
245 in
/home/luc/research/petsc-3.5.2/src/dm/impls/shell/dmshell.c<br>
[0]PETSC ERROR: #260 DMCreateGlobalVector() line 681
in
/home/luc/research/petsc-3.5.2/src/dm/interface/dm.c<br>
[0]PETSC ERROR: #261 DMGetGlobalVector() line 154 in
/home/luc/research/petsc-3.5.2/src/dm/interface/dmget.c<br>
KSP Object: 1 MPI processes<br>
type: gmres<br>
GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative
refinement<br>
GMRES: happy breakdown tolerance 1e-30<br>
maximum iterations=10000, initial guess is zero<br>
tolerances: relative=1e-08, absolute=1e-16,
divergence=1e+16<br>
left preconditioning<br>
using PRECONDITIONED norm type for convergence test<br>
PC Object: 1 MPI processes<br>
type: fieldsplit<br>
FieldSplit with Schur preconditioner,
factorization FULL<br>
Preconditioner for the Schur complement formed
from Sp, an assembled approximation to S, which uses
(the lumped) A00's diagonal's inverse<br>
Split info:<br>
Split number 0 Defined by IS<br>
Split number 1 Defined by IS<br>
KSP solver for A00 block<br>
KSP Object: (fieldsplit_0_) 1 MPI
processes<br>
type: preonly<br>
maximum iterations=10000, initial guess is
zero<br>
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000<br>
left preconditioning<br>
using NONE norm type for convergence test<br>
PC Object: (fieldsplit_0_) 1 MPI
processes<br>
type: hypre<br>
HYPRE Euclid preconditioning<br>
HYPRE Euclid: number of levels 1<br>
linear system matrix = precond matrix:<br>
Mat Object: (fieldsplit_0_) 1
MPI processes<br>
type: seqaij<br>
rows=2000, cols=2000<br>
total: nonzeros=40000, allocated
nonzeros=40000<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 400 nodes,
limit used is 5<br>
KSP solver for S = A11 - A10 inv(A00) A01 <br>
KSP Object: (fieldsplit_1_) 1 MPI
processes<br>
type: gmres<br>
GMRES: restart=30, using Classical
(unmodified) Gram-Schmidt Orthogonalization with no
iterative refinement<br>
GMRES: happy breakdown tolerance 1e-30<br>
maximum iterations=10000, initial guess is
zero<br>
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000<br>
left preconditioning<br>
using PRECONDITIONED norm type for convergence
test<br>
PC Object: (fieldsplit_1_) 1 MPI
processes<br>
type: mg<br>
MG: type is MULTIPLICATIVE, levels=1
cycles=v<br>
Cycles per PCApply=1<br>
Not using Galerkin computed coarse grid
matrices<br>
Coarse grid solver -- level
-------------------------------<br>
KSP Object:
(fieldsplit_1_mg_levels_0_) 1 MPI processes<br>
type: chebyshev<br>
Chebyshev: eigenvalue estimates: min =
1.70057, max = 18.7063<br>
Chebyshev: estimated using: [0 0.1; 0
1.1]<br>
KSP Object:
(fieldsplit_1_mg_levels_0_est_) 1 MPI
processes<br>
type: gmres<br>
GMRES: restart=30, using Classical
(unmodified) Gram-Schmidt Orthogonalization with no
iterative refinement<br>
GMRES: happy breakdown tolerance
1e-30<br>
maximum iterations=10, initial guess
is zero<br>
tolerances: relative=1e-05,
absolute=1e-50, divergence=10000<br>
left preconditioning<br>
using NONE norm type for convergence
test<br>
PC Object:
(fieldsplit_1_mg_levels_0_) 1 MPI
processes<br>
type: sor<br>
SOR: type = local_symmetric,
iterations = 1, local iterations = 1, omega = 1<br>
linear system matrix followed by
preconditioner matrix:<br>
Mat Object:
(fieldsplit_1_) 1 MPI processes<br>
type: schurcomplement<br>
rows=330, cols=330<br>
Schur complement A11 - A10
inv(A00) A01<br>
A11<br>
Mat Object:
(fieldsplit_1_) 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=330<br>
total: nonzeros=7642,
allocated nonzeros=7642<br>
total number of mallocs used
during MatSetValues calls =0<br>
using I-node routines: found
121 nodes, limit used is 5<br>
A10<br>
Mat
Object: 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=2000<br>
total: nonzeros=22800,
allocated nonzeros=22800<br>
total number of mallocs used
during MatSetValues calls =0<br>
using I-node routines: found
121 nodes, limit used is 5<br>
KSP of A00<br>
KSP Object:
(fieldsplit_0_) 1 MPI processes<br>
type: preonly<br>
maximum iterations=10000,
initial guess is zero<br>
tolerances: relative=1e-05,
absolute=1e-50, divergence=10000<br>
left preconditioning<br>
using NONE norm type for
convergence test<br>
PC Object:
(fieldsplit_0_) 1 MPI processes<br>
type: hypre<br>
HYPRE Euclid preconditioning<br>
HYPRE Euclid: number of
levels 1<br>
linear system matrix = precond
matrix:<br>
Mat
Object:
(fieldsplit_0_) 1 MPI
processes<br>
type: seqaij<br>
rows=2000, cols=2000<br>
total: nonzeros=40000,
allocated nonzeros=40000<br>
total number of mallocs used
during MatSetValues calls =0<br>
using I-node routines:
found 400 nodes, limit used is 5<br>
A01<br>
Mat
Object: 1 MPI processes<br>
type: seqaij<br>
rows=2000, cols=330<br>
total: nonzeros=22800,
allocated nonzeros=22800<br>
total number of mallocs used
during MatSetValues calls =0<br>
using I-node routines: found
400 nodes, limit used is 5<br>
Mat Object: 1 MPI
processes<br>
type: seqaij<br>
rows=330, cols=330<br>
total: nonzeros=7642, allocated
nonzeros=7642<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 121
nodes, limit used is 5<br>
maximum iterations=1, initial guess is
zero<br>
tolerances: relative=1e-05,
absolute=1e-50, divergence=10000<br>
left preconditioning<br>
using NONE norm type for convergence test<br>
PC Object:
(fieldsplit_1_mg_levels_0_) 1 MPI processes<br>
type: sor<br>
SOR: type = local_symmetric, iterations
= 1, local iterations = 1, omega = 1<br>
linear system matrix followed by
preconditioner matrix:<br>
Mat Object:
(fieldsplit_1_) 1 MPI processes<br>
type: schurcomplement<br>
rows=330, cols=330<br>
Schur complement A11 - A10 inv(A00)
A01<br>
A11<br>
Mat Object:
(fieldsplit_1_) 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=330<br>
total: nonzeros=7642, allocated
nonzeros=7642<br>
total number of mallocs used
during MatSetValues calls =0<br>
using I-node routines: found 121
nodes, limit used is 5<br>
A10<br>
Mat Object: 1 MPI
processes<br>
type: seqaij<br>
rows=330, cols=2000<br>
total: nonzeros=22800, allocated
nonzeros=22800<br>
total number of mallocs used
during MatSetValues calls =0<br>
using I-node routines: found 121
nodes, limit used is 5<br>
KSP of A00<br>
KSP Object:
(fieldsplit_0_) 1 MPI processes<br>
type: preonly<br>
maximum iterations=10000, initial
guess is zero<br>
tolerances: relative=1e-05,
absolute=1e-50, divergence=10000<br>
left preconditioning<br>
using NONE norm type for
convergence test<br>
PC Object:
(fieldsplit_0_) 1 MPI processes<br>
type: hypre<br>
HYPRE Euclid preconditioning<br>
HYPRE Euclid: number of levels 1<br>
linear system matrix = precond
matrix:<br>
Mat Object:
(fieldsplit_0_) 1 MPI processes<br>
type: seqaij<br>
rows=2000, cols=2000<br>
total: nonzeros=40000, allocated
nonzeros=40000<br>
total number of mallocs used
during MatSetValues calls =0<br>
using I-node routines: found
400 nodes, limit used is 5<br>
A01<br>
Mat Object: 1 MPI
processes<br>
type: seqaij<br>
rows=2000, cols=330<br>
total: nonzeros=22800, allocated
nonzeros=22800<br>
total number of mallocs used
during MatSetValues calls =0<br>
using I-node routines: found 400
nodes, limit used is 5<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=330<br>
total: nonzeros=7642, allocated
nonzeros=7642<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 121
nodes, limit used is 5<br>
linear system matrix followed by
preconditioner matrix:<br>
Mat Object: (fieldsplit_1_) 1
MPI processes<br>
type: schurcomplement<br>
rows=330, cols=330<br>
Schur complement A11 - A10 inv(A00) A01<br>
A11<br>
Mat Object:
(fieldsplit_1_) 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=330<br>
total: nonzeros=7642, allocated
nonzeros=7642<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 121
nodes, limit used is 5<br>
A10<br>
Mat Object: 1 MPI
processes<br>
type: seqaij<br>
rows=330, cols=2000<br>
total: nonzeros=22800, allocated
nonzeros=22800<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 121
nodes, limit used is 5<br>
KSP of A00<br>
KSP Object:
(fieldsplit_0_) 1 MPI processes<br>
type: preonly<br>
maximum iterations=10000, initial
guess is zero<br>
tolerances: relative=1e-05,
absolute=1e-50, divergence=10000<br>
left preconditioning<br>
using NONE norm type for convergence
test<br>
PC Object:
(fieldsplit_0_) 1 MPI processes<br>
type: hypre<br>
HYPRE Euclid preconditioning<br>
HYPRE Euclid: number of levels 1<br>
linear system matrix = precond matrix:<br>
Mat Object:
(fieldsplit_0_) 1 MPI processes<br>
type: seqaij<br>
rows=2000, cols=2000<br>
total: nonzeros=40000, allocated
nonzeros=40000<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 400
nodes, limit used is 5<br>
A01<br>
Mat Object: 1 MPI
processes<br>
type: seqaij<br>
rows=2000, cols=330<br>
total: nonzeros=22800, allocated
nonzeros=22800<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 400
nodes, limit used is 5<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=330, cols=330<br>
total: nonzeros=7642, allocated
nonzeros=7642<br>
total number of mallocs used during
MatSetValues calls =0<br>
using I-node routines: found 121 nodes,
limit used is 5<br>
linear system matrix = precond matrix:<br>
Mat Object: 1 MPI processes<br>
type: seqaij<br>
rows=2330, cols=2330<br>
total: nonzeros=93242, allocated nonzeros=93242<br>
total number of mallocs used during MatSetValues
calls =0<br>
using I-node routines: found 521 nodes, limit
used is 5</font><br>
<br>
I am not completely surprised by this since the
multigrid algorithm might want to know the structure of
my vector in order to do a good job at restrictions and
prolongations, etc... I am just not sure when I should
call: <font face="Courier New, Courier, monospace">DMShellSetGlobalVector()
or DMShellSetCreateGlobalVector()</font>.<br>
If I call <font face="Courier New, Courier, monospace">DMShellSetGlobalVector()</font>
I assume that I need to generate my vector somehow
before hand but I don't know what is required to make it
a valid "template" vector?<br>
If I call <font face="Courier New, Courier, monospace">DMShellSetCreateGlobalVector()
<font face="sans-serif">I need to pass it a functions
that computes a vector but what kind of vector? A
residual or a "template"? This is not very clear to
me...<span class="HOEnZb"><font color="#888888"><br>
<br>
</font></span></font></font><span class="HOEnZb"><font
color="#888888">
<pre cols="72">--
Best,
Luc</pre>
</font></span></div>
</blockquote>
</div>
<br>
<br clear="all">
<div><br>
</div>
-- <br>
<div class="gmail_signature">What most experimenters take for
granted before they begin their experiments is infinitely
more interesting than any results to which their experiments
lead.<br>
-- Norbert Wiener</div>
</div>
</div>
</blockquote>
<br>
</body>
</html>