[petsc-users] KSPsolver
Jin, Shuangshuang
Shuangshuang.Jin at pnnl.gov
Tue Apr 16 13:29:00 CDT 2013
Hi, by running mpiexec -n 2 ./ex12 -ksp_view, I got:
[d3m956 at olympus ksp]$ mpiexec -n 2 ./ex12 -ksp_view
0, 56
0, 56
KSP Object: 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using PRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
type: ourjacobi
linear system matrix = precond matrix:
KSP Object: 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using PRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
type: ourjacobi
linear system matrix = precond matrix:
Matrix Object: 1 MPI processes
type: seqaij
rows=56, cols=56
total: nonzeros=250, allocated nonzeros=280
total number of mallocs used during MatSetValues calls =0
not using I-node routines
Norm of error 2.10144e-06 iterations 14
Matrix Object: 1 MPI processes
type: seqaij
rows=56, cols=56
total: nonzeros=250, allocated nonzeros=280
total number of mallocs used during MatSetValues calls =0
not using I-node routines
Norm of error 2.10144e-06 iterations 14
Thanks,
Shuangshuang
From: petsc-users-bounces at mcs.anl.gov [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Hong Zhang
Sent: Tuesday, April 16, 2013 11:26 AM
To: PETSc users list
Subject: Re: [petsc-users] KSPsolver
What do you get by running
mpiexec -n 2 ./ex12 -ksp_view
I get
mpiexec -n 2 ./ex12 -ksp_view
KSP Object: 2 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using PRECONDITIONED norm type for convergence test
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
PC Object: 2 MPI processes
type: ourjacobi
linear system matrix = precond matrix:
Matrix Object: 2 MPI processes
type: mpiaij
rows=56, cols=56
total: nonzeros=250, allocated nonzeros=560
total number of mallocs used during MatSetValues calls =0
not using I-node (on process 0) routines
Norm of error 2.10144e-06 iterations 14
Hong
On Tue, Apr 16, 2013 at 1:17 PM, Jin, Shuangshuang <Shuangshuang.Jin at pnnl.gov<mailto:Shuangshuang.Jin at pnnl.gov>> wrote:
Hello, everyone, I'm trying to run the following example: http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex12.c.html
It is said that this example solves a linear system in parallel with KSP.
However, when I ran it on my clusters, I always got the same copy running multiple times. I printed the Istart and Iend numbers right after the MatGetOwnershipRange, and noticed that the index on each processor are all the same, which is [0, 56].
The following is the printout from the runs:
[d3m956 at olympus ksp]$ mpiexec -n 1 ./ex12
0, 56
Norm of error 2.10144e-06 iterations 14
[d3m956 at olympus ksp]$ mpiexec -n 2 ./ex12
0, 56
0, 56
Norm of error 2.10144e-06 iterations 14
Norm of error 2.10144e-06 iterations 14
[d3m956 at olympus ksp]$ mpiexec -n 4 ./ex12
0, 56
0, 56
0, 56
0, 56
Norm of error 2.10144e-06 iterations 14
Norm of error 2.10144e-06 iterations 14
Norm of error 2.10144e-06 iterations 14
Norm of error 2.10144e-06 iterations 14
Can anyone explain to me why I ran into this issue? Anything I missed?
Thanks,
Shuangshuang
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130416/15b8a861/attachment-0001.html>
More information about the petsc-users
mailing list