[petsc-users] KSPsolver

Hong Zhang hzhang at mcs.anl.gov
Tue Apr 16 13:25:56 CDT 2013


What do you get by running
mpiexec -n 2 ./ex12 -ksp_view

I get
mpiexec -n 2 ./ex12 -ksp_view
KSP Object: 2 MPI processes
  type: gmres
    GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
    GMRES: happy breakdown tolerance 1e-30
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
  left preconditioning
  using PRECONDITIONED norm type for convergence test
  GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
  GMRES: happy breakdown tolerance 1e-30
PC Object: 2 MPI processes
  type: ourjacobi
  linear system matrix = precond matrix:
  Matrix Object:   2 MPI processes
    type: mpiaij
    rows=56, cols=56
    total: nonzeros=250, allocated nonzeros=560
    total number of mallocs used during MatSetValues calls =0
      not using I-node (on process 0) routines
Norm of error 2.10144e-06 iterations 14

Hong



On Tue, Apr 16, 2013 at 1:17 PM, Jin, Shuangshuang <
Shuangshuang.Jin at pnnl.gov> wrote:

>  Hello, everyone, I’m trying to run the following example: *
> http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex12.c.html
> *<http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex12.c.html>
>
> It is said that this example solves a linear system in parallel with KSP.
>
> However, when I ran it on my clusters, I always got the same copy running
> multiple times. I printed the Istart and Iend numbers right after the
> MatGetOwnershipRange, and noticed that the index on each processor are all
> the same, which is [0, 56].
>
> The following is the printout from the runs:
>
> [d3m956 at olympus ksp]$ mpiexec -n 1 ./ex12
> 0, 56
> Norm of error 2.10144e-06 iterations 14
> [d3m956 at olympus ksp]$ mpiexec -n 2 ./ex12
> 0, 56
> 0, 56
> Norm of error 2.10144e-06 iterations 14
> Norm of error 2.10144e-06 iterations 14
> [d3m956 at olympus ksp]$ mpiexec -n 4 ./ex12
> 0, 56
> 0, 56
> 0, 56
> 0, 56
> Norm of error 2.10144e-06 iterations 14
> Norm of error 2.10144e-06 iterations 14
> Norm of error 2.10144e-06 iterations 14
> Norm of error 2.10144e-06 iterations 14
>
> Can anyone explain to me why I ran into this issue? Anything I missed?
>
> Thanks,
> Shuangshuang
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130416/2dfee303/attachment.html>


More information about the petsc-users mailing list