PETSc is slowing down in C++? again

Matthew Knepley knepley at gmail.com
Sun Sep 6 21:07:29 CDT 2009


On Sun, Sep 6, 2009 at 8:39 PM, xiaoyin ji <sapphire.jxy at gmail.com> wrote:

> Hi,
>
> I cannot include running options like -ksp_monitor in my program as
> it's not a simple petsc code. However, after testing the ksp examples
> I've found similar problem.
>
> Here is the code I'm testing src/ksp/ksp/examples/tutorials/ex2.c
>
> What I've done is add a loop of 10000 steps between MatCreate and
> MatDestory (right before PetscFinalize), and print the time for each
> loop. The time will increase exponentially just like my program.
> Moreover, if I narrow the loop so that only ksp create and destory are
> included, solving time does not change. -ksp_monitor option shows that
> ksp loop is running fine, however I cannot use this option with the
> time test as print out will change the loop time significantly.
>
> It seems to me that either MatDestory or VecDestory does not clear
> everything well in C++ codes( in Fortran codes they work well).
> Besides, instead of directly call petsc functions, I've also created a
> class which contains petsc mat and ksp utilities, and create/destroy
> the object of this class for each loop. However problem still exists.
>

1) I cannot reproduce this bug. However, the description is not that clear.

2) The Fortran and C++ interfaces are just wrappers. They do not handle
     memory allocation or calculation.

3) This must be something specific to your computer.

  Matt


> Best,
> Xiaoyin Ji
>
> On Thu, Sep 3, 2009 at 8:34 AM, xiaoyin ji<sapphire.jxy at gmail.com> wrote:
> > Hi,
> >
> > Here are the print outs
> >
> > for the very beginning, average time is about 0.8sec for the ksp solver
> > KSP Object:
> >  type: bicg
> >  maximum iterations=10000, initial guess is zero
> >  tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
> >  left preconditioning
> > PC Object:
> >  type: bjacobi
> >    block Jacobi: number of blocks = 16
> >    Local solve is same for all blocks, in the following KSP and PC
> objects:
> >  KSP Object:(sub_)
> >    type: preonly
> >    maximum iterations=10000, initial guess is zero
> >    tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
> >    left preconditioning
> >  PC Object:(sub_)
> >    type: ilu
> >      ILU: 0 levels of fill
> >      ILU: factor fill ratio allocated 1
> >      ILU: tolerance for zero pivot 1e-12
> >      ILU: using diagonal shift to prevent zero pivot
> >      ILU: using diagonal shift on blocks to prevent zero pivot
> >           out-of-place factorization
> >           matrix ordering: natural
> >      ILU: factor fill ratio needed 1
> >           Factored matrix follows
> >          Matrix Object:
> >            type=seqaij, rows=5672, cols=5672
> >            package used to perform factorization: petsc
> >            total: nonzeros=39090, allocated nonzeros=39704
> >              not using I-node routines
> >    linear system matrix = precond matrix:
> >    Matrix Object:
> >      type=seqaij, rows=5672, cols=5672
> >      total: nonzeros=39090, allocated nonzeros=39704
> >        not using I-node routines
> >  linear system matrix = precond matrix:
> >  Matrix Object:
> >    type=mpiaij, rows=90746, cols=90746
> >    total: nonzeros=636378, allocated nonzeros=1279114
> >      not using I-node (on process 0) routines
> > Norm of error 48.144, Iterations 137
> >
> > After 4000 steps, solver takes 7.5sec
> >
> > KSP Object:
> >  type: bicg
> >  maximum iterations=10000, initial guess is zero
> >  tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
> >  left preconditioning
> > PC Object:
> >  type: bjacobi
> >    block Jacobi: number of blocks = 16
> >    Local solve is same for all blocks, in the following KSP and PC
> objects:
> >  KSP Object:(sub_)
> >    type: preonly
> >    maximum iterations=10000, initial guess is zero
> >    tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
> >    left preconditioning
> >  PC Object:(sub_)
> >    type: ilu
> >      ILU: 0 levels of fill
> >      ILU: factor fill ratio allocated 1
> >      ILU: tolerance for zero pivot 1e-12
> >      ILU: using diagonal shift to prevent zero pivot
> >      ILU: using diagonal shift on blocks to prevent zero pivot
> >           out-of-place factorization
> >           matrix ordering: natural
> >      ILU: factor fill ratio needed 1
> >           Factored matrix follows
> >          Matrix Object:
> >            type=seqaij, rows=5672, cols=5672
> >            package used to perform factorization: petsc
> >            total: nonzeros=39090, allocated nonzeros=39704
> >              not using I-node routines
> >    linear system matrix = precond matrix:
> >    Matrix Object:
> >      type=seqaij, rows=5672, cols=5672
> >      total: nonzeros=39090, allocated nonzeros=39704
> >        not using I-node routines
> >  linear system matrix = precond matrix:
> >  Matrix Object:
> >    type=mpiaij, rows=90746, cols=90746
> >    total: nonzeros=636378, allocated nonzeros=1279114
> >      not using I-node (on process 0) routines
> > Norm of error 48.7467, Iterations 132
> >
> >
> > The iterations are similar, solving time is actually increasing
> > exponentially, and the matrix should not be too complicated here as
> > the PETSc in Fortran solved this in 1sec.
> >
> > By the way, will there be a way to set a PETSc vector directly into a
> > preconditioner for the ksp solver?
> >
> > Thanks!
> >
> > Best,
> > Xiaoyin Ji
> >
> > Department of Materials Science and Engineering
> > North Carolina State University
> >
>



-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090906/30af6f1b/attachment.htm>


More information about the petsc-users mailing list