PETSc is slowing down in C++? continued
Matthew Knepley
knepley at gmail.com
Thu Sep 3 07:53:37 CDT 2009
On Thu, Sep 3, 2009 at 7:34 AM, xiaoyin ji <sapphire.jxy at gmail.com> wrote:
> Hi,
>
> Here are the print outs
>
> for the very beginning, average time is about 0.8sec for the ksp solver
> KSP Object:
> type: bicg
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-07, absolute=1e-50, divergence=10000
> left preconditioning
> PC Object:
> type: bjacobi
> block Jacobi: number of blocks = 16
> Local solve is same for all blocks, in the following KSP and PC objects:
> KSP Object:(sub_)
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
> left preconditioning
> PC Object:(sub_)
> type: ilu
> ILU: 0 levels of fill
> ILU: factor fill ratio allocated 1
> ILU: tolerance for zero pivot 1e-12
> ILU: using diagonal shift to prevent zero pivot
> ILU: using diagonal shift on blocks to prevent zero pivot
> out-of-place factorization
> matrix ordering: natural
> ILU: factor fill ratio needed 1
> Factored matrix follows
> Matrix Object:
> type=seqaij, rows=5672, cols=5672
> package used to perform factorization: petsc
> total: nonzeros=39090, allocated nonzeros=39704
> not using I-node routines
> linear system matrix = precond matrix:
> Matrix Object:
> type=seqaij, rows=5672, cols=5672
> total: nonzeros=39090, allocated nonzeros=39704
> not using I-node routines
> linear system matrix = precond matrix:
> Matrix Object:
> type=mpiaij, rows=90746, cols=90746
> total: nonzeros=636378, allocated nonzeros=1279114
> not using I-node (on process 0) routines
> Norm of error 48.144, Iterations 137
>
> After 4000 steps, solver takes 7.5sec
>
> KSP Object:
> type: bicg
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-07, absolute=1e-50, divergence=10000
> left preconditioning
> PC Object:
> type: bjacobi
> block Jacobi: number of blocks = 16
> Local solve is same for all blocks, in the following KSP and PC objects:
> KSP Object:(sub_)
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
> left preconditioning
> PC Object:(sub_)
> type: ilu
> ILU: 0 levels of fill
> ILU: factor fill ratio allocated 1
> ILU: tolerance for zero pivot 1e-12
> ILU: using diagonal shift to prevent zero pivot
> ILU: using diagonal shift on blocks to prevent zero pivot
> out-of-place factorization
> matrix ordering: natural
> ILU: factor fill ratio needed 1
> Factored matrix follows
> Matrix Object:
> type=seqaij, rows=5672, cols=5672
> package used to perform factorization: petsc
> total: nonzeros=39090, allocated nonzeros=39704
> not using I-node routines
> linear system matrix = precond matrix:
> Matrix Object:
> type=seqaij, rows=5672, cols=5672
> total: nonzeros=39090, allocated nonzeros=39704
> not using I-node routines
> linear system matrix = precond matrix:
> Matrix Object:
> type=mpiaij, rows=90746, cols=90746
> total: nonzeros=636378, allocated nonzeros=1279114
> not using I-node (on process 0) routines
> Norm of error 48.7467, Iterations 132
>
>
> The iterations are similar, solving time is actually increasing
> exponentially, and the matrix should not be too complicated here as
> the PETSc in Fortran solved this in 1sec.
>
You did not send the output of -log_summary. You will probably have to
segregate the solves into different stages in order to see the difference.
I suspect
a) You do not have something preallocated correctly
b) You are not freeing something (run with -malloc_dump) and thus clogging
memory
c) something apart from PETSc is taking time
By the way, will there be a way to set a PETSc vector directly into a
> preconditioner for the ksp solver?
>
Do not know what you mean here.
Matt
> Thanks!
>
> Best,
> Xiaoyin Ji
>
> Department of Materials Science and Engineering
> North Carolina State University
>
--
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090903/9cc3bda5/attachment.htm>
More information about the petsc-users
mailing list