[petsc-users] Scaling/Preconditioners for Poisson equation
Jed Brown
jed at jedbrown.org
Wed Oct 1 07:23:03 CDT 2014
Filippo Leonardi <filippo.leonardi at sam.math.ethz.ch> writes:
> I am actually having hard time figuring out where I am spending my time.
>
> Reading the report I am spending time on KSPSolve and PCApply (e+02). Since
> the number of those operations is well under control. I guess is some
> communication that is the bottleneck.
>
> The lines:
> VecScatterBegin 4097 1.0 2.5168e+01 3.2 0.00e+00 0.0 2.9e+09 3.7e+01
> 0.0e+00 3 0 87 39 0 10 0100100 0 0
> VecScatterEnd 4097 1.0 1.7736e+02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 25 0 0 0 0 88 0 0 0 0 0
> are probably what is slowing down the solution.
How large is your problem?
> Also, times do not add up properly, especially in KSPSolve.
Timings are inclusive, not exclusive.
I don't know what is going on with the having the same event appear many
times within the stage, but I remember fixing an issue that might have
caused that years ago and I'd appreciate it if you would upgrade to the
current version of PETSc.
How large is your problem size and how many processors are you running on?
> PS: until now I was using outputs in VTK. I guess it is better to output in
> PEtsc binary?
Yes. You can use the binary-appended VTK viewer (please upgrade to
current version of PETSc) or the PETSc binary viewer.
> Is it better to output from PETSC_COMM_SELF (i.e. each processor
> individually)?
No, use collective IO.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 818 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141001/e0ceac59/attachment.pgp>
More information about the petsc-users
mailing list