[petsc-users] Enquiry regarding log summary results

TAY wee-beng zonexo at gmail.com
Sun Oct 7 16:45:04 CDT 2012


Hi,

I have attached 3 results using 12,24 and 32 processors. I am using a 
completely different clusters and I'm testing if it's the cluster 
configuration problems. It's seems that VecScatterEnd does not scale 
well from 12 to 24 to 32. Does these results show that there's problems 
with the partition still? My partition is clustered closely at the 
center, I'm wondering if this has a great effect on scaling...

Tks!

Yours sincerely,

TAY wee-beng

On 6/10/2012 11:54 PM, Jed Brown wrote:
> On Sat, Oct 6, 2012 at 4:51 PM, TAY wee-beng <zonexo at gmail.com 
> <mailto:zonexo at gmail.com>> wrote:
>
>     This line happens rather late in the assembly, so it is not the
>     main matrix which I've created, right? And hence this doesn't matter?
>
>
> Look at all the lines talking about mallocs. If the first assembly 
> uses lots of mallocs, the first assembly will be slow.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121007/5caa289e/attachment-0001.html>
-------------- next part --------------


---------------------------------------------- PETSc Performance Summary: ----------------------------------------------

./a.out on a petsc-3.3-dev_atlas6_rel named atlas6-c39 with 12 processors, by g0306332 Mon Oct  8 03:58:56 2012
Using Petsc Development HG revision: 0b92fc173218fc24fd69e1ee041f811d50d4766c  HG Date: Fri Oct 05 15:20:44 2012 -0500

                         Max       Max/Min        Avg      Total 
mv: cannot move `/var/tmp/uvwp12.plt' to `/home/wtay/Results/8/': No such file or directory
Time (sec):           1.201e+03      1.00258   1.199e+03
Objects:              5.160e+02      1.00000   5.160e+02
Flops:                2.977e+11      1.11376   2.739e+11  3.287e+12
Flops/sec:            2.478e+08      1.11115   2.285e+08  2.742e+09
MPI Messages:         5.879e+04      2.96007   4.346e+04  5.215e+05
MPI Message Lengths:  4.194e+09      2.02856   8.822e+04  4.601e+10
MPI Reductions:       6.296e+03      1.00000

Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
                            e.g., VecAXPY() for real vectors of length N --> 2N flops
                            and VecAXPY() for complex vectors of length N --> 8N flops

Summary of Stages:   ----- Time ------  ----- Flops -----  --- Messages ---  -- Message Lengths --  -- Reductions --
                        Avg     %Total     Avg     %Total   counts   %Total     Avg         %Total   counts   %Total 
 0:      Main Stage: 1.3933e+02  11.6%  0.0000e+00   0.0%  0.000e+00   0.0%  0.000e+00        0.0%  2.000e+01   0.3% 
 1:     poisson_eqn: 9.8694e+02  82.3%  3.1635e+12  96.3%  5.206e+05  99.8%  8.705e+04       98.7%  5.977e+03  94.9% 
 2:    momentum_eqn: 7.2420e+01   6.0%  1.2324e+11   3.7%  8.800e+02   0.2%  1.170e+03        1.3%  2.980e+02   4.7% 

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %f - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %f %M %L %R  %T %f %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------

--- Event Stage 0: Main Stage


--- Event Stage 1: poisson_eqn

MatMult            21340 1.0 5.8555e+02 1.1 1.91e+11 1.2 4.0e+05 1.1e+05 0.0e+00 47 62 77 93  0  58 65 77 94  0  3503
MatMultAdd          3280 1.0 4.0142e+01 1.2 1.26e+10 1.2 5.7e+04 1.9e+04 0.0e+00  3  4 11  2  0   4  4 11  2  0  3205
MatMultTranspose    3280 1.0 4.4090e+01 1.3 1.26e+10 1.2 5.7e+04 1.9e+04 0.0e+00  3  4 11  2  0   4  4 11  2  0  2918
MatSolve            1640 0.0 5.8434e-03 0.0 1.42e+06 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0   243
MatLUFactorSym         1 1.0 3.1948e-05 2.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatLUFactorNum         1 1.0 1.7881e-05 6.2 5.53e+03 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0   309
MatConvert             4 1.0 2.3882e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatScale              12 1.0 2.8265e-01 1.1 5.26e+07 1.2 7.8e+01 1.0e+05 0.0e+00  0  0  0  0  0   0  0  0  0  0  1960
MatAssemblyBegin      69 1.0 1.6842e+00 2.8 0.00e+00 0.0 2.3e+02 1.8e+04 7.4e+01  0  0  0  0  1   0  0  0  0  1     0
MatAssemblyEnd        69 1.0 1.8641e+00 1.0 0.00e+00 0.0 1.0e+03 1.6e+04 2.0e+02  0  0  0  0  3   0  0  0  0  3     0
MatGetRow        3544348 1.0 4.4062e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetRowIJ            1 0.0 6.9141e-06 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetOrdering         1 0.0 4.5061e-05 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.7e-01  0  0  0  0  0   0  0  0  0  0     0
MatCoarsen             4 1.0 6.5285e-01 1.2 0.00e+00 0.0 4.3e+02 1.4e+05 9.2e+01  0  0  0  0  1   0  0  0  0  2     0
MatAXPY                4 1.0 6.9771e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatMatMult             4 1.0 1.3333e+00 1.0 3.72e+07 1.2 4.9e+02 4.9e+04 9.6e+01  0  0  0  0  2   0  0  0  0  2   298
MatMatMultSym          4 1.0 1.0190e+00 1.0 0.00e+00 0.0 4.1e+02 3.9e+04 8.8e+01  0  0  0  0  1   0  0  0  0  1     0
MatMatMultNum          4 1.0 3.1801e-01 1.0 3.72e+07 1.2 7.8e+01 1.0e+05 8.0e+00  0  0  0  0  0   0  0  0  0  0  1249
MatPtAP                4 1.0 4.5583e+00 1.0 1.16e+09 1.8 8.4e+02 1.1e+05 1.1e+02  0  0  0  0  2   0  0  0  0  2  1881
MatPtAPSymbolic        4 1.0 2.8222e+00 1.0 0.00e+00 0.0 7.5e+02 9.3e+04 1.0e+02  0  0  0  0  2   0  0  0  0  2     0
MatPtAPNumeric         4 1.0 1.7362e+00 1.0 1.16e+09 1.8 8.9e+01 2.7e+05 8.0e+00  0  0  0  0  0   0  0  0  0  0  4939
MatTrnMatMult          4 1.0 1.2843e+01 1.0 2.04e+09 1.9 4.7e+02 6.6e+05 1.2e+02  1  1  0  1  2   1  1  0  1  2  1775
MatGetLocalMat        20 1.0 7.1283e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01  0  0  0  0  0   0  0  0  0  0     0
MatGetBrAoCol         12 1.0 8.6687e-02 1.6 0.00e+00 0.0 5.5e+02 1.4e+05 1.6e+01  0  0  0  0  0   0  0  0  0  0     0
MatGetSymTrans         8 1.0 9.0438e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
KSPGMRESOrthog       860 1.0 8.2049e-01 1.1 3.90e+08 1.0 0.0e+00 0.0e+00 8.6e+02  0  0  0  0 14   0  0  0  0 14  5608
KSPSetUp              11 1.0 1.3758e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+01  0  0  0  0  0   0  0  0  0  0     0
KSPSolve              20 1.0 9.8386e+02 1.0 2.88e+11 1.1 5.2e+05 8.7e+04 5.8e+03 82 96100 99 93 100100100100 98  3215
VecDot               800 1.0 1.2949e+01 2.7 2.57e+09 1.0 0.0e+00 0.0e+00 8.0e+02  1  1  0  0 13   1  1  0  0 13  2355
VecDotNorm2          400 1.0 1.3116e+01 2.2 5.14e+09 1.0 0.0e+00 0.0e+00 1.2e+03  1  2  0  0 19   1  2  0  0 20  4650
VecMDot              860 1.0 4.3056e-01 1.2 1.95e+08 1.0 0.0e+00 0.0e+00 8.6e+02  0  0  0  0 14   0  0  0  0 14  5343
VecNorm             2104 1.0 1.0177e+01 5.3 1.39e+09 1.0 0.0e+00 0.0e+00 2.1e+03  0  1  0  0 33   0  1  0  0 35  1618
VecScale           14804 1.0 2.2892e+01 1.1 5.83e+09 1.0 0.0e+00 0.0e+00 0.0e+00  2  2  0  0  0   2  2  0  0  0  3007
VecCopy             4144 1.0 7.4420e+00 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  1  0  0  0  0   1  0  0  0  0     0
VecSet             14075 1.0 6.1308e+00 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0
VecAXPY            27064 1.0 7.9051e+01 1.2 2.33e+10 1.0 0.0e+00 0.0e+00 0.0e+00  6  8  0  0  0   8  9  0  0  0  3472
VecAYPX            26240 1.0 7.8797e+01 1.1 1.45e+10 1.0 0.0e+00 0.0e+00 0.0e+00  6  5  0  0  0   8  5  0  0  0  2176
VecAXPBYCZ           800 1.0 1.2601e+01 1.0 5.14e+09 1.0 0.0e+00 0.0e+00 0.0e+00  1  2  0  0  0   1  2  0  0  0  4840
VecWAXPY             800 1.0 1.2598e+01 1.0 2.57e+09 1.0 0.0e+00 0.0e+00 0.0e+00  1  1  0  0  0   1  1  0  0  0  2420
VecMAXPY            1684 1.0 4.7353e-01 1.0 2.30e+08 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0  5742
VecAssemblyBegin      87 1.0 4.7785e-01 2.4 0.00e+00 0.0 0.0e+00 0.0e+00 2.6e+02  0  0  0  0  4   0  0  0  0  4     0
VecAssemblyEnd        87 1.0 9.9659e-05 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecPointwiseMult   19724 1.0 8.5318e+01 1.1 8.74e+09 1.0 0.0e+00 0.0e+00 0.0e+00  7  3  0  0  0   8  3  0  0  0  1209
VecScatterBegin    27955 1.0 3.7788e+00 2.3 0.00e+00 0.0 5.2e+05 8.7e+04 0.0e+00  0  0 99 98  0   0  0100 99  0     0
VecScatterEnd      27955 1.0 6.0687e+01 3.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  4  0  0  0  0   4  0  0  0  0     0
VecSetRandom           4 1.0 2.6621e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecNormalize        1684 1.0 8.1183e+0052.0 5.86e+07 1.0 0.0e+00 0.0e+00 1.7e+03  0  0  0  0 27   0  0  0  0 28    85
PCSetUp                2 1.0 2.7046e+01 1.0 3.60e+09 1.1 4.4e+03 1.6e+05 9.6e+02  2  1  1  1 15   3  1  1  2 16  1555
PCSetUpOnBlocks      820 1.0 6.9976e-04 1.3 5.53e+03 0.0 0.0e+00 0.0e+00 5.0e+00  0  0  0  0  0   0  0  0  0  0     8
PCApply              820 1.0 8.5955e+02 1.0 2.52e+11 1.1 5.0e+05 8.1e+04 2.5e+03 71 83 96 88 39  86 86 96 89 41  3171
PCGAMGgraph_AGG        4 1.0 2.9757e+00 1.0 3.72e+07 1.2 2.3e+02 5.1e+04 7.6e+01  0  0  0  0  1   0  0  0  0  1   133
PCGAMGcoarse_AGG       4 1.0 1.4145e+01 1.0 2.04e+09 1.9 1.3e+03 3.4e+05 2.9e+02  1  1  0  1  5   1  1  0  1  5  1612
PCGAMGProl_AGG         4 1.0 1.0395e+00 1.0 0.00e+00 0.0 5.5e+02 6.4e+04 1.1e+02  0  0  0  0  2   0  0  0  0  2     0
PCGAMGPOpt_AGG         4 1.0 4.1134e+00 1.0 9.11e+08 1.1 1.3e+03 8.0e+04 2.1e+02  0  0  0  0  3   0  0  0  0  4  2503

--- Event Stage 2: momentum_eqn

MatMult               38 1.0 8.5567e+00 1.0 2.35e+09 1.0 8.4e+02 7.2e+05 0.0e+00  1  1  0  1  0  12 23 95 99  0  3260
MatSolve              57 1.0 1.3324e+01 1.1 3.51e+09 1.0 0.0e+00 0.0e+00 0.0e+00  1  1  0  0  0  18 34  0  0  0  3123
MatLUFactorNum        19 1.0 1.3298e+01 1.1 1.97e+09 1.0 0.0e+00 0.0e+00 0.0e+00  1  1  0  0  0  18 19  0  0  0  1753
MatILUFactorSym        1 1.0 5.8045e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+00  0  0  0  0  0   1  0  0  0  0     0
MatAssemblyBegin      19 1.0 3.3103e+0051.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.8e+01  0  0  0  0  1   3  0  0  0 13     0
MatAssemblyEnd        19 1.0 4.1241e+00 1.0 0.00e+00 0.0 4.4e+01 1.8e+05 8.0e+00  0  0  0  0  0   6  0  5  1  3     0
MatGetRowIJ            1 1.0 1.9073e-06 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetOrdering         1 1.0 5.7246e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00  0  0  0  0  0   0  0  0  0  1     0
KSPSetUp              38 1.0 1.5740e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
KSPSolve              19 1.0 4.4298e+01 1.0 1.04e+10 1.0 8.4e+02 7.2e+05 1.4e+02  4  4  0  1  2  61100 95 99 46  2782
VecDot                38 1.0 1.3129e+00 1.6 3.66e+08 1.0 0.0e+00 0.0e+00 3.8e+01  0  0  0  0  1   1  4  0  0 13  3310
VecDotNorm2           19 1.0 1.3210e+00 1.6 7.32e+08 1.0 0.0e+00 0.0e+00 5.7e+01  0  0  0  0  1   1  7  0  0 19  6579
VecNorm               38 1.0 1.5438e+00 4.7 3.66e+08 1.0 0.0e+00 0.0e+00 3.8e+01  0  0  0  0  1   1  4  0  0 13  2815
VecCopy               38 1.0 8.6146e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0
VecSet               115 1.0 2.0613e+00 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   3  0  0  0  0     0
VecAXPBYCZ            38 1.0 1.8039e+00 1.0 7.32e+08 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   2  7  0  0  0  4818
VecWAXPY              38 1.0 1.8099e+00 1.0 3.66e+08 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   2  4  0  0  0  2401
VecAssemblyBegin      38 1.0 1.0454e-0116.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.1e+02  0  0  0  0  2   0  0  0  0 38     0
VecAssemblyEnd        38 1.0 4.7684e-05 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecScatterBegin       38 1.0 7.1190e-02 2.4 0.00e+00 0.0 8.4e+02 7.2e+05 0.0e+00  0  0  0  1  0   0  0 95 99  0     0
VecScatterEnd         38 1.0 2.8875e-01 2.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
PCSetUp               38 1.0 1.3929e+01 1.1 1.97e+09 1.0 0.0e+00 0.0e+00 5.0e+00  1  1  0  0  0  19 19  0  0  2  1674
PCSetUpOnBlocks       19 1.0 1.3929e+01 1.1 1.97e+09 1.0 0.0e+00 0.0e+00 3.0e+00  1  1  0  0  0  19 19  0  0  1  1674
PCApply               57 1.0 1.4274e+01 1.1 3.51e+09 1.0 0.0e+00 0.0e+00 0.0e+00  1  1  0  0  0  19 34  0  0  0  2915
------------------------------------------------------------------------------------------------------------------------

Memory usage is given in bytes:

Object Type          Creations   Destructions     Memory  Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

              Matrix     6             48   2093863816     0
       Krylov Solver     2              9        27392     0
              Vector     4             63    485882816     0
      Vector Scatter     0             10        10360     0
           Index Set     0             10     19278416     0
      Preconditioner     0              9         8868     0
              Viewer     1              0            0     0

--- Event Stage 1: poisson_eqn

              Matrix   117             76   1488820380     0
      Matrix Coarsen     4              4         2448     0
       Krylov Solver    10              4       120512     0
              Vector   225            175    387306216     0
      Vector Scatter    31             22        22792     0
           Index Set    81             74       807916     0
      Preconditioner    11              4         3424     0
         PetscRandom     4              4         2432     0

--- Event Stage 2: momentum_eqn

              Matrix     1              0            0     0
       Krylov Solver     1              0            0     0
              Vector    10              1         1496     0
      Vector Scatter     1              0            0     0
           Index Set     5              2       361680     0
      Preconditioner     2              0            0     0
========================================================================================================================
Average time to get PetscTime(): 9.53674e-08
Average time for MPI_Barrier(): 9.20296e-06
Average time for zero size MPI_Send(): 2.16564e-06
#PETSc Option Table entries:
-log_summary
-poisson_pc_gamg_agg_nsmooths 1
-poisson_pc_type gamg
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure run at: Sun Oct  7 16:51:24 2012
Configure options: --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-mpi-dir=/app1/mvapich2/current/ --with-blas-lapack-dir=/app1/intel/mkl/lib/intel64/ --with-batch=1 --with-debugging=0 --download-hypre=1 --prefix=/home/svu/g0306332/lib/petsc-3.3-dev_atlas6_rel --known-mpi-shared-libraries=0
-----------------------------------------
Libraries compiled on Sun Oct  7 16:51:24 2012 on atlas6-c01 
Machine characteristics: Linux-2.6.18-274.7.1.el5-x86_64-with-redhat-5.8-Final
Using PETSc directory: /home/svu/g0306332/codes/petsc-dev
Using PETSc arch: petsc-3.3-dev_atlas6_rel
-----------------------------------------

Using C compiler: /app1/mvapich2/current/bin/mpicc  -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O  ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: /app1/mvapich2/current/bin/mpif90  -O3   ${FOPTFLAGS} ${FFLAGS} 
-----------------------------------------

Using include paths: -I/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/include -I/home/svu/g0306332/codes/petsc-dev/include -I/home/svu/g0306332/codes/petsc-dev/include -I/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/include -I/app1/mvapich2/current/include -I/app1/mvapich2/1.4/include
-----------------------------------------

Using C linker: /app1/mvapich2/current/bin/mpicc
Using Fortran linker: /app1/mvapich2/current/bin/mpif90
Using libraries: -Wl,-rpath,/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -L/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -lpetsc -lX11 -Wl,-rpath,/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -L/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -lHYPRE -L/app1/mvapich2/1.4/lib -L/nfs/app1/intel_2011/composer_xe_2011_sp1.6.233/compiler/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/compiler/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/ipp/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/mkl/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/tbb/lib/intel64/cc4.1.0_libc2.4_kernel2.6.16.21 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -lmpichcxx -lstdc++ -lpthread -Wl,-rpath,/app1/intel/mkl/lib/intel64 -L/app1/intel/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -lpthread -lmpichf90 -lifport -lifcore -limf -lsvml -lm -lipgo -lirc -lirc_s -lm -lmpichcxx -lstdc++ -lmpichcxx -lstdc++ -ldl -lmpich -lpthread -lrdmacm -libverbs -libumad -lrt -lgcc_s -ldl 
-----------------------------------------

Job  /app1/common/lsf7/7.0/linux2.6-glibc2.3-x86_64/bin/mvapich_wrapper ./a.out -poisson_pc_gamg_agg_nsmooths 1 -poisson_pc_type gamg -log_summary

TID   HOST_NAME   COMMAND_LINE            STATUS            TERMINATION_TIME
===== ========== ================  =======================  ===================
00000 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
00001 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
00002 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
00003 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
00004 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
00005 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
00006 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
00007 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
00008 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
00009 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
00010 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
00011 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 03:58:59
-------------- next part --------------

************************************************************************************************************************
***             WIDEN YOUR WINDOW TO 120 CHARACTERS.  Use 'enscript -r -fCourier9' to print this document            ***
************************************************************************************************************************

---------------------------------------------- PETSc Performance Summary: ----------------------------------------------

./a.out on a petsc-3.3-dev_atlas6_rel named atlas6-c14 with 24 processors, by g0306332 Mon Oct  8 04:17:33 2012
Using Petsc Development HG revision: 0b92fc173218fc24fd69e1ee041f811d50d4766c  HG Date: Fri Oct 05 15:20:44 2012 -0500

                         Max       Max/Min        Avg      Total 
mv: cannot move `/var/tmp/uvwp24.plt' to `/home/wtay/Results/8/': No such file or directory
Time (sec):           5.824e+02      1.00339   5.807e+02
Objects:              5.160e+02      1.00000   5.160e+02
Flops:                1.380e+11      1.23808   1.148e+11  2.756e+12
Flops/sec:            2.370e+08      1.23418   1.978e+08  4.746e+09
MPI Messages:         6.066e+04      3.66281   3.892e+04  9.341e+05
MPI Message Lengths:  3.500e+09      2.02046   8.586e+04  8.021e+10
MPI Reductions:       5.548e+03      1.00000

Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
                            e.g., VecAXPY() for real vectors of length N --> 2N flops
                            and VecAXPY() for complex vectors of length N --> 8N flops

Summary of Stages:   ----- Time ------  ----- Flops -----  --- Messages ---  -- Message Lengths --  -- Reductions --
                        Avg     %Total     Avg     %Total   counts   %Total     Avg         %Total   counts   %Total 
 0:      Main Stage: 1.1861e+02  20.4%  0.0000e+00   0.0%  0.000e+00   0.0%  0.000e+00        0.0%  2.000e+01   0.4% 
 1:     poisson_eqn: 4.2596e+02  73.4%  2.6333e+12  95.5%  9.323e+05  99.8%  8.450e+04       98.4%  5.229e+03  94.3% 
 2:    momentum_eqn: 3.6084e+01   6.2%  1.2276e+11   4.5%  1.840e+03   0.2%  1.366e+03        1.6%  2.980e+02   5.4% 

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %f - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %f %M %L %R  %T %f %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------

--- Event Stage 0: Main Stage


--- Event Stage 1: poisson_eqn

MatMult            17700 1.0 2.6405e+02 1.2 9.02e+10 1.3 7.1e+05 1.0e+05 0.0e+00 42 62 76 92  0  58 65 76 94  0  6455
MatMultAdd          2720 1.0 1.9278e+01 1.5 6.25e+09 1.5 1.1e+05 1.7e+04 0.0e+00  3  4 11  2  0   3  4 11  2  0  5553
MatMultTranspose    2720 1.0 2.1702e+01 1.5 6.25e+09 1.5 1.1e+05 1.7e+04 0.0e+00  3  4 11  2  0   4  4 11  2  0  4933
MatSolve            1360 0.0 4.2832e-03 0.0 1.30e+06 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0   303
MatLUFactorSym         1 1.0 3.8147e-05 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatLUFactorNum         1 1.0 2.1935e-05 7.7 6.48e+03 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0   295
MatConvert             4 1.0 1.2682e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatScale              12 1.0 1.4481e-01 1.1 3.05e+07 1.4 1.7e+02 9.8e+04 0.0e+00  0  0  0  0  0   0  0  0  0  0  3835
MatAssemblyBegin      69 1.0 2.6819e+00 2.5 0.00e+00 0.0 5.3e+02 1.6e+04 7.4e+01  0  0  0  0  1   0  0  0  0  1     0
MatAssemblyEnd        69 1.0 1.0519e+00 1.0 0.00e+00 0.0 2.6e+03 1.3e+04 2.0e+02  0  0  0  0  4   0  0  0  0  4     0
MatGetRow        1821330 1.1 2.2507e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetRowIJ            1 0.0 8.1062e-06 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetOrdering         1 0.0 5.1022e-05 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.3e-02  0  0  0  0  0   0  0  0  0  0     0
MatCoarsen             4 1.0 3.9578e-01 1.3 0.00e+00 0.0 3.1e+03 4.7e+04 1.8e+02  0  0  0  0  3   0  0  0  0  4     0
MatAXPY                4 1.0 3.4627e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatMatMult             4 1.0 7.5009e-01 1.0 2.13e+07 1.3 1.1e+03 4.4e+04 9.6e+01  0  0  0  0  2   0  0  0  0  2   530
MatMatMultSym          4 1.0 5.8047e-01 1.0 0.00e+00 0.0 9.7e+02 3.5e+04 8.8e+01  0  0  0  0  2   0  0  0  0  2     0
MatMatMultNum          4 1.0 1.7238e-01 1.0 2.13e+07 1.3 1.7e+02 9.8e+04 8.0e+00  0  0  0  0  0   0  0  0  0  0  2308
MatPtAP                4 1.0 3.0428e+00 1.0 8.45e+08 2.7 2.2e+03 9.0e+04 1.1e+02  1  0  0  0  2   1  0  0  0  2  2823
MatPtAPSymbolic        4 1.0 1.8848e+00 1.0 0.00e+00 0.0 2.0e+03 7.5e+04 1.0e+02  0  0  0  0  2   0  0  0  0  2     0
MatPtAPNumeric         4 1.0 1.1581e+00 1.0 8.45e+08 2.7 2.4e+02 2.1e+05 8.0e+00  0  0  0  0  0   0  0  0  0  0  7418
MatTrnMatMult          4 1.0 6.8553e+00 1.0 1.03e+09 3.2 1.1e+03 5.9e+05 1.2e+02  1  1  0  1  2   2  1  0  1  2  3304
MatGetLocalMat        20 1.0 3.6288e-01 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01  0  0  0  0  0   0  0  0  0  0     0
MatGetBrAoCol         12 1.0 7.6919e-02 1.6 0.00e+00 0.0 1.2e+03 1.4e+05 1.6e+01  0  0  0  0  0   0  0  0  0  0     0
MatGetSymTrans         8 1.0 4.5255e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
KSPGMRESOrthog       720 1.0 4.8121e-01 1.2 2.00e+08 1.1 0.0e+00 0.0e+00 7.2e+02  0  0  0  0 13   0  0  0  0 14  9566
KSPSetUp              11 1.0 7.0659e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+01  0  0  0  0  0   0  0  0  0  0     0
KSPSolve              20 1.0 4.2393e+02 1.0 1.33e+11 1.3 9.3e+05 8.5e+04 5.1e+03 73 96100 98 92 100100100100 98  6212
VecDot               660 1.0 1.2440e+01 5.9 1.08e+09 1.0 0.0e+00 0.0e+00 6.6e+02  1  1  0  0 12   1  1  0  0 13  2022
VecDotNorm2          330 1.0 1.2494e+01 4.8 2.16e+09 1.0 0.0e+00 0.0e+00 9.9e+02  1  2  0  0 18   2  2  0  0 19  4027
VecMDot              720 1.0 2.8871e-01 1.4 1.00e+08 1.1 0.0e+00 0.0e+00 7.2e+02  0  0  0  0 13   0  0  0  0 14  7972
VecNorm             1754 1.0 1.3658e+0119.0 5.93e+08 1.0 0.0e+00 0.0e+00 1.8e+03  1  1  0  0 32   1  1  0  0 34  1011
VecScale           12284 1.0 8.2996e+00 1.2 2.49e+09 1.1 0.0e+00 0.0e+00 0.0e+00  1  2  0  0  0   2  2  0  0  0  6885
VecCopy             3444 1.0 3.1909e+00 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  1  0  0  0  0   1  0  0  0  0     0
VecSet             11695 1.0 2.7659e+00 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0
VecAXPY            22444 1.0 3.1521e+01 1.3 9.91e+09 1.1 0.0e+00 0.0e+00 0.0e+00  5  8  0  0  0   7  9  0  0  0  7223
VecAYPX            21760 1.0 3.1916e+01 1.2 6.19e+09 1.1 0.0e+00 0.0e+00 0.0e+00  5  5  0  0  0   7  5  0  0  0  4458
VecAXPBYCZ           660 1.0 5.2597e+00 1.1 2.16e+09 1.0 0.0e+00 0.0e+00 0.0e+00  1  2  0  0  0   1  2  0  0  0  9566
VecWAXPY             660 1.0 5.2671e+00 1.1 1.08e+09 1.0 0.0e+00 0.0e+00 0.0e+00  1  1  0  0  0   1  1  0  0  0  4776
VecMAXPY            1404 1.0 2.3818e-01 1.1 1.18e+08 1.1 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0 11420
VecAssemblyBegin     110 1.0 8.9732e-01 3.7 0.00e+00 0.0 0.0e+00 0.0e+00 3.2e+02  0  0  0  0  6   0  0  0  0  6     0
VecAssemblyEnd       110 1.0 1.0490e-04 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecPointwiseMult   16364 1.0 3.4136e+01 1.2 3.73e+09 1.1 0.0e+00 0.0e+00 0.0e+00  5  3  0  0  0   7  3  0  0  0  2507
VecScatterBegin    23218 1.0 3.1578e+00 2.3 0.00e+00 0.0 9.3e+05 8.4e+04 0.0e+00  0  0 99 97  0   1  0 99 99  0     0
VecScatterEnd      23218 1.0 4.9513e+01 5.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  5  0  0  0  0   7  0  0  0  0     0
VecSetRandom           4 1.0 1.3620e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecNormalize        1404 1.0 1.2207e+01120.4 3.01e+07 1.1 0.0e+00 0.0e+00 1.4e+03  1  0  0  0 25   1  0  0  0 27    57
PCSetUp                2 1.0 1.5744e+01 1.0 1.83e+09 1.2 1.2e+04 1.2e+05 1.1e+03  3  2  1  2 19   4  2  1  2 20  2664
PCSetUpOnBlocks      680 1.0 6.5613e-04 1.6 6.48e+03 0.0 0.0e+00 0.0e+00 5.0e+00  0  0  0  0  0   0  0  0  0  0    10
PCApply              680 1.0 3.6728e+02 1.1 1.18e+11 1.3 8.9e+05 7.9e+04 2.1e+03 62 82 95 87 37  84 86 95 89 39  6165
PCGAMGgraph_AGG        4 1.0 1.5585e+00 1.0 2.13e+07 1.3 4.9e+02 4.9e+04 7.6e+01  0  0  0  0  1   0  0  0  0  1   255
PCGAMGcoarse_AGG       4 1.0 7.5525e+00 1.0 1.03e+09 3.2 5.2e+03 1.8e+05 3.8e+02  1  1  1  1  7   2  1  1  1  7  2999
PCGAMGProl_AGG         4 1.0 1.3036e+00 1.0 0.00e+00 0.0 1.3e+03 5.5e+04 1.1e+02  0  0  0  0  2   0  0  0  0  2     0
PCGAMGPOpt_AGG         4 1.0 2.1967e+00 1.0 4.93e+08 1.2 2.8e+03 7.6e+04 2.1e+02  0  0  0  0  4   1  0  0  0  4  4693

--- Event Stage 2: momentum_eqn

MatMult               38 1.0 4.4312e+00 1.1 1.20e+09 1.1 1.7e+03 7.2e+05 0.0e+00  1  1  0  2  0  12 23 95 99  0  6295
MatSolve              57 1.0 6.1924e+00 1.1 1.78e+09 1.1 0.0e+00 0.0e+00 0.0e+00  1  2  0  0  0  17 34  0  0  0  6680
MatLUFactorNum        19 1.0 6.1136e+00 1.1 9.93e+08 1.1 0.0e+00 0.0e+00 0.0e+00  1  1  0  0  0  16 19  0  0  0  3776
MatILUFactorSym        1 1.0 2.6631e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+00  0  0  0  0  0   1  0  0  0  0     0
MatAssemblyBegin      19 1.0 2.0705e+0034.4 0.00e+00 0.0 0.0e+00 0.0e+00 3.8e+01  0  0  0  0  1   4  0  0  0 13     0
MatAssemblyEnd        19 1.0 2.1199e+00 1.0 0.00e+00 0.0 9.2e+01 1.8e+05 8.0e+00  0  0  0  0  0   6  0  5  1  3     0
MatGetRowIJ            1 1.0 2.1458e-06 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetOrdering         1 1.0 2.9305e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00  0  0  0  0  0   0  0  0  0  1     0
KSPSetUp              38 1.0 8.0219e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
KSPSolve              19 1.0 2.1292e+01 1.0 5.28e+09 1.1 1.7e+03 7.2e+05 1.4e+02  4  4  0  2  2  59100 95 99 46  5766
VecDot                38 1.0 7.3666e-01 1.9 1.87e+08 1.0 0.0e+00 0.0e+00 3.8e+01  0  0  0  0  1   1  4  0  0 13  5899
VecDotNorm2           19 1.0 7.0598e-01 1.7 3.73e+08 1.0 0.0e+00 0.0e+00 5.7e+01  0  0  0  0  1   2  7  0  0 19 12310
VecNorm               38 1.0 9.6422e-01 6.6 1.87e+08 1.0 0.0e+00 0.0e+00 3.8e+01  0  0  0  0  1   1  4  0  0 13  4507
VecCopy               38 1.0 4.3037e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0
VecSet               115 1.0 1.0355e+00 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   3  0  0  0  0     0
VecAXPBYCZ            38 1.0 9.1814e-01 1.1 3.73e+08 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   2  7  0  0  0  9466
VecWAXPY              38 1.0 9.1795e-01 1.1 1.87e+08 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   2  4  0  0  0  4734
VecAssemblyBegin      38 1.0 8.9392e-0211.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.1e+02  0  0  0  0  2   0  0  0  0 38     0
VecAssemblyEnd        38 1.0 4.3392e-05 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecScatterBegin       38 1.0 5.8024e-02 2.2 0.00e+00 0.0 1.7e+03 7.2e+05 0.0e+00  0  0  0  2  0   0  0 95 99  0     0
VecScatterEnd         38 1.0 3.1747e-01 3.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
PCSetUp               38 1.0 6.4099e+00 1.1 9.93e+08 1.1 0.0e+00 0.0e+00 5.0e+00  1  1  0  0  0  17 19  0  0  2  3602
PCSetUpOnBlocks       19 1.0 6.4097e+00 1.1 9.93e+08 1.1 0.0e+00 0.0e+00 3.0e+00  1  1  0  0  0  17 19  0  0  1  3602
PCApply               57 1.0 6.6614e+00 1.1 1.78e+09 1.1 0.0e+00 0.0e+00 0.0e+00  1  2  0  0  0  18 34  0  0  0  6210
------------------------------------------------------------------------------------------------------------------------

Memory usage is given in bytes:

Object Type          Creations   Destructions     Memory  Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

              Matrix     6             48   1076255464     0
       Krylov Solver     2              9        27392     0
              Vector     4             63    248561024     0
      Vector Scatter     0             10        10360     0
           Index Set     0             10      9824824     0
      Preconditioner     0              9         8868     0
              Viewer     1              0            0     0

--- Event Stage 1: poisson_eqn

              Matrix   117             76    734590032     0
      Matrix Coarsen     4              4         2448     0
       Krylov Solver    10              4       120512     0
              Vector   225            175    200939176     0
      Vector Scatter    31             22        22792     0
           Index Set    81             74       764972     0
      Preconditioner    11              4         3424     0
         PetscRandom     4              4         2432     0

--- Event Stage 2: momentum_eqn

              Matrix     1              0            0     0
       Krylov Solver     1              0            0     0
              Vector    10              1         1496     0
      Vector Scatter     1              0            0     0
           Index Set     5              2       361680     0
      Preconditioner     2              0            0     0
========================================================================================================================
Average time to get PetscTime(): 0
Average time for MPI_Barrier(): 2.36034e-05
Average time for zero size MPI_Send(): 2.29478e-06
#PETSc Option Table entries:
-log_summary
-poisson_pc_gamg_agg_nsmooths 1
-poisson_pc_type gamg
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure run at: Sun Oct  7 16:51:24 2012
Configure options: --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-mpi-dir=/app1/mvapich2/current/ --with-blas-lapack-dir=/app1/intel/mkl/lib/intel64/ --with-batch=1 --with-debugging=0 --download-hypre=1 --prefix=/home/svu/g0306332/lib/petsc-3.3-dev_atlas6_rel --known-mpi-shared-libraries=0
-----------------------------------------
Libraries compiled on Sun Oct  7 16:51:24 2012 on atlas6-c01 
Machine characteristics: Linux-2.6.18-274.7.1.el5-x86_64-with-redhat-5.8-Final
Using PETSc directory: /home/svu/g0306332/codes/petsc-dev
Using PETSc arch: petsc-3.3-dev_atlas6_rel
-----------------------------------------

Using C compiler: /app1/mvapich2/current/bin/mpicc  -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O  ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: /app1/mvapich2/current/bin/mpif90  -O3   ${FOPTFLAGS} ${FFLAGS} 
-----------------------------------------

Using include paths: -I/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/include -I/home/svu/g0306332/codes/petsc-dev/include -I/home/svu/g0306332/codes/petsc-dev/include -I/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/include -I/app1/mvapich2/current/include -I/app1/mvapich2/1.4/include
-----------------------------------------

Using C linker: /app1/mvapich2/current/bin/mpicc
Using Fortran linker: /app1/mvapich2/current/bin/mpif90
Using libraries: -Wl,-rpath,/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -L/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -lpetsc -lX11 -Wl,-rpath,/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -L/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -lHYPRE -L/app1/mvapich2/1.4/lib -L/nfs/app1/intel_2011/composer_xe_2011_sp1.6.233/compiler/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/compiler/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/ipp/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/mkl/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/tbb/lib/intel64/cc4.1.0_libc2.4_kernel2.6.16.21 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -lmpichcxx -lstdc++ -lpthread -Wl,-rpath,/app1/intel/mkl/lib/intel64 -L/app1/intel/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -lpthread -lmpichf90 -lifport -lifcore -limf -lsvml -lm -lipgo -lirc -lirc_s -lm -lmpichcxx -lstdc++ -lmpichcxx -lstdc++ -ldl -lmpich -lpthread -lrdmacm -libverbs -libumad -lrt -lgcc_s -ldl 
-----------------------------------------

Job  /app1/common/lsf7/7.0/linux2.6-glibc2.3-x86_64/bin/mvapich_wrapper ./a.out -poisson_pc_gamg_agg_nsmooths 1 -poisson_pc_type gamg -log_summary

TID   HOST_NAME   COMMAND_LINE            STATUS            TERMINATION_TIME
===== ========== ================  =======================  ===================
00000 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00001 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00002 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00003 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00004 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00005 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00006 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00007 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00008 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00009 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00010 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00011 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:17:35
00012 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
00013 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
00014 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
00015 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
00016 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
00017 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
00018 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
00019 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
00020 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
00021 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
00022 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
00023 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:17:35
-------------- next part --------------


---------------------------------------------- PETSc Performance Summary: ----------------------------------------------

./a.out on a petsc-3.3-dev_atlas6_rel named atlas6-c14 with 32 processors, by g0306332 Mon Oct  8 04:38:44 2012
Using Petsc Development HG revision: 0b92fc173218fc24fd69e1ee041f811d50d4766c  HG Date: Fri Oct 05 15:20:44 2012 -0500

                         Max       Max/Min        Avg      Total 
mv: cannot move `/var/tmp/uvwp32.plt' to `/home/wtay/Results/8/': No such file or directory
Time (sec):           5.793e+02      1.00264   5.781e+02
Objects:              5.160e+02      1.00000   5.160e+02
Flops:                1.280e+11      1.26041   1.076e+11  3.442e+12
Flops/sec:            2.210e+08      1.25710   1.861e+08  5.955e+09
MPI Messages:         8.329e+04      3.19963   5.696e+04  1.823e+06
MPI Message Lengths:  4.386e+09      2.01368   7.429e+04  1.354e+11
MPI Reductions:       6.648e+03      1.00000

Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
                            e.g., VecAXPY() for real vectors of length N --> 2N flops
                            and VecAXPY() for complex vectors of length N --> 8N flops

Summary of Stages:   ----- Time ------  ----- Flops -----  --- Messages ---  -- Message Lengths --  -- Reductions --
                        Avg     %Total     Avg     %Total   counts   %Total     Avg         %Total   counts   %Total 
 0:      Main Stage: 1.4747e+02  25.5%  0.0000e+00   0.0%  0.000e+00   0.0%  0.000e+00        0.0%  2.000e+01   0.3% 
 1:     poisson_eqn: 4.0301e+02  69.7%  3.3200e+12  96.4%  1.820e+06  99.9%  7.334e+04       98.7%  6.329e+03  95.2% 
 2:    momentum_eqn: 2.7602e+01   4.8%  1.2245e+11   3.6%  2.480e+03   0.1%  9.434e+02        1.3%  2.980e+02   4.5% 

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %f - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %f %M %L %R  %T %f %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------

--- Event Stage 0: Main Stage


--- Event Stage 1: poisson_eqn

MatMult            22328 1.0 2.4333e+02 1.1 8.47e+10 1.3 1.5e+06 8.6e+04 0.0e+00 40 63 80 93  0  57 65 80 94  0  8859
MatMultAdd          3432 1.0 1.7052e+01 1.5 5.96e+09 1.5 1.7e+05 1.8e+04 0.0e+00  2  4 10  2  0   4  4 10  2  0  7952
MatMultTranspose    3432 1.0 2.4190e+01 2.0 5.96e+09 1.5 1.7e+05 1.8e+04 0.0e+00  3  4 10  2  0   4  4 10  2  0  5605
MatSolve            1716 0.0 6.4962e-03 0.0 1.62e+06 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0   249
MatLUFactorSym         1 1.0 3.4094e-05 2.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatLUFactorNum         1 1.0 2.0027e-05 7.0 6.24e+03 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0   311
MatConvert             4 1.0 9.4268e-02 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatScale              12 1.0 1.0746e-01 1.6 2.28e+07 1.4 2.7e+02 8.1e+04 0.0e+00  0  0  0  0  0   0  0  0  0  0  5182
MatAssemblyBegin      69 1.0 2.4791e+00 3.8 0.00e+00 0.0 7.1e+02 1.6e+04 7.4e+01  0  0  0  0  1   0  0  0  0  1     0
MatAssemblyEnd        69 1.0 9.1389e-01 1.1 0.00e+00 0.0 3.8e+03 1.2e+04 2.0e+02  0  0  0  0  3   0  0  0  0  3     0
MatGetRow        1358404 1.1 1.6604e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetRowIJ            1 0.0 7.1526e-06 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetOrdering         1 0.0 4.6015e-05 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 6.2e-02  0  0  0  0  0   0  0  0  0  0     0
MatCoarsen             4 1.0 3.2097e-01 1.3 0.00e+00 0.0 5.8e+03 3.5e+04 2.2e+02  0  0  0  0  3   0  0  0  0  3     0
MatAXPY                4 1.0 2.6504e-02 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatMatMult             4 1.0 5.9873e-01 1.0 1.59e+07 1.4 1.8e+03 3.8e+04 9.6e+01  0  0  0  0  1   0  0  0  0  2   666
MatMatMultSym          4 1.0 4.7245e-01 1.0 0.00e+00 0.0 1.5e+03 3.1e+04 8.8e+01  0  0  0  0  1   0  0  0  0  1     0
MatMatMultNum          4 1.0 1.2957e-01 1.0 1.59e+07 1.4 2.7e+02 8.1e+04 8.0e+00  0  0  0  0  0   0  0  0  0  0  3078
MatPtAP                4 1.0 2.4452e+00 1.0 6.76e+08 3.0 3.4e+03 7.9e+04 1.1e+02  0  0  0  0  2   1  0  0  0  2  3527
MatPtAPSymbolic        4 1.0 1.5169e+00 1.1 0.00e+00 0.0 3.1e+03 6.5e+04 1.0e+02  0  0  0  0  2   0  0  0  0  2     0
MatPtAPNumeric         4 1.0 9.2885e-01 1.0 6.76e+08 3.0 3.4e+02 2.0e+05 8.0e+00  0  0  0  0  0   0  0  0  0  0  9285
MatTrnMatMult          4 1.0 5.3296e+00 1.0 7.66e+08 3.9 1.5e+03 5.5e+05 1.2e+02  1  1  0  1  2   1  1  0  1  2  4241
MatGetLocalMat        20 1.0 2.7215e-01 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01  0  0  0  0  0   0  0  0  0  0     0
MatGetBrAoCol         12 1.0 9.9058e-02 2.2 0.00e+00 0.0 1.9e+03 1.2e+05 1.6e+01  0  0  0  0  0   0  0  0  0  0     0
MatGetSymTrans         8 1.0 3.3771e-02 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
KSPGMRESOrthog       898 1.0 4.8392e-01 1.4 1.50e+08 1.1 0.0e+00 0.0e+00 9.0e+02  0  0  0  0 14   0  0  0  0 14  9516
KSPSetUp              11 1.0 5.3839e-02 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+01  0  0  0  0  0   0  0  0  0  0     0
KSPSolve              20 1.0 4.0076e+02 1.0 1.24e+11 1.3 1.8e+06 7.3e+04 6.2e+03 69 96100 99 93  99100100100 98  8284
VecDot               838 1.0 1.9926e+0110.1 1.02e+09 1.1 0.0e+00 0.0e+00 8.4e+02  1  1  0  0 13   2  1  0  0 13  1603
VecDotNorm2          419 1.0 1.9897e+01 8.4 2.03e+09 1.1 0.0e+00 0.0e+00 1.3e+03  1  2  0  0 19   2  2  0  0 20  3211
VecMDot              898 1.0 3.8118e-01 1.9 7.48e+07 1.1 0.0e+00 0.0e+00 9.0e+02  0  0  0  0 14   0  0  0  0 14  6040
VecNorm             2199 1.0 1.4204e+0118.3 5.47e+08 1.1 0.0e+00 0.0e+00 2.2e+03  1  0  0  0 33   1  1  0  0 35  1211
VecScale           15488 1.0 9.1676e+00 2.1 2.34e+09 1.1 0.0e+00 0.0e+00 0.0e+00  1  2  0  0  0   2  2  0  0  0  7861
VecCopy             4334 1.0 3.1940e+00 1.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0
VecSet             14721 1.0 2.4871e+00 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0
VecAXPY            28318 1.0 3.1641e+01 1.9 9.33e+09 1.1 0.0e+00 0.0e+00 0.0e+00  5  8  0  0  0   7  9  0  0  0  9083
VecAYPX            27456 1.0 3.1435e+01 1.6 5.83e+09 1.1 0.0e+00 0.0e+00 0.0e+00  5  5  0  0  0   7  5  0  0  0  5713
VecAXPBYCZ           838 1.0 4.9286e+00 1.3 2.03e+09 1.1 0.0e+00 0.0e+00 0.0e+00  1  2  0  0  0   1  2  0  0  0 12962
VecWAXPY             838 1.0 4.9716e+00 1.3 1.02e+09 1.1 0.0e+00 0.0e+00 0.0e+00  1  1  0  0  0   1  1  0  0  0  6425
VecMAXPY            1760 1.0 1.7820e-01 1.5 8.84e+07 1.1 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0 15270
VecAssemblyBegin     118 1.0 1.9584e+00 6.7 0.00e+00 0.0 0.0e+00 0.0e+00 3.5e+02  0  0  0  0  5   0  0  0  0  5     0
VecAssemblyEnd       118 1.0 1.0204e-04 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecPointwiseMult   20636 1.0 3.3645e+01 1.7 3.50e+09 1.1 0.0e+00 0.0e+00 0.0e+00  5  3  0  0  0   7  3  0  0  0  3210
VecScatterBegin    29278 1.0 4.1807e+00 2.5 0.00e+00 0.0 1.8e+06 7.3e+04 0.0e+00  1  0 99 98  0   1  0100 99  0     0
VecScatterEnd      29278 1.0 1.0144e+0214.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  6  0  0  0  0   8  0  0  0  0     0
VecSetRandom           4 1.0 1.0141e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecNormalize        1760 1.0 1.1541e+01158.2 2.25e+07 1.1 0.0e+00 0.0e+00 1.8e+03  1  0  0  0 26   1  0  0  0 28    60
PCSetUp                2 1.0 1.2079e+01 1.0 1.36e+09 1.2 2.0e+04 9.8e+04 1.1e+03  2  1  1  1 16   3  1  1  1 17  3473
PCSetUpOnBlocks      858 1.0 8.4829e-04 1.7 6.24e+03 0.0 0.0e+00 0.0e+00 5.0e+00  0  0  0  0  0   0  0  0  0  0     7
PCApply              858 1.0 3.4936e+02 1.1 1.11e+11 1.3 1.7e+06 6.8e+04 2.6e+03 59 83 96 88 39  84 86 96 89 41  8195
PCGAMGgraph_AGG        4 1.0 1.1721e+00 1.0 1.59e+07 1.4 7.4e+02 4.4e+04 7.6e+01  0  0  0  0  1   0  0  0  0  1   340
PCGAMGcoarse_AGG       4 1.0 5.8811e+00 1.0 7.66e+08 3.9 8.7e+03 1.4e+05 4.1e+02  1  1  0  1  6   1  1  0  1  7  3843
PCGAMGProl_AGG         4 1.0 8.9037e-01 1.0 0.00e+00 0.0 1.8e+03 5.4e+04 1.1e+02  0  0  0  0  2   0  0  0  0  2     0
PCGAMGPOpt_AGG         4 1.0 1.6679e+00 1.0 3.64e+08 1.2 4.5e+03 6.4e+04 2.1e+02  0  0  0  0  3   0  0  0  0  3  6189

--- Event Stage 2: momentum_eqn

MatMult               38 1.0 3.3929e+00 1.3 8.88e+08 1.1 2.4e+03 7.2e+05 0.0e+00  1  1  0  1  0  12 23 95 99  0  8221
MatSolve              57 1.0 4.6989e+00 1.4 1.31e+09 1.1 0.0e+00 0.0e+00 0.0e+00  1  1  0  0  0  16 34  0  0  0  8769
MatLUFactorNum        19 1.0 4.5038e+00 1.1 7.31e+08 1.1 0.0e+00 0.0e+00 0.0e+00  1  1  0  0  0  16 19  0  0  0  5093
MatILUFactorSym        1 1.0 1.9833e-01 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+00  0  0  0  0  0   1  0  0  0  0     0
MatAssemblyBegin      19 1.0 3.3288e+0018.8 0.00e+00 0.0 0.0e+00 0.0e+00 3.8e+01  0  0  0  0  1   5  0  0  0 13     0
MatAssemblyEnd        19 1.0 1.6178e+00 1.2 0.00e+00 0.0 1.2e+02 1.8e+05 8.0e+00  0  0  0  0  0   6  0  5  1  3     0
MatGetRowIJ            1 1.0 2.1458e-06 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetOrdering         1 1.0 2.2255e-02 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00  0  0  0  0  0   0  0  0  0  1     0
KSPSetUp              38 1.0 5.9859e-02 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
KSPSolve              19 1.0 1.6160e+01 1.0 3.90e+09 1.1 2.4e+03 7.2e+05 1.4e+02  3  4  0  1  2  59100 95 99 46  7577
VecDot                38 1.0 1.3321e+00 4.0 1.38e+08 1.1 0.0e+00 0.0e+00 3.8e+01  0  0  0  0  1   2  4  0  0 13  3262
VecDotNorm2           19 1.0 1.2704e+00 3.6 2.76e+08 1.1 0.0e+00 0.0e+00 5.7e+01  0  0  0  0  1   2  7  0  0 19  6841
VecNorm               38 1.0 1.5091e+00 9.8 1.38e+08 1.1 0.0e+00 0.0e+00 3.8e+01  0  0  0  0  1   2  4  0  0 13  2879
VecCopy               38 1.0 3.3406e-01 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0
VecSet               115 1.0 7.8406e-01 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   3  0  0  0  0     0
VecAXPBYCZ            38 1.0 6.9826e-01 1.4 2.76e+08 1.1 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   2  7  0  0  0 12446
VecWAXPY              38 1.0 6.8325e-01 1.3 1.38e+08 1.1 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   2  4  0  0  0  6360
VecAssemblyBegin      38 1.0 4.2442e-0143.9 0.00e+00 0.0 0.0e+00 0.0e+00 1.1e+02  0  0  0  0  2   0  0  0  0 38     0
VecAssemblyEnd        38 1.0 4.1008e-05 2.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecScatterBegin       38 1.0 5.7228e-02 2.5 0.00e+00 0.0 2.4e+03 7.2e+05 0.0e+00  0  0  0  1  0   0  0 95 99  0     0
VecScatterEnd         38 1.0 4.9354e-01 4.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0
PCSetUp               38 1.0 4.7225e+00 1.1 7.31e+08 1.1 0.0e+00 0.0e+00 5.0e+00  1  1  0  0  0  16 19  0  0  2  4857
PCSetUpOnBlocks       19 1.0 4.7223e+00 1.1 7.31e+08 1.1 0.0e+00 0.0e+00 3.0e+00  1  1  0  0  0  16 19  0  0  1  4857
PCApply               57 1.0 5.0658e+00 1.4 1.31e+09 1.1 0.0e+00 0.0e+00 0.0e+00  1  1  0  0  0  17 34  0  0  0  8133
------------------------------------------------------------------------------------------------------------------------

Memory usage is given in bytes:

Object Type          Creations   Destructions     Memory  Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

              Matrix     6             48    802816820     0
       Krylov Solver     2              9        27392     0
              Vector     4             63    184694312     0
      Vector Scatter     0             10        10360     0
           Index Set     0             10      7279624     0
      Preconditioner     0              9         8868     0
              Viewer     1              0            0     0

--- Event Stage 1: poisson_eqn

              Matrix   117             76    536993068     0
      Matrix Coarsen     4              4         2448     0
       Krylov Solver    10              4       120512     0
              Vector   225            175    150834928     0
      Vector Scatter    31             22        22792     0
           Index Set    81             74       749444     0
      Preconditioner    11              4         3424     0
         PetscRandom     4              4         2432     0

--- Event Stage 2: momentum_eqn

              Matrix     1              0            0     0
       Krylov Solver     1              0            0     0
              Vector    10              1         1496     0
      Vector Scatter     1              0            0     0
           Index Set     5              2       361680     0
      Preconditioner     2              0            0     0
========================================================================================================================
Average time to get PetscTime(): 9.53674e-08
Average time for MPI_Barrier(): 2.57969e-05
Average time for zero size MPI_Send(): 2.34693e-06
#PETSc Option Table entries:
-log_summary
-poisson_pc_gamg_agg_nsmooths 1
-poisson_pc_type gamg
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure run at: Sun Oct  7 16:51:24 2012
Configure options: --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-mpi-dir=/app1/mvapich2/current/ --with-blas-lapack-dir=/app1/intel/mkl/lib/intel64/ --with-batch=1 --with-debugging=0 --download-hypre=1 --prefix=/home/svu/g0306332/lib/petsc-3.3-dev_atlas6_rel --known-mpi-shared-libraries=0
-----------------------------------------
Libraries compiled on Sun Oct  7 16:51:24 2012 on atlas6-c01 
Machine characteristics: Linux-2.6.18-274.7.1.el5-x86_64-with-redhat-5.8-Final
Using PETSc directory: /home/svu/g0306332/codes/petsc-dev
Using PETSc arch: petsc-3.3-dev_atlas6_rel
-----------------------------------------

Using C compiler: /app1/mvapich2/current/bin/mpicc  -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O  ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: /app1/mvapich2/current/bin/mpif90  -O3   ${FOPTFLAGS} ${FFLAGS} 
-----------------------------------------

Using include paths: -I/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/include -I/home/svu/g0306332/codes/petsc-dev/include -I/home/svu/g0306332/codes/petsc-dev/include -I/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/include -I/app1/mvapich2/current/include -I/app1/mvapich2/1.4/include
-----------------------------------------

Using C linker: /app1/mvapich2/current/bin/mpicc
Using Fortran linker: /app1/mvapich2/current/bin/mpif90
Using libraries: -Wl,-rpath,/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -L/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -lpetsc -lX11 -Wl,-rpath,/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -L/home/svu/g0306332/codes/petsc-dev/petsc-3.3-dev_atlas6_rel/lib -lHYPRE -L/app1/mvapich2/1.4/lib -L/nfs/app1/intel_2011/composer_xe_2011_sp1.6.233/compiler/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/compiler/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/ipp/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/mkl/lib/intel64 -L/app1/intel_2011/composer_xe_2011_sp1.6.233/tbb/lib/intel64/cc4.1.0_libc2.4_kernel2.6.16.21 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -lmpichcxx -lstdc++ -lpthread -Wl,-rpath,/app1/intel/mkl/lib/intel64 -L/app1/intel/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -lpthread -lmpichf90 -lifport -lifcore -limf -lsvml -lm -lipgo -lirc -lirc_s -lm -lmpichcxx -lstdc++ -lmpichcxx -lstdc++ -ldl -lmpich -lpthread -lrdmacm -libverbs -libumad -lrt -lgcc_s -ldl 
-----------------------------------------

Job  /app1/common/lsf7/7.0/linux2.6-glibc2.3-x86_64/bin/mvapich_wrapper ./a.out -poisson_pc_gamg_agg_nsmooths 1 -poisson_pc_type gamg -log_summary

TID   HOST_NAME   COMMAND_LINE            STATUS            TERMINATION_TIME
===== ========== ================  =======================  ===================
00000 atlas6-c43 ./a.out -poisson  Done                     10/08/2012 04:38:45
00001 atlas6-c43 ./a.out -poisson  Done                     10/08/2012 04:38:45
00002 atlas6-c43 ./a.out -poisson  Done                     10/08/2012 04:38:45
00003 atlas6-c43 ./a.out -poisson  Done                     10/08/2012 04:38:45
00004 atlas6-c43 ./a.out -poisson  Done                     10/08/2012 04:38:45
00005 atlas6-c43 ./a.out -poisson  Done                     10/08/2012 04:38:45
00006 atlas6-c43 ./a.out -poisson  Done                     10/08/2012 04:38:45
00007 atlas6-c43 ./a.out -poisson  Done                     10/08/2012 04:38:45
00008 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00009 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00010 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00011 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00012 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00013 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00014 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00015 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00016 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00017 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00018 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00019 atlas6-c39 ./a.out -poisson  Done                     10/08/2012 04:38:45
00020 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45
00021 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45
00022 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45
00023 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45
00024 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45
00025 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45
00026 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45
00027 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45
00028 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45
00029 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45
00030 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45
00031 atlas6-c14 ./a.out -poisson  Done                     10/08/2012 04:38:45


More information about the petsc-users mailing list