[petsc-users] KSP solver for single process

Alan zhenglun.wei at gmail.com
Tue Aug 6 14:56:35 CDT 2013


    Thanks for replies. Here I attached the log_summary for the large 
and small problems. The DoFs for the large problem is 4 times of that 
for the small problem. Few observations are listed here:
1, the total number of iterations does not change much from the small 
problem to the large one;
2, the time elapsed for KSPSolve() for the large problem is less than 4 
times of that for the small problem;
3, the time elapsed for PCSet() for the large problem is more than 10 
times of that for the small problem;
4, the time elapsed for PCGAMGProl_AGG for the large problem is more 
than 20 times of that for the small problem;
    In my code, I have solved the Poisson equation for twice with 
difference RHS; however, the observation above is almost consistent for 
these two times.
    Do these observation indicate that I should switch my PC from GAMG 
to MG for solving Poisson equation in a single process?

best,
Alan

> On Tue, Aug 6, 2013 at 2:31 PM, Karl Rupp <rupp at mcs.anl.gov 
> <mailto:rupp at mcs.anl.gov>> wrote:
>
>     Hi Alan,
>
>     please use -log_summary to get profiling information on the run.
>     What is
>     the bottleneck? Is it the number of solver iterations increasing
>     significantly? If so, consider changing the preconditioner options
>     (more
>     levels!). I don't expect a direct solver to be any faster in the 180k
>     case for a Poisson problem.
>
>
> Mudpack is geometric multigrid: 
> http://www2.cisl.ucar.edu/resources/legacy/mudpack
> This should be faster.
>
>    Matt
>
>     Best regards,
>     Karli
>
>
>     On 08/06/2013 02:22 PM, Alan wrote:
>     > Dear all,
>     > I hope you're having a nice day.
>     > I have a quick question on solving Poisson equation with KSP solvers
>     > (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this
>     solver with:
>     > -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1
>     -mg_levels_ksp_max_it
>     > 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7
>     > It performs very well in parallel computation and scalability is
>     fine.
>     > However, if I run it with a single process, the KSP solver is much
>     > slower than direct ones, i.e. Mudpack. Briefly, the speed difference
>     > between the KSP solver and the direct solver is negligible on
>     dealing
>     > with small problems (i.e.36k DoFs ) but becomes very huge for
>     moderate
>     > large problems (i.e. 180k DoFs). Although the direct solver
>     inherently
>     > has better performance for moderate large problems in the single
>     > process, I wonder if any setup or approach can improve the
>     performance
>     > of this KSP Poisson solver with the single process? or even make it
>     > obtain competitive speed (a little bit slower is fine) against
>     direct
>     > solvers.
>     >
>     > thanks in advance,
>     > Alan
>     >
>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130806/a7c33f0a/attachment-0001.html>
-------------- next part --------------
node = 0 mx = 600  my = 240  mm = 1  nn = 1  
node = 0 xs = 0  ys = 0  xw = 600  yw = 240  
rank = 0:  left BC = 1,  rightBC = 1, bottom BC = 1, top BC = 1
rank = 0:  xc = 599, yc = 240, xw = 600, yw = 240
rank 0 Cylinder: neighbor = left: 128 right: 152  bottom: 48  top: 72 
Current Computation Time Step = 0, Total Computation Time step number = 1
Heave_dH = 0.001841, PhysVel.y = 0.169646 
Rank = 0, Average velocity on IB points: U-ib = 1.000000, V-ib = 0.000000
  0 KSP Residual norm 1.766720535653e+01 
  1 KSP Residual norm 8.320548736317e+00 
  2 KSP Residual norm 3.497137130771e+00 
  3 KSP Residual norm 1.003445717739e+00 
  4 KSP Residual norm 3.869439823358e-01 
  5 KSP Residual norm 2.475103239062e-01 
  6 KSP Residual norm 2.281611491375e-01 
  7 KSP Residual norm 1.084138785055e-01 
  8 KSP Residual norm 5.141423441920e-02 
  9 KSP Residual norm 3.347748825553e-02 
 10 KSP Residual norm 2.628353299859e-02 
 11 KSP Residual norm 2.096532648662e-02 
 12 KSP Residual norm 8.618284456392e-03 
 13 KSP Residual norm 5.565127181073e-03 
 14 KSP Residual norm 6.314337218164e-03 
 15 KSP Residual norm 3.131142787500e-03 
 16 KSP Residual norm 3.068804030745e-03 
 17 KSP Residual norm 2.349857536588e-03 
 18 KSP Residual norm 8.503110026710e-04 
 19 KSP Residual norm 7.687867061945e-04 
 20 KSP Residual norm 4.742409404804e-04 
 21 KSP Residual norm 5.672769845689e-04 
 22 KSP Residual norm 4.808829820485e-04 
 23 KSP Residual norm 2.857419449644e-04 
 24 KSP Residual norm 1.438427631790e-04 
 25 KSP Residual norm 4.860115232885e-05 
 26 KSP Residual norm 3.225934842340e-05 
 27 KSP Residual norm 1.397147991245e-05 
 28 KSP Residual norm 1.260818970574e-05 
 29 KSP Residual norm 1.047031075597e-05 
 30 KSP Residual norm 5.920108684271e-06 
 31 KSP Residual norm 3.051540274780e-06 
 32 KSP Residual norm 1.496508480442e-06 
Pressure Check Iteration = 1, Error = 8.812422e-09, Max Pressure = 1.377380 @ (134,61) 
Pressure Corrector RHS Calculated!!!!!
Pressure Corrector RHS Calculated!!!!!
  0 KSP Residual norm 1.055813500392e+01 
  1 KSP Residual norm 5.111549490211e+00 
  2 KSP Residual norm 2.431687757980e+00 
  3 KSP Residual norm 7.380477067726e-01 
  4 KSP Residual norm 2.649467853279e-01 
  5 KSP Residual norm 1.581771354806e-01 
  6 KSP Residual norm 1.813719861751e-01 
  7 KSP Residual norm 8.239048633455e-02 
  8 KSP Residual norm 3.243134600574e-02 
  9 KSP Residual norm 2.196187959685e-02 
 10 KSP Residual norm 1.990772184469e-02 
 11 KSP Residual norm 1.408691713651e-02 
 12 KSP Residual norm 5.553521214754e-03 
 13 KSP Residual norm 3.627141705183e-03 
 14 KSP Residual norm 4.194181361457e-03 
 15 KSP Residual norm 2.317138149101e-03 
 16 KSP Residual norm 2.352605973283e-03 
 17 KSP Residual norm 1.884396223781e-03 
 18 KSP Residual norm 8.511003652931e-04 
 19 KSP Residual norm 5.716104892684e-04 
 20 KSP Residual norm 3.455582593757e-04 
 21 KSP Residual norm 3.943724766808e-04 
 22 KSP Residual norm 3.558747195132e-04 
 23 KSP Residual norm 2.440517227429e-04 
 24 KSP Residual norm 1.220784864643e-04 
 25 KSP Residual norm 4.312636138662e-05 
 26 KSP Residual norm 2.491805473018e-05 
 27 KSP Residual norm 1.092946022663e-05 
 28 KSP Residual norm 9.586541934346e-06 
 29 KSP Residual norm 8.338858229099e-06 
 30 KSP Residual norm 5.256635417170e-06 
 31 KSP Residual norm 2.650948822714e-06 
 32 KSP Residual norm 1.273360000962e-06 
 33 KSP Residual norm 1.159289546119e-06 
 34 KSP Residual norm 6.729897324730e-07 
Rank#0, Max dp = 7.959984e-01 @ (134, 66)
Rank#0, time step = 0, continuity = 2.829245e-07 @ (134, 90)
Rank = 0, Computation for time step 0 is done!! 0 time steps left

Rank = 0, W time = 122.641372
************************************************************************************************************************
***             WIDEN YOUR WINDOW TO 120 CHARACTERS.  Use 'enscript -r -fCourier9' to print this document            ***
************************************************************************************************************************

---------------------------------------------- PETSc Performance Summary: ----------------------------------------------

./ex29 on a arch-linux2-c-debug named l2118a-linux.soecs.ku.edu with 1 processor, by zlwei Tue Aug  6 14:45:13 2013
Using Petsc Development GIT revision: 7a0108da53bbe8dff949efa7a5ab1303a7fb1560  GIT Date: 2013-06-20 10:11:56 +0200

                         Max       Max/Min        Avg      Total 
Time (sec):           1.238e+02      1.00000   1.238e+02
Objects:              4.010e+02      1.00000   4.010e+02
Flops:                8.165e+08      1.00000   8.165e+08  8.165e+08
Flops/sec:            6.596e+06      1.00000   6.596e+06  6.596e+06
Memory:               8.291e+07      1.00000              8.291e+07
MPI Messages:         0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Message Lengths:  0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Reductions:       3.629e+03      1.00000

Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
                            e.g., VecAXPY() for real vectors of length N --> 2N flops
                            and VecAXPY() for complex vectors of length N --> 8N flops

Summary of Stages:   ----- Time ------  ----- Flops -----  --- Messages ---  -- Message Lengths --  -- Reductions --
                        Avg     %Total     Avg     %Total   counts   %Total     Avg         %Total   counts   %Total 
 0:      Main Stage: 1.6329e+00   1.3%  0.0000e+00   0.0%  0.000e+00   0.0%  0.000e+00        0.0%  6.000e+00   0.2% 
 1:      DMMG Setup: 9.4195e-02   0.1%  0.0000e+00   0.0%  0.000e+00   0.0%  0.000e+00        0.0%  6.000e+01   1.7% 
 2: Pressure RHS Setup: 6.3622e-02   0.1%  0.0000e+00   0.0%  0.000e+00   0.0%  0.000e+00        0.0%  6.000e+01   1.7% 
 3:  Pressure Solve: 6.0762e+01  49.1%  3.9881e+08  48.8%  0.000e+00   0.0%  0.000e+00        0.0%  1.695e+03  46.7% 
 4: Corrector RHS Setup: 4.9592e-02   0.0%  0.0000e+00   0.0%  0.000e+00   0.0%  0.000e+00        0.0%  6.000e+01   1.7% 
 5: Corrector Solve: 6.1190e+01  49.4%  4.1768e+08  51.2%  0.000e+00   0.0%  0.000e+00        0.0%  1.747e+03  48.1% 

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length (bytes)
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %f - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------


      ##########################################################
      #                                                        #
      #                          WARNING!!!                    #
      #                                                        #
      #   This code was compiled with a debugging option,      #
      #   To get timing results run ./configure                #
      #   using --with-debugging=no, the performance will      #
      #   be generally two or three times faster.              #
      #                                                        #
      ##########################################################


Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %f %M %L %R  %T %f %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------

--- Event Stage 0: Main Stage

VecSet                 1 1.0 4.1831e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecScatterBegin        1 1.0 2.2550e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0

--- Event Stage 1: DMMG Setup

ThreadCommRunKer       1 1.0 7.1526e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
ThreadCommBarrie       1 1.0 3.8147e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecSet                 1 1.0 4.5800e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   5  0  0  0  0     0

--- Event Stage 2: Pressure RHS Setup

VecSet                 1 1.0 4.2260e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   7  0  0  0  0     0

--- Event Stage 3: Pressure Solve

KSPGMRESOrthog        30 1.0 6.1212e-02 1.0 3.64e+07 1.0 0.0e+00 0.0e+00 1.6e+02  0  4  0  0  5   0  9  0  0 10   595
KSPSetUp              10 1.0 1.4705e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.3e+02  0  0  0  0  4   0  0  0  0  8     0
KSPSolve               1 1.0 4.4591e+00 1.0 3.08e+08 1.0 0.0e+00 0.0e+00 8.8e+02  4 38  0  0 24   7 77  0  0 52    69
VecMDot               30 1.0 2.5030e-02 1.0 1.82e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  5  0  0  0   727
VecTDot               64 1.0 4.8102e-02 1.0 1.84e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  5  0  0  0   383
VecNorm               66 1.0 2.3875e-02 1.0 1.31e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  3  0  0  0   551
VecScale              33 1.0 3.5070e-02 1.0 1.82e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0    52
VecCopy              104 1.0 1.2174e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecSet               414 1.0 4.0807e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecAXPY              265 1.0 1.7370e-01 1.0 4.06e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  5  0  0  0   0 10  0  0  0   234
VecAYPX              229 1.0 1.0965e-01 1.0 1.99e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  5  0  0  0   181
VecMAXPY              33 1.0 4.1362e-02 1.0 2.15e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  3  0  0  0   0  5  0  0  0   520
VecAssemblyBegin       5 1.0 2.2888e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecAssemblyEnd         5 1.0 2.0027e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecPointwiseMult     231 1.0 9.4288e-02 1.0 1.27e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  3  0  0  0   135
VecSetRandom           3 1.0 2.9329e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecNormalize          33 1.0 4.2642e-02 1.0 5.46e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   0  1  0  0  0   128
MatMult              260 1.0 9.8725e-01 1.0 1.74e+08 1.0 0.0e+00 0.0e+00 0.0e+00  1 21  0  0  0   2 44  0  0  0   176
MatMultAdd            99 1.0 2.1511e-01 1.0 2.64e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  3  0  0  0   0  7  0  0  0   123
MatMultTranspose      99 1.0 2.3953e-01 1.0 2.64e+07 1.0 0.0e+00 0.0e+00 9.9e+01  0  3  0  0  3   0  7  0  0  6   110
MatSolve              33 1.0 1.2226e-03 1.0 7.49e+04 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0    61
MatLUFactorSym         1 1.0 5.2905e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatLUFactorNum         1 1.0 2.8586e-04 1.0 1.42e+04 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0    50
MatConvert             3 1.0 2.4556e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 9.0e+00  0  0  0  0  0   0  0  0  0  1     0
MatScale               9 1.0 2.1022e-02 1.0 2.71e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  1  0  0  0   129
MatAssemblyBegin      29 1.0 1.4186e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatAssemblyEnd        29 1.0 1.9785e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetRow         496479 1.0 2.0033e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  2  0  0  0  0   3  0  0  0  0     0
MatGetRowIJ            1 1.0 3.5048e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetOrdering         1 1.0 4.7588e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatCoarsen             3 1.0 1.1538e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01  0  0  0  0  0   0  0  0  0  1     0
MatAXPY                3 1.0 8.1131e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatMatMult             3 1.0 2.1595e-01 1.0 2.31e+06 1.0 0.0e+00 0.0e+00 1.8e+01  0  0  0  0  0   0  1  0  0  1    11
MatMatMultSym          3 1.0 1.7298e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01  0  0  0  0  0   0  0  0  0  1     0
MatMatMultNum          3 1.0 4.2782e-02 1.0 2.31e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  1  0  0  0    54
MatPtAP                3 1.0 3.9002e-01 1.0 8.24e+06 1.0 0.0e+00 0.0e+00 1.8e+01  0  1  0  0  0   1  2  0  0  1    21
MatPtAPSymbolic        3 1.0 1.2223e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01  0  0  0  0  0   0  0  0  0  1     0
MatPtAPNumeric         3 1.0 2.6774e-01 1.0 8.24e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   0  2  0  0  0    31
MatTrnMatMult          3 1.0 8.6899e-01 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 3.6e+01  1  2  0  0  1   1  3  0  0  2    14
MatTrnMatMultSym       3 1.0 6.8294e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01  1  0  0  0  1   1  0  0  0  2     0
MatTrnMatMultNum       3 1.0 1.8598e-01 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  3  0  0  0    67
MatGetSymTrans         6 1.0 2.9898e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
PCGAMGgraph_AGG        3 1.0 7.7426e+00 1.0 1.91e+06 1.0 0.0e+00 0.0e+00 4.5e+01  6  0  0  0  1  13  0  0  0  3     0
PCGAMGcoarse_AGG       3 1.0 1.1194e+00 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 5.1e+01  1  2  0  0  1   2  3  0  0  3    11
PCGAMGProl_AGG         3 1.0 4.1053e+01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 33  0  0  0  0  68  0  0  0  1     0
PCGAMGPOpt_AGG         3 1.0 2.9895e+00 1.0 6.79e+07 1.0 0.0e+00 0.0e+00 4.9e+02  2  8  0  0 13   5 17  0  0 29    23
PCSetUp                2 1.0 5.3339e+01 1.0 9.05e+07 1.0 0.0e+00 0.0e+00 8.0e+02 43 11  0  0 22  88 23  0  0 47     2
PCSetUpOnBlocks       33 1.0 2.3496e-03 1.0 1.42e+04 1.0 0.0e+00 0.0e+00 1.0e+01  0  0  0  0  0   0  0  0  0  1     6
PCApply               33 1.0 3.3544e+00 1.0 2.12e+08 1.0 0.0e+00 0.0e+00 7.8e+02  3 26  0  0 22   6 53  0  0 46    63

--- Event Stage 4: Corrector RHS Setup

VecSet                 1 1.0 3.9101e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0

--- Event Stage 5: Corrector Solve

KSPGMRESOrthog        30 1.0 6.1215e-02 1.0 3.64e+07 1.0 0.0e+00 0.0e+00 1.6e+02  0  4  0  0  5   0  9  0  0  9   595
KSPSetUp              10 1.0 4.3111e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.3e+02  0  0  0  0  4   0  0  0  0  7     0
KSPSolve               1 1.0 4.7401e+00 1.0 3.27e+08 1.0 0.0e+00 0.0e+00 9.3e+02  4 40  0  0 26   8 78  0  0 53    69
VecMDot               30 1.0 2.5080e-02 1.0 1.82e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  4  0  0  0   726
VecTDot               68 1.0 5.2035e-02 1.0 1.96e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  5  0  0  0   376
VecNorm               68 1.0 2.4876e-02 1.0 1.37e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  3  0  0  0   552
VecScale              33 1.0 3.5061e-02 1.0 1.82e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0    52
VecCopy              110 1.0 1.2474e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecSet               432 1.0 2.8626e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecAXPY              281 1.0 1.8523e-01 1.0 4.31e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  5  0  0  0   0 10  0  0  0   233
VecAYPX              243 1.0 1.1632e-01 1.0 2.11e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  3  0  0  0   0  5  0  0  0   181
VecMAXPY              33 1.0 4.1347e-02 1.0 2.15e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  3  0  0  0   0  5  0  0  0   520
VecAssemblyBegin       5 1.0 2.2888e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecAssemblyEnd         5 1.0 1.9073e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecPointwiseMult     243 1.0 9.9632e-02 1.0 1.34e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  3  0  0  0   135
VecSetRandom           3 1.0 2.9260e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecNormalize          33 1.0 4.2624e-02 1.0 5.46e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   0  1  0  0  0   128
MatMult              274 1.0 1.0408e+00 1.0 1.83e+08 1.0 0.0e+00 0.0e+00 0.0e+00  1 22  0  0  0   2 44  0  0  0   176
MatMultAdd           105 1.0 2.2814e-01 1.0 2.80e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  3  0  0  0   0  7  0  0  0   123
MatMultTranspose     105 1.0 2.5417e-01 1.0 2.80e+07 1.0 0.0e+00 0.0e+00 1.0e+02  0  3  0  0  3   0  7  0  0  6   110
MatSolve              35 1.0 1.2560e-03 1.0 7.95e+04 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0    63
MatLUFactorSym         1 1.0 2.4891e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatLUFactorNum         1 1.0 2.6894e-04 1.0 1.42e+04 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0    53
MatConvert             3 1.0 6.8882e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 9.0e+00  0  0  0  0  0   0  0  0  0  1     0
MatScale               9 1.0 2.1209e-02 1.0 2.71e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  1  0  0  0   128
MatAssemblyBegin      29 1.0 1.4400e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatAssemblyEnd        29 1.0 1.9717e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetRow         496479 1.0 2.0019e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  2  0  0  0  0   3  0  0  0  0     0
MatGetRowIJ            1 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetOrdering         1 1.0 2.5821e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatCoarsen             3 1.0 1.0246e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01  0  0  0  0  0   0  0  0  0  1     0
MatAXPY                3 1.0 8.0140e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatMatMult             3 1.0 2.1582e-01 1.0 2.31e+06 1.0 0.0e+00 0.0e+00 1.8e+01  0  0  0  0  0   0  1  0  0  1    11
MatMatMultSym          3 1.0 1.7282e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01  0  0  0  0  0   0  0  0  0  1     0
MatMatMultNum          3 1.0 4.2815e-02 1.0 2.31e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  1  0  0  0    54
MatPtAP                3 1.0 3.8948e-01 1.0 8.24e+06 1.0 0.0e+00 0.0e+00 1.8e+01  0  1  0  0  0   1  2  0  0  1    21
MatPtAPSymbolic        3 1.0 1.2175e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01  0  0  0  0  0   0  0  0  0  1     0
MatPtAPNumeric         3 1.0 2.6768e-01 1.0 8.24e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   0  2  0  0  0    31
MatTrnMatMult          3 1.0 8.6146e-01 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 3.6e+01  1  2  0  0  1   1  3  0  0  2    14
MatTrnMatMultSym       3 1.0 6.7687e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01  1  0  0  0  1   1  0  0  0  2     0
MatTrnMatMultNum       3 1.0 1.8455e-01 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  3  0  0  0    68
MatGetSymTrans         6 1.0 2.7702e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
PCGAMGgraph_AGG        3 1.0 7.7286e+00 1.0 1.91e+06 1.0 0.0e+00 0.0e+00 4.5e+01  6  0  0  0  1  13  0  0  0  3     0
PCGAMGcoarse_AGG       3 1.0 1.0950e+00 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 5.1e+01  1  2  0  0  1   2  3  0  0  3    11
PCGAMGProl_AGG         3 1.0 4.1285e+01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 33  0  0  0  0  67  0  0  0  1     0
PCGAMGPOpt_AGG         3 1.0 2.9775e+00 1.0 6.79e+07 1.0 0.0e+00 0.0e+00 4.9e+02  2  8  0  0 13   5 16  0  0 28    23
PCSetUp                2 1.0 5.3523e+01 1.0 9.05e+07 1.0 0.0e+00 0.0e+00 8.0e+02 43 11  0  0 22  87 22  0  0 46     2
PCSetUpOnBlocks       35 1.0 1.3256e-03 1.0 1.42e+04 1.0 0.0e+00 0.0e+00 1.0e+01  0  0  0  0  0   0  0  0  0  1    11
PCApply               35 1.0 3.5666e+00 1.0 2.24e+08 1.0 0.0e+00 0.0e+00 8.3e+02  3 27  0  0 23   6 54  0  0 48    63
------------------------------------------------------------------------------------------------------------------------

Memory usage is given in bytes:

Object Type          Creations   Destructions     Memory  Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

       Krylov Solver     0             13        50296     0
     DMKSP interface     0              2         1296     0
              Vector     1             56     34403968     0
      Vector Scatter     0             12         7632     0
              Matrix     0             22     32698344     0
    Distributed Mesh     0              6      8669184     0
     Bipartite Graph     0             12         9504     0
           Index Set     0              6         5440     0
   IS L to G Mapping     0              9      5189220     0
      Preconditioner     0             12        12360     0

--- Event Stage 1: DMMG Setup

       Krylov Solver     1              0            0     0
              Vector     5              4         6048     0
      Vector Scatter     4              0            0     0
    Distributed Mesh     2              0            0     0
     Bipartite Graph     4              0            0     0
           Index Set    10             10      1159584     0
   IS L to G Mapping     3              0            0     0

--- Event Stage 2: Pressure RHS Setup

       Krylov Solver     1              0            0     0
     DMKSP interface     1              0            0     0
              Vector     5              4         6048     0
      Vector Scatter     4              0            0     0
    Distributed Mesh     3              1         4376     0
     Bipartite Graph     6              2         1584     0
           Index Set    10             10      1159584     0
   IS L to G Mapping     3              0            0     0
      Preconditioner     1              0            0     0
              Viewer     1              0            0     0

--- Event Stage 3: Pressure Solve

       Krylov Solver     8              3        90576     0
              Vector   100             74     28259312     0
              Matrix    23             12     29559028     0
      Matrix Coarsen     3              3         1884     0
           Index Set     6              3         2280     0
      Preconditioner     8              3         3024     0
         PetscRandom     3              3         1872     0

--- Event Stage 4: Corrector RHS Setup

       Krylov Solver     1              0            0     0
     DMKSP interface     1              0            0     0
              Vector     5              4         6048     0
      Vector Scatter     4              0            0     0
    Distributed Mesh     3              1         4376     0
     Bipartite Graph     6              2         1584     0
           Index Set    10             10      1159584     0
   IS L to G Mapping     3              0            0     0
      Preconditioner     1              0            0     0

--- Event Stage 5: Corrector Solve

       Krylov Solver     8              3        90576     0
              Vector   100             74     28259312     0
              Matrix    23             12     29559028     0
      Matrix Coarsen     3              3         1884     0
           Index Set     6              3         2280     0
      Preconditioner     8              3         3024     0
         PetscRandom     3              3         1872     0
========================================================================================================================
Average time to get PetscTime(): 1.90735e-07
#PETSc Option Table entries:
-ksp_monitor
-ksp_rtol 1.0e-7
-ksp_type cg
-log_summary
-mg_levels_ksp_max_it 1
-mg_levels_ksp_type richardson
-pc_gamg_agg_nsmooths 1
-pc_type gamg
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure run at: Mon Jun 24 19:50:10 2013
Configure options: --download-f-blas-lapack --download-hypre --download-mpich --with-cc=gcc --with-fc=gfortran PETSC_ARCH=arch-linux2-c-debug
-----------------------------------------
Libraries compiled on Mon Jun 24 19:50:10 2013 on l2118a-linux.soecs.ku.edu 
Machine characteristics: Linux-2.6.18-128.el5-x86_64-with-redhat-5.3-Tikanga
Using PETSc directory: /home/zlwei/soft/mercurial/petsc-dev
Using PETSc arch: arch-linux2-c-debug
-----------------------------------------

Using C compiler: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpicc  -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g3 -fno-inline -O0  ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpif90  -fPIC  -Wall -Wno-unused-variable -g  ${FOPTFLAGS} ${FFLAGS} 
-----------------------------------------

Using include paths: -I/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/include -I/home/zlwei/soft/mercurial/petsc-dev/include -I/home/zlwei/soft/mercurial/petsc-dev/include -I/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/include
-----------------------------------------

Using C linker: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpicc
Using Fortran linker: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpif90
Using libraries: -Wl,-rpath,/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -L/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -lpetsc -Wl,-rpath,/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -L/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -lHYPRE -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -lmpichcxx -lstdc++ -lflapack -lfblas -lpthread -lmpichf90 -lgfortran -lm -lm -lmpichcxx -lstdc++ -lmpichcxx -lstdc++ -ldl -lmpich -lopa -lmpl -lrt -lpthread -lgcc_s -ldl 
-----------------------------------------

-------------- next part --------------
node = 0 mx = 300  my = 120  mm = 1  nn = 1  
node = 0 xs = 0  ys = 0  xw = 300  yw = 120  
rank = 0:  left BC = 1,  rightBC = 1, bottom BC = 1, top BC = 1
rank = 0:  xc = 299, yc = 120, xw = 300, yw = 120
rank 0 Cylinder: neighbor = left: 128 right: 152  bottom: 48  top: 72 
Current Computation Time Step = 0, Total Computation Time step number = 1
Heave_dH = 0.001841, PhysVel.y = 0.169646 
Rank = 0, Average velocity on IB points: U-ib = 1.000000, V-ib = 0.000000
  0 KSP Residual norm 3.219859842275e+01 
  1 KSP Residual norm 1.212793919099e+01 
  2 KSP Residual norm 1.000475318395e+00 
  3 KSP Residual norm 1.048356151684e+00 
  4 KSP Residual norm 2.417191252714e-01 
  5 KSP Residual norm 1.614749788884e-01 
  6 KSP Residual norm 5.607699469795e-02 
  7 KSP Residual norm 5.015849450869e-02 
  8 KSP Residual norm 3.767902711948e-02 
  9 KSP Residual norm 1.331120203189e-02 
 10 KSP Residual norm 1.486268056233e-02 
 11 KSP Residual norm 5.251536657590e-03 
 12 KSP Residual norm 4.794291514649e-03 
 13 KSP Residual norm 2.460495800806e-03 
 14 KSP Residual norm 2.248817042552e-03 
 15 KSP Residual norm 2.211309778295e-03 
 16 KSP Residual norm 2.287471668574e-03 
 17 KSP Residual norm 1.262579985084e-03 
 18 KSP Residual norm 4.163719864597e-04 
 19 KSP Residual norm 2.326361502572e-04 
 20 KSP Residual norm 2.841935932373e-04 
 21 KSP Residual norm 3.047482003586e-04 
 22 KSP Residual norm 3.582477628286e-04 
 23 KSP Residual norm 2.822803681240e-04 
 24 KSP Residual norm 1.256577194451e-04 
 25 KSP Residual norm 6.006337667087e-05 
 26 KSP Residual norm 5.482035386006e-05 
 27 KSP Residual norm 4.716042773817e-05 
 28 KSP Residual norm 3.438093185462e-05 
 29 KSP Residual norm 2.216599861020e-05 
 30 KSP Residual norm 1.188485020621e-05 
 31 KSP Residual norm 5.332709179154e-06 
 32 KSP Residual norm 5.571699960743e-06 
 33 KSP Residual norm 4.072911342628e-06 
 34 KSP Residual norm 2.429372703445e-06 
Pressure Check Iteration = 1, Error = 2.108138e-08, Max Pressure = 1.528494 @ (134,61) 
Pressure Corrector RHS Calculated!!!!!
Pressure Corrector RHS Calculated!!!!!
  0 KSP Residual norm 1.829819563087e+01 
  1 KSP Residual norm 7.909965123659e+00 
  2 KSP Residual norm 1.074505340791e+00 
  3 KSP Residual norm 7.169735199209e-01 
  4 KSP Residual norm 2.244554419387e-01 
  5 KSP Residual norm 1.754655965410e-01 
  6 KSP Residual norm 4.825693892894e-02 
  7 KSP Residual norm 5.251256707979e-02 
  8 KSP Residual norm 3.804056366048e-02 
  9 KSP Residual norm 1.200096281517e-02 
 10 KSP Residual norm 1.103128679453e-02 
 11 KSP Residual norm 3.997720166314e-03 
 12 KSP Residual norm 3.655688571252e-03 
 13 KSP Residual norm 1.543345959344e-03 
 14 KSP Residual norm 1.292885619415e-03 
 15 KSP Residual norm 1.248963602650e-03 
 16 KSP Residual norm 4.801522102369e-04 
 17 KSP Residual norm 5.672996517558e-04 
 18 KSP Residual norm 2.999258586999e-04 
 19 KSP Residual norm 2.844401232531e-04 
 20 KSP Residual norm 2.100112658645e-04 
 21 KSP Residual norm 1.057637855557e-04 
 22 KSP Residual norm 5.175131849164e-05 
 23 KSP Residual norm 5.284783973564e-05 
 24 KSP Residual norm 2.617328421458e-05 
 25 KSP Residual norm 2.066100891352e-05 
 26 KSP Residual norm 1.682497454183e-05 
 27 KSP Residual norm 1.172461379204e-05 
 28 KSP Residual norm 7.434528247622e-06 
 29 KSP Residual norm 7.721209094005e-06 
 30 KSP Residual norm 7.748882349033e-06 
 31 KSP Residual norm 8.982990172645e-06 
 32 KSP Residual norm 1.114273786849e-05 
 33 KSP Residual norm 7.667014380348e-06 
 34 KSP Residual norm 3.720460282534e-06 
 35 KSP Residual norm 2.203328100797e-06 
 36 KSP Residual norm 1.590063516358e-06 
Rank#0, Max dp = 8.902003e-01 @ (134, 66)
Rank#0, time step = 0, continuity = 7.809215e-07 @ (145, 66)
Rank = 0, Computation for time step 0 is done!! 0 time steps left

Rank = 0, W time = 13.260636
************************************************************************************************************************
***             WIDEN YOUR WINDOW TO 120 CHARACTERS.  Use 'enscript -r -fCourier9' to print this document            ***
************************************************************************************************************************

---------------------------------------------- PETSc Performance Summary: ----------------------------------------------

./ex29 on a arch-linux2-c-debug named l2118a-linux.soecs.ku.edu with 1 processor, by zlwei Tue Aug  6 14:46:54 2013
Using Petsc Development GIT revision: 7a0108da53bbe8dff949efa7a5ab1303a7fb1560  GIT Date: 2013-06-20 10:11:56 +0200

                         Max       Max/Min        Avg      Total 
Time (sec):           1.355e+01      1.00000   1.355e+01
Objects:              3.150e+02      1.00000   3.150e+02
Flops:                2.138e+08      1.00000   2.138e+08  2.138e+08
Flops/sec:            1.577e+07      1.00000   1.577e+07  1.577e+07
Memory:               2.090e+07      1.00000              2.090e+07
MPI Messages:         0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Message Lengths:  0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Reductions:       2.797e+03      1.00000

Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
                            e.g., VecAXPY() for real vectors of length N --> 2N flops
                            and VecAXPY() for complex vectors of length N --> 8N flops

Summary of Stages:   ----- Time ------  ----- Flops -----  --- Messages ---  -- Message Lengths --  -- Reductions --
                        Avg     %Total     Avg     %Total   counts   %Total     Avg         %Total   counts   %Total 
 0:      Main Stage: 4.5979e-01   3.4%  0.0000e+00   0.0%  0.000e+00   0.0%  0.000e+00        0.0%  6.000e+00   0.2% 
 1:      DMMG Setup: 2.7542e-02   0.2%  0.0000e+00   0.0%  0.000e+00   0.0%  0.000e+00        0.0%  6.000e+01   2.1% 
 2: Pressure RHS Setup: 1.9010e-02   0.1%  0.0000e+00   0.0%  0.000e+00   0.0%  0.000e+00        0.0%  6.000e+01   2.1% 
 3:  Pressure Solve: 6.5101e+00  48.0%  1.0454e+08  48.9%  0.000e+00   0.0%  0.000e+00        0.0%  1.285e+03  45.9% 
 4: Corrector RHS Setup: 1.4123e-02   0.1%  0.0000e+00   0.0%  0.000e+00   0.0%  0.000e+00        0.0%  6.000e+01   2.1% 
 5: Corrector Solve: 6.5235e+00  48.1%  1.0926e+08  51.1%  0.000e+00   0.0%  0.000e+00        0.0%  1.325e+03  47.4% 

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length (bytes)
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %f - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------


      ##########################################################
      #                                                        #
      #                          WARNING!!!                    #
      #                                                        #
      #   This code was compiled with a debugging option,      #
      #   To get timing results run ./configure                #
      #   using --with-debugging=no, the performance will      #
      #   be generally two or three times faster.              #
      #                                                        #
      ##########################################################


Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %f %M %L %R  %T %f %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------

--- Event Stage 0: Main Stage

VecSet                 1 1.0 1.4708e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecScatterBegin        1 1.0 5.1093e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0

--- Event Stage 1: DMMG Setup

ThreadCommRunKer       1 1.0 6.9141e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
ThreadCommBarrie       1 1.0 5.0068e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecSet                 1 1.0 1.3750e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   5  0  0  0  0     0

--- Event Stage 2: Pressure RHS Setup

VecSet                 1 1.0 9.4509e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   5  0  0  0  0     0

--- Event Stage 3: Pressure Solve

KSPGMRESOrthog        20 1.0 1.4830e-02 1.0 9.05e+06 1.0 0.0e+00 0.0e+00 1.1e+02  0  4  0  0  4   0  9  0  0  9   610
KSPSetUp               8 1.0 5.8279e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 9.6e+01  0  0  0  0  3   0  0  0  0  7     0
KSPSolve               1 1.0 1.2430e+00 1.0 8.23e+07 1.0 0.0e+00 0.0e+00 7.2e+02  9 38  0  0 26  19 79  0  0 56    66
VecMDot               20 1.0 6.0251e-03 1.0 4.52e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  4  0  0  0   751
VecTDot               68 1.0 1.5597e-02 1.0 4.90e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  5  0  0  0   314
VecNorm               57 1.0 6.5038e-03 1.0 3.42e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  3  0  0  0   527
VecScale              22 1.0 9.2158e-03 1.0 4.52e+05 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0    49
VecCopy               74 1.0 2.9335e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecSet               327 1.0 1.6136e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecAXPY              210 1.0 5.2465e-02 1.0 1.07e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  5  0  0  0   1 10  0  0  0   205
VecAYPX              173 1.0 2.9601e-02 1.0 5.25e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  5  0  0  0   177
VecMAXPY              22 1.0 9.6974e-03 1.0 5.34e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  5  0  0  0   551
VecAssemblyBegin       4 1.0 2.0027e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecAssemblyEnd         4 1.0 1.6212e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecPointwiseMult     162 1.0 2.5438e-02 1.0 3.33e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  3  0  0  0   131
VecSetRandom           2 1.0 7.2423e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  1  0  0  0  0   1  0  0  0  0     0
VecNormalize          22 1.0 1.1368e-02 1.0 1.36e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   0  1  0  0  0   119
MatMult              194 1.0 2.4963e-01 1.0 4.50e+07 1.0 0.0e+00 0.0e+00 0.0e+00  2 21  0  0  0   4 43  0  0  0   180
MatMultAdd            70 1.0 5.5392e-02 1.0 6.93e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  3  0  0  0   1  7  0  0  0   125
MatMultTranspose      70 1.0 6.2724e-02 1.0 6.93e+06 1.0 0.0e+00 0.0e+00 7.0e+01  0  3  0  0  3   1  7  0  0  5   110
MatSolve              35 1.0 5.5137e-03 1.0 1.05e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  1  0  0  0   190
MatLUFactorSym         1 1.0 1.7319e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatLUFactorNum         1 1.0 3.7150e-03 1.0 4.26e+05 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0   115
MatConvert             2 1.0 7.7617e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 6.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatScale               6 1.0 5.1129e-03 1.0 6.65e+05 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  1  0  0  0   130
MatAssemblyBegin      20 1.0 9.2030e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatAssemblyEnd        20 1.0 5.3781e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0
MatGetRow         123345 1.0 4.9806e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  4  0  0  0  0   8  0  0  0  0     0
MatGetRowIJ            1 1.0 9.7990e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetOrdering         1 1.0 1.0302e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatCoarsen             2 1.0 3.7993e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00  0  0  0  0  0   1  0  0  0  1     0
MatAXPY                2 1.0 1.9550e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatMatMult             2 1.0 5.3621e-02 1.0 5.66e+05 1.0 0.0e+00 0.0e+00 1.2e+01  0  0  0  0  0   1  1  0  0  1    11
MatMatMultSym          2 1.0 4.3117e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01  0  0  0  0  0   1  0  0  0  1     0
MatMatMultNum          2 1.0 1.0380e-02 1.0 5.66e+05 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  1  0  0  0    55
MatPtAP                2 1.0 9.5418e-02 1.0 2.02e+06 1.0 0.0e+00 0.0e+00 1.2e+01  1  1  0  0  0   1  2  0  0  1    21
MatPtAPSymbolic        2 1.0 2.9883e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01  0  0  0  0  0   0  0  0  0  1     0
MatPtAPNumeric         2 1.0 6.5507e-02 1.0 2.02e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   1  2  0  0  0    31
MatTrnMatMult          2 1.0 2.3324e-01 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 2.4e+01  2  1  0  0  1   4  3  0  0  2    13
MatTrnMatMultSym       2 1.0 1.8737e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01  1  0  0  0  1   3  0  0  0  2     0
MatTrnMatMultNum       2 1.0 4.5826e-02 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   1  3  0  0  0    65
MatGetSymTrans         4 1.0 1.0321e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
PCGAMGgraph_AGG        2 1.0 1.9121e+00 1.0 4.67e+05 1.0 0.0e+00 0.0e+00 3.0e+01 14  0  0  0  1  29  0  0  0  2     0
PCGAMGcoarse_AGG       2 1.0 3.0292e-01 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 3.4e+01  2  1  0  0  1   5  3  0  0  3    10
PCGAMGProl_AGG         2 1.0 1.4573e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 11  0  0  0  0  22  0  0  0  1     0
PCGAMGPOpt_AGG         2 1.0 7.4503e-01 1.0 1.68e+07 1.0 0.0e+00 0.0e+00 3.2e+02  5  8  0  0 12  11 16  0  0 25    23
PCSetUp                2 1.0 4.5365e+00 1.0 2.27e+07 1.0 0.0e+00 0.0e+00 5.5e+02 33 11  0  0 20  70 22  0  0 43     5
PCSetUpOnBlocks       35 1.0 7.3206e-03 1.0 4.26e+05 1.0 0.0e+00 0.0e+00 1.0e+01  0  0  0  0  0   0  0  0  0  1    58
PCApply               35 1.0 9.3794e-01 1.0 5.67e+07 1.0 0.0e+00 0.0e+00 6.2e+02  7 27  0  0 22  14 54  0  0 48    60

--- Event Stage 4: Corrector RHS Setup

VecSet                 1 1.0 1.0395e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0

--- Event Stage 5: Corrector Solve

KSPGMRESOrthog        20 1.0 1.4795e-02 1.0 9.05e+06 1.0 0.0e+00 0.0e+00 1.1e+02  0  4  0  0  4   0  8  0  0  8   611
KSPSetUp               8 1.0 2.1520e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 9.6e+01  0  0  0  0  3   0  0  0  0  7     0
KSPSolve               1 1.0 1.3155e+00 1.0 8.70e+07 1.0 0.0e+00 0.0e+00 7.6e+02 10 41  0  0 27  20 80  0  0 57    66
VecMDot               20 1.0 6.0053e-03 1.0 4.52e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  4  0  0  0   753
VecTDot               72 1.0 1.6134e-02 1.0 5.18e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  5  0  0  0   321
VecNorm               59 1.0 6.7673e-03 1.0 3.57e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  3  0  0  0   527
VecScale              22 1.0 9.1701e-03 1.0 4.52e+05 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0    49
VecCopy               78 1.0 3.1137e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecSet               341 1.0 1.0112e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecAXPY              222 1.0 5.5460e-02 1.0 1.14e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  5  0  0  0   1 10  0  0  0   205
VecAYPX              183 1.0 3.1363e-02 1.0 5.56e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  3  0  0  0   0  5  0  0  0   177
VecMAXPY              22 1.0 9.7005e-03 1.0 5.34e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  5  0  0  0   551
VecAssemblyBegin       4 1.0 1.8835e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecAssemblyEnd         4 1.0 1.5974e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecPointwiseMult     170 1.0 2.6755e-02 1.0 3.49e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  3  0  0  0   131
VecSetRandom           2 1.0 7.2606e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  1  0  0  0  0   1  0  0  0  0     0
VecNormalize          22 1.0 1.1306e-02 1.0 1.36e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   0  1  0  0  0   120
MatMult              204 1.0 2.6262e-01 1.0 4.74e+07 1.0 0.0e+00 0.0e+00 0.0e+00  2 22  0  0  0   4 43  0  0  0   180
MatMultAdd            74 1.0 5.8598e-02 1.0 7.33e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  3  0  0  0   1  7  0  0  0   125
MatMultTranspose      74 1.0 6.6470e-02 1.0 7.33e+06 1.0 0.0e+00 0.0e+00 7.4e+01  0  3  0  0  3   1  7  0  0  6   110
MatSolve              37 1.0 5.7158e-03 1.0 1.11e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   0  1  0  0  0   193
MatLUFactorSym         1 1.0 1.5109e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatLUFactorNum         1 1.0 3.5410e-03 1.0 4.26e+05 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0   120
MatConvert             2 1.0 2.0061e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 6.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatScale               6 1.0 5.1038e-03 1.0 6.65e+05 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  1  0  0  0   130
MatAssemblyBegin      20 1.0 9.3222e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatAssemblyEnd        20 1.0 4.8141e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   1  0  0  0  0     0
MatGetRow         123345 1.0 4.9688e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  4  0  0  0  0   8  0  0  0  0     0
MatGetRowIJ            1 1.0 9.5129e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetOrdering         1 1.0 8.9407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatCoarsen             2 1.0 2.8723e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00  0  0  0  0  0   0  0  0  0  1     0
MatAXPY                2 1.0 1.9500e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatMatMult             2 1.0 5.3714e-02 1.0 5.66e+05 1.0 0.0e+00 0.0e+00 1.2e+01  0  0  0  0  0   1  1  0  0  1    11
MatMatMultSym          2 1.0 4.3223e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01  0  0  0  0  0   1  0  0  0  1     0
MatMatMultNum          2 1.0 1.0369e-02 1.0 5.66e+05 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  1  0  0  0    55
MatPtAP                2 1.0 9.5119e-02 1.0 2.02e+06 1.0 0.0e+00 0.0e+00 1.2e+01  1  1  0  0  0   1  2  0  0  1    21
MatPtAPSymbolic        2 1.0 2.9528e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01  0  0  0  0  0   0  0  0  0  1     0
MatPtAPNumeric         2 1.0 6.5562e-02 1.0 2.02e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   1  2  0  0  0    31
MatTrnMatMult          2 1.0 2.1705e-01 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 2.4e+01  2  1  0  0  1   3  3  0  0  2    14
MatTrnMatMultSym       2 1.0 1.7135e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01  1  0  0  0  1   3  0  0  0  2     0
MatTrnMatMultNum       2 1.0 4.5670e-02 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   1  3  0  0  0    65
MatGetSymTrans         4 1.0 6.6414e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
PCGAMGgraph_AGG        2 1.0 1.8934e+00 1.0 4.67e+05 1.0 0.0e+00 0.0e+00 3.0e+01 14  0  0  0  1  29  0  0  0  2     0
PCGAMGcoarse_AGG       2 1.0 2.7539e-01 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 3.4e+01  2  1  0  0  1   4  3  0  0  3    11
PCGAMGProl_AGG         2 1.0 1.4554e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 11  0  0  0  0  22  0  0  0  1     0
PCGAMGPOpt_AGG         2 1.0 7.4179e-01 1.0 1.68e+07 1.0 0.0e+00 0.0e+00 3.2e+02  5  8  0  0 12  11 15  0  0 24    23
PCSetUp                2 1.0 4.4874e+00 1.0 2.27e+07 1.0 0.0e+00 0.0e+00 5.5e+02 33 11  0  0 20  69 21  0  0 42     5
PCSetUpOnBlocks       37 1.0 6.4981e-03 1.0 4.26e+05 1.0 0.0e+00 0.0e+00 1.0e+01  0  0  0  0  0   0  0  0  0  1    66
PCApply               37 1.0 9.9311e-01 1.0 5.99e+07 1.0 0.0e+00 0.0e+00 6.5e+02  7 28  0  0 23  15 55  0  0 49    60
------------------------------------------------------------------------------------------------------------------------

Memory usage is given in bytes:

Object Type          Creations   Destructions     Memory  Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

       Krylov Solver     0             11        47784     0
     DMKSP interface     0              2         1296     0
              Vector     1             44      8630368     0
      Vector Scatter     0             12         7632     0
              Matrix     0             16      8531872     0
    Distributed Mesh     0              6      2189184     0
     Bipartite Graph     0             12         9504     0
           Index Set     0              6         9664     0
   IS L to G Mapping     0              9      1301220     0
      Preconditioner     0             10        10408     0

--- Event Stage 1: DMMG Setup

       Krylov Solver     1              0            0     0
              Vector     5              4         6048     0
      Vector Scatter     4              0            0     0
    Distributed Mesh     2              0            0     0
     Bipartite Graph     4              0            0     0
           Index Set    10             10       295584     0
   IS L to G Mapping     3              0            0     0

--- Event Stage 2: Pressure RHS Setup

       Krylov Solver     1              0            0     0
     DMKSP interface     1              0            0     0
              Vector     5              4         6048     0
      Vector Scatter     4              0            0     0
    Distributed Mesh     3              1         4376     0
     Bipartite Graph     6              2         1584     0
           Index Set    10             10       295584     0
   IS L to G Mapping     3              0            0     0
      Preconditioner     1              0            0     0
              Viewer     1              0            0     0

--- Event Stage 3: Pressure Solve

       Krylov Solver     6              2        60384     0
              Vector    71             51      7082504     0
              Matrix    16              8      7271944     0
      Matrix Coarsen     2              2         1256     0
           Index Set     5              2         1520     0
      Preconditioner     6              2         2016     0
         PetscRandom     2              2         1248     0

--- Event Stage 4: Corrector RHS Setup

       Krylov Solver     1              0            0     0
     DMKSP interface     1              0            0     0
              Vector     5              4         6048     0
      Vector Scatter     4              0            0     0
    Distributed Mesh     3              1         4376     0
     Bipartite Graph     6              2         1584     0
           Index Set    10             10       295584     0
   IS L to G Mapping     3              0            0     0
      Preconditioner     1              0            0     0

--- Event Stage 5: Corrector Solve

       Krylov Solver     6              2        60384     0
              Vector    71             51      7082504     0
              Matrix    16              8      7271944     0
      Matrix Coarsen     2              2         1256     0
           Index Set     5              2         1520     0
      Preconditioner     6              2         2016     0
         PetscRandom     2              2         1248     0
========================================================================================================================
Average time to get PetscTime(): 9.53674e-08
#PETSc Option Table entries:
-ksp_monitor
-ksp_rtol 1.0e-7
-ksp_type cg
-log_summary
-mg_levels_ksp_max_it 1
-mg_levels_ksp_type richardson
-pc_gamg_agg_nsmooths 1
-pc_type gamg
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure run at: Mon Jun 24 19:50:10 2013
Configure options: --download-f-blas-lapack --download-hypre --download-mpich --with-cc=gcc --with-fc=gfortran PETSC_ARCH=arch-linux2-c-debug
-----------------------------------------
Libraries compiled on Mon Jun 24 19:50:10 2013 on l2118a-linux.soecs.ku.edu 
Machine characteristics: Linux-2.6.18-128.el5-x86_64-with-redhat-5.3-Tikanga
Using PETSc directory: /home/zlwei/soft/mercurial/petsc-dev
Using PETSc arch: arch-linux2-c-debug
-----------------------------------------

Using C compiler: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpicc  -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g3 -fno-inline -O0  ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpif90  -fPIC  -Wall -Wno-unused-variable -g  ${FOPTFLAGS} ${FFLAGS} 
-----------------------------------------

Using include paths: -I/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/include -I/home/zlwei/soft/mercurial/petsc-dev/include -I/home/zlwei/soft/mercurial/petsc-dev/include -I/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/include
-----------------------------------------

Using C linker: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpicc
Using Fortran linker: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpif90
Using libraries: -Wl,-rpath,/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -L/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -lpetsc -Wl,-rpath,/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -L/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -lHYPRE -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -lmpichcxx -lstdc++ -lflapack -lfblas -lpthread -lmpichf90 -lgfortran -lm -lm -lmpichcxx -lstdc++ -lmpichcxx -lstdc++ -ldl -lmpich -lopa -lmpl -lrt -lpthread -lgcc_s -ldl 
-----------------------------------------



More information about the petsc-users mailing list