F I N I T E E L E M E N T A N A L Y S I S P R O G R A M FEAP (C) Regents of the University of California All Rights Reserved. VERSION: Release 8.4.1d DATE: 01 January 2014 Files are set as: Status Filename Input (read ) : Exists Icube05_0001 Output (write) : Exists Ocube05_0001 Restart (read ) : New Rcube05_0001 Restart (write) : New Rcube05_0001 Plots (write) : New Pcube05_0001 Caution, existing write files will be overwritten. Are filenames correct?( y or n; r = redefine all, s = stop) : R U N N I N G F E A P P R O B L E M N O W --> Please report errors by e-mail to: feap@ce.berkeley.edu 0 KSP Residual norm 9.439652885243e+00 1 KSP Residual norm 3.403063701552e+00 2 KSP Residual norm 1.525697956373e+00 3 KSP Residual norm 8.144627985555e-01 4 KSP Residual norm 4.758378188059e-01 5 KSP Residual norm 3.236515587468e-01 6 KSP Residual norm 2.418029669033e-01 7 KSP Residual norm 1.827549847375e-01 8 KSP Residual norm 1.398550438409e-01 9 KSP Residual norm 1.030953264507e-01 10 KSP Residual norm 7.958349903481e-02 11 KSP Residual norm 6.681382994268e-02 12 KSP Residual norm 5.378646980721e-02 13 KSP Residual norm 4.220237149545e-02 14 KSP Residual norm 3.303837304179e-02 15 KSP Residual norm 2.557132558799e-02 16 KSP Residual norm 2.093525127921e-02 17 KSP Residual norm 1.861266291328e-02 18 KSP Residual norm 1.634213327175e-02 19 KSP Residual norm 1.437070315205e-02 20 KSP Residual norm 1.178709971799e-02 21 KSP Residual norm 9.474395684516e-03 22 KSP Residual norm 8.369069535611e-03 23 KSP Residual norm 7.620223784957e-03 24 KSP Residual norm 6.389080851040e-03 25 KSP Residual norm 5.276196499511e-03 26 KSP Residual norm 4.335046465707e-03 27 KSP Residual norm 3.750611304400e-03 28 KSP Residual norm 3.259185417943e-03 29 KSP Residual norm 2.808249246871e-03 30 KSP Residual norm 2.457833154274e-03 31 KSP Residual norm 2.186664726291e-03 32 KSP Residual norm 1.937136024635e-03 33 KSP Residual norm 1.721976232709e-03 34 KSP Residual norm 1.464158213424e-03 35 KSP Residual norm 1.298585967033e-03 36 KSP Residual norm 1.169575240801e-03 37 KSP Residual norm 1.026696505196e-03 38 KSP Residual norm 9.079709939868e-04 39 KSP Residual norm 8.166319180328e-04 40 KSP Residual norm 7.470600885892e-04 41 KSP Residual norm 6.892567054166e-04 42 KSP Residual norm 6.484617749663e-04 43 KSP Residual norm 6.230476455431e-04 44 KSP Residual norm 5.738402788128e-04 45 KSP Residual norm 5.310135793268e-04 46 KSP Residual norm 5.154535519238e-04 47 KSP Residual norm 4.968032977467e-04 48 KSP Residual norm 4.815325364385e-04 49 KSP Residual norm 4.941996597962e-04 50 KSP Residual norm 5.108562130903e-04 51 KSP Residual norm 4.967589544583e-04 52 KSP Residual norm 4.404771417974e-04 53 KSP Residual norm 3.687217055351e-04 54 KSP Residual norm 3.291983649095e-04 55 KSP Residual norm 3.160232303045e-04 56 KSP Residual norm 3.180990021444e-04 57 KSP Residual norm 3.123014542553e-04 58 KSP Residual norm 2.762730799731e-04 59 KSP Residual norm 2.533996612866e-04 60 KSP Residual norm 2.552796306331e-04 61 KSP Residual norm 2.446220670284e-04 62 KSP Residual norm 2.374604473269e-04 63 KSP Residual norm 2.487770499660e-04 64 KSP Residual norm 2.640767533499e-04 65 KSP Residual norm 2.685518541047e-04 66 KSP Residual norm 2.532234554421e-04 67 KSP Residual norm 2.402945288685e-04 68 KSP Residual norm 2.496944822326e-04 69 KSP Residual norm 2.457193897406e-04 70 KSP Residual norm 2.396016294140e-04 71 KSP Residual norm 2.329005574641e-04 72 KSP Residual norm 2.198894936691e-04 73 KSP Residual norm 2.076044633968e-04 74 KSP Residual norm 1.894470548351e-04 75 KSP Residual norm 1.718685881487e-04 76 KSP Residual norm 1.649582839233e-04 77 KSP Residual norm 1.548846805505e-04 78 KSP Residual norm 1.491415006273e-04 79 KSP Residual norm 1.432842248379e-04 80 KSP Residual norm 1.278723801749e-04 81 KSP Residual norm 1.210019730351e-04 82 KSP Residual norm 1.193546400747e-04 83 KSP Residual norm 1.070800581326e-04 84 KSP Residual norm 9.891584858046e-05 85 KSP Residual norm 9.164811979279e-05 86 KSP Residual norm 8.219453597555e-05 87 KSP Residual norm 8.790336149528e-05 88 KSP Residual norm 8.972092711087e-05 89 KSP Residual norm 7.956662183585e-05 90 KSP Residual norm 7.031519599323e-05 91 KSP Residual norm 6.867457523335e-05 92 KSP Residual norm 7.057082052274e-05 93 KSP Residual norm 7.656308002962e-05 94 KSP Residual norm 8.412920365916e-05 95 KSP Residual norm 8.728112735826e-05 96 KSP Residual norm 8.679380006335e-05 97 KSP Residual norm 8.686623853054e-05 98 KSP Residual norm 8.564689141166e-05 99 KSP Residual norm 8.488699583384e-05 100 KSP Residual norm 8.784253263284e-05 101 KSP Residual norm 9.318290997681e-05 102 KSP Residual norm 9.859438663377e-05 103 KSP Residual norm 9.557682227574e-05 104 KSP Residual norm 8.740767899693e-05 105 KSP Residual norm 8.753169141607e-05 106 KSP Residual norm 9.012906866983e-05 107 KSP Residual norm 8.778052938259e-05 108 KSP Residual norm 8.048897278465e-05 109 KSP Residual norm 7.423277897684e-05 110 KSP Residual norm 6.949183991506e-05 111 KSP Residual norm 6.400520795468e-05 112 KSP Residual norm 5.910963869750e-05 113 KSP Residual norm 5.454040394160e-05 114 KSP Residual norm 5.307561515357e-05 115 KSP Residual norm 5.108361024699e-05 116 KSP Residual norm 4.335409816022e-05 117 KSP Residual norm 3.532285015084e-05 118 KSP Residual norm 2.907740987186e-05 119 KSP Residual norm 2.525730414687e-05 120 KSP Residual norm 2.282597799947e-05 121 KSP Residual norm 2.017666182899e-05 122 KSP Residual norm 1.832657506330e-05 123 KSP Residual norm 1.590360811292e-05 124 KSP Residual norm 1.367463252077e-05 125 KSP Residual norm 1.228239405852e-05 126 KSP Residual norm 1.103928815710e-05 127 KSP Residual norm 1.043097194232e-05 128 KSP Residual norm 1.063297402546e-05 129 KSP Residual norm 9.548264233101e-06 130 KSP Residual norm 7.530007938872e-06 131 KSP Residual norm 6.635748529879e-06 132 KSP Residual norm 6.042896503857e-06 133 KSP Residual norm 5.317042895642e-06 134 KSP Residual norm 4.576206689761e-06 135 KSP Residual norm 3.835388568057e-06 136 KSP Residual norm 3.476327976903e-06 137 KSP Residual norm 3.089941536758e-06 138 KSP Residual norm 2.821925579504e-06 139 KSP Residual norm 2.401777674790e-06 140 KSP Residual norm 2.000950008898e-06 141 KSP Residual norm 1.899805926153e-06 142 KSP Residual norm 1.729710561343e-06 143 KSP Residual norm 1.563152467691e-06 144 KSP Residual norm 1.400868931097e-06 145 KSP Residual norm 1.182247052517e-06 146 KSP Residual norm 9.562555781470e-07 147 KSP Residual norm 8.387168604392e-07 148 KSP Residual norm 7.574911287033e-07 149 KSP Residual norm 6.644732390416e-07 150 KSP Residual norm 6.048672599573e-07 151 KSP Residual norm 5.742627661852e-07 152 KSP Residual norm 5.618826890609e-07 153 KSP Residual norm 4.825028996475e-07 154 KSP Residual norm 3.890683833814e-07 155 KSP Residual norm 3.356201907012e-07 156 KSP Residual norm 3.041841083137e-07 157 KSP Residual norm 2.829106510821e-07 158 KSP Residual norm 2.561543398855e-07 159 KSP Residual norm 2.141807062983e-07 160 KSP Residual norm 1.803934120725e-07 161 KSP Residual norm 1.518454814450e-07 162 KSP Residual norm 1.325124208719e-07 163 KSP Residual norm 1.289716447707e-07 164 KSP Residual norm 1.125336132151e-07 165 KSP Residual norm 9.405816292229e-08 KSP Object: 24 MPI processes type: cg maximum iterations=10000, initial guess is zero tolerances: relative=1e-08, absolute=1e-16, divergence=1e+16 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 24 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=4 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 24 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 24 MPI processes type: bjacobi block Jacobi: number of blocks = 24 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot [INBLOCKS] matrix ordering: nd factor fill ratio given 5, needed 2.04016 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=450, cols=450, bs=6 package used to perform factorization: petsc total: nonzeros=111564, allocated nonzeros=111564 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 140 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=450, cols=450, bs=6 total: nonzeros=54684, allocated nonzeros=54684 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 150 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 24 MPI processes type: mpiaij rows=450, cols=450, bs=6 total: nonzeros=54684, allocated nonzeros=54684 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 150 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 24 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_1_) 24 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 24 MPI processes type: mpiaij rows=4602, cols=4602, bs=6 total: nonzeros=291924, allocated nonzeros=291924 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 86 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 24 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_2_) 24 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 24 MPI processes type: mpiaij rows=145944, cols=145944, bs=6 total: nonzeros=19588968, allocated nonzeros=19588968 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 2215 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 24 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_3_) 24 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 24 MPI processes type: mpiaij rows=2080983, cols=2080983, bs=3 total: nonzeros=132617061, allocated nonzeros=132617061 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 28852 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 24 MPI processes type: mpiaij rows=2080983, cols=2080983, bs=3 total: nonzeros=132617061, allocated nonzeros=132617061 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 28852 nodes, limit used is 5 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /usr2/tgross/parFEAP/parFEAP84_mod/FEAP84/ver84/parfeap/feap on a linux-gnu-c named sandy.ilsb.tuwien.ac.at with 24 processors, by tgross Mon Jan 27 21:23:47 2014 Using Petsc Release Version 3.4.3, Oct, 15, 2013 Max Max/Min Avg Total Time (sec): 1.076e+02 1.00006 1.075e+02 Objects: 1.446e+03 1.00139 1.444e+03 Flops: 2.199e+10 1.13802 2.079e+10 4.989e+11 Flops/sec: 2.045e+08 1.13799 1.933e+08 4.639e+09 MPI Messages: 3.182e+04 4.96327 1.739e+04 4.174e+05 MPI Message Lengths: 3.596e+07 4.36215 1.309e+03 5.463e+08 MPI Reductions: 1.699e+03 1.00118 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 1.0755e+02 100.0% 4.9894e+11 100.0% 4.174e+05 100.0% 1.309e+03 100.0% 1.696e+03 99.8% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %f - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage MatMult 693 1.0 2.1918e+01 1.5 4.28e+09 1.1 9.2e+04 1.8e+03 0.0e+00 16 19 22 30 0 16 19 22 30 0 4412 MatMultAdd 498 1.0 3.1974e+00 1.6 6.61e+08 1.1 4.7e+04 3.8e+02 0.0e+00 2 3 11 3 0 2 3 11 3 0 4783 MatMultTranspose 498 1.0 2.9672e+00 1.6 6.61e+08 1.1 4.7e+04 3.8e+02 0.0e+00 2 3 11 3 0 2 3 11 3 0 5154 MatSolve 166 0.0 3.6094e-02 0.0 3.70e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1024 MatSOR 996 1.0 7.2820e+01 1.2 1.55e+10 1.1 2.0e+05 1.4e+03 0.0e+00 64 71 48 52 0 64 71 48 52 0 4841 MatLUFactorSym 1 1.0 3.1500e-03314.6 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 1.0 1.0341e-023614.5 1.77e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1710 MatScale 9 1.0 4.1145e-02 1.8 5.34e+06 1.1 4.0e+02 4.0e+02 0.0e+00 0 0 0 0 0 0 0 0 0 0 2982 MatAssemblyBegin 50 1.0 8.4690e-01 3.9 0.00e+00 0.0 7.4e+02 2.7e+03 5.4e+01 0 0 0 0 3 0 0 0 0 3 0 MatAssemblyEnd 50 1.0 3.1580e-01 1.1 0.00e+00 0.0 5.3e+03 1.4e+02 1.7e+02 0 0 1 0 10 0 0 1 0 10 0 MatGetRow 344796 1.0 8.8606e-02 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 1 0.0 9.2030e-05 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 0.0 2.3603e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.7e-01 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 3 1.0 4.9957e-02 1.1 0.00e+00 0.0 2.5e+03 6.0e+02 6.3e+01 0 0 1 0 4 0 0 1 0 4 0 MatZeroEntries 1 1.0 9.1708e-02 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 7 1.4 5.2977e-03 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAXPY 3 1.0 2.1172e-02 3.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 3 1.0 8.0084e-01 1.0 7.97e+07 1.1 2.4e+03 3.7e+03 7.2e+01 1 0 1 2 4 1 0 1 2 4 2268 MatMatMultSym 3 1.0 4.7355e-01 1.0 0.00e+00 0.0 2.0e+03 2.7e+03 6.6e+01 0 0 0 1 4 0 0 0 1 4 0 MatMatMultNum 3 1.0 3.3007e-01 1.0 7.97e+07 1.1 4.0e+02 8.5e+03 6.0e+00 0 0 0 1 0 0 0 0 1 0 5503 MatPtAP 3 1.0 3.3866e+00 1.0 5.06e+08 1.2 4.0e+03 7.8e+03 7.5e+01 3 2 1 6 4 3 2 1 6 4 3327 MatPtAPSymbolic 3 1.0 1.6790e+00 1.0 0.00e+00 0.0 2.4e+03 9.5e+03 4.5e+01 2 0 1 4 3 2 0 1 4 3 0 MatPtAPNumeric 3 1.0 1.7104e+00 1.0 5.06e+08 1.2 1.6e+03 5.1e+03 3.0e+01 2 2 0 1 2 2 2 0 1 2 6588 MatTrnMatMult 3 1.0 7.0875e-01 1.0 3.21e+07 1.2 2.5e+03 4.1e+03 8.7e+01 1 0 1 2 5 1 0 1 2 5 998 MatGetLocalMat 15 1.0 1.5222e-01 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 1 0 0 0 0 1 0 MatGetBrAoCol 9 1.0 7.3409e-02 3.7 0.00e+00 0.0 2.8e+03 9.9e+03 1.2e+01 0 0 1 5 1 0 0 1 5 1 0 MatGetSymTrans 6 1.0 2.7187e-02 2.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecDot 1 1.0 3.2051e-03 6.3 1.76e+05 1.0 0.0e+00 0.0e+00 1.0e+00 0 0 0 0 0 0 0 0 0 0 1299 VecMDot 30 1.0 9.8450e-02 2.5 1.04e+07 1.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 2 0 0 0 0 2 2493 VecTDot 330 1.0 1.5574e+00 2.6 5.80e+07 1.0 0.0e+00 0.0e+00 3.3e+02 1 0 0 0 19 1 0 0 0 19 882 VecNorm 199 1.0 5.1133e+00 6.8 3.13e+07 1.0 0.0e+00 0.0e+00 2.0e+02 2 0 0 0 12 2 0 0 0 12 145 VecScale 1527 1.0 2.1687e-02 3.2 3.39e+06 2.2 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 2760 VecCopy 5 1.0 3.6001e-03 3.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 2045 1.0 6.2646e-02 2.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 333 1.0 3.3573e-01 1.6 5.82e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 4104 VecAYPX 662 1.0 2.3592e-01 1.8 4.46e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 4463 VecMAXPY 33 1.0 2.3414e-02 1.6 1.23e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 12390 VecAssemblyBegin 125 1.0 2.4747e-01 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 3.7e+02 0 0 0 0 22 0 0 0 0 22 0 VecAssemblyEnd 125 1.0 1.8072e-04 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 33 1.0 7.2753e-03 2.0 1.04e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3374 VecScatterBegin 3310 1.0 5.7059e-01 2.9 0.00e+00 0.0 4.1e+05 1.2e+03 0.0e+00 0 0 97 90 0 0 0 97 90 0 0 VecScatterEnd 3310 1.0 2.8178e+01 3.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 11 0 0 0 0 11 0 0 0 0 0 VecSetRandom 3 1.0 2.6548e-03 1.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 33 1.0 7.5054e-02 2.7 3.12e+06 1.0 0.0e+00 0.0e+00 3.3e+01 0 0 0 0 2 0 0 0 0 2 981 KSPGMRESOrthog 30 1.0 1.1442e-01 2.1 2.08e+07 1.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 2 0 0 0 0 2 4291 KSPSetUp 10 1.0 2.5187e-02 2.6 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 1.0096e+02 1.0 2.20e+10 1.1 4.2e+05 1.3e+03 1.7e+03 94100100100 98 94100100100 98 4942 PCSetUp 2 1.0 7.4426e+00 1.0 7.84e+08 1.2 3.6e+04 1.9e+03 1.2e+03 7 4 9 13 69 7 4 9 13 69 2364 PCSetUpOnBlocks 166 1.0 1.3941e-0267.9 1.77e+07 0.0 0.0e+00 0.0e+00 5.1e+00 0 0 0 0 0 0 0 0 0 0 1268 PCApply 166 1.0 8.3635e+01 1.0 1.91e+10 1.1 3.6e+05 1.1e+03 5.1e+00 77 87 86 75 0 77 87 86 75 0 5187 PCGAMGgraph_AGG 3 1.0 1.0902e+00 1.0 1.36e+06 1.1 2.0e+03 1.6e+02 1.1e+02 1 0 0 0 6 1 0 0 0 6 28 PCGAMGcoarse_AGG 3 1.0 8.0904e-01 1.0 3.21e+07 1.2 7.3e+03 1.9e+03 2.1e+02 1 0 2 2 12 1 0 2 2 12 874 PCGAMGProl_AGG 3 1.0 4.6260e-01 1.0 0.00e+00 0.0 1.6e+04 6.3e+02 5.2e+02 0 0 4 2 31 0 0 4 2 31 0 PCGAMGPOpt_AGG 3 1.0 1.6766e+00 1.0 2.44e+08 1.1 6.4e+03 2.3e+03 1.7e+02 2 1 2 3 10 2 1 2 3 10 3325 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Matrix 87 87 207310004 0 Matrix Coarsen 3 3 1884 0 Vector 1249 1249 277672928 0 Vector Scatter 22 22 23144 0 Index Set 60 60 69864 0 Krylov Solver 10 10 116296 0 Preconditioner 10 10 9836 0 Viewer 2 1 728 0 PetscRandom 3 3 1872 0 ======================================================================================================================== Average time to get PetscTime(): 9.53674e-08 Average time for MPI_Barrier(): 0.000283575 Average time for zero size MPI_Send(): 2.18352e-05 #PETSc Option Table entries: -ksp_monitor -ksp_type cg -ksp_view -log_summary -mg_levels_ksp_type richardson -mg_levels_pc_type sor -options_left -pc_gamg_agg_nsmooths 1 -pc_gamg_type agg -pc_type gamg #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure run at: Thu Jan 23 19:10:29 2014 Configure options: --download-parmetis --download-superlu_dist --download-mpich --download-hypre --download-metis --download-ml --download-mumps --download-scalapack --download-blacs --download-cmake --download-f-blas-lapack=1 --with-debugging=0 ----------------------------------------- Libraries compiled on Thu Jan 23 19:10:29 2014 on ilfb35.ilsb.tuwien.ac.at Machine characteristics: Linux-2.6.32-358.2.1.el6.x86_64-x86_64-with-redhat-6.4-Carbon Using PETSc directory: /usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3 Using PETSc arch: linux-gnu-c ----------------------------------------- Using C compiler: /usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/bin/mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: /usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/bin/mpif90 -fPIC -Wall -Wno-unused-variable -O ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/include -I/usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/include -I/usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/include -I/usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/include ----------------------------------------- Using C linker: /usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/bin/mpicc Using Fortran linker: /usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/bin/mpif90 Using libraries: -Wl,-rpath,/usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/lib -L/usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/lib -lpetsc -Wl,-rpath,/usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/lib -L/usr2/tgross/parFEAP/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/lib -lHYPRE -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.4.7 -L/usr/lib/gcc/x86_64-redhat-linux/4.4.7 -lmpichcxx -lstdc++ -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lscalapack -lml -lmpichcxx -lstdc++ -lsuperlu_dist_3.3 -lflapack -lfblas -lX11 -lparmetis -lmetis -lpthread -lmpichf90 -lgfortran -lm -lm -lmpichcxx -lstdc++ -lmpichcxx -lstdc++ -ldl -lmpich -lopa -lmpl -lrt -lpthread -lgcc_s -ldl ----------------------------------------- #PETSc Option Table entries: -ksp_monitor -ksp_type cg -ksp_view -log_summary -mg_levels_ksp_type richardson -mg_levels_pc_type sor -options_left -pc_gamg_agg_nsmooths 1 -pc_gamg_type agg -pc_type gamg #End of PETSc Option Table entries There are no unused options.