F I N I T E E L E M E N T A N A L Y S I S P R O G R A M FEAP (C) Regents of the University of California All Rights Reserved. VERSION: Release 8.4.1d DATE: 01 January 2014 Files are set as: Status Filename Input (read ) : Exists Icube05_0001 Output (write) : Exists Ocube05_0001 Restart (read ) : New Rcube05_0001 Restart (write) : New Rcube05_0001 Plots (write) : New Pcube05_0001 Caution, existing write files will be overwritten. Are filenames correct?( y or n; r = redefine all, s = stop) : R U N N I N G F E A P P R O B L E M N O W --> Please report errors by e-mail to: feap@ce.berkeley.edu 0 KSP Residual norm 1.118758556745e+01 1 KSP Residual norm 2.890013144667e+00 2 KSP Residual norm 1.165893810678e+00 3 KSP Residual norm 6.591722377279e-01 4 KSP Residual norm 3.843380043884e-01 5 KSP Residual norm 2.634932807275e-01 6 KSP Residual norm 2.004072742000e-01 7 KSP Residual norm 1.540003966027e-01 8 KSP Residual norm 1.169955654840e-01 9 KSP Residual norm 9.341195155319e-02 10 KSP Residual norm 7.311582684346e-02 11 KSP Residual norm 5.324110527447e-02 12 KSP Residual norm 4.618988336018e-02 13 KSP Residual norm 4.134962507264e-02 14 KSP Residual norm 3.415475636888e-02 15 KSP Residual norm 2.800687896196e-02 16 KSP Residual norm 2.422072063135e-02 17 KSP Residual norm 2.132008019287e-02 18 KSP Residual norm 1.691827130183e-02 19 KSP Residual norm 1.422892570478e-02 20 KSP Residual norm 1.297205219180e-02 21 KSP Residual norm 1.186674911083e-02 22 KSP Residual norm 1.035748314096e-02 23 KSP Residual norm 9.466003050405e-03 24 KSP Residual norm 8.279104481272e-03 25 KSP Residual norm 6.840586846477e-03 26 KSP Residual norm 5.914440593892e-03 27 KSP Residual norm 5.497079506339e-03 28 KSP Residual norm 5.030747321327e-03 29 KSP Residual norm 4.261921474672e-03 30 KSP Residual norm 3.646073925818e-03 31 KSP Residual norm 3.263395418182e-03 32 KSP Residual norm 2.967994808803e-03 33 KSP Residual norm 2.691607538673e-03 34 KSP Residual norm 2.355222004284e-03 35 KSP Residual norm 1.988554686821e-03 36 KSP Residual norm 1.782108602812e-03 37 KSP Residual norm 1.675236493393e-03 38 KSP Residual norm 1.535243242444e-03 39 KSP Residual norm 1.348672213615e-03 40 KSP Residual norm 1.185260405955e-03 41 KSP Residual norm 1.064442581294e-03 42 KSP Residual norm 9.474732022638e-04 43 KSP Residual norm 8.213309655935e-04 44 KSP Residual norm 7.322251902205e-04 45 KSP Residual norm 7.040290076518e-04 46 KSP Residual norm 6.625374001658e-04 47 KSP Residual norm 5.894146495507e-04 48 KSP Residual norm 5.192421851912e-04 49 KSP Residual norm 4.697625292620e-04 50 KSP Residual norm 4.432061435960e-04 51 KSP Residual norm 4.128206299508e-04 52 KSP Residual norm 3.759271762205e-04 53 KSP Residual norm 3.610973796933e-04 54 KSP Residual norm 3.528401432544e-04 55 KSP Residual norm 3.357613400859e-04 56 KSP Residual norm 3.136031253747e-04 57 KSP Residual norm 3.161711793318e-04 58 KSP Residual norm 3.157118985560e-04 59 KSP Residual norm 3.153095251024e-04 60 KSP Residual norm 3.209456518937e-04 61 KSP Residual norm 3.116018219654e-04 62 KSP Residual norm 2.785131808914e-04 63 KSP Residual norm 2.462891463924e-04 64 KSP Residual norm 2.346673119665e-04 65 KSP Residual norm 2.219723902101e-04 66 KSP Residual norm 2.108755827486e-04 67 KSP Residual norm 1.947682398724e-04 68 KSP Residual norm 1.844531185384e-04 69 KSP Residual norm 1.735069704669e-04 70 KSP Residual norm 1.594502629068e-04 71 KSP Residual norm 1.504175756568e-04 72 KSP Residual norm 1.492390028777e-04 73 KSP Residual norm 1.449435033185e-04 74 KSP Residual norm 1.355039103665e-04 75 KSP Residual norm 1.308110244322e-04 76 KSP Residual norm 1.321039041753e-04 77 KSP Residual norm 1.314657526282e-04 78 KSP Residual norm 1.342013441168e-04 79 KSP Residual norm 1.398114214911e-04 80 KSP Residual norm 1.366009413844e-04 81 KSP Residual norm 1.289536176862e-04 82 KSP Residual norm 1.233309182374e-04 83 KSP Residual norm 1.247074177170e-04 84 KSP Residual norm 1.335714492321e-04 85 KSP Residual norm 1.375602973910e-04 86 KSP Residual norm 1.312403286889e-04 87 KSP Residual norm 1.257720544456e-04 88 KSP Residual norm 1.265519318415e-04 89 KSP Residual norm 1.275789976103e-04 90 KSP Residual norm 1.250722218445e-04 91 KSP Residual norm 1.265661505790e-04 92 KSP Residual norm 1.174366698468e-04 93 KSP Residual norm 1.193522160832e-04 94 KSP Residual norm 1.313378759220e-04 95 KSP Residual norm 1.245845790105e-04 96 KSP Residual norm 1.124211545864e-04 97 KSP Residual norm 1.056414486256e-04 98 KSP Residual norm 1.057092838596e-04 99 KSP Residual norm 1.139361144899e-04 100 KSP Residual norm 1.083524982080e-04 101 KSP Residual norm 9.876707059929e-05 102 KSP Residual norm 1.124034736376e-04 103 KSP Residual norm 1.170610540744e-04 104 KSP Residual norm 1.004910534758e-04 105 KSP Residual norm 8.338462276242e-05 106 KSP Residual norm 7.713758661914e-05 107 KSP Residual norm 7.972887220129e-05 108 KSP Residual norm 7.953829602650e-05 109 KSP Residual norm 7.226842935944e-05 110 KSP Residual norm 6.340962981368e-05 111 KSP Residual norm 5.829426821849e-05 112 KSP Residual norm 5.723479103588e-05 113 KSP Residual norm 5.554636608256e-05 114 KSP Residual norm 5.498507881979e-05 115 KSP Residual norm 5.582246554979e-05 116 KSP Residual norm 5.490285070625e-05 117 KSP Residual norm 5.274604971748e-05 118 KSP Residual norm 5.046492648841e-05 119 KSP Residual norm 5.113732385413e-05 120 KSP Residual norm 5.266810432932e-05 121 KSP Residual norm 5.180238694953e-05 122 KSP Residual norm 5.152629867492e-05 123 KSP Residual norm 5.258427668515e-05 124 KSP Residual norm 5.399186383041e-05 125 KSP Residual norm 5.457970738419e-05 126 KSP Residual norm 5.410992190714e-05 127 KSP Residual norm 5.368860870829e-05 128 KSP Residual norm 5.330688041805e-05 129 KSP Residual norm 5.381172952660e-05 130 KSP Residual norm 5.999537754939e-05 131 KSP Residual norm 7.009924873335e-05 132 KSP Residual norm 6.917697923045e-05 133 KSP Residual norm 5.841405440009e-05 134 KSP Residual norm 5.515688772427e-05 135 KSP Residual norm 5.669869159974e-05 136 KSP Residual norm 5.526226759551e-05 137 KSP Residual norm 5.500292437154e-05 138 KSP Residual norm 5.489523449946e-05 139 KSP Residual norm 5.504139633629e-05 140 KSP Residual norm 5.467773509969e-05 141 KSP Residual norm 5.275159328359e-05 142 KSP Residual norm 5.118371583965e-05 143 KSP Residual norm 4.752251708403e-05 144 KSP Residual norm 4.419302168534e-05 145 KSP Residual norm 4.118399699122e-05 146 KSP Residual norm 3.790496643296e-05 147 KSP Residual norm 3.663602742388e-05 148 KSP Residual norm 3.350100186569e-05 149 KSP Residual norm 3.181968799614e-05 150 KSP Residual norm 3.341888857493e-05 151 KSP Residual norm 3.124177350146e-05 152 KSP Residual norm 2.601726197776e-05 153 KSP Residual norm 2.180361284337e-05 154 KSP Residual norm 1.968235005153e-05 155 KSP Residual norm 1.921586554347e-05 156 KSP Residual norm 1.996482298952e-05 157 KSP Residual norm 1.945014913649e-05 158 KSP Residual norm 1.630512384480e-05 159 KSP Residual norm 1.303478803048e-05 160 KSP Residual norm 1.118516026084e-05 161 KSP Residual norm 9.885931666358e-06 162 KSP Residual norm 8.847176060525e-06 163 KSP Residual norm 8.111019636974e-06 164 KSP Residual norm 7.557388177046e-06 165 KSP Residual norm 7.140795539691e-06 166 KSP Residual norm 6.798486437235e-06 167 KSP Residual norm 6.449982351677e-06 168 KSP Residual norm 5.679425296475e-06 169 KSP Residual norm 4.779325734425e-06 170 KSP Residual norm 4.262898688540e-06 171 KSP Residual norm 4.096822892025e-06 172 KSP Residual norm 3.903270478160e-06 173 KSP Residual norm 3.656322682505e-06 174 KSP Residual norm 3.503077425893e-06 175 KSP Residual norm 3.193262731011e-06 176 KSP Residual norm 2.776632598243e-06 177 KSP Residual norm 2.388433933132e-06 178 KSP Residual norm 2.116359040153e-06 179 KSP Residual norm 1.901230090081e-06 180 KSP Residual norm 1.674902684500e-06 181 KSP Residual norm 1.598865799224e-06 182 KSP Residual norm 1.616591544106e-06 183 KSP Residual norm 1.511269769347e-06 184 KSP Residual norm 1.292425024949e-06 185 KSP Residual norm 1.042110427794e-06 186 KSP Residual norm 8.511407499240e-07 187 KSP Residual norm 7.712909398171e-07 188 KSP Residual norm 7.782612177755e-07 189 KSP Residual norm 7.733304399710e-07 190 KSP Residual norm 6.744084011210e-07 191 KSP Residual norm 5.470958384221e-07 192 KSP Residual norm 4.851341582544e-07 193 KSP Residual norm 4.563706681800e-07 194 KSP Residual norm 4.099402187321e-07 195 KSP Residual norm 3.736408390283e-07 196 KSP Residual norm 3.544855009857e-07 197 KSP Residual norm 3.160865467592e-07 198 KSP Residual norm 2.812729337608e-07 199 KSP Residual norm 2.752339608142e-07 200 KSP Residual norm 2.446851090547e-07 201 KSP Residual norm 1.981112095145e-07 202 KSP Residual norm 1.797649041557e-07 203 KSP Residual norm 1.692441314349e-07 204 KSP Residual norm 1.608012037858e-07 205 KSP Residual norm 1.688177063721e-07 206 KSP Residual norm 1.695692007928e-07 207 KSP Residual norm 1.334879502710e-07 208 KSP Residual norm 1.095395675400e-07 KSP Object: 24 MPI processes type: cg maximum iterations=10000, initial guess is zero tolerances: relative=1e-08, absolute=1e-16, divergence=1e+16 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 24 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=4 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 24 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 24 MPI processes type: bjacobi block Jacobi: number of blocks = 24 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot [INBLOCKS] matrix ordering: nd factor fill ratio given 5, needed 2.04016 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=450, cols=450, bs=6 package used to perform factorization: petsc total: nonzeros=111564, allocated nonzeros=111564 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 140 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=450, cols=450, bs=6 total: nonzeros=54684, allocated nonzeros=54684 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 150 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 24 MPI processes type: mpiaij rows=450, cols=450, bs=6 total: nonzeros=54684, allocated nonzeros=54684 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 150 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 24 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=1 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_1_) 24 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 24 MPI processes type: mpiaij rows=4602, cols=4602, bs=6 total: nonzeros=291924, allocated nonzeros=291924 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 86 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 24 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=1 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_2_) 24 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 24 MPI processes type: mpiaij rows=145944, cols=145944, bs=6 total: nonzeros=19588968, allocated nonzeros=19588968 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 2215 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 24 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=1 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_3_) 24 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 24 MPI processes type: mpiaij rows=2080983, cols=2080983, bs=3 total: nonzeros=132617061, allocated nonzeros=132617061 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 28852 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 24 MPI processes type: mpiaij rows=2080983, cols=2080983, bs=3 total: nonzeros=132617061, allocated nonzeros=132617061 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 28852 nodes, limit used is 5 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /usr2/tgross/parFEAP/compile_test/parFEAP84_mod/FEAP84/ver84/parfeap/feap on a linux-gnu-c named sandy.ilsb.tuwien.ac.at with 24 processors, by tgross Thu Jan 30 10:15:29 2014 Using Petsc Release Version 3.4.3, Oct, 15, 2013 Max Max/Min Avg Total Time (sec): 5.937e+01 1.00007 5.937e+01 Objects: 1.077e+03 1.00186 1.075e+03 Flops: 1.629e+10 1.13433 1.541e+10 3.699e+11 Flops/sec: 2.744e+08 1.13428 2.596e+08 6.231e+09 MPI Messages: 2.635e+04 4.87123 1.454e+04 3.489e+05 MPI Message Lengths: 2.838e+07 4.35483 1.240e+03 4.326e+08 MPI Reductions: 1.828e+03 1.00110 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 5.9366e+01 100.0% 3.6989e+11 100.0% 3.489e+05 100.0% 1.240e+03 100.0% 1.825e+03 99.8% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %f - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage MatMult 865 1.0 1.6095e+01 1.3 5.36e+09 1.1 1.1e+05 1.8e+03 0.0e+00 23 33 33 47 0 23 33 33 47 0 7520 MatMultAdd 627 1.0 2.5194e+00 1.4 8.32e+08 1.1 5.9e+04 3.8e+02 0.0e+00 4 5 17 5 0 4 5 17 5 0 7642 MatMultTranspose 627 1.0 5.2505e+00 2.4 8.32e+08 1.1 5.9e+04 3.8e+02 0.0e+00 6 5 17 5 0 6 5 17 5 0 3667 MatSolve 209 0.0 3.4584e-02 0.0 4.65e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1346 MatSOR 1254 1.0 2.7833e+01 1.3 8.38e+09 1.1 8.4e+04 1.4e+03 0.0e+00 40 51 24 27 0 40 51 24 27 0 6829 MatLUFactorSym 1 1.0 3.1981e-03462.6 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 1.0 9.8441e-033440.8 1.77e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1796 MatScale 9 1.0 4.1231e-02 1.5 5.34e+06 1.1 4.0e+02 4.0e+02 0.0e+00 0 0 0 0 0 0 0 0 0 0 2976 MatAssemblyBegin 50 1.0 9.7262e-0121.0 0.00e+00 0.0 7.4e+02 2.7e+03 5.4e+01 1 0 0 0 3 1 0 0 0 3 0 MatAssemblyEnd 50 1.0 1.5998e-01 1.1 0.00e+00 0.0 5.3e+03 1.4e+02 1.7e+02 0 0 2 0 9 0 0 2 0 9 0 MatGetRow 344796 1.0 6.6909e-02 1.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 1 0.0 5.3883e-05 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 0.0 1.6499e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.7e-01 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 3 1.0 8.4591e-03 1.2 0.00e+00 0.0 2.5e+03 6.0e+02 6.3e+01 0 0 1 0 3 0 0 1 0 3 0 MatZeroEntries 1 1.0 6.3216e-02 2.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 7 1.4 7.2742e-04 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAXPY 3 1.0 2.0059e-02 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 3 1.0 6.3202e-01 1.0 7.97e+07 1.1 2.4e+03 3.7e+03 7.2e+01 1 0 1 2 4 1 0 1 2 4 2874 MatMatMultSym 3 1.0 4.1000e-01 1.0 0.00e+00 0.0 2.0e+03 2.7e+03 6.6e+01 1 0 1 1 4 1 0 1 1 4 0 MatMatMultNum 3 1.0 2.2282e-01 1.0 7.97e+07 1.1 4.0e+02 8.5e+03 6.0e+00 0 0 0 1 0 0 0 0 1 0 8152 MatPtAP 3 1.0 2.6009e+00 1.0 5.06e+08 1.2 4.0e+03 7.8e+03 7.5e+01 4 3 1 7 4 4 3 1 7 4 4332 MatPtAPSymbolic 3 1.0 1.5594e+00 1.0 0.00e+00 0.0 2.4e+03 9.5e+03 4.5e+01 3 0 1 5 2 3 0 1 5 2 0 MatPtAPNumeric 3 1.0 1.0416e+00 1.0 5.06e+08 1.2 1.6e+03 5.1e+03 3.0e+01 2 3 0 2 2 2 3 0 2 2 10818 MatTrnMatMult 3 1.0 5.7367e-01 1.0 3.21e+07 1.2 2.5e+03 4.1e+03 8.7e+01 1 0 1 2 5 1 0 1 2 5 1233 MatGetLocalMat 15 1.0 9.0400e-02 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 1 0 0 0 0 1 0 MatGetBrAoCol 9 1.0 2.7844e-02 5.0 0.00e+00 0.0 2.8e+03 9.9e+03 1.2e+01 0 0 1 6 1 0 0 1 6 1 0 MatGetSymTrans 6 1.0 2.3436e-02 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecDot 1 1.0 4.7708e-04 1.1 1.76e+05 1.0 0.0e+00 0.0e+00 1.0e+00 0 0 0 0 0 0 0 0 0 0 8724 VecMDot 30 1.0 1.3670e-0115.6 1.04e+07 1.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 2 0 0 0 0 2 1796 VecTDot 416 1.0 2.1278e+00 3.6 7.32e+07 1.0 0.0e+00 0.0e+00 4.2e+02 2 0 0 0 23 2 0 0 0 23 814 VecNorm 242 1.0 4.5301e+00 4.6 3.88e+07 1.0 0.0e+00 0.0e+00 2.4e+02 5 0 0 0 13 5 0 0 0 13 203 VecScale 660 1.0 3.5834e-03 2.2 2.01e+06 1.6 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 10985 VecCopy 5 1.0 3.0966e-03 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 2561 1.0 3.9828e-02 2.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 419 1.0 2.4792e-01 1.9 7.34e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 7002 VecAYPX 834 1.0 1.4833e-01 2.0 5.62e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 8953 VecMAXPY 33 1.0 2.5291e-02 3.0 1.23e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 11471 VecAssemblyBegin 125 1.0 3.1623e-02 7.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.7e+02 0 0 0 0 20 0 0 0 0 20 0 VecAssemblyEnd 125 1.0 8.6784e-05 2.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 33 1.0 5.7311e-03 3.0 1.04e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 4283 VecScatterBegin 2873 1.0 1.0596e-01 3.2 0.00e+00 0.0 3.4e+05 1.1e+03 0.0e+00 0 0 96 88 0 0 0 96 88 0 0 VecScatterEnd 2873 1.0 9.3338e+00 3.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 10 0 0 0 0 10 0 0 0 0 0 VecSetRandom 3 1.0 2.1038e-03 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 33 1.0 2.4078e-02 5.3 3.12e+06 1.0 0.0e+00 0.0e+00 3.3e+01 0 0 0 0 2 0 0 0 0 2 3058 KSPGMRESOrthog 30 1.0 1.5212e-01 5.4 2.08e+07 1.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 2 0 0 0 0 2 3227 KSPSetUp 10 1.0 8.2645e-03 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 5.2659e+01 1.0 1.63e+10 1.1 3.5e+05 1.2e+03 1.8e+03 89100100100 98 89100100100 98 7024 PCSetUp 2 1.0 5.4096e+00 1.0 7.84e+08 1.2 3.6e+04 1.9e+03 1.2e+03 9 5 10 16 64 9 5 10 16 64 3253 PCSetUpOnBlocks 209 1.0 1.3349e-02107.9 1.77e+07 0.0 0.0e+00 0.0e+00 5.1e+00 0 0 0 0 0 0 0 0 0 0 1324 PCApply 209 1.0 3.9139e+01 1.1 1.29e+10 1.1 2.9e+05 9.9e+02 5.1e+00 63 79 82 65 0 63 79 82 65 0 7470 PCGAMGgraph_AGG 3 1.0 9.1326e-01 1.0 1.36e+06 1.1 2.0e+03 1.6e+02 1.1e+02 2 0 1 0 6 2 0 1 0 6 33 PCGAMGcoarse_AGG 3 1.0 5.9106e-01 1.0 3.21e+07 1.2 7.3e+03 1.9e+03 2.1e+02 1 0 2 3 11 1 0 2 3 12 1196 PCGAMGProl_AGG 3 1.0 1.0461e-01 1.0 0.00e+00 0.0 1.6e+04 6.3e+02 5.2e+02 0 0 5 2 28 0 0 5 2 28 0 PCGAMGPOpt_AGG 3 1.0 1.1951e+00 1.0 2.44e+08 1.1 6.4e+03 2.3e+03 1.7e+02 2 2 2 3 9 2 2 2 3 9 4665 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Matrix 87 87 207310700 0 Matrix Coarsen 3 3 1908 0 Vector 880 880 185081632 0 Vector Scatter 22 22 23672 0 Index Set 60 60 70344 0 Krylov Solver 10 10 116376 0 Preconditioner 10 10 9916 0 Viewer 2 1 736 0 PetscRandom 3 3 1896 0 ======================================================================================================================== Average time to get PetscTime(): 0 Average time for MPI_Barrier(): 1.12057e-05 Average time for zero size MPI_Send(): 1.00334e-06 #PETSc Option Table entries: -ksp_monitor -ksp_type cg -ksp_view -log_summary -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -mg_levels_pc_type sor -options_left -pc_gamg_agg_nsmooths 1 -pc_gamg_type agg -pc_type gamg #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure run at: Tue Jan 28 19:33:43 2014 Configure options: --with-mpi-dir=/usr/local/openmpi/1.5.4/gcc/x86_64 --download-parmetis --download-superlu_dist --download-hypre --download-metis --download-cmake --download-spooles --download-f-blas-lapack=1 --with-debugging=0 --with-shared-libraries=0 COPTFLAGS=-O3 FOPTFLAGS=-O3 ----------------------------------------- Libraries compiled on Tue Jan 28 19:33:43 2014 on ilfb35.ilsb.tuwien.ac.at Machine characteristics: Linux-2.6.32-358.2.1.el6.x86_64-x86_64-with-redhat-6.4-Carbon Using PETSc directory: /usr2/tgross/parFEAP/compile_test/parFEAP84_mod/petsc-3.4.3 Using PETSc arch: linux-gnu-c ----------------------------------------- Using C compiler: /usr/local/openmpi/1.5.4/gcc/x86_64/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O3 ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: /usr/local/openmpi/1.5.4/gcc/x86_64/bin/mpif90 -Wall -Wno-unused-variable -O3 ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/usr2/tgross/parFEAP/compile_test/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/include -I/usr2/tgross/parFEAP/compile_test/parFEAP84_mod/petsc-3.4.3/include -I/usr2/tgross/parFEAP/compile_test/parFEAP84_mod/petsc-3.4.3/include -I/usr2/tgross/parFEAP/compile_test/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/include -I/usr/local/openmpi/1.5.4/gcc/x86_64/include -I/usr/local/include ----------------------------------------- Using C linker: /usr/local/openmpi/1.5.4/gcc/x86_64/bin/mpicc Using Fortran linker: /usr/local/openmpi/1.5.4/gcc/x86_64/bin/mpif90 Using libraries: -Wl,-rpath,/usr2/tgross/parFEAP/compile_test/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/lib -L/usr2/tgross/parFEAP/compile_test/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/lib -lpetsc -Wl,-rpath,/usr2/tgross/parFEAP/compile_test/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/lib -L/usr2/tgross/parFEAP/compile_test/parFEAP84_mod/petsc-3.4.3/linux-gnu-c/lib -lHYPRE -L/usr/local/lib64 -L/usr/local/lib64/openmpi -L/usr/local/openmpi/1.5.4/gcc/x86_64/lib64 -L/usr/lib/gcc/x86_64-redhat-linux/4.4.7 -lmpi_cxx -lstdc++ -lsuperlu_dist_3.3 -lflapack -lfblas -lX11 -lparmetis -lmetis -lpthread -lmpi_f90 -lmpi_f77 -lgfortran -lm -lm -lm -lm -lmpi_cxx -lstdc++ -lmpi_cxx -lstdc++ -ldl -lmpi -lnsl -lutil -lgcc_s -lpthread -ldl ----------------------------------------- #PETSc Option Table entries: -ksp_monitor -ksp_type cg -ksp_view -log_summary -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -mg_levels_pc_type sor -options_left -pc_gamg_agg_nsmooths 1 -pc_gamg_type agg -pc_type gamg #End of PETSc Option Table entries There are no unused options.