[petsc-users] Obtaining compiling and building information from a.out
TAY wee-beng
zonexo at gmail.com
Tue Mar 27 21:17:45 CDT 2018
Hi Dave,
I looked at the output using log_view and re-compile. However, although
I use the same options "-xHost -g -O3 -openmp" (some filenames and dir
names are now different but they are actually the same), I still get
different results timing. I have attach both, fast and slow. So what
else can I do to pinpoint the differences?
Thank you very much.
Yours sincerely,
================================================
TAY Wee-Beng (Zheng Weiming) 郑伟明
Personal research webpage: http://tayweebeng.wixsite.com/website
Youtube research showcase: https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
linkedin: www.linkedin.com/in/tay-weebeng
================================================
On 27/3/2018 5:22 PM, Dave May wrote:
>
>
> On 27 March 2018 at 10:16, TAY wee-beng <zonexo at gmail.com
> <mailto:zonexo at gmail.com>> wrote:
>
> Hi,
>
> I have been compiling and building different version of my CFD
> with the intel 2016, 2018 compilers, and also different compiling
> options.
>
> I tested a version of my a.out and it is much faster than the
> other a.out, using only 3 min instead of more than 10min to solve
> a certain case using GAMG.
>
> However, I can't recall how it was compiled. I only know that I
> used the intel 2016 compiler.
>
> So is there any way I can find out how the a.out was compiled?
> Like what options were used?
>
>
> Since you posted to the list I presume "a.out" links against petsc...
> If so, run your code with
> -log_view
>
> Upon calling PetscFinalize(), you will get all the options given to
> PETSc configure, plus the CFLAGS, link lines, etc.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180328/9b3b0985/attachment-0001.html>
-------------- next part --------------
0.400000000000000 0.400000000000000 0.000000000000000E+000
0.000000000000000E+000 0.000000000000000E+000 0.000000000000000E+000
1.00000000000000 0.400000000000000 0 -400000
AB,AA,BB,CC -3.11950005317340 2.72450003441190
2.95400004531257 2.97600004635751
size_x,size_y,size_z 62x140x388
total grid size = 3367840
recommended cores (50k / core) = 67.3568000000000
myid,jsta,jend,ksta,kend 0 1 70 1
97
myid,jsta,jend,ksta,kend 7 71 140 292
388
myid,jsta,jend,ksta,kend 4 1 70 195
291
myid,jsta,jend,ksta,kend 5 71 140 195
291
myid,jsta,jend,ksta,kend 6 1 70 292
388
myid,jsta,jend,ksta,kend 1 71 140 1
97
myid,jsta,jend,ksta,kend 2 1 70 98
194
myid,jsta,jend,ksta,kend 3 71 140 98
194
min_area,max_area,min_grid_area,ratio 2.491427578150726E-003
8.513928284063485E-003 6.250000000000001E-004 13.6222852545016
ratio bet max_area,min_grid_area not ideal
max element length should be 3.535533905932738E-002
body_cg_ini 0.117923968064102 -1.965876114406727E-005
6.83221612299529
Warning - length difference between element and cell
max_element_length,min_element_length,min_delta
0.157615376756988 5.913693179946990E-002 2.500000000000000E-002
maximum ngh_surfaces and ngh_vertics are 2 1
minimum ngh_surfaces and ngh_vertics are 1 1
min IIB_cell_no 66
max IIB_cell_no 316
IIB_cell_no_sum 1647
min equal_size 1800
max equal_size 2000
min I_cell_no 149
max I_cell_no 1955
I_cell_no_sum 7996
size(IIB_cell_u),size(I_cell_u),size(IIB_equal_cell_u),size(I_equal_cell_u),siz
e(IIB_global_cell_u),size(I_global_cell_u)
316 1955 316 1955 1647 7996
IIB_equal_cell_no_u1_max 316
I_equal_cell_no_u1_max 1955
IIB_I_cell_no_uvw_total1 1647 0 0 7996
0 0
size(IIB_cell_u),IIB_cell_no_max_cur 316 824
local IIB_cells size exceed, to increase size
size(I_cell_u),I_cell_no_max_cur 1955 4004
local I_cells size exceed, to increase size
IIB_cell_no_u1_max,I_cell_no_u1_max 1236 6006
size(IIB_cell_u),IIB_cell_no_max_cur 316 317
IIB_cells size exceed, to increase size
IIB_equal_cell_no_u1_max 475
size(I_cell_u),I_cell_no_max_cur 1955 1959
I_cells size exceed, to increase size
I_equal_cell_no_u1_max 2938
size(IIB_global_cell_u1),IIB_global_cell_no_u1_max_cur 1647 1649
IIB global cells size exceed, to increase size
size(I_global_cell_u1),I_global_cell_no_u1_max_cur 7996 8001
I global cells size exceed, to increase size
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1236 6006
time,IIB_I_cell_no_uvw_total1 21 1649 0 0
8001 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
size(I_cell_u),I_cell_no_max_cur 6006 6210
local I_cells size exceed, to increase size
IIB_cell_no_u1_max,I_cell_no_u1_max 1236 9315
time,IIB_I_cell_no_uvw_total1 22 1643 0 0
7988 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
size(IIB_cell_u),IIB_cell_no_max_cur 1236 1319
local IIB_cells size exceed, to increase size
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 23 1649 0 0
8003 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 24 1645 0 0
7984 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 25 1646 0 0
7997 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 26 1649 0 0
8001 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 27 1646 0 0
7987 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 28 1645 0 0
7984 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 29 1644 0 0
7991 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 30 1648 0 0
7992 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 31 1648 0 0
8001 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 32 1648 0 0
7998 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 33 1647 0 0
7997 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 34 1648 0 0
8003 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 35 1647 0 0
7990 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 36 1642 0 0
7990 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 37 1646 0 0
7982 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 38 1646 0 0
7991 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 39 1648 0 0
8000 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 40 1646 0 0
7998 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 41 1645 0 0
7987 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 42 1650 0 0
8008 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 43 1644 0 0
7990 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 44 1647 0 0
7998 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 45 1647 0 0
7996 0 0
Recommended no for final_IIB_equal_no1,final_IIB_no1,final_IIB_global_no1,final
_I_equal_no1,final_I_no1,final_I_global_no1
593 2472 3091 3672 11643 15001
0.400000000000000 0.400000000000000 0.000000000000000E+000
0.000000000000000E+000 0.000000000000000E+000 0.000000000000000E+000
1.00000000000000 0.400000000000000 0 -400000
AB,AA,BB,CC -3.11950005317340 2.72450003441190
2.95400004531257 2.97600004635751
size_x,size_y,size_z 62x140x388
total grid size = 3367840
recommended cores (50k / core) = 67.3568000000000
myid,jsta,jend,ksta,kend 0 1 70 1
97
myid,jsta,jend,ksta,kend 1 71 140 1
97
myid,jsta,jend,ksta,kend 2 1 70 98
194
myid,jsta,jend,ksta,kend 3 71 140 98
194
myid,jsta,jend,ksta,kend 4 1 70 195
291
myid,jsta,jend,ksta,kend 5 71 140 195
291
myid,jsta,jend,ksta,kend 6 1 70 292
388
myid,jsta,jend,ksta,kend 7 71 140 292
388
min_area,max_area,min_grid_area,ratio 2.491427578150726E-003
8.513928284063485E-003 6.250000000000001E-004 13.6222852545016
ratio bet max_area,min_grid_area not ideal
max element length should be 3.535533905932738E-002
body_cg_ini 0.117923968064102 -1.965876114406727E-005
6.83221612299529
Warning - length difference between element and cell
max_element_length,min_element_length,min_delta
0.157615376756988 5.913693179946990E-002 2.500000000000000E-002
maximum ngh_surfaces and ngh_vertics are 2 1
minimum ngh_surfaces and ngh_vertics are 1 1
size(IIB_cell_u),size(I_cell_u),size(IIB_equal_cell_u),size(I_equal_cell_u),siz
e(IIB_global_cell_u),size(I_global_cell_u)
2472 11643 593 3672 3091 15001
IIB_I_cell_no_uvw_total1 1647 1699 1668 7996
8318 7986
1 0.01500000 0.55521996 0.50167194 1.49132917 -0.68647491E+03 -0.20372981E+00 0.33615663E+07
2 0.01500000 0.64585142 0.60619409 1.40141482 -0.11012069E+04 -0.16753715E+01 0.33575645E+07
3 0.01500000 0.73593671 0.72189474 1.44197488 -0.13166126E+04 -0.36947098E+01 0.33552444E+07
4 0.01500000 0.76027431 0.77393780 1.46636155 -0.14542153E+04 -0.47010091E+01 0.33537201E+07
5 0.01500000 0.75915757 0.76204613 1.47797514 -0.15453247E+04 -0.60296962E+01 0.33526627E+07
6 0.01500000 0.74742340 0.75211120 1.46445276 -0.16119725E+04 -0.70224586E+01 0.33518669E+07
7 0.01500000 0.72469489 0.73253807 1.45611639 -0.16628567E+04 -0.75633417E+01 0.33512386E+07
8 0.01500000 0.70620928 0.70750345 1.44491710 -0.17038804E+04 -0.81789866E+01 0.33507166E+07
9 0.01500000 0.68737201 0.68840472 1.43060449 -0.17383828E+04 -0.86491557E+01 0.33502642E+07
10 0.01500000 0.67066958 0.67201885 1.41666375 -0.17682751E+04 -0.90679314E+01 0.33498601E+07
11 0.01500000 0.65356651 0.65614140 1.40694393 -0.17948878E+04 -0.94375640E+01 0.33494895E+07
12 0.01500000 0.63784263 0.64006177 1.39565411 -0.18190659E+04 -0.98032901E+01 0.33491430E+07
13 0.01500000 0.62354446 0.62707936 1.38463801 -0.18413690E+04 -0.10146395E+02 0.33488144E+07
14 0.01500000 0.60511977 0.60854016 1.37434517 -0.18621890E+04 -0.10476429E+02 0.33484996E+07
15 0.01500000 0.59483432 0.59700016 1.36652361 -0.18818065E+04 -0.10798095E+02 0.33481953E+07
16 0.01500000 0.58195301 0.58443368 1.35889657 -0.19004238E+04 -0.11113175E+02 0.33478997E+07
17 0.01500000 0.56770020 0.57081075 1.35100984 -0.19181939E+04 -0.11422851E+02 0.33476109E+07
18 0.01500000 0.55880217 0.56091360 1.34350797 -0.19352312E+04 -0.11726948E+02 0.33473280E+07
19 0.01500000 0.54887411 0.55141669 1.33653850 -0.19516281E+04 -0.12027016E+02 0.33470500E+07
20 0.01500000 0.53716931 0.53993030 1.33090692 -0.19674569E+04 -0.12323269E+02 0.33467762E+07
cd_cl_cs_mom_implicit1
0.128886132854489 5.139386380557665E-004 0.344932288820398
-2.214355720498197E-004 -3.982915522187946E-002 8.883606198822986E-005
************************************************************************************************************************
*** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document ***
************************************************************************************************************************
---------------------------------------------- PETSc Performance Summary: ----------------------------------------------
./a.out on a petsc-3.8.3_intel2016_rel named vis01 with 8 processors, by tsltaywb Tue Mar 27 22:28:37 2018
Using Petsc Release Version 3.8.3, Dec, 09, 2017
Max Max/Min Avg Total
Time (sec): 1.347e+02 1.00000 1.347e+02
Objects: 9.030e+02 1.00111 9.021e+02
Flop: 1.177e+11 1.00745 1.173e+11 9.387e+11
Flop/sec: 8.740e+08 1.00745 8.710e+08 6.968e+09
MPI Messages: 5.214e+04 1.69428 4.224e+04 3.379e+05
MPI Message Lengths: 9.316e+08 1.42825 1.874e+04 6.333e+09
MPI Reductions: 2.178e+03 1.00000
Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
e.g., VecAXPY() for real vectors of length N --> 2N flop
and VecAXPY() for complex vectors of length N --> 8N flop
Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions --
Avg %Total Avg %Total counts %Total Avg %Total counts %Total
0: Main Stage: 1.3471e+02 100.0% 9.3868e+11 100.0% 3.379e+05 100.0% 1.874e+04 100.0% 2.177e+03 100.0%
------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
Count: number of times phase was executed
Time and Flop: Max - maximum over all processors
Ratio - ratio of maximum to minimum over all processors
Mess: number of messages sent
Avg. len: average message length (bytes)
Reduct: number of global reductions
Global: entire computation
Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
%T - percent time in this phase %F - percent flop in this phase
%M - percent messages in this phase %L - percent message lengths in this phase
%R - percent reductions in this phase
Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event Count Time (sec) Flop --- Global --- --- Stage --- Total
Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------
--- Event Stage 0: Main Stage
BuildTwoSided 4 1.0 2.7359e-0310.9 0.00e+00 0.0 6.4e+01 4.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
VecDot 76 1.0 2.7495e-01 2.1 1.92e+08 1.0 0.0e+00 0.0e+00 7.6e+01 0 0 0 0 3 0 0 0 0 3 5585
VecDotNorm2 38 1.0 2.1468e-01 3.0 1.92e+08 1.0 0.0e+00 0.0e+00 3.8e+01 0 0 0 0 2 0 0 0 0 2 7154
VecMDot 400 1.0 1.5063e+00 1.4 2.39e+09 1.0 0.0e+00 0.0e+00 4.0e+02 1 2 0 0 18 1 2 0 0 18 12703
VecNorm 485 1.0 2.7930e-01 1.6 4.51e+08 1.0 0.0e+00 0.0e+00 4.8e+02 0 0 0 0 22 0 0 0 0 22 12906
VecScale 428 1.0 6.0537e-02 1.0 1.53e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 20259
VecCopy 1426 1.0 4.1480e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
VecSet 5357 1.0 5.1100e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
VecAXPY 28 1.0 1.4019e-02 1.0 1.87e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 10664
VecAYPX 10880 1.0 1.8162e+00 1.1 1.57e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 6920
VecAXPBYCZ 5516 1.0 1.5153e+00 1.0 3.53e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 18616
VecWAXPY 76 1.0 2.2895e-01 1.0 1.92e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 6708
VecMAXPY 428 1.0 1.3232e+00 1.0 2.68e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 16202
VecAssemblyBegin 91 1.0 7.6561e-02 5.5 0.00e+00 0.0 0.0e+00 0.0e+00 2.7e+02 0 0 0 0 12 0 0 0 0 12 0
VecAssemblyEnd 91 1.0 3.3951e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
VecPointwiseMult 44 1.0 1.1086e-02 1.1 5.09e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3668
VecScatterBegin 11498 1.0 4.6959e-01 1.3 0.00e+00 0.0 3.3e+05 1.9e+04 0.0e+00 0 0 98 98 0 0 0 98 98 0 0
VecScatterEnd 11498 1.0 4.4420e+00 7.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 2 0 0 0 0 0
VecSetRandom 4 1.0 1.0390e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
VecNormalize 428 1.0 2.0914e-01 1.2 4.60e+08 1.0 0.0e+00 0.0e+00 4.3e+02 0 0 0 0 20 0 0 0 0 20 17592
MatMult 8636 1.0 3.5918e+01 1.0 5.13e+10 1.0 2.6e+05 2.2e+04 0.0e+00 26 43 77 90 0 26 43 77 90 0 11366
MatMultAdd 1360 1.0 2.4898e+00 1.0 2.46e+09 1.0 3.5e+04 3.1e+03 0.0e+00 2 2 10 2 0 2 2 10 2 0 7877
MatMultTranspose 1360 1.0 3.1336e+00 1.3 2.46e+09 1.0 3.5e+04 3.1e+03 0.0e+00 2 2 10 2 0 2 2 10 2 0 6258
MatSolve 435 4.6 5.8790e+00 1.0 9.06e+09 1.0 0.0e+00 0.0e+00 0.0e+00 4 8 0 0 0 4 8 0 0 0 12332
MatSOR 8204 1.0 5.4194e+01 1.0 4.00e+10 1.0 0.0e+00 0.0e+00 0.0e+00 40 34 0 0 0 40 34 0 0 0 5883
MatLUFactorSym 1 1.0 2.3842e-05 2.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatLUFactorNum 2 1.0 6.1511e-01 1.0 4.25e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 5510
MatILUFactorSym 1 1.0 2.5613e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatConvert 4 1.0 4.8567e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatScale 13 1.0 9.8135e-02 1.0 7.62e+07 1.0 1.2e+02 1.9e+04 0.0e+00 0 0 0 0 0 0 0 0 0 0 6192
MatResidual 1360 1.0 4.8659e+00 1.0 6.88e+09 1.0 4.1e+04 1.9e+04 0.0e+00 4 6 12 13 0 4 6 12 13 0 11222
MatAssemblyBegin 81 1.0 2.0317e-01 1.8 0.00e+00 0.0 6.0e+02 8.2e+04 5.6e+01 0 0 0 1 3 0 0 0 1 3 0
MatAssemblyEnd 81 1.0 4.8165e-01 1.1 0.00e+00 0.0 1.1e+03 4.7e+03 1.7e+02 0 0 0 0 8 0 0 0 0 8 0
MatGetRow 1386912 1.0 5.0331e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 4 0 0 0 0 4 0 0 0 0 0
MatGetRowIJ 2 2.0 1.0014e-05 5.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatCreateSubMat 2 1.0 4.9806e-04 1.0 0.00e+00 0.0 4.9e+01 8.4e+01 3.6e+01 0 0 0 0 2 0 0 0 0 2 0
MatGetOrdering 2 2.0 1.7617e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatCoarsen 4 1.0 4.1207e-02 1.0 0.00e+00 0.0 9.0e+02 7.1e+03 1.4e+01 0 0 0 0 1 0 0 0 0 1 0
MatZeroEntries 4 1.0 9.6283e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatAXPY 4 1.0 3.4464e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 3 0 0 0 0 3 0 0 0 0 0
MatMatMult 4 1.0 3.3812e-01 1.0 2.02e+07 1.0 7.6e+02 9.5e+03 6.5e+01 0 0 0 0 3 0 0 0 0 3 475
MatMatMultSym 4 1.0 2.7510e-01 1.0 0.00e+00 0.0 6.4e+02 7.6e+03 5.6e+01 0 0 0 0 3 0 0 0 0 3 0
MatMatMultNum 4 1.0 6.2469e-02 1.0 2.02e+07 1.0 1.2e+02 1.9e+04 8.0e+00 0 0 0 0 0 0 0 0 0 0 2571
MatPtAP 4 1.0 2.3125e+00 1.0 5.63e+08 1.0 1.5e+03 6.8e+04 6.9e+01 2 0 0 2 3 2 0 0 2 3 1913
MatPtAPSymbolic 4 1.0 1.6609e+00 1.0 0.00e+00 0.0 7.7e+02 6.8e+04 2.9e+01 1 0 0 1 1 1 0 0 1 1 0
MatPtAPNumeric 4 1.0 6.5213e-01 1.0 5.63e+08 1.0 7.2e+02 6.8e+04 4.0e+01 0 0 0 1 2 0 0 0 1 2 6783
MatTrnMatMult 1 1.0 7.2597e-01 1.0 4.10e+07 1.0 1.4e+02 6.9e+04 1.9e+01 1 0 0 0 1 1 0 0 0 1 450
MatTrnMatMultSym 1 1.0 4.6429e-01 1.0 0.00e+00 0.0 1.2e+02 3.5e+04 1.7e+01 0 0 0 0 1 0 0 0 0 1 0
MatTrnMatMultNum 1 1.0 2.6166e-01 1.0 4.10e+07 1.0 2.0e+01 2.8e+05 2.0e+00 0 0 0 0 0 0 0 0 0 0 1249
MatGetLocalMat 14 1.0 7.4680e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatGetBrAoCol 12 1.0 1.6888e-02 1.2 0.00e+00 0.0 8.5e+02 5.6e+04 0.0e+00 0 0 0 1 0 0 0 0 1 0 0
SFSetGraph 4 1.0 4.2915e-06 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
SFBcastBegin 22 1.0 3.9790e-03 1.9 0.00e+00 0.0 9.0e+02 7.1e+03 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
SFBcastEnd 22 1.0 2.1598e-03 3.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
KSPGMRESOrthog 400 1.0 2.6764e+00 1.2 4.78e+09 1.0 0.0e+00 0.0e+00 4.0e+02 2 4 0 0 18 2 4 0 0 18 14298
KSPSetUp 17 1.0 3.4259e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 1 0 0 0 0 1 0
KSPSolve 39 1.0 1.2240e+02 1.0 1.18e+11 1.0 3.3e+05 1.8e+04 1.5e+03 91100 99 96 69 91100 99 96 69 7666
PCGAMGGraph_AGG 4 1.0 7.0450e+00 1.0 2.02e+07 1.0 3.5e+02 8.4e+03 4.8e+01 5 0 0 0 2 5 0 0 0 2 23
PCGAMGCoarse_AGG 4 1.0 8.7772e-01 1.0 4.10e+07 1.0 1.2e+03 2.0e+04 4.9e+01 1 0 0 0 2 1 0 0 0 2 372
PCGAMGProl_AGG 4 1.0 1.4617e-01 1.0 0.00e+00 0.0 5.8e+02 8.1e+03 9.6e+01 0 0 0 0 4 0 0 0 0 4 0
PCGAMGPOpt_AGG 4 1.0 4.0676e+00 1.0 3.57e+08 1.0 2.0e+03 1.5e+04 1.9e+02 3 0 1 0 9 3 0 1 0 9 699
GAMG: createProl 4 1.0 1.2139e+01 1.0 4.18e+08 1.0 4.1e+03 1.5e+04 3.8e+02 9 0 1 1 18 9 0 1 1 18 275
Graph 8 1.0 7.0439e+00 1.0 2.02e+07 1.0 3.5e+02 8.4e+03 4.8e+01 5 0 0 0 2 5 0 0 0 2 23
MIS/Agg 4 1.0 4.1251e-02 1.0 0.00e+00 0.0 9.0e+02 7.1e+03 1.4e+01 0 0 0 0 1 0 0 0 0 1 0
SA: col data 4 1.0 2.4046e-02 1.1 0.00e+00 0.0 2.6e+02 1.6e+04 4.0e+01 0 0 0 0 2 0 0 0 0 2 0
SA: frmProl0 4 1.0 1.1588e-01 1.0 0.00e+00 0.0 3.2e+02 1.8e+03 4.0e+01 0 0 0 0 2 0 0 0 0 2 0
SA: smooth 4 1.0 3.8085e+00 1.0 2.75e+07 1.0 7.6e+02 9.5e+03 8.1e+01 3 0 0 0 4 3 0 0 0 4 57
GAMG: partLevel 4 1.0 2.3136e+00 1.0 5.63e+08 1.0 1.6e+03 6.5e+04 1.2e+02 2 0 0 2 6 2 0 0 2 6 1912
repartition 1 1.0 1.0395e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 6.0e+00 0 0 0 0 0 0 0 0 0 0 0
Invert-Sort 1 1.0 1.0014e-04 2.0 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00 0 0 0 0 0 0 0 0 0 0 0
Move A 1 1.0 3.2687e-04 1.0 0.00e+00 0.0 3.5e+01 1.1e+02 1.9e+01 0 0 0 0 1 0 0 0 0 1 0
Move P 1 1.0 3.0112e-04 1.0 0.00e+00 0.0 1.4e+01 2.6e+01 1.9e+01 0 0 0 0 1 0 0 0 0 1 0
PCSetUp 4 1.0 1.5343e+01 1.0 1.41e+09 1.0 5.7e+03 2.9e+04 5.3e+02 11 1 2 3 24 11 1 2 3 24 726
PCSetUpOnBlocks 359 1.0 8.8877e-01 1.0 4.25e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 3813
PCApply 435 1.0 9.6567e+01 1.0 9.92e+10 1.0 3.2e+05 1.6e+04 8.4e+01 72 84 95 79 4 72 84 95 79 4 8190
------------------------------------------------------------------------------------------------------------------------
Memory usage is given in bytes:
Object Type Creations Destructions Memory Descendants' Mem.
Reports information only for process 0.
--- Event Stage 0: Main Stage
Vector 359 359 423296120 0.
Vector Scatter 67 67 27572128 0.
Matrix 118 118 2032943620 0.
Matrix Coarsen 4 4 2512 0.
Distributed Mesh 35 35 181440 0.
Index Set 137 137 33646184 0.
IS L to G Mapping 35 35 14511960 0.
Star Forest Graph 74 74 63344 0.
Discrete System 35 35 31500 0.
Krylov Solver 17 17 270136 0.
Preconditioner 13 13 13484 0.
PetscRandom 8 8 5104 0.
Viewer 1 0 0 0.
========================================================================================================================
Average time to get PetscTime(): 1.09673e-06
Average time for MPI_Barrier(): 1.81198e-06
Average time for zero size MPI_Send(): 7.7486e-06
#PETSc Option Table entries:
-log_view
-poisson_pc_gamg_agg_nsmooths 1
-poisson_pc_type gamg
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure options: --with-mpi-dir=/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64 --with-blaslapack-dir=/app/intel/compilers_and_libraries_2016.1.150/linux/mkl/lib/intel64 --download-hypre=/home/users/nus/tsltaywb/source/git.hypre.tar.gz --with-debugging=0 --prefix=/home/users/nus/tsltaywb/lib/petsc-3.8.3_intel2016_rel --with-shared-libraries=0 --known-mpi-shared-libraries=0 --with-fortran-interfaces=1 --CFLAGS="-xHost -g -O3 -openmp" --CXXFLAGS="-xHost -g -O3 -openmp" --FFLAGS="-xHost -g -O3 -openmp"
-----------------------------------------
Libraries compiled on Sat Feb 24 21:14:23 2018 on nus03
Machine characteristics: Linux-2.6.32-696.18.7.el6.x86_64-x86_64-with-redhat-6.9-Santiago
Using PETSc directory: /home/users/nus/tsltaywb/source/petsc-3.8.3
Using PETSc arch: petsc-3.8.3_intel2016_rel
-----------------------------------------
Using C compiler: /app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/bin/mpicc -xHost -g -O3 -openmp ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: /app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/bin/mpif90 -xHost -g -O3 -openmp ${FOPTFLAGS} ${FFLAGS}
-----------------------------------------
Using include paths: -I/home/users/nus/tsltaywb/source/petsc-3.8.3/petsc-3.8.3_intel2016_rel/include -I/home/users/nus/tsltaywb/source/petsc-3.8.3/include -I/home/users/nus/tsltaywb/source/petsc-3.8.3/include -I/home/users/nus/tsltaywb/source/petsc-3.8.3/petsc-3.8.3_intel2016_rel/include -I/home/users/nus/tsltaywb/lib/petsc-3.8.3_intel2016_rel/include -I/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/include
-----------------------------------------
Using C linker: /app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/bin/mpicc
Using Fortran linker: /app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/bin/mpif90
Using libraries: -Wl,-rpath,/home/users/nus/tsltaywb/source/petsc-3.8.3/petsc-3.8.3_intel2016_rel/lib -L/home/users/nus/tsltaywb/source/petsc-3.8.3/petsc-3.8.3_intel2016_rel/lib -lpetsc -Wl,-rpath,/home/users/nus/tsltaywb/lib/petsc-3.8.3_intel2016_rel/lib -L/home/users/nus/tsltaywb/lib/petsc-3.8.3_intel2016_rel/lib -Wl,-rpath,/app/intel/compilers_and_libraries_2016.1.150/linux/mkl/lib/intel64 -L/app/intel/compilers_and_libraries_2016.1.150/linux/mkl/lib/intel64 -L/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/lib/debug_mt -L/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/lib -L/app/intel/compilers_and_libraries_2016.1.150/linux/ipp/lib/intel64 -L/app/intel/compilers_and_libraries_2016.1.150/linux/compiler/lib/intel64 -L/app/intel/compilers_and_libraries_2016.1.150/linux/tbb/lib/intel64/gcc4.4 -L/app/intel/compilers_and_libraries_2016.1.150/linux/daal/lib/intel64_lin -L/app/intel/compilers_and_libraries_2016.1.150/linux/tbb/lib/intel64_lin/gcc4.4 -L/app/intel/compilers_and_libraries_2016.1.150/linux/compiler/lib/intel64_lin -L/usr/lib/gcc/x86_64-redhat-linux/4.4.7 -Wl,-rpath,/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/lib/debug_mt -Wl,-rpath,/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/lib -Wl,-rpath,/opt/intel/mpi-rt/5.1/intel64/lib/debug_mt -Wl,-rpath,/opt/intel/mpi-rt/5.1/intel64/lib -Wl,-rpath,/opt/intel/mpi-rt/5.0/intel64/lib/debug_mt -Wl,-rpath,/opt/intel/mpi-rt/5.0/intel64/lib -lHYPRE -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lX11 -lifport -lifcoremt -lmpicxx -ldl -lmpifort -lmpi -lmpigi -lrt -lpthread -limf -lsvml -lirng -lm -lipgo -ldecimal -liomp5 -lcilkrts -lstdc++ -lgcc_s -lirc -lirc_s -ldl
-----------------------------------------
======================================================================================
Resource Usage on 2018-03-27 22:28:37.108982:
JobId: 6700582.wlm01
Project: 11000314
Exit Status: 0
NCPUs Requested: 8 NCPUs Used: 8
CPU Time Used: 00:17:57
Memory Requested: 32gb Memory Used: 19402196kb
Vmem Used: 21906792kb
Walltime requested: 00:10:00 Walltime Used: 00:02:27
Execution Nodes Used: (vis01:ncpus=8:mem=33554432kb)
======================================================================================
-------------- next part --------------
0.400000000000000 0.400000000000000 0.000000000000000E+000
0.000000000000000E+000 0.000000000000000E+000 0.000000000000000E+000
1.00000000000000 0.400000000000000 0 -400000
AB,AA,BB,CC -3.11950005317340 2.72450003441190
2.95400004531257 2.97600004635751
size_x,size_y,size_z 62x140x388
total grid size = 3367840
recommended cores (50k / core) = 67.3568000000000
myid,jsta,jend,ksta,kend 0 1 70 1
97
myid,jsta,jend,ksta,kend 1 71 140 1
97
myid,jsta,jend,ksta,kend 2 1 70 98
194
myid,jsta,jend,ksta,kend 3 71 140 98
194
myid,jsta,jend,ksta,kend 4 1 70 195
291
myid,jsta,jend,ksta,kend 5 71 140 195
291
myid,jsta,jend,ksta,kend 6 1 70 292
388
myid,jsta,jend,ksta,kend 7 71 140 292
388
min_area,max_area,min_grid_area,ratio 2.491427578150726E-003
8.513928284063485E-003 6.250000000000001E-004 13.6222852545016
ratio bet max_area,min_grid_area not ideal
max element length should be 3.535533905932738E-002
body_cg_ini 0.117923968064102 -1.965876114406727E-005
6.83221612299529
Warning - length difference between element and cell
max_element_length,min_element_length,min_delta
0.157615376756988 5.913693179946990E-002 2.500000000000000E-002
maximum ngh_surfaces and ngh_vertics are 2 1
minimum ngh_surfaces and ngh_vertics are 1 1
min IIB_cell_no 66
max IIB_cell_no 316
IIB_cell_no_sum 1647
min equal_size 1800
max equal_size 2000
min I_cell_no 149
max I_cell_no 1955
I_cell_no_sum 7996
size(IIB_cell_u),size(I_cell_u),size(IIB_equal_cell_u),size(I_equal_cell_u),siz
e(IIB_global_cell_u),size(I_global_cell_u)
316 1955 316 1955 1647 7996
IIB_equal_cell_no_u1_max 316
I_equal_cell_no_u1_max 1955
IIB_I_cell_no_uvw_total1 1647 0 0 7996
0 0
size(IIB_cell_u),IIB_cell_no_max_cur 316 824
local IIB_cells size exceed, to increase size
size(I_cell_u),I_cell_no_max_cur 1955 4004
local I_cells size exceed, to increase size
IIB_cell_no_u1_max,I_cell_no_u1_max 1236 6006
size(IIB_cell_u),IIB_cell_no_max_cur 316 317
IIB_cells size exceed, to increase size
IIB_equal_cell_no_u1_max 475
size(I_cell_u),I_cell_no_max_cur 1955 1959
I_cells size exceed, to increase size
I_equal_cell_no_u1_max 2938
size(IIB_global_cell_u1),IIB_global_cell_no_u1_max_cur 1647 1649
IIB global cells size exceed, to increase size
size(I_global_cell_u1),I_global_cell_no_u1_max_cur 7996 8001
I global cells size exceed, to increase size
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1236 6006
time,IIB_I_cell_no_uvw_total1 21 1649 0 0
8001 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
size(I_cell_u),I_cell_no_max_cur 6006 6210
local I_cells size exceed, to increase size
IIB_cell_no_u1_max,I_cell_no_u1_max 1236 9315
time,IIB_I_cell_no_uvw_total1 22 1643 0 0
7988 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
size(IIB_cell_u),IIB_cell_no_max_cur 1236 1319
local IIB_cells size exceed, to increase size
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 23 1649 0 0
8003 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 24 1645 0 0
7984 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 25 1646 0 0
7997 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 26 1649 0 0
8001 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 27 1646 0 0
7987 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 28 1645 0 0
7984 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 29 1644 0 0
7991 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 30 1648 0 0
7992 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 31 1648 0 0
8001 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 32 1648 0 0
7998 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 33 1647 0 0
7997 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 34 1648 0 0
8003 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 35 1647 0 0
7990 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 36 1642 0 0
7990 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 37 1646 0 0
7982 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 38 1646 0 0
7991 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 39 1648 0 0
8000 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 40 1646 0 0
7998 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 41 1645 0 0
7987 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 42 1650 0 0
8008 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 43 1644 0 0
7990 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 44 1647 0 0
7998 0 0
IIB_equal_cell_no_u1_max 475
I_equal_cell_no_u1_max 2938
IIB_global_cell_no_u1_max,I_global_cell_no_u1_max 2473 12001
IIB_cell_no_u1_max,I_cell_no_u1_max 1978 9315
time,IIB_I_cell_no_uvw_total1 45 1647 0 0
7996 0 0
Recommended no for final_IIB_equal_no1,final_IIB_no1,final_IIB_global_no1,final
_I_equal_no1,final_I_no1,final_I_global_no1
593 2472 3091 3672 11643 15001
0.400000000000000 0.400000000000000 0.000000000000000E+000
0.000000000000000E+000 0.000000000000000E+000 0.000000000000000E+000
1.00000000000000 0.400000000000000 0 -400000
AB,AA,BB,CC -3.11950005317340 2.72450003441190
2.95400004531257 2.97600004635751
size_x,size_y,size_z 62x140x388
total grid size = 3367840
recommended cores (50k / core) = 67.3568000000000
myid,jsta,jend,ksta,kend 0 1 70 1
97
myid,jsta,jend,ksta,kend 1 71 140 1
97
myid,jsta,jend,ksta,kend 2 1 70 98
194
myid,jsta,jend,ksta,kend 3 71 140 98
194
myid,jsta,jend,ksta,kend 4 1 70 195
291
myid,jsta,jend,ksta,kend 5 71 140 195
291
myid,jsta,jend,ksta,kend 6 1 70 292
388
myid,jsta,jend,ksta,kend 7 71 140 292
388
min_area,max_area,min_grid_area,ratio 2.491427578150726E-003
8.513928284063485E-003 6.250000000000001E-004 13.6222852545016
ratio bet max_area,min_grid_area not ideal
max element length should be 3.535533905932738E-002
body_cg_ini 0.117923968064102 -1.965876114406727E-005
6.83221612299529
Warning - length difference between element and cell
max_element_length,min_element_length,min_delta
0.157615376756988 5.913693179946990E-002 2.500000000000000E-002
maximum ngh_surfaces and ngh_vertics are 2 1
minimum ngh_surfaces and ngh_vertics are 1 1
size(IIB_cell_u),size(I_cell_u),size(IIB_equal_cell_u),size(I_equal_cell_u),siz
e(IIB_global_cell_u),size(I_global_cell_u)
2472 11643 593 3672 3091 15001
IIB_I_cell_no_uvw_total1 1647 1699 1668 7996
8318 7986
1 0.01500000 0.55521952 0.50166514 1.49131893 -0.68670439E+03 -0.20604374E+00 0.33615662E+07
2 0.01500000 0.64585026 0.60619564 1.40140471 -0.11009478E+04 -0.10911172E+01 0.33575644E+07
3 0.01500000 0.73593463 0.72189795 1.44197265 -0.13161091E+04 -0.28042672E+01 0.33552438E+07
4 0.01500000 0.76027325 0.77393896 1.46636140 -0.14536130E+04 -0.36603995E+01 0.33537189E+07
5 0.01500000 0.75915510 0.76204667 1.47797502 -0.15446388E+04 -0.49307434E+01 0.33526611E+07
6 0.01500000 0.74742136 0.75211276 1.46445227 -0.16112041E+04 -0.59483248E+01 0.33518653E+07
7 0.01500000 0.72469437 0.73253845 1.45611669 -0.16620039E+04 -0.65738576E+01 0.33512372E+07
8 0.01500000 0.70620865 0.70750419 1.44491755 -0.17029461E+04 -0.73025142E+01 0.33507155E+07
9 0.01500000 0.68737227 0.68840507 1.43060525 -0.17373751E+04 -0.78836160E+01 0.33502633E+07
10 0.01500000 0.67066988 0.67201926 1.41666490 -0.17672055E+04 -0.83903321E+01 0.33498595E+07
11 0.01500000 0.65356708 0.65614218 1.40694522 -0.17937703E+04 -0.88125441E+01 0.33494891E+07
12 0.01500000 0.63784284 0.64006217 1.39565551 -0.18179146E+04 -0.91924434E+01 0.33491426E+07
13 0.01500000 0.62354467 0.62707978 1.38463971 -0.18401961E+04 -0.95159393E+01 0.33488141E+07
14 0.01500000 0.60512073 0.60854059 1.37434643 -0.18610036E+04 -0.98021312E+01 0.33484991E+07
15 0.01500000 0.59483523 0.59700107 1.36652434 -0.18806142E+04 -0.10067043E+02 0.33481948E+07
16 0.01500000 0.58195385 0.58443459 1.35889875 -0.18992269E+04 -0.10322942E+02 0.33478989E+07
17 0.01500000 0.56770101 0.57081164 1.35101132 -0.19169918E+04 -0.10578851E+02 0.33476101E+07
18 0.01500000 0.55880304 0.56091451 1.34350841 -0.19340220E+04 -0.10838995E+02 0.33473270E+07
19 0.01500000 0.54887498 0.55141759 1.33653898 -0.19504092E+04 -0.11106005E+02 0.33470490E+07
20 0.01500000 0.53716976 0.53993119 1.33090820 -0.19662259E+04 -0.11378638E+02 0.33467752E+07
cd_cl_cs_mom_implicit1
0.128886335282393 5.139605941217318E-004 0.344932516978139
-2.214409645676409E-004 -3.982912633936039E-002 8.883873705382712E-005
************************************************************************************************************************
*** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document ***
************************************************************************************************************************
---------------------------------------------- PETSc Performance Summary: ----------------------------------------------
./a.out on a petsc-3.8.3_intel2016_O3_rel named vis01 with 8 processors, by tsltaywb Wed Mar 28 00:37:22 2018
Using Petsc Release Version 3.8.3, Dec, 09, 2017
Max Max/Min Avg Total
Time (sec): 6.984e+02 1.00000 6.984e+02
Objects: 8.800e+02 1.00114 8.791e+02
Flop: 6.874e+11 1.00870 6.847e+11 5.478e+12
Flop/sec: 9.842e+08 1.00870 9.804e+08 7.843e+09
MPI Messages: 3.604e+05 1.70000 2.916e+05 2.333e+06
MPI Message Lengths: 5.939e+09 1.42866 1.730e+04 4.036e+10
MPI Reductions: 3.933e+03 1.00000
Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
e.g., VecAXPY() for real vectors of length N --> 2N flop
and VecAXPY() for complex vectors of length N --> 8N flop
Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions --
Avg %Total Avg %Total counts %Total Avg %Total counts %Total
0: Main Stage: 6.9842e+02 100.0% 5.4778e+12 100.0% 2.333e+06 100.0% 1.730e+04 100.0% 3.932e+03 100.0%
------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
Count: number of times phase was executed
Time and Flop: Max - maximum over all processors
Ratio - ratio of maximum to minimum over all processors
Mess: number of messages sent
Avg. len: average message length (bytes)
Reduct: number of global reductions
Global: entire computation
Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
%T - percent time in this phase %F - percent flop in this phase
%M - percent messages in this phase %L - percent message lengths in this phase
%R - percent reductions in this phase
Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event Count Time (sec) Flop --- Global --- --- Stage --- Total
Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------
--- Event Stage 0: Main Stage
BuildTwoSided 4 1.0 3.0739e-0313.3 0.00e+00 0.0 6.4e+01 4.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
VecDot 76 1.0 2.7252e-01 2.1 1.92e+08 1.0 0.0e+00 0.0e+00 7.6e+01 0 0 0 0 2 0 0 0 0 2 5635
VecDotNorm2 38 1.0 2.1477e-01 2.9 1.92e+08 1.0 0.0e+00 0.0e+00 3.8e+01 0 0 0 0 1 0 0 0 0 1 7151
VecMDot 80 1.0 7.3801e-02 1.4 1.02e+08 1.0 0.0e+00 0.0e+00 8.0e+01 0 0 0 0 2 0 0 0 0 2 11020
VecNorm 2606 1.0 2.5490e+00 6.4 2.24e+09 1.0 0.0e+00 0.0e+00 2.6e+03 0 0 0 0 66 0 0 0 0 66 7019
VecScale 88 1.0 4.2443e-03 1.1 1.02e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 19161
VecCopy 7329 1.0 3.1297e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
VecSet 34131 1.0 6.2618e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
VecAXPY 8 1.0 1.4107e-03 1.0 1.85e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 10481
VecAYPX 79973 1.0 1.4727e+01 1.1 1.23e+10 1.0 0.0e+00 0.0e+00 0.0e+00 2 2 0 0 0 2 2 0 0 0 6654
VecAXPBYCZ 38852 1.0 9.4528e+00 1.1 2.29e+10 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 19331
VecWAXPY 76 1.0 2.3002e-01 1.0 1.92e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 6676
VecMAXPY 88 1.0 5.7092e-02 1.0 1.20e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 16835
VecAssemblyBegin 91 1.0 7.6395e-02 5.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.7e+02 0 0 0 0 7 0 0 0 0 7 0
VecAssemblyEnd 91 1.0 3.1996e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
VecPointwiseMult 44 1.0 1.0860e-02 1.0 5.09e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3744
VecScatterBegin 80251 1.0 2.4074e+00 1.4 0.00e+00 0.0 2.3e+06 1.7e+04 0.0e+00 0 0100100 0 0 0100100 0 0
VecScatterEnd 80251 1.0 3.0498e+01 8.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 2 0 0 0 0 0
VecSetRandom 4 1.0 1.0388e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
VecNormalize 88 1.0 2.0471e-02 1.7 3.05e+07 1.0 0.0e+00 0.0e+00 8.8e+01 0 0 0 0 2 0 0 0 0 2 11918
MatMult 60741 1.0 2.3056e+02 1.1 3.21e+11 1.0 1.8e+06 2.1e+04 0.0e+00 32 46 78 95 0 32 46 78 95 0 11048
MatMultAdd 9684 1.0 1.7790e+01 1.0 1.75e+10 1.0 2.5e+05 3.1e+03 0.0e+00 3 3 11 2 0 3 3 11 2 0 7849
MatMultTranspose 9684 1.0 2.1507e+01 1.3 1.75e+10 1.0 2.5e+05 3.1e+03 0.0e+00 3 3 11 2 0 3 3 11 2 0 6493
MatSolve 251626.5 5.8798e+00 1.0 9.06e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 12331
MatSOR 58208 1.0 3.8408e+02 1.0 2.84e+11 1.0 0.0e+00 0.0e+00 0.0e+00 54 41 0 0 0 54 41 0 0 0 5897
MatLUFactorSym 1 1.0 2.7895e-05 2.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatLUFactorNum 2 1.0 6.1062e-01 1.0 4.25e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 5550
MatILUFactorSym 1 1.0 2.5299e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatConvert 4 1.0 4.8461e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatScale 13 1.0 9.7709e-02 1.0 7.62e+07 1.0 1.2e+02 1.9e+04 0.0e+00 0 0 0 0 0 0 0 0 0 0 6219
MatResidual 9704 1.0 3.4509e+01 1.0 4.92e+10 1.0 3.0e+05 1.9e+04 0.0e+00 5 7 13 14 0 5 7 13 14 0 11318
MatAssemblyBegin 81 1.0 2.8049e-01 9.1 0.00e+00 0.0 6.0e+02 8.2e+04 5.6e+01 0 0 0 0 1 0 0 0 0 1 0
MatAssemblyEnd 81 1.0 5.1077e-01 1.2 0.00e+00 0.0 1.1e+03 4.7e+03 1.7e+02 0 0 0 0 4 0 0 0 0 4 0
MatGetRow 1386912 1.0 5.0990e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0
MatGetRowIJ 2 2.0 7.1526e-06 3.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatCreateSubMat 2 1.0 5.5003e-04 1.0 0.00e+00 0.0 4.9e+01 8.4e+01 3.6e+01 0 0 0 0 1 0 0 0 0 1 0
MatGetOrdering 2 2.0 1.8149e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatCoarsen 4 1.0 4.0846e-02 1.0 0.00e+00 0.0 9.0e+02 7.1e+03 1.4e+01 0 0 0 0 0 0 0 0 0 0 0
MatZeroEntries 4 1.0 9.5308e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatAXPY 4 1.0 3.4579e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatMatMult 4 1.0 3.3728e-01 1.0 2.02e+07 1.0 7.6e+02 9.5e+03 6.5e+01 0 0 0 0 2 0 0 0 0 2 476
MatMatMultSym 4 1.0 2.7397e-01 1.0 0.00e+00 0.0 6.4e+02 7.6e+03 5.6e+01 0 0 0 0 1 0 0 0 0 1 0
MatMatMultNum 4 1.0 6.0991e-02 1.0 2.02e+07 1.0 1.2e+02 1.9e+04 8.0e+00 0 0 0 0 0 0 0 0 0 0 2633
MatPtAP 4 1.0 2.2907e+00 1.0 5.63e+08 1.0 1.5e+03 6.8e+04 6.9e+01 0 0 0 0 2 0 0 0 0 2 1931
MatPtAPSymbolic 4 1.0 1.6347e+00 1.0 0.00e+00 0.0 7.7e+02 6.8e+04 2.9e+01 0 0 0 0 1 0 0 0 0 1 0
MatPtAPNumeric 4 1.0 6.5637e-01 1.0 5.63e+08 1.0 7.2e+02 6.8e+04 4.0e+01 0 0 0 0 1 0 0 0 0 1 6739
MatTrnMatMult 1 1.0 7.3789e-01 1.0 4.10e+07 1.0 1.4e+02 6.9e+04 1.9e+01 0 0 0 0 0 0 0 0 0 0 443
MatTrnMatMultSym 1 1.0 4.7426e-01 1.0 0.00e+00 0.0 1.2e+02 3.5e+04 1.7e+01 0 0 0 0 0 0 0 0 0 0 0
MatTrnMatMultNum 1 1.0 2.6366e-01 1.0 4.10e+07 1.0 2.0e+01 2.8e+05 2.0e+00 0 0 0 0 0 0 0 0 0 0 1240
MatGetLocalMat 14 1.0 7.4357e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
MatGetBrAoCol 12 1.0 1.6210e-02 1.1 0.00e+00 0.0 8.5e+02 5.6e+04 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
SFSetGraph 4 1.0 6.6757e-06 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
SFBcastBegin 22 1.0 4.3759e-03 2.0 0.00e+00 0.0 9.0e+02 7.1e+03 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
SFBcastEnd 22 1.0 2.5189e-03 3.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
KSPGMRESOrthog 80 1.0 1.2113e-01 1.2 2.03e+08 1.0 0.0e+00 0.0e+00 8.0e+01 0 0 0 0 2 0 0 0 0 2 13428
KSPSetUp 17 1.0 3.4320e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 1 0 0 0 0 1 0
KSPSolve 39 1.0 6.8603e+02 1.0 6.87e+11 1.0 2.3e+06 1.7e+04 3.3e+03 98100100 99 83 98100100 99 83 7984
PCGAMGGraph_AGG 4 1.0 7.0555e+00 1.0 2.02e+07 1.0 3.5e+02 8.4e+03 4.8e+01 1 0 0 0 1 1 0 0 0 1 23
PCGAMGCoarse_AGG 4 1.0 8.9008e-01 1.0 4.10e+07 1.0 1.2e+03 2.0e+04 4.9e+01 0 0 0 0 1 0 0 0 0 1 367
PCGAMGProl_AGG 4 1.0 1.3883e-01 1.0 0.00e+00 0.0 5.8e+02 8.1e+03 9.6e+01 0 0 0 0 2 0 0 0 0 2 0
PCGAMGPOpt_AGG 4 1.0 4.0805e+00 1.0 3.57e+08 1.0 2.0e+03 1.5e+04 1.9e+02 1 0 0 0 5 1 0 0 0 5 697
GAMG: createProl 4 1.0 1.2167e+01 1.0 4.18e+08 1.0 4.1e+03 1.5e+04 3.8e+02 2 0 0 0 10 2 0 0 0 10 274
Graph 8 1.0 7.0544e+00 1.0 2.02e+07 1.0 3.5e+02 8.4e+03 4.8e+01 1 0 0 0 1 1 0 0 0 1 23
MIS/Agg 4 1.0 4.0890e-02 1.0 0.00e+00 0.0 9.0e+02 7.1e+03 1.4e+01 0 0 0 0 0 0 0 0 0 0 0
SA: col data 4 1.0 2.4671e-02 1.1 0.00e+00 0.0 2.6e+02 1.6e+04 4.0e+01 0 0 0 0 1 0 0 0 0 1 0
SA: frmProl0 4 1.0 1.0792e-01 1.0 0.00e+00 0.0 3.2e+02 1.8e+03 4.0e+01 0 0 0 0 1 0 0 0 0 1 0
SA: smooth 4 1.0 3.8192e+00 1.0 2.75e+07 1.0 7.6e+02 9.5e+03 8.1e+01 1 0 0 0 2 1 0 0 0 2 57
GAMG: partLevel 4 1.0 2.2918e+00 1.0 5.63e+08 1.0 1.6e+03 6.5e+04 1.2e+02 0 0 0 0 3 0 0 0 0 3 1930
repartition 1 1.0 1.0419e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 6.0e+00 0 0 0 0 0 0 0 0 0 0 0
Invert-Sort 1 1.0 1.0705e-04 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00 0 0 0 0 0 0 0 0 0 0 0
Move A 1 1.0 3.4499e-04 1.0 0.00e+00 0.0 3.5e+01 1.1e+02 1.9e+01 0 0 0 0 0 0 0 0 0 0 0
Move P 1 1.0 3.2592e-04 1.0 0.00e+00 0.0 1.4e+01 2.6e+01 1.9e+01 0 0 0 0 0 0 0 0 0 0 0
PCSetUp 4 1.0 1.5343e+01 1.0 1.41e+09 1.0 5.7e+03 2.9e+04 5.3e+02 2 0 0 0 13 2 0 0 0 13 726
PCSetUpOnBlocks 2440 1.0 8.8707e-01 1.0 4.25e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3821
PCApply 60724 1.0 3.9030e+02 1.0 2.93e+11 1.0 0.0e+00 0.0e+00 0.0e+00 55 43 0 0 0 55 43 0 0 0 5988
------------------------------------------------------------------------------------------------------------------------
Memory usage is given in bytes:
Object Type Creations Destructions Memory Descendants' Mem.
Reports information only for process 0.
--- Event Stage 0: Main Stage
Vector 336 336 345797896 0.
Vector Scatter 67 67 27572128 0.
Matrix 118 118 2032943620 0.
Matrix Coarsen 4 4 2512 0.
Distributed Mesh 35 35 181440 0.
Index Set 137 137 33646184 0.
IS L to G Mapping 35 35 14511960 0.
Star Forest Graph 74 74 63344 0.
Discrete System 35 35 31500 0.
Krylov Solver 17 17 252912 0.
Preconditioner 13 13 13484 0.
PetscRandom 8 8 5104 0.
Viewer 1 0 0 0.
========================================================================================================================
Average time to get PetscTime(): 1.40667e-06
Average time for MPI_Barrier(): 1.81198e-06
Average time for zero size MPI_Send(): 7.51019e-06
#PETSc Option Table entries:
-log_view
-poisson_pc_gamg_agg_nsmooths 1
-poisson_pc_type gamg
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure options: --with-mpi-dir=/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64 --with-blaslapack-dir=/app/intel/compilers_and_libraries_2016.1.150/linux/mkl/lib/intel64 --download-hypre=/home/users/nus/tsltaywb/source/git.hypre_org.tar.gz --download-ml=/home/users/nus/tsltaywb/source/ml-v6.2-p4.tar.gz --with-debugging=0 --prefix=/home/users/nus/tsltaywb/lib/petsc-3.8.3_intel2016_O3_rel --with-shared-libraries=0 --known-mpi-shared-libraries=0 --with-fortran-interfaces=1 --CFLAGS="-xHost -g -O3 -openmp" --CXXFLAGS="-xHost -g -O3 -openmp" --FFLAGS="-xHost -g -O3 -openmp"
-----------------------------------------
Libraries compiled on Tue Mar 20 04:26:04 2018 on nus01
Machine characteristics: Linux-2.6.32-696.18.7.el6.x86_64-x86_64-with-redhat-6.9-Santiago
Using PETSc directory: /home/users/nus/tsltaywb/source/petsc-3.8.3
Using PETSc arch: petsc-3.8.3_intel2016_O3_rel
-----------------------------------------
Using C compiler: /app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/bin/mpicc -xHost -g -O3 -openmp ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: /app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/bin/mpif90 -xHost -g -O3 -openmp ${FOPTFLAGS} ${FFLAGS}
-----------------------------------------
Using include paths: -I/home/users/nus/tsltaywb/source/petsc-3.8.3/petsc-3.8.3_intel2016_O3_rel/include -I/home/users/nus/tsltaywb/source/petsc-3.8.3/include -I/home/users/nus/tsltaywb/source/petsc-3.8.3/include -I/home/users/nus/tsltaywb/source/petsc-3.8.3/petsc-3.8.3_intel2016_O3_rel/include -I/home/users/nus/tsltaywb/lib/petsc-3.8.3_intel2016_O3_rel/include -I/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/include
-----------------------------------------
Using C linker: /app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/bin/mpicc
Using Fortran linker: /app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/bin/mpif90
Using libraries: -Wl,-rpath,/home/users/nus/tsltaywb/source/petsc-3.8.3/petsc-3.8.3_intel2016_O3_rel/lib -L/home/users/nus/tsltaywb/source/petsc-3.8.3/petsc-3.8.3_intel2016_O3_rel/lib -lpetsc -Wl,-rpath,/home/users/nus/tsltaywb/lib/petsc-3.8.3_intel2016_O3_rel/lib -L/home/users/nus/tsltaywb/lib/petsc-3.8.3_intel2016_O3_rel/lib -Wl,-rpath,/app/intel/compilers_and_libraries_2016.1.150/linux/mkl/lib/intel64 -L/app/intel/compilers_and_libraries_2016.1.150/linux/mkl/lib/intel64 -L/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/lib/debug_mt -L/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/lib -L/app/intel/compilers_and_libraries_2016.1.150/linux/ipp/lib/intel64 -L/app/intel/compilers_and_libraries_2016.1.150/linux/compiler/lib/intel64 -L/app/intel/compilers_and_libraries_2016.1.150/linux/tbb/lib/intel64/gcc4.4 -L/app/intel/compilers_and_libraries_2016.1.150/linux/daal/lib/intel64_lin -L/app/intel/compilers_and_libraries_2016.1.150/linux/tbb/lib/intel64_lin/gcc4.4 -L/app/intel/compilers_and_libraries_2016.1.150/linux/compiler/lib/intel64_lin -L/usr/lib/gcc/x86_64-redhat-linux/4.4.7 -Wl,-rpath,/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/lib/debug_mt -Wl,-rpath,/app/intel/compilers_and_libraries_2016.1.150/linux/mpi/intel64/lib -Wl,-rpath,/opt/intel/mpi-rt/5.1/intel64/lib/debug_mt -Wl,-rpath,/opt/intel/mpi-rt/5.1/intel64/lib -Wl,-rpath,/opt/intel/mpi-rt/5.0/intel64/lib/debug_mt -Wl,-rpath,/opt/intel/mpi-rt/5.0/intel64/lib -lHYPRE -lml -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lX11 -lifport -lifcoremt -lmpicxx -ldl -lmpifort -lmpi -lmpigi -lrt -lpthread -limf -lsvml -lirng -lm -lipgo -ldecimal -liomp5 -lcilkrts -lstdc++ -lgcc_s -lirc -lirc_s -ldl
-----------------------------------------
======================================================================================
Resource Usage on 2018-03-28 00:37:23.076186:
JobId: 6701296.wlm01
Project: 11000314
Exit Status: 0
NCPUs Requested: 8 NCPUs Used: 8
CPU Time Used: 01:33:02
Memory Requested: 32gb Memory Used: 18812572kb
Vmem Used: 21312508kb
Walltime requested: 00:25:00 Walltime Used: 00:11:51
Execution Nodes Used: (vis01:ncpus=8:mem=33554432kb)
======================================================================================
More information about the petsc-users
mailing list