[petsc-users] DMMG KSP Solve time with random initialization
khalid ashraf
khalid_eee at yahoo.com
Tue Apr 12 20:33:23 CDT 2011
Hi,
I use the DMMG to solve the Ax=b. At the initialization part, I either assign a
predetermined value to the vectors or a random value as shown in the code below.
With the same system size, and no of processors, the random initialization takes
significantly more time than the predetermined value. I am attaching the laog
summary in both cases. Could you please suggest why the time requirement is so
huge (specially KSP Solve) in the random initialization and how I can improve it
?
Thanks in advance.
###Code without random value assignment to vectors:
u_localptr[k][j][i] = 0.7e-0;
v_localptr[k][j][i] = 0.81e-0;
w_localptr[k][j][i] = -54e-1;
###Code with random value assignment to vectors:
/* PetscRandomCreate(PETSC_COMM_WORLD,&pRandom);
PetscRandomSetFromOptions(pRandom);
PetscRandomSetType(pRandom,PETSCRAND);
PetscRandomSetInterval(pRandom,0.1e-8,1.0e-8);
VecSetRandom(u,pRandom);
PetscRandomSetInterval(pRandom,-1.e-8,-0.1e-8);
VecSetRandom(v,pRandom);
//VecSetRandom(w,pRandom);
PetscRandomDestroy(pRandom);*/
###log_summary without random value assignment to vectors:
Max Max/Min Avg Total
Time (sec): 6.210e-01 1.00071 6.208e-01
Objects: 1.060e+02 1.00000 1.060e+02
Flops: 5.325e+04 1.00000 5.325e+04 1.065e+05
Flops/sec: 8.581e+04 1.00071 8.578e+04 1.716e+05
Memory: 1.412e+06 1.00582 2.815e+06
MPI Messages: 7.600e+01 1.00000 7.600e+01 1.520e+02
MPI Message Lengths: 3.078e+05 1.00000 4.051e+03 6.157e+05
MPI Reductions: 1.250e+02 1.00000
VecView 16 1.0 1.9195e-01 1.0 0.00e+00 0.0 3.6e+01 8.2e+03 7.0e+00
30 0 24 48 6 30 0 24 48 8 0
VecNorm 4 1.0 5.9933e-05 1.4 1.64e+04 1.0 0.0e+00 0.0e+00 4.0e+00
0 31 0 0 3 0 31 0 0 4 547
VecScale 9 1.0 4.0106e-06 1.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00
0 0 0 0 0 0 0 0 0 0 0
VecCopy 12 1.0 5.3243e-05 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00
0 0 0 0 0 0 0 0 0 0 0
VecSet 7 1.0 3.0005e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00
0 0 0 0 0 0 0 0 0 0 0
VecAXPY 9 1.0 1.3322e-04 1.3 3.69e+04 1.0 0.0e+00 0.0e+00 0.0e+00
0 69 0 0 0 0 69 0 0 0 553
VecScatterBegin 53 1.0 1.1030e-03 1.1 0.00e+00 0.0 7.4e+01 4.1e+03 0.0e+00
0 0 49 49 0 0 0 49 49 0 0
VecScatterEnd 53 1.0 3.6766e-0310.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00
0 0 0 0 0 0 0 0 0 0 0
MatAssemblyBegin 2 1.0 2.6521e-04 2.8 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00
0 0 0 0 3 0 0 0 0 4 0
MatAssemblyEnd 2 1.0 1.0279e-03 1.0 0.00e+00 0.0 4.0e+00 1.0e+03 1.1e+01
0 0 3 1 9 0 0 3 1 12 0
KSPSetup 2 1.0 5.8801e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00
0 0 0 0 0 0 0 0 0 0 0
KSPSolve 4 1.0 4.1139e-03 1.0 1.64e+04 1.0 0.0e+00 0.0e+00 1.0e+01
1 31 0 0 8 1 31 0 0 11 8
PCSetUp 1 1.0 3.5669e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00
1 0 0 0 2 1 0 0 0 2 0
-----------------------------------------------------------------------------------------------------------------------
###log_summary with random initialization to vectors:
Time (sec): 4.456e+01 1.00002 4.456e+01
Objects: 1.690e+02 1.00000 1.690e+02
Flops: 1.086e+10 1.00000 1.086e+10 2.172e+10
Flops/sec: 2.437e+08 1.00002 2.437e+08 4.875e+08
Memory: 2.709e+06 1.00302 5.410e+06
MPI Messages: 8.141e+04 1.00000 8.141e+04 1.628e+05
MPI Message Lengths: 3.335e+08 1.00000 4.096e+03 6.669e+08
MPI Reductions: 4.028e+05 1.00000
VecView 16 1.0 2.0461e-01 1.0 0.00e+00 0.0 3.6e+01 8.2e+03 7.0e+00
0 0 0 0 0 0 0 0 0 0 0
VecMDot 80000 1.0 2.7737e+00 1.0 2.70e+09 1.0 0.0e+00 0.0e+00 8.0e+04
6 25 0 0 20 6 25 0 0 20 1948
VecNorm 121336 1.0 3.8669e+00 1.0 4.97e+08 1.0 0.0e+00 0.0e+00 1.2e+05
9 5 0 0 30 9 5 0 0 30 257
VecScale 121345 1.0 8.6525e-01 1.0 2.48e+08 1.0 0.0e+00 0.0e+00 0.0e+00
2 2 0 0 0 2 2 0 0 0 574
VecCopy 40012 1.0 9.5324e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00
0 0 0 0 0 0 0 0 0 0 0
VecSet 201343 1.0 3.3050e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00
1 0 0 0 0 1 0 0 0 0 0
VecAXPY 41345 1.0 4.0391e-01 1.0 1.69e+08 1.0 0.0e+00 0.0e+00 0.0e+00
1 2 0 0 0 1 2 0 0 0 839
VecWAXPY 1336 1.0 1.5288e-02 1.0 2.74e+06 1.0 0.0e+00 0.0e+00 0.0e+00
0 0 0 0 0 0 0 0 0 0 358
VecMAXPY 121336 1.0 4.6148e+00 1.0 3.03e+09 1.0 0.0e+00 0.0e+00 0.0e+00
10 28 0 0 0 10 28 0 0 0 1313
VecScatterBegin 81389 1.0 6.2763e-01 1.0 0.00e+00 0.0 1.6e+05 4.1e+03 0.0e+00
1 0100100 0 1 0100100 0 0
VecScatterEnd 81389 1.0 7.0998e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00
2 0 0 0 0 2 0 0 0 0 0
VecSetRandom 2 1.0 2.7497e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00
0 0 0 0 0 0 0 0 0 0 0
VecNormalize 80000 1.0 3.2196e+00 1.0 4.92e+08 1.0 0.0e+00 0.0e+00 8.0e+04
7 5 0 0 20 7 5 0 0 20 305
MatMult 81336 1.0 1.8218e+01 1.0 2.17e+09 1.0 1.6e+05 4.1e+03 0.0e+00
41 20100100 0 41 20100100 0 238
MatSolve 80000 1.0 9.1123e+00 1.0 2.05e+09 1.0 0.0e+00 0.0e+00 0.0e+00
20 19 0 0 0 20 19 0 0 0 450
MatLUFactorNum 1 1.0 7.6804e-04 1.0 3.74e+04 1.0 0.0e+00 0.0e+00 0.0e+00
0 0 0 0 0 0 0 0 0 0 97
MatILUFactorSym 1 1.0 7.0408e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00
0 0 0 0 0 0 0 0 0 0 0
MatAssemblyBegin 2 1.0 5.7212e-04 8.6 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00
0 0 0 0 0 0 0 0 0 0 0
MatAssemblyEnd 2 1.0 1.1452e-03 1.0 0.00e+00 0.0 4.0e+00 1.0e+03 1.1e+01
0 0 0 0 0 0 0 0 0 0 0
MatGetRowIJ 1 1.0 1.0453e-06 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00
0 0 0 0 0 0 0 0 0 0 0
MatGetOrdering 1 1.0 5.4405e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00
0 0 0 0 0 0 0 0 0 0 0
KSPGMRESOrthog 80000 1.0 6.8689e+00 1.0 5.40e+09 1.0 0.0e+00 0.0e+00 8.0e+04
15 50 0 0 20 15 50 0 0 20 1573
KSPSetup 3 1.0 5.9501e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00
0 0 0 0 0 0 0 0 0 0 0
KSPSolve 4 1.0 4.3836e+01 1.0 1.09e+10 1.0 1.6e+05 4.1e+03 4.0e+05
98100100100100 98100100100100 496
PCSetUp 2 1.0 5.8231e-03 1.0 3.74e+04 1.0 0.0e+00 0.0e+00 7.0e+00
0 0 0 0 0 0 0 0 0 0 13
PCSetUpOnBlocks 40000 1.0 2.4762e-02 1.0 3.74e+04 1.0 0.0e+00 0.0e+00 5.0e+00
0 0 0 0 0 0 0 0 0 0 3
PCApply 40000 1.0 2.6536e+01 1.0 4.26e+09 1.0 8.0e+04 4.1e+03 3.2e+05
60 39 49 49 79 60 39 49 49 79 321
------------------------------------------------------------------------------------------------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110412/11af95ce/attachment.htm>
More information about the petsc-users
mailing list