[petsc-users] Big discrepancy between machines

Timothée Nicolas timothee.nicolas at gmail.com
Mon Dec 14 01:09:46 CST 2015


Hi,

I have noticed I have a VERY big difference in behaviour between two
machines in my problem, solved with SNES. I can't explain it, because I
have tested my operators which give the same result. I also checked that
the vectors fed to the SNES are the same. The problem happens only with my
shell preconditioner. When I don't use it, and simply solve using -snes_mf,
I don't see anymore than the usual 3-4 changing digits at the end of the
residuals. However, when I use my pcshell, the results are completely
different between the two machines.

I have attached output_SuperComputer.txt and output_DesktopComputer.txt,
which correspond to the output from the exact same code and options (and of
course same input data file !). More precisely

output_SuperComputer.txt : output on a supercomputer called Helios, sorry I
don't know the exact specs.
In this case, the SNES norms are reduced successively:
0 SNES Function norm 4.867111712420e-03
1 SNES Function norm 5.632325929998e-08
2 SNES Function norm 7.427800084502e-15

output_DesktopComputer.txt : output on a Mac OS X Yosemite 3.4 GHz Intel
Core i5 16GB 1600 MHz DDr3. (the same happens on an other laptop with Mac
OS X Mavericks).
In this case, I obtain the following for the SNES norms,
while in the other, I obtain
0 SNES Function norm 4.867111713544e-03
1 SNES Function norm 1.560094052222e-03
2 SNES Function norm 1.552118650943e-03
3 SNES Function norm 1.552106297094e-03
4 SNES Function norm 1.552106277949e-03
which I can't explain, because otherwise the KSP residual (with the same
operator, which I checked) behave well.

As you can see, the first time the preconditioner is applied (DB_, DP_,
Drho_ and PS_ solves), the two outputs coincide (except for the few last
digits, up to 9 actually, which is more than I would expect), and
everything starts to diverge at the first print of the main KSP (the one
stemming from the SNES) residual norms.

Do you have an idea what may cause such a strange behaviour ?

Best

Timothee
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151214/746b143c/attachment-0001.html>
-------------- next part --------------
  0 SNES Function norm 4.867111713544e-03 
  Residual norms for DB_ solve.
  0 KSP Residual norm 2.366556734874e-07 
  1 KSP Residual norm 1.599134162118e-11 
  2 KSP Residual norm 1.379636472597e-15 
  Residual norms for DP_ solve.
  0 KSP Residual norm 4.268082421881e-09 
  1 KSP Residual norm 3.147853422474e-14 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 1.548251260747e-08 
  1 KSP Residual norm 5.430867597212e-18 
  Residual norms for PS_ solve.
  0 KSP Residual norm 4.867111515741e-03 
  1 KSP Residual norm 1.907710093690e-04 
  2 KSP Residual norm 7.658106671754e-06 
  3 KSP Residual norm 3.357005929454e-07 
  4 KSP Residual norm 1.454080306567e-08 
    0 KSP Residual norm 5.139313706047e-03 
  Residual norms for DB_ solve.
  0 KSP Residual norm 5.440635448768e-05 
  1 KSP Residual norm 5.050541808202e-09 
  2 KSP Residual norm 4.288718765754e-13 
  Residual norms for DP_ solve.
  0 KSP Residual norm 8.017413031372e-03 
  1 KSP Residual norm 1.517037052583e-08 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 3.019014439771e-01 
  1 KSP Residual norm 1.698740458365e-10 
  Residual norms for PS_ solve.
  0 KSP Residual norm 9.470339285118e-01 
  1 KSP Residual norm 3.711941719696e-02 
  2 KSP Residual norm 1.490076291921e-03 
  3 KSP Residual norm 6.531888110917e-05 
  4 KSP Residual norm 2.829271710054e-06 
    1 KSP Residual norm 1.076619979170e-05 
  Residual norms for DB_ solve.
  0 KSP Residual norm 1.455733053736e-02 
  1 KSP Residual norm 8.558119750766e-07 
  2 KSP Residual norm 6.221252391682e-11 
  Residual norms for DP_ solve.
  0 KSP Residual norm 1.904297391721e-02 
  1 KSP Residual norm 8.633819203365e-08 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 9.771004397981e-01 
  1 KSP Residual norm 3.797547005550e-10 
  Residual norms for PS_ solve.
  0 KSP Residual norm 2.100138029120e-01 
  1 KSP Residual norm 8.008499816230e-03 
  2 KSP Residual norm 3.188390248422e-04 
  3 KSP Residual norm 1.392814641696e-05 
  4 KSP Residual norm 6.017324204570e-07 
    2 KSP Residual norm 1.378553492814e-08 
  Linear solve converged due to CONVERGED_RTOL iterations 2
  1 SNES Function norm 1.560094052222e-03 
  Residual norms for DB_ solve.
  0 KSP Residual norm 1.305901856652e-09 
  1 KSP Residual norm 1.016323286480e-13 
  2 KSP Residual norm 8.697452307336e-18 
  Residual norms for DP_ solve.
  0 KSP Residual norm 4.141600765836e-05 
  1 KSP Residual norm 7.836080325484e-11 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 1.559532533569e-03 
  1 KSP Residual norm 8.775290484669e-13 
  Residual norms for PS_ solve.
  0 KSP Residual norm 7.939953978965e-08 
  1 KSP Residual norm 3.091105675950e-09 
  2 KSP Residual norm 1.281590573300e-10 
  3 KSP Residual norm 5.678754805485e-12 
  4 KSP Residual norm 2.357983561332e-13 
    0 KSP Residual norm 7.983926898450e-06 
  Residual norms for DB_ solve.
  0 KSP Residual norm 1.635766726710e-04 
  1 KSP Residual norm 1.273119897835e-08 
  2 KSP Residual norm 1.089516309227e-12 
  Residual norms for DP_ solve.
  0 KSP Residual norm 2.656200473798e-02 
  1 KSP Residual norm 5.025904537481e-08 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 1.000150398902e+00 
  1 KSP Residual norm 5.627979304770e-10 
  Residual norms for PS_ solve.
  0 KSP Residual norm 9.944957775471e-03 
  1 KSP Residual norm 3.871513600798e-04 
  2 KSP Residual norm 1.605196431847e-05 
  3 KSP Residual norm 7.112462681111e-07 
  4 KSP Residual norm 2.953388825860e-08 
    1 KSP Residual norm 2.200468803859e-10 
  Residual norms for DB_ solve.
  0 KSP Residual norm 1.307412819177e-02 
  1 KSP Residual norm 9.578697048783e-07 
  2 KSP Residual norm 8.340922260300e-11 
  Residual norms for DP_ solve.
  0 KSP Residual norm 2.721965031182e-02 
  1 KSP Residual norm 1.401958230926e-07 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 8.302667215672e-01 
  1 KSP Residual norm 1.116675867061e-09 
  Residual norms for PS_ solve.
  0 KSP Residual norm 5.598451080918e-01 
  1 KSP Residual norm 2.202634919872e-02 
  2 KSP Residual norm 9.217499213474e-04 
  3 KSP Residual norm 4.080246263216e-05 
  4 KSP Residual norm 1.686626638957e-06 
    2 KSP Residual norm 1.603543671001e-13 
  Linear solve converged due to CONVERGED_RTOL iterations 2
  2 SNES Function norm 1.552118650943e-03 
  Residual norms for DB_ solve.
  0 KSP Residual norm 2.793650509227e-15 
  1 KSP Residual norm 2.146724819226e-19 
  2 KSP Residual norm 1.937772738260e-23 
  Residual norms for DP_ solve.
  0 KSP Residual norm 4.120426859540e-05 
  1 KSP Residual norm 7.796021317362e-11 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 1.551559786739e-03 
  1 KSP Residual norm 8.730428515318e-13 
  Residual norms for PS_ solve.
  0 KSP Residual norm 1.428033305389e-12 
  1 KSP Residual norm 5.807655876136e-14 
  2 KSP Residual norm 2.674763604785e-15 
  3 KSP Residual norm 1.196747820651e-16 
  4 KSP Residual norm 5.334942027622e-18 
    0 KSP Residual norm 1.237286805551e-08 
  Residual norms for DB_ solve.
  0 KSP Residual norm 2.271987261067e-07 
  1 KSP Residual norm 1.742498472575e-11 
  2 KSP Residual norm 1.573543263716e-15 
  Residual norms for DP_ solve.
  0 KSP Residual norm 2.654741283900e-02 
  1 KSP Residual norm 5.022943319228e-08 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 9.996528036365e-01 
  1 KSP Residual norm 5.624851138340e-10 
  Residual norms for PS_ solve.
  0 KSP Residual norm 1.154281723143e-04 
  1 KSP Residual norm 4.693593763434e-06 
  2 KSP Residual norm 2.161231018351e-07 
  3 KSP Residual norm 9.671206291572e-09 
  4 KSP Residual norm 4.311554921541e-10 
    1 KSP Residual norm 3.016343311594e-14 
  Linear solve converged due to CONVERGED_RTOL iterations 1
  3 SNES Function norm 1.552106297094e-03 
  Residual norms for DB_ solve.
  0 KSP Residual norm 6.025886869455e-17 
  1 KSP Residual norm 3.683671084034e-21 
  2 KSP Residual norm 2.756059908370e-25 
  Residual norms for DP_ solve.
  0 KSP Residual norm 4.120394063681e-05 
  1 KSP Residual norm 7.795959265451e-11 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 1.551547437335e-03 
  1 KSP Residual norm 8.730359027070e-13 
  Residual norms for PS_ solve.
  0 KSP Residual norm 2.415967494271e-14 
  1 KSP Residual norm 1.029996000145e-15 
  2 KSP Residual norm 4.850784439943e-17 
  3 KSP Residual norm 2.034713843148e-18 
  4 KSP Residual norm 8.770156385840e-20 
    0 KSP Residual norm 1.917411615790e-11 
  Residual norms for DB_ solve.
  0 KSP Residual norm 3.137430130807e-06 
  1 KSP Residual norm 1.918603503891e-10 
  2 KSP Residual norm 1.435073680190e-14 
  Residual norms for DP_ solve.
  0 KSP Residual norm 2.654319443621e-02 
  1 KSP Residual norm 5.002189211387e-08 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 9.996525441184e-01 
  1 KSP Residual norm 5.623748967611e-10 
  Residual norms for PS_ solve.
  0 KSP Residual norm 1.258077372494e-03 
  1 KSP Residual norm 5.363662191604e-05 
  2 KSP Residual norm 2.525980612687e-06 
  3 KSP Residual norm 1.059553063015e-07 
  4 KSP Residual norm 4.567015737839e-09 
    1 KSP Residual norm 1.130508373028e-16 
  Linear solve converged due to CONVERGED_RTOL iterations 1
  4 SNES Function norm 1.552106277949e-03 
Nonlinear solve converged due to CONVERGED_SNORM_RELATIVE iterations 4
 Total CPU time since PetscInitialize: 1.3750E+01 
 CPU time used for SNESSolve: 1.3354E+01 
 Number of linear iterations :  6 
 Number of function evaluations : 15 
 Kinetic Energy =  8.283578E-11 
 Magnetic Energy =  2.278864E-12 
 
 
 Exiting the main MHD Loop 
 
 Deallocating remaining arrays 
 
 Destroying remaining Petsc elements 
 
************************************************************************************************************************
***             WIDEN YOUR WINDOW TO 120 CHARACTERS.  Use 'enscript -r -fCourier9' to print this document            ***
************************************************************************************************************************

---------------------------------------------- PETSc Performance Summary: ----------------------------------------------

./miips on a arch-darwin-c-debug named iMac27Nicolas.nifs.ac.jp with 1 processor, by timotheenicolas Mon Dec 14 15:55:59 2015
Using Petsc Release Version 3.6.1, Jul, 22, 2015 

                         Max       Max/Min        Avg      Total 
Time (sec):           1.382e+01      1.00000   1.382e+01
Objects:              2.590e+02      1.00000   2.590e+02
Flops:                7.271e+08      1.00000   7.271e+08  7.271e+08
Flops/sec:            5.262e+07      1.00000   5.262e+07  5.262e+07
MPI Messages:         0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Message Lengths:  0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Reductions:       0.000e+00      0.00000

Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
                            e.g., VecAXPY() for real vectors of length N --> 2N flops
                            and VecAXPY() for complex vectors of length N --> 8N flops

Summary of Stages:   ----- Time ------  ----- Flops -----  --- Messages ---  -- Message Lengths --  -- Reductions --
                        Avg     %Total     Avg     %Total   counts   %Total     Avg         %Total   counts   %Total 
 0:      Main Stage: 1.3818e+01 100.0%  7.2707e+08 100.0%  0.000e+00   0.0%  0.000e+00        0.0%  0.000e+00   0.0% 

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length (bytes)
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %F - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %F %M %L %R  %T %F %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------

--- Event Stage 0: Main Stage

SNESSolve              1 1.0 1.3345e+01 1.0 7.06e+08 1.0 0.0e+00 0.0e+00 0.0e+00 97 97  0  0  0  97 97  0  0  0    53
SNESFunctionEval      15 1.0 2.5123e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 18  0  0  0  0  18  0  0  0  0     0
SNESJacobianEval       4 1.0 3.0715e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
SNESLineSearch         4 1.0 1.3934e+00 1.0 7.14e+07 1.0 0.0e+00 0.0e+00 0.0e+00 10 10  0  0  0  10 10  0  0  0    51
VecView                2 1.0 8.8236e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecDot                 4 1.0 5.5161e-03 1.0 8.92e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   0  1  0  0  0  1617
VecMDot               86 1.0 5.5805e-02 1.0 1.32e+08 1.0 0.0e+00 0.0e+00 0.0e+00  0 18  0  0  0   0 18  0  0  0  2368
VecNorm              141 1.0 5.4996e-02 1.0 1.25e+08 1.0 0.0e+00 0.0e+00 0.0e+00  0 17  0  0  0   0 17  0  0  0  2271
VecScale             200 1.0 5.6164e-02 1.0 8.92e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0 12  0  0  0   0 12  0  0  0  1588
VecCopy              180 1.0 6.0994e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecSet               242 1.0 1.5345e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  1  0  0  0  0   1  0  0  0  0     0
VecAXPY              119 1.0 7.6371e-02 1.0 1.20e+08 1.0 0.0e+00 0.0e+00 0.0e+00  1 17  0  0  0   1 17  0  0  0  1577
VecWAXPY              17 1.0 3.4016e-02 1.0 3.01e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  4  0  0  0   0  4  0  0  0   885
VecMAXPY             130 1.0 8.4107e-02 1.0 2.01e+08 1.0 0.0e+00 0.0e+00 0.0e+00  1 28  0  0  0   1 28  0  0  0  2393
VecAssemblyBegin     245 1.0 7.4085e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecAssemblyEnd       245 1.0 2.5799e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecPointwiseMult       2 1.0 3.4161e-03 1.0 2.23e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0   653
VecScatterBegin      260 1.0 2.5876e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  2  0  0  0  0   2  0  0  0  0     0
VecReduceArith         8 1.0 8.0670e-03 1.0 1.78e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  2  0  0  0  2212
VecReduceComm          4 1.0 2.7520e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecNormalize         130 1.0 7.2848e-02 1.0 1.51e+08 1.0 0.0e+00 0.0e+00 0.0e+00  1 21  0  0  0   1 21  0  0  0  2067
MatMult MF            10 1.0 1.7229e+00 1.0 5.58e+07 1.0 0.0e+00 0.0e+00 0.0e+00 12  8  0  0  0  12  8  0  0  0    32
MatMult              110 1.0 1.1690e+01 1.0 1.06e+08 1.0 0.0e+00 0.0e+00 0.0e+00 85 15  0  0  0  85 15  0  0  0     9
MatMultAdd            40 1.0 2.7459e+00 1.0 3.35e+07 1.0 0.0e+00 0.0e+00 0.0e+00 20  5  0  0  0  20  5  0  0  0    12
MatAssemblyBegin       4 1.0 2.9000e-07 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatAssemblyEnd         4 1.0 3.0516e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
KSPGMRESOrthog        86 1.0 1.0825e-01 1.0 2.64e+08 1.0 0.0e+00 0.0e+00 0.0e+00  1 36  0  0  0   1 36  0  0  0  2441
KSPSetUp               8 1.0 2.7075e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
KSPSolve               4 1.0 1.1773e+01 1.0 6.32e+08 1.0 0.0e+00 0.0e+00 0.0e+00 85 87  0  0  0  85 87  0  0  0    54
PCSetUp                8 1.0 1.6327e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
PCApply               10 1.0 1.0665e+01 1.0 5.07e+08 1.0 0.0e+00 0.0e+00 0.0e+00 77 70  0  0  0  77 70  0  0  0    48
------------------------------------------------------------------------------------------------------------------------

Memory usage is given in bytes:

Object Type          Creations   Destructions     Memory  Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

                SNES     1              1         1324     0
      SNESLineSearch     1              1          856     0
              DMSNES     2              2         1312     0
              Vector   169            169    710843024     0
      Vector Scatter     8              8         5184     0
             MatMFFD     1              1          768     0
              Matrix     8              8        19376     0
    Distributed Mesh     8              8        39152     0
Star Forest Bipartite Graph    16             16        13328     0
     Discrete System     8              8         6720     0
           Index Set    16             16      4045944     0
   IS L to G Mapping     7              7      4037968     0
       Krylov Solver     5              5        91760     0
     DMKSP interface     1              1          640     0
      Preconditioner     5              5         4312     0
              Viewer     3              2         1520     0
========================================================================================================================
Average time to get PetscTime(): 2.82002e-08
#PETSc Option Table entries:
-DB_ksp_monitor
-DB_pc_type none
-DP_ksp_monitor
-DP_pc_type none
-Drho_ksp_monitor
-Drho_pc_type none
-PS_ksp_monitor
-PS_pc_type none
-ksp_converged_reason
-ksp_monitor
-log_summary
-mult_lu
-nts 1
-snes_converged_reason
-snes_mf
-snes_monitor
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure options: --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --with-debugging=no
-----------------------------------------
Libraries compiled on Mon Dec 14 14:57:32 2015 on iMac27Nicolas.nifs.ac.jp 
Machine characteristics: Darwin-14.5.0-x86_64-i386-64bit
Using PETSc directory: /Users/timotheenicolas/PETSC/petsc-3.6.1
Using PETSc arch: arch-darwin-c-debug
-----------------------------------------

Using C compiler: /Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/bin/mpicc  -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O  ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: /Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/bin/mpif90  -fPIC  -Wall -Wno-unused-variable -ffree-line-length-0 -Wno-unused-dummy-argument -O  ${FOPTFLAGS} ${FFLAGS} 
-----------------------------------------

Using include paths: -I/Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/include -I/Users/timotheenicolas/PETSC/petsc-3.6.1/include -I/Users/timotheenicolas/PETSC/petsc-3.6.1/include -I/Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/include -I/opt/X11/include
-----------------------------------------

Using C linker: /Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/bin/mpicc
Using Fortran linker: /Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/bin/mpif90
Using libraries: -Wl,-rpath,/Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/lib -L/Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/lib -lpetsc -Wl,-rpath,/Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/lib -L/Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/lib -lflapack -lfblas -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib -lX11 -lssl -lcrypto -Wl,-rpath,/Library/Developer/CommandLineTools/usr/lib/clang/7.0.0/lib/darwin -L/Library/Developer/CommandLineTools/usr/lib/clang/7.0.0/lib/darwin -lmpifort -lgfortran -Wl,-rpath,/usr/local/lib/gcc/x86_64-apple-darwin14.0.0/5.0.0 -L/usr/local/lib/gcc/x86_64-apple-darwin14.0.0/5.0.0 -Wl,-rpath,/usr/local/lib -L/usr/local/lib -lgfortran -lgcc_ext.10.5 -lquadmath -lm -lclang_rt.osx -lmpicxx -lc++ -Wl,-rpath,/Library/Developer/CommandLineTools/usr/bin/../lib/clang/7.0.0/lib/darwin -L/Library/Developer/CommandLineTools/usr/bin/../lib/clang/7.0.0/lib/darwin -lclang_rt.osx -Wl,-rpath,/Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/lib -L/Users/timotheenicolas/PETSC/petsc-3.6.1/arch-darwin-c-debug/lib -ldl -lmpi -lpmpi -lSystem -Wl,-rpath,/Library/Developer/CommandLineTools/usr/bin/../lib/clang/7.0.0/lib/darwin -L/Library/Developer/CommandLineTools/usr/bin/../lib/clang/7.0.0/lib/darwin -lclang_rt.osx -ldl 
-----------------------------------------
-------------- next part --------------
  0 SNES Function norm 4.867111712420e-03 
  Residual norms for DB_ solve.
  0 KSP Residual norm 2.366556734390e-07 
  1 KSP Residual norm 1.599134161658e-11 
  2 KSP Residual norm 1.379636472384e-15 
  Residual norms for DP_ solve.
  0 KSP Residual norm 4.268082420930e-09 
  1 KSP Residual norm 3.147853421751e-14 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 1.548251260747e-08 
  1 KSP Residual norm 5.430867597213e-18 
  Residual norms for PS_ solve.
  0 KSP Residual norm 4.867111514618e-03 
  1 KSP Residual norm 1.907717649118e-04 
  2 KSP Residual norm 7.658144026212e-06 
  3 KSP Residual norm 3.357066761239e-07 
  4 KSP Residual norm 1.454205241131e-08 
    0 KSP Residual norm 4.899341386700e-03 
  Residual norms for DB_ solve.
  0 KSP Residual norm 5.707119370325e-05 
  1 KSP Residual norm 5.297917198531e-09 
  2 KSP Residual norm 4.498782273058e-13 
  Residual norms for DP_ solve.
  0 KSP Residual norm 8.711633299940e-07 
  1 KSP Residual norm 6.424911624338e-12 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 3.160121194373e-06 
  1 KSP Residual norm 1.108489319887e-15 
  Residual norms for PS_ solve.
  0 KSP Residual norm 9.934191997238e-01 
  1 KSP Residual norm 3.893765235821e-02 
  2 KSP Residual norm 1.563067404860e-03 
  3 KSP Residual norm 6.851945301781e-05 
  4 KSP Residual norm 2.968105385402e-06 
    1 KSP Residual norm 1.520860261048e-07 
  Residual norms for DB_ solve.
  0 KSP Residual norm 1.029574455279e+00 
  1 KSP Residual norm 6.052311862765e-05 
  2 KSP Residual norm 4.397928200182e-09 
  Residual norms for DP_ solve.
  0 KSP Residual norm 7.778726792599e-05 
  1 KSP Residual norm 4.659723031268e-10 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 3.464866626155e-07 
  1 KSP Residual norm 4.466560238078e-16 
  Residual norms for PS_ solve.
  0 KSP Residual norm 2.922206828301e-01 
  1 KSP Residual norm 1.405945327578e-02 
  2 KSP Residual norm 7.875288422188e-04 
  3 KSP Residual norm 3.296403312814e-05 
  4 KSP Residual norm 1.378743485812e-06 
    2 KSP Residual norm 1.553474427381e-12 
  Linear solve converged due to CONVERGED_RTOL iterations 2
  1 SNES Function norm 5.632325929998e-08 
  Residual norms for DB_ solve.
  0 KSP Residual norm 1.302167182121e-09 
  1 KSP Residual norm 1.016337630022e-13 
  2 KSP Residual norm 8.676582604839e-18 
  Residual norms for DP_ solve.
  0 KSP Residual norm 6.913957320690e-11 
  1 KSP Residual norm 4.966469692746e-16 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 1.132396259485e-09 
  1 KSP Residual norm 3.945452134707e-20 
  Residual norms for PS_ solve.
  0 KSP Residual norm 5.629471166161e-08 
  1 KSP Residual norm 2.221140747995e-09 
  2 KSP Residual norm 9.103520624251e-11 
  3 KSP Residual norm 3.881846850794e-12 
  4 KSP Residual norm 1.617926366497e-13 
    0 KSP Residual norm 5.656401015719e-08 
  Residual norms for DB_ solve.
  0 KSP Residual norm 2.302174239526e-02 
  1 KSP Residual norm 1.796923989546e-06 
  2 KSP Residual norm 1.534054142323e-10 
  Residual norms for DP_ solve.
  0 KSP Residual norm 1.222421046956e-03 
  1 KSP Residual norm 8.780879980160e-09 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 2.001976946159e-02 
  1 KSP Residual norm 6.975153185817e-13 
  Residual norms for PS_ solve.
  0 KSP Residual norm 9.952378186456e-01 
  1 KSP Residual norm 3.926701064580e-02 
  2 KSP Residual norm 1.609382459345e-03 
  3 KSP Residual norm 6.862589195784e-05 
  4 KSP Residual norm 2.860278802804e-06 
    1 KSP Residual norm 2.921998658056e-12 
  Residual norms for DB_ solve.
  0 KSP Residual norm 6.653463668565e-01 
  1 KSP Residual norm 3.822008488207e-05 
  2 KSP Residual norm 2.781508132254e-09 
  Residual norms for DP_ solve.
  0 KSP Residual norm 3.871253757817e-03 
  1 KSP Residual norm 2.691108885444e-08 
  Residual norms for Drho_ solve.
  0 KSP Residual norm 9.846149611801e-03 
  1 KSP Residual norm 8.090481654281e-13 
  Residual norms for PS_ solve.
  0 KSP Residual norm 7.827483124782e-01 
  1 KSP Residual norm 3.575117584186e-02 
  2 KSP Residual norm 1.584099327764e-03 
  3 KSP Residual norm 6.603898369937e-05 
  4 KSP Residual norm 2.829819432667e-06 
    2 KSP Residual norm 1.006906168481e-16 
  Linear solve converged due to CONVERGED_RTOL iterations 2
  2 SNES Function norm 7.427800084502e-15 
Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 2
 Total CPU time since PetscInitialize: 5.2341E+00 
 CPU time used for SNESSolve: 4.7887E+00 
 Number of linear iterations :  4 
 Number of function evaluations :  9 
 Kinetic Energy =  8.283578E-11 
 Magnetic Energy =  2.278864E-12 
 
 
 Exiting the main MHD Loop 
 
 Deallocating remaining arrays 
 
 Destroying remaining Petsc elements 
 
************************************************************************************************************************
***             WIDEN YOUR WINDOW TO 120 CHARACTERS.  Use 'enscript -r -fCourier9' to print this document            ***
************************************************************************************************************************

---------------------------------------------- PETSc Performance Summary: ----------------------------------------------

./miips on a arch-linux2-c-opt named helios87 with 1 processor, by tnicolas Mon Dec 14 15:56:00 2015
Using Petsc Release Version 3.6.0, Jun, 09, 2015 

                         Max       Max/Min        Avg      Total 
Time (sec):           5.319e+00      1.00000   5.319e+00
Objects:              2.270e+02      1.00000   2.270e+02
Flops:                4.461e+08      1.00000   4.461e+08  4.461e+08
Flops/sec:            8.387e+07      1.00000   8.387e+07  8.387e+07
MPI Messages:         0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Message Lengths:  0.000e+00      0.00000   0.000e+00  0.000e+00
MPI Reductions:       0.000e+00      0.00000

Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
                            e.g., VecAXPY() for real vectors of length N --> 2N flops
                            and VecAXPY() for complex vectors of length N --> 8N flops

Summary of Stages:   ----- Time ------  ----- Flops -----  --- Messages ---  -- Message Lengths --  -- Reductions --
                        Avg     %Total     Avg     %Total   counts   %Total     Avg         %Total   counts   %Total 
 0:      Main Stage: 5.3187e+00 100.0%  4.4605e+08 100.0%  0.000e+00   0.0%  0.000e+00        0.0%  0.000e+00   0.0% 

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length (bytes)
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %F - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %F %M %L %R  %T %F %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------

--- Event Stage 0: Main Stage

SNESSolve              1 1.0 4.7775e+00 1.0 4.25e+08 1.0 0.0e+00 0.0e+00 0.0e+00 90 95  0  0  0  90 95  0  0  0    89
SNESFunctionEval       9 1.0 6.5902e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 12  0  0  0  0  12  0  0  0  0     0
SNESJacobianEval       2 1.0 4.2868e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
SNESLineSearch         2 1.0 3.1556e-01 1.0 3.57e+07 1.0 0.0e+00 0.0e+00 0.0e+00  6  8  0  0  0   6  8  0  0  0   113
VecView                2 1.0 2.0043e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecDot                 2 1.0 1.6801e-03 1.0 4.46e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   0  1  0  0  0  2655
VecMDot               52 1.0 3.4153e-02 1.0 8.20e+07 1.0 0.0e+00 0.0e+00 0.0e+00  1 18  0  0  0   1 18  0  0  0  2400
VecNorm               85 1.0 1.8394e-02 1.0 7.58e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0 17  0  0  0   0 17  0  0  0  4122
VecScale             120 1.0 2.6920e-02 1.0 5.35e+07 1.0 0.0e+00 0.0e+00 0.0e+00  1 12  0  0  0   1 12  0  0  0  1988
VecCopy              110 1.0 5.4078e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  1  0  0  0  0   1  0  0  0  0     0
VecSet               174 1.0 2.7492e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  5  0  0  0  0   5  0  0  0  0     0
VecAXPY               73 1.0 5.1098e-02 1.0 7.58e+07 1.0 0.0e+00 0.0e+00 0.0e+00  1 17  0  0  0   1 17  0  0  0  1484
VecWAXPY              11 1.0 2.1958e-02 1.0 1.90e+07 1.0 0.0e+00 0.0e+00 0.0e+00  0  4  0  0  0   0  4  0  0  0   863
VecMAXPY              78 1.0 4.1620e-02 1.0 1.24e+08 1.0 0.0e+00 0.0e+00 0.0e+00  1 28  0  0  0   1 28  0  0  0  2987
VecAssemblyBegin     149 1.0 6.8188e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecAssemblyEnd       149 1.0 4.5538e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecPointwiseMult       2 1.0 3.2470e-03 1.0 2.23e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  1  0  0  0   0  1  0  0  0   687
VecScatterBegin      158 1.0 1.5226e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  3  0  0  0  0   3  0  0  0  0     0
VecReduceArith         4 1.0 3.0560e-03 1.0 8.92e+06 1.0 0.0e+00 0.0e+00 0.0e+00  0  2  0  0  0   0  2  0  0  0  2919
VecReduceComm          2 1.0 9.0599e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecNormalize          78 1.0 2.8533e-02 1.0 9.03e+07 1.0 0.0e+00 0.0e+00 0.0e+00  1 20  0  0  0   1 20  0  0  0  3166
MatMult MF             6 1.0 4.6223e-01 1.0 3.35e+07 1.0 0.0e+00 0.0e+00 0.0e+00  9  8  0  0  0   9  8  0  0  0    72
MatMult               66 1.0 4.0039e+00 1.0 6.36e+07 1.0 0.0e+00 0.0e+00 0.0e+00 75 14  0  0  0  75 14  0  0  0    16
MatMultAdd            24 1.0 7.7939e-01 1.0 2.01e+07 1.0 0.0e+00 0.0e+00 0.0e+00 15  5  0  0  0  15  5  0  0  0    26
MatAssemblyBegin       2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatAssemblyEnd         2 1.0 4.2679e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
KSPGMRESOrthog        52 1.0 6.1561e-02 1.0 1.64e+08 1.0 0.0e+00 0.0e+00 0.0e+00  1 37  0  0  0   1 37  0  0  0  2663
KSPSetUp               6 1.0 4.1640e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  1  0  0  0  0   1  0  0  0  0     0
KSPSolve               2 1.0 4.3720e+00 1.0 3.87e+08 1.0 0.0e+00 0.0e+00 0.0e+00 82 87  0  0  0  82 87  0  0  0    89
PCSetUp                6 1.0 1.6212e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
PCApply                6 1.0 4.0072e+00 1.0 3.04e+08 1.0 0.0e+00 0.0e+00 0.0e+00 75 68  0  0  0  75 68  0  0  0    76
------------------------------------------------------------------------------------------------------------------------

Memory usage is given in bytes:

Object Type          Creations   Destructions     Memory  Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

                SNES     1              1         1332     0
      SNESLineSearch     1              1          864     0
              DMSNES     2              2         1328     0
              Vector   137            137    568058072     0
      Vector Scatter     8              8         5248     0
             MatMFFD     1              1          784     0
              Matrix     8              8        19440     0
    Distributed Mesh     8              8        39216     0
Star Forest Bipartite Graph    16             16        13584     0
     Discrete System     8              8         6784     0
           Index Set    16             16      4046072     0
   IS L to G Mapping     7              7      4038024     0
       Krylov Solver     5              5        91800     0
     DMKSP interface     1              1          648     0
      Preconditioner     5              5         4352     0
              Viewer     3              2         1536     0
========================================================================================================================
Average time to get PetscTime(): 0
#PETSc Option Table entries:
-DB_ksp_monitor
-DB_pc_type none
-DP_ksp_monitor
-DP_pc_type none
-Drho_ksp_monitor
-Drho_pc_type none
-PS_ksp_monitor
-PS_pc_type none
-ksp_converged_reason
-ksp_monitor
-log_summary
-mult_lu
-nts 1
-snes_converged_reason
-snes_mf
-snes_monitor
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure options: --prefix=/csc/softs/anl/petsc-3.6.0/intel-15.0.0.090/bullxmpi-1.2.8.2/real --with-debugging=0 --with-x=0 --with-cc=mpicc --with-fc=mpif90 --with-cxx=mpicxx --with-fortran --known-mpi-shared-libraries=1 --with-scalar-type=real --with-precision=double --CFLAGS="-g -O3 -mavx -mkl" --CXXFLAGS="-g -O3 -mavx -mkl" --FFLAGS="-g -O3 -mavx -mkl"
-----------------------------------------
Libraries compiled on Mon Sep 28 20:22:47 2015 on helios85 
Machine characteristics: Linux-2.6.32-573.1.1.el6.Bull.80.x86_64-x86_64-with-redhat-6.4-Santiago
Using PETSc directory: /csc/releases/buildlog/anl/petsc-3.6.0/intel-15.0.0.090/bullxmpi-1.2.8.2/real/petsc-3.6.0
Using PETSc arch: arch-linux2-c-opt
-----------------------------------------

Using C compiler: mpicc -g -O3 -mavx -mkl -fPIC  ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: mpif90 -g -O3 -mavx -mkl -fPIC   ${FOPTFLAGS} ${FFLAGS} 
-----------------------------------------

Using include paths: -I/csc/releases/buildlog/anl/petsc-3.6.0/intel-15.0.0.090/bullxmpi-1.2.8.2/real/petsc-3.6.0/arch-linux2-c-opt/include -I/csc/releases/buildlog/anl/petsc-3.6.0/intel-15.0.0.090/bullxmpi-1.2.8.2/real/petsc-3.6.0/include -I/csc/releases/buildlog/anl/petsc-3.6.0/intel-15.0.0.090/bullxmpi-1.2.8.2/real/petsc-3.6.0/include -I/csc/releases/buildlog/anl/petsc-3.6.0/intel-15.0.0.090/bullxmpi-1.2.8.2/real/petsc-3.6.0/arch-linux2-c-opt/include -I/opt/mpi/bullxmpi/1.2.8.2/include
-----------------------------------------

Using C linker: mpicc
Using Fortran linker: mpif90
Using libraries: -Wl,-rpath,/csc/releases/buildlog/anl/petsc-3.6.0/intel-15.0.0.090/bullxmpi-1.2.8.2/real/petsc-3.6.0/arch-linux2-c-opt/lib -L/csc/releases/buildlog/anl/petsc-3.6.0/intel-15.0.0.090/bullxmpi-1.2.8.2/real/petsc-3.6.0/arch-linux2-c-opt/lib -lpetsc -lhwloc -lxml2 -lssl -lcrypto -Wl,-rpath,/opt/mpi/bullxmpi/1.2.8.2/lib -L/opt/mpi/bullxmpi/1.2.8.2/lib -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.4.7 -L/usr/lib/gcc/x86_64-redhat-linux/4.4.7 -lmpi_f90 -lmpi_f77 -lm -lifport -lifcore -lm -lmpi_cxx -ldl -Wl,-rpath,/opt/mpi/bullxmpi/1.2.8.2/lib -L/opt/mpi/bullxmpi/1.2.8.2/lib -lmpi -lnuma -lrt -lnsl -lutil -Wl,-rpath,/opt/mpi/bullxmpi/1.2.8.2/lib -L/opt/mpi/bullxmpi/1.2.8.2/lib -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.4.7 -L/usr/lib/gcc/x86_64-redhat-linux/4.4.7 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -limf -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -lsvml -lirng -lipgo -ldecimal -lcilkrts -lstdc++ -lgcc_s -lirc -lpthread -lirc_s -Wl,-rpath,/opt/mpi/bullxmpi/1.2.8.2/lib -L/opt/mpi/bullxmpi/1.2.8.2/lib -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.4.7 -L/usr/lib/gcc/x86_64-redhat-linux/4.4.7 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/compiler/lib/intel64 -Wl,-rpath,/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -L/opt/intel/composer_xe_2015.0.090/mkl/lib/intel64 -ldl 
-----------------------------------------


More information about the petsc-users mailing list