<html><head><meta http-equiv="Content-Type" content="text/html charset=iso-8859-1"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><div><div apple-content-edited="true"><span class="Apple-style-span" style="border-collapse: separate; border-spacing: 0px; "><span class="Apple-style-span" style="border-collapse: separate; color: rgb(0, 0, 0); font-family: Helvetica; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; font-size: medium; "><div style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><span class="Apple-style-span" style="orphans: 2; text-align: -webkit-auto; text-indent: 0px; widows: 2; -webkit-text-decorations-in-effect: none; "><div style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "></div></span><br class="Apple-interchange-newline"></div></span><br class="Apple-interchange-newline"></span><br class="Apple-interchange-newline">
</div>
<br><div><div>Le 11 déc. 2014 à 18:01, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> a écrit :</div><br class="Apple-interchange-newline"><blockquote type="cite"><br> Please run both with -ksp_monitor -ksp_type gmres and send the output<br><br> Barry<br><br><blockquote type="cite">On Dec 11, 2014, at 10:07 AM, Marc MEDALE <<a href="mailto:marc.medale@univ-amu.fr">marc.medale@univ-amu.fr</a>> wrote:<br><br>Dear Matt,<br><br>the output files obtained with PETSc-3.4p4 and 3.5p1 versions using the following command line:<br>-ksp_type preonly -pc_type lu -pc_factor_mat_solver_package mumps -mat_mumps_icntl_8 0 -ksp_monitor -ksp_view<br><br>are attached below. If skipping flops and memory usage per core, a diff between the two output files reduces to:<br>diff Output_3.4p4.txt Output_3.5p1.txt<br>14c14<br>< Matrix Object: 64 MPI processes<br>---<br><blockquote type="cite"> Mat Object: 64 MPI processes<br></blockquote>18c18<br>< total: nonzeros=481059588, allocated nonzeros=481059588<br>---<br><blockquote type="cite"> total: nonzeros=4.8106e+08, allocated nonzeros=4.8106e+08<br></blockquote>457c457<br>< INFOG(10) (total integer space store the matrix factors after factorization): 26149876 <br>---<br><blockquote type="cite"> INFOG(10) (total integer space store the matrix factors after factorization): 26136333 <br></blockquote>461c461<br>< INFOG(14) (number of memory compress after factorization): 54 <br>---<br><blockquote type="cite"> INFOG(14) (number of memory compress after factorization): 48 <br></blockquote>468,469c468,469<br>< INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 338 <br>< INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 19782 <br>---<br><blockquote type="cite"> INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 334 <br> INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 19779 <br></blockquote>472a473,478<br><blockquote type="cite"> INFOG(28) (after factorization: number of null pivots encountered): 0<br> INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 470143172<br> INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 202, 10547<br> INFOG(32) (after analysis: type of analysis done): 1<br> INFOG(33) (value used for ICNTL(8)): 0<br> INFOG(34) (exponent of the determinant if determinant is requested): 0<br></blockquote>474c480<br>< Matrix Object: 64 MPI processes<br>---<br><blockquote type="cite"> Mat Object: 64 MPI processes<br></blockquote>477c483<br>< total: nonzeros=63720324, allocated nonzeros=63720324<br>---<br><blockquote type="cite"> total: nonzeros=6.37203e+07, allocated nonzeros=6.37203e+07<br></blockquote>481c487<br>< Norme de U 1 7.37266E-02, L 1 1.00000E+00<br>---<br><blockquote type="cite">Norme de U 1 1.61172E-02, L 1 1.00000E+00<br></blockquote>483c489<br>< Temps total d execution : 198.373291969299 <br>---<br><blockquote type="cite"> Temps total d execution : 216.934082031250 <br></blockquote><br><br>Which does not reveal any striking differences, except in the L2 norm of the solution vectors.<br><br>I need assistance to help me to overcome this quite bizarre behavior.<br><br>Thank you.<br><br>Marc MEDALE<br><br>=========================================================<br>Université Aix-Marseille, Polytech'Marseille, Dépt Mécanique Energétique<br>Laboratoire IUSTI, UMR 7343 CNRS-Université Aix-Marseille<br>Technopole de Chateau-Gombert, 5 rue Enrico Fermi<br>13453 MARSEILLE, Cedex 13, FRANCE<br>---------------------------------------------------------------------------------------------------<br>Tel : +33 (0)4.91.10.69.14 ou 38<br>Fax : +33 (0)4.91.10.69.69<br>e-mail : <a href="mailto:marc.medale@univ-amu.fr">marc.medale@univ-amu.fr</a><br>=========================================================<br><br><br><br><br><br><br><br>Le 11 déc. 2014 à 11:43, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> a écrit :<br><br><blockquote type="cite">On Thu, Dec 11, 2014 at 4:38 AM, Marc MEDALE <<a href="mailto:marc.medale@univ-amu.fr">marc.medale@univ-amu.fr</a>> wrote:<br>Dear PETSC Users,<br><br>I have just updated to PETSc-3.5 my research code that uses PETSc for a while but I'm facing an astonishing difference between PETSc-3.4 to 3.5 versions when solving a very ill conditioned algebraic system with MUMPS (4.10.0 in both cases).<br><br>The only differences that arise in my fortran source code are the following:<br>Loma1-medale% diff ../version_3.5/solvEFL_MAN_SBIF.F ../version_3.4/solvEFL_MAN_SBIF.F<br>336,337d335<br>< CALL MatSetOption(MATGLOB,MAT_KEEP_NONZERO_PATTERN,<br>< & PETSC_TRUE,IER)<br>749,750c747,748<br>< CALL KSPSetTolerances(KSP1,TOL,PETSC_DEFAULT_REAL,<br>< & PETSC_DEFAULT_REAL,PETSC_DEFAULT_INTEGER,IER)<br>---<br><blockquote type="cite"> CALL KSPSetTolerances(KSP1,TOL,PETSC_DEFAULT_DOUBLE_PRECISION,<br> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,IER)<br></blockquote>909c907,908<br>< CALL KSPSetOperators(KSP1,MATGLOB,MATGLOB,IER)<br>---<br><blockquote type="cite"> CALL KSPSetOperators(KSP1,MATGLOB,MATGLOB,<br> & SAME_NONZERO_PATTERN,IER)<br></blockquote><br>When I run the corresponding program versions on 128 cores of our cluster with the same input data and the following command line arguments:<br>-ksp_type preonly -pc_type lu -pc_factor_mat_solver_package mumps -mat_mumps_icntl_8 0<br><br>I get the following outputs:<br>a) with PETSc-3.4p4:<br> L2 norm of solution vector: 7.39640E-02,<br><br>b) with PETSc-3.5p1:<br> L2 norm of solution vector: 1.61325E-02<br><br>Do I have change something else in updating my code based on KSP from PETSc-3.4 to 3.5 versions?<br>Do any default values in the PETSc-MUMPS interface have been changed from PETSc-3.4 to 3.5?<br>Any hints or suggestions are welcome to help me to recover the right results (obtained with PETSc-3.4).<br><br>Send the output from -ksp_monitor -ksp_view for both runs. I am guessing that a MUMPS default changed between versions.<br><br> Thanks,<br><br> Matt<br><br>Thank you very much.<br><br>Marc MEDALE.<br><br><br><br>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener<br></blockquote><br><Output_3.4p4.txt><Output_3.5p1.txt><br></blockquote><br></blockquote></div><br></div></body></html>