[petsc-users] Troubles updating my code from PETSc-3.4 to 3.5 Using MUMPS for KSPSolve()
Matthew Knepley
knepley at gmail.com
Thu Dec 11 10:16:05 CST 2014
On Thu, Dec 11, 2014 at 10:07 AM, Marc MEDALE <marc.medale at univ-amu.fr>
wrote:
> Dear Matt,
>
> the output files obtained with PETSc-3.4p4 and 3.5p1 versions using the
> following command line:
> -ksp_type preonly -pc_type lu -pc_factor_mat_solver_package mumps
> -mat_mumps_icntl_8 0 -ksp_monitor -ksp_view
>
> are attached below. If skipping flops and memory usage per core, a diff
> between the two output files reduces to:
> diff Output_3.4p4.txt Output_3.5p1.txt
> 14c14
> < Matrix Object: 64 MPI processes
> ---
> > Mat Object: 64 MPI processes
> 18c18
> < total: nonzeros=481059588, allocated nonzeros=481059588
> ---
> > total: nonzeros=4.8106e+08, allocated nonzeros=4.8106e+08
> 457c457
> < INFOG(10) (total integer space store the matrix factors
> after factorization): 26149876
> ---
> > INFOG(10) (total integer space store the matrix factors
> after factorization): 26136333
> 461c461
> < INFOG(14) (number of memory compress after factorization):
> 54
> ---
> > INFOG(14) (number of memory compress after factorization):
> 48
> 468,469c468,469
> < INFOG(21) (size in MB of memory effectively used during
> factorization - value on the most memory consuming processor): 338
> < INFOG(22) (size in MB of memory effectively used during
> factorization - sum over all processors): 19782
> ---
> > INFOG(21) (size in MB of memory effectively used during
> factorization - value on the most memory consuming processor): 334
> > INFOG(22) (size in MB of memory effectively used during
> factorization - sum over all processors): 19779
> 472a473,478
> > INFOG(28) (after factorization: number of null pivots
> encountered): 0
> > INFOG(29) (after factorization: effective number of
> entries in the factors (sum over all processors)): 470143172
> > INFOG(30, 31) (after solution: size in Mbytes of memory
> used during solution phase): 202, 10547
> > INFOG(32) (after analysis: type of analysis done): 1
> > INFOG(33) (value used for ICNTL(8)): 0
> > INFOG(34) (exponent of the determinant if determinant is
> requested): 0
> 474c480
> < Matrix Object: 64 MPI processes
> ---
> > Mat Object: 64 MPI processes
> 477c483
> < total: nonzeros=63720324, allocated nonzeros=63720324
> ---
> > total: nonzeros=6.37203e+07, allocated nonzeros=6.37203e+07
> 481c487
> < Norme de U 1 7.37266E-02, L 1 1.00000E+00
> ---
> > Norme de U 1 1.61172E-02, L 1 1.00000E+00
> 483c489
> < Temps total d execution : 198.373291969299
> ---
> > Temps total d execution : 216.934082031250
>
>
These appear to be two different matrices with the same, or about the same
structure. The
factorization is proceeding differently, I am guessing due to different
pivots.
Can you write the matrix to a binary file using MatView() and load it into
both versions so that
we are certain it is the same?
Thanks,
Matt
> Which does not reveal any striking differences, except in the L2 norm of
> the solution vectors.
>
> I need assistance to help me to overcome this quite bizarre behavior.
>
> Thank you.
>
> Marc MEDALE
>
> =========================================================
> Université Aix-Marseille, Polytech'Marseille, Dépt Mécanique Energétique
> Laboratoire IUSTI, UMR 7343 CNRS-Université Aix-Marseille
> Technopole de Chateau-Gombert, 5 rue Enrico Fermi
> 13453 MARSEILLE, Cedex 13, FRANCE
>
> ---------------------------------------------------------------------------------------------------
> Tel : +33 (0)4.91.10.69.14 ou 38
> Fax : +33 (0)4.91.10.69.69
> e-mail : marc.medale at univ-amu.fr
> =========================================================
>
>
>
>
>
>
>
>
>
> Le 11 déc. 2014 à 11:43, Matthew Knepley <knepley at gmail.com> a écrit :
>
> On Thu, Dec 11, 2014 at 4:38 AM, Marc MEDALE <marc.medale at univ-amu.fr>
> wrote:
>
>> Dear PETSC Users,
>>
>> I have just updated to PETSc-3.5 my research code that uses PETSc for a
>> while but I'm facing an astonishing difference between PETSc-3.4 to 3.5
>> versions when solving a very ill conditioned algebraic system with MUMPS
>> (4.10.0 in both cases).
>>
>> The only differences that arise in my fortran source code are the
>> following:
>> Loma1-medale% diff ../version_3.5/solvEFL_MAN_SBIF.F
>> ../version_3.4/solvEFL_MAN_SBIF.F
>> 336,337d335
>> < CALL MatSetOption(MATGLOB,MAT_KEEP_NONZERO_PATTERN,
>> < & PETSC_TRUE,IER)
>> 749,750c747,748
>> < CALL KSPSetTolerances(KSP1,TOL,PETSC_DEFAULT_REAL,
>> < & PETSC_DEFAULT_REAL,PETSC_DEFAULT_INTEGER,IER)
>> ---
>> > CALL KSPSetTolerances(KSP1,TOL,PETSC_DEFAULT_DOUBLE_PRECISION,
>> > & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,IER)
>> 909c907,908
>> < CALL KSPSetOperators(KSP1,MATGLOB,MATGLOB,IER)
>> ---
>> > CALL KSPSetOperators(KSP1,MATGLOB,MATGLOB,
>> > & SAME_NONZERO_PATTERN,IER)
>>
>> When I run the corresponding program versions on 128 cores of our cluster
>> with the same input data and the following command line arguments:
>> -ksp_type preonly -pc_type lu -pc_factor_mat_solver_package mumps
>> -mat_mumps_icntl_8 0
>>
>> I get the following outputs:
>> a) with PETSc-3.4p4:
>> L2 norm of solution vector: 7.39640E-02,
>>
>> b) with PETSc-3.5p1:
>> L2 norm of solution vector: 1.61325E-02
>>
>> Do I have change something else in updating my code based on KSP from
>> PETSc-3.4 to 3.5 versions?
>> Do any default values in the PETSc-MUMPS interface have been changed from
>> PETSc-3.4 to 3.5?
>> Any hints or suggestions are welcome to help me to recover the right
>> results (obtained with PETSc-3.4).
>>
>
> Send the output from -ksp_monitor -ksp_view for both runs. I am guessing
> that a MUMPS default changed between versions.
>
> Thanks,
>
> Matt
>
>
>> Thank you very much.
>>
>> Marc MEDALE.
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141211/f11fc39b/attachment.html>
More information about the petsc-users
mailing list