[petsc-users] About MatMumpsSetIcntl function
김성익
ksi2443 at gmail.com
Wed Nov 30 09:01:30 CST 2022
Following your comments my options are as below.
mpirun -np 3 ./app -mpi_linear_solver_server -mpi_linear_solver_server_view
-pc_type mpi -ksp_type preonly -mpi_ksp_monitor -mpi_ksp_converged_reason
-mpi_pc_type lu -pc_mpi_always_use_server -ksp_monitor_true_residual
-ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5
and the outputs are as below.
Residual norms for mpi_ solve.
0 KSP Residual norm 3.941360585078e-05
1 KSP Residual norm 3.325311951792e-20
Linear mpi_ solve converged due to CONVERGED_RTOL iterations 1
0 KSP none resid norm 2.000000000000e+00 true resid norm
5.241047207555e-16 ||r(i)||/||b|| 2.620523603778e-16
1 KSP none resid norm 5.241047207555e-16 true resid norm
5.241047207555e-16 ||r(i)||/||b|| 2.620523603778e-16
KSP Object: 1 MPI process
type: preonly
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using NONE norm type for convergence test
PC Object: 1 MPI process
type: mpi
Size of MPI communicator used for MPI parallel KSP solve 1
Desired minimum number of nonzeros per rank for MPI parallel solve 10000
*** Use -mpi_ksp_view to see the MPI KSP parameters ***
*** Use -mpi_linear_solver_server_view to statistics on all the solves ***
linear system matrix = precond matrix:
Mat Object: 1 MPI process
type: seqaij
rows=24, cols=24
total: nonzeros=576, allocated nonzeros=840
total number of mallocs used during MatSetValues calls=48
using I-node routines: found 5 nodes, limit used is 5
Solving Time : 0.003143sec
Assemble Time : 0.000994sec
TIME : 1.000000, TIME_STEP : 1.000000, ITER : 2, RESIDUAL :
5.889432e-09
TIME0 : 1.000000
MPI linear solver server statistics:
Ranks KSPSolve()s Mats KSPs Avg. Size Avg.
Its
1 1 1 1 24 1
WARNING! There are options you set that were not used!
WARNING! could be spelling mistake, etc!
There is one unused database option. It is:
Option left: name:-mat_mumps_icntl_7 value: 5
2022년 11월 30일 (수) 오후 11:54, Matthew Knepley <knepley at gmail.com>님이 작성:
> On Wed, Nov 30, 2022 at 9:31 AM 김성익 <ksi2443 at gmail.com> wrote:
>
>> After folloing the comment, ./app -pc_type lu -ksp_type preonly
>> -ksp_monitor_true_residual -ksp_converged_reason -ksp_view
>> -mat_mumps_icntl_7 5
>>
>
> Okay, you can see that it is using METIS:
>
> INFOG(7) (ordering option effectively used after analysis): 5
>
> It looks like the server stuff was not seeing the option. Put it back in
> and send the output.
>
> Thanks,
>
> Matt
>
> The outputs are as below.
>>
>> 0 KSP none resid norm 2.000000000000e+00 true resid norm
>> 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16
>> 1 KSP none resid norm 4.241815708566e-16 true resid norm
>> 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16
>> Linear solve converged due to CONVERGED_ITS iterations 1
>> KSP Object: 1 MPI process
>> type: preonly
>> maximum iterations=10000, initial guess is zero
>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
>> left preconditioning
>> using NONE norm type for convergence test
>> PC Object: 1 MPI process
>> type: lu
>> out-of-place factorization
>> tolerance for zero pivot 2.22045e-14
>> matrix ordering: external
>> factor fill ratio given 0., needed 0.
>> Factored matrix follows:
>> Mat Object: 1 MPI process
>> type: mumps
>> rows=24, cols=24
>> package used to perform factorization: mumps
>> total: nonzeros=576, allocated nonzeros=576
>> MUMPS run parameters:
>> Use -ksp_view ::ascii_info_detail to display information
>> for all processes
>> RINFOG(1) (global estimated flops for the elimination after
>> analysis): 8924.
>> RINFOG(2) (global estimated flops for the assembly after
>> factorization): 0.
>> RINFOG(3) (global estimated flops for the elimination after
>> factorization): 8924.
>> (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant):
>> (0.,0.)*(2^0)
>> INFOG(3) (estimated real workspace for factors on all
>> processors after analysis): 576
>> INFOG(4) (estimated integer workspace for factors on all
>> processors after analysis): 68
>> INFOG(5) (estimated maximum front size in the complete
>> tree): 24
>> INFOG(6) (number of nodes in the complete tree): 1
>> INFOG(7) (ordering option effectively used after analysis):
>> 5
>> INFOG(8) (structural symmetry in percent of the permuted
>> matrix after analysis): 100
>> INFOG(9) (total real/complex workspace to store the matrix
>> factors after factorization): 576
>> INFOG(10) (total integer space store the matrix factors
>> after factorization): 68
>> INFOG(11) (order of largest frontal matrix after
>> factorization): 24
>> INFOG(12) (number of off-diagonal pivots): 0
>> INFOG(13) (number of delayed pivots after factorization): 0
>> INFOG(14) (number of memory compress after factorization): 0
>> INFOG(15) (number of steps of iterative refinement after
>> solution): 0
>> INFOG(16) (estimated size (in MB) of all MUMPS internal
>> data for factorization after analysis: value on the most memory consuming
>> processor): 0
>> INFOG(17) (estimated size of all MUMPS internal data for
>> factorization after analysis: sum over all processors): 0
>> INFOG(18) (size of all MUMPS internal data allocated during
>> factorization: value on the most memory consuming processor): 0
>> INFOG(19) (size of all MUMPS internal data allocated during
>> factorization: sum over all processors): 0
>> INFOG(20) (estimated number of entries in the factors): 576
>> INFOG(21) (size in MB of memory effectively used during
>> factorization - value on the most memory consuming processor): 0
>> INFOG(22) (size in MB of memory effectively used during
>> factorization - sum over all processors): 0
>> INFOG(23) (after analysis: value of ICNTL(6) effectively
>> used): 0
>> INFOG(24) (after analysis: value of ICNTL(12) effectively
>> used): 1
>> INFOG(25) (after factorization: number of pivots modified
>> by static pivoting): 0
>> INFOG(28) (after factorization: number of null pivots
>> encountered): 0
>> INFOG(29) (after factorization: effective number of entries
>> in the factors (sum over all processors)): 576
>> INFOG(30, 31) (after solution: size in Mbytes of memory
>> used during solution phase): 0, 0
>> INFOG(32) (after analysis: type of analysis done): 1
>> INFOG(33) (value used for ICNTL(8)): 7
>> INFOG(34) (exponent of the determinant if determinant is
>> requested): 0
>> INFOG(35) (after factorization: number of entries taking
>> into account BLR factor compression - sum over all processors): 576
>> INFOG(36) (after analysis: estimated size of all MUMPS
>> internal data for running BLR in-core - value on the most memory consuming
>> processor): 0
>> INFOG(37) (after analysis: estimated size of all MUMPS
>> internal data for running BLR in-core - sum over all processors): 0
>> INFOG(38) (after analysis: estimated size of all MUMPS
>> internal data for running BLR out-of-core - value on the most memory
>> consuming processor): 0
>> INFOG(39) (after analysis: estimated size of all MUMPS
>> internal data for running BLR out-of-core - sum over all processors): 0
>> linear system matrix = precond matrix:
>> Mat Object: 1 MPI process
>> type: seqaij
>> rows=24, cols=24
>> total: nonzeros=576, allocated nonzeros=840
>> total number of mallocs used during MatSetValues calls=48
>> using I-node routines: found 5 nodes, limit used is 5
>>
>>
>>
>> 2022년 11월 30일 (수) 오후 11:26, Matthew Knepley <knepley at gmail.com>님이 작성:
>>
>>> On Wed, Nov 30, 2022 at 9:20 AM 김성익 <ksi2443 at gmail.com> wrote:
>>>
>>>> In my code there are below.
>>>> PetscCall(KSPCreate(PETSC_COMM_WORLD, &ksp));
>>>> PetscCall(KSPSetOperators(ksp, xGK, xGK));
>>>> PetscCall(KSPGetPC(ksp, &pc));
>>>> PetscCall(PCSetType(pc, PCLU));
>>>> PetscCall(PCFactorSetMatSolverType(pc, MATSOLVERMUMPS));
>>>> PetscCall(KSPSetFromOptions(ksp));
>>>>
>>>> and my runtime options are as below.
>>>> mpirun -np 3 ./app -mpi_linear_solver_server
>>>> -mpi_linear_solver_server_view -pc_type mpi -ksp_type preonly
>>>> -mpi_ksp_monitor -mpi_ksp_converged_reason -mpi_pc_type lu
>>>> -pc_mpi_always_use_server -mat_mumps_icntl_7 5
>>>>
>>>
>>> 1) Get rid of the all server stuff until we see what is wrong with your
>>> code
>>>
>>> 2) Always run in serial until it works
>>>
>>> ./app -pc_type lu -ksp_type preonly -ksp_monitor_true_residual
>>> -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5
>>>
>>> Send the output so we can see what the solver is.
>>>
>>> Thanks,
>>>
>>> Matt
>>>
>>> 2022년 11월 30일 (수) 오후 11:16, Matthew Knepley <knepley at gmail.com>님이 작성:
>>>>
>>>>> On Wed, Nov 30, 2022 at 9:10 AM 김성익 <ksi2443 at gmail.com> wrote:
>>>>>
>>>>>> When I adopt icntl by using option, the outputs are as below.
>>>>>>
>>>>>> WARNING! There are options you set that were not used!
>>>>>> WARNING! could be spelling mistake, etc!
>>>>>> There is one unused database option. It is:
>>>>>> Option left: name:-mat_mumps_icntl_7 value: 5
>>>>>>
>>>>>> Is it work??
>>>>>>
>>>>>
>>>>> Are you calling KSPSetFromOptions() after the PC is created?
>>>>>
>>>>> -pc_type lu -pc_factor_mat_solver_type mumps -mat_mumps_icntl_7 3
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Matt
>>>>>
>>>>>
>>>>>> Thanks,
>>>>>> Hyung Kim
>>>>>>
>>>>>> 2022년 11월 30일 (수) 오후 11:04, Matthew Knepley <knepley at gmail.com>님이 작성:
>>>>>>
>>>>>>> On Wed, Nov 30, 2022 at 8:58 AM 김성익 <ksi2443 at gmail.com> wrote:
>>>>>>>
>>>>>>>> I'm working on FEM.
>>>>>>>> When I used mumps alone, I fount it efficient to use mumps with
>>>>>>>> metis.
>>>>>>>> So my purpose is using MUMPSsolver with METIS.
>>>>>>>>
>>>>>>>> I tried to set metis (by icntl_7 : 5) after global matrix assembly
>>>>>>>> and just before kspsolve.
>>>>>>>> However there is error because of 'pcfactorgetmatrix' and
>>>>>>>> 'matmumpsseticntl'.
>>>>>>>>
>>>>>>>> How can I fix this?
>>>>>>>>
>>>>>>>
>>>>>>> Give the Icntrl as an option.
>>>>>>>
>>>>>>> Thanks,
>>>>>>>
>>>>>>> Matt
>>>>>>>
>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Hyung Kim
>>>>>>>>
>>>>>>>> 2022년 11월 30일 (수) 오후 10:44, Matthew Knepley <knepley at gmail.com>님이
>>>>>>>> 작성:
>>>>>>>>
>>>>>>>>> On Wed, Nov 30, 2022 at 8:40 AM 김성익 <ksi2443 at gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Following your comments,
>>>>>>>>>>
>>>>>>>>>> After matrix assembly end,
>>>>>>>>>> PetscCall(KSPGetPC(ksp,&pc));
>>>>>>>>>> PetscCall(KSPSetFromOptions(ksp));
>>>>>>>>>> PetscCall(KSPSetUp(ksp));
>>>>>>>>>> PetscCall(PCFactorGetMatrix(pc,&xGK));
>>>>>>>>>>
>>>>>>>>>> However there is another error as below.
>>>>>>>>>> [0]PETSC ERROR: Object is in wrong state
>>>>>>>>>> [0]PETSC ERROR: Not for factored matrix
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>> The error message is telling you that you cannot alter values in
>>>>>>>>> the factored matrix. This is because
>>>>>>>>> the direct solvers use their own internal storage formats which we
>>>>>>>>> cannot alter, and you should probably
>>>>>>>>> not alter either.
>>>>>>>>>
>>>>>>>>> What are you trying to do?
>>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>>
>>>>>>>>> Matt
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble
>>>>>>>>>> shooting.
>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown
>>>>>>>>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by
>>>>>>>>>> ksi2443 Wed Nov 30 05:37:52 2022
>>>>>>>>>> [0]PETSC ERROR: Configure options -download-mumps
>>>>>>>>>> -download-scalapack -download-parmetis -download-metis
>>>>>>>>>> [0]PETSC ERROR: #1 MatZeroEntries() at
>>>>>>>>>> /home/ksi2443/petsc/src/mat/interface/matrix.c:6024
>>>>>>>>>> [0]PETSC ERROR: #2 main() at
>>>>>>>>>> /home/ksi2443/Downloads/coding/a1.c:339
>>>>>>>>>> [0]PETSC ERROR: No PETSc Option Table entries
>>>>>>>>>>
>>>>>>>>>> How can I fix this?
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Thanks,
>>>>>>>>>> Hyung Kim
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> 2022년 11월 30일 (수) 오후 4:18, Jose E. Roman <jroman at dsic.upv.es>님이
>>>>>>>>>> 작성:
>>>>>>>>>>
>>>>>>>>>>> You have to call PCFactorGetMatrix() first. See any of the
>>>>>>>>>>> examples that use MatMumpsSetIcntl(), for instance
>>>>>>>>>>> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html
>>>>>>>>>>>
>>>>>>>>>>> Jose
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> > El 30 nov 2022, a las 6:52, 김성익 <ksi2443 at gmail.com> escribió:
>>>>>>>>>>> >
>>>>>>>>>>> > Hello,
>>>>>>>>>>> >
>>>>>>>>>>> >
>>>>>>>>>>> > I tried to adopt METIS option in MUMPS by using
>>>>>>>>>>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));'
>>>>>>>>>>> >
>>>>>>>>>>> > However, there is an error as follows
>>>>>>>>>>> >
>>>>>>>>>>> > [0]PETSC ERROR: Object is in wrong state
>>>>>>>>>>> > [0]PETSC ERROR: Only for factored matrix
>>>>>>>>>>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for
>>>>>>>>>>> trouble shooting.
>>>>>>>>>>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown
>>>>>>>>>>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by
>>>>>>>>>>> ksi2443 Tue Nov 29 21:12:41 2022
>>>>>>>>>>> > [0]PETSC ERROR: Configure options -download-mumps
>>>>>>>>>>> -download-scalapack -download-parmetis -download-metis
>>>>>>>>>>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at
>>>>>>>>>>> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478
>>>>>>>>>>> > [0]PETSC ERROR: #2 main() at
>>>>>>>>>>> /home/ksi2443/Downloads/coding/a1.c:149
>>>>>>>>>>> > [0]PETSC ERROR: No PETSc Option Table entries
>>>>>>>>>>> >
>>>>>>>>>>> > How can I fix this error?
>>>>>>>>>>> >
>>>>>>>>>>> > Thank you for your help.
>>>>>>>>>>> >
>>>>>>>>>>> >
>>>>>>>>>>> > Hyung Kim
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>>> experiments lead.
>>>>>>>>> -- Norbert Wiener
>>>>>>>>>
>>>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> What most experimenters take for granted before they begin their
>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>> experiments lead.
>>>>>>> -- Norbert Wiener
>>>>>>>
>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>
>>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin their
>>>>> experiments is infinitely more interesting than any results to which their
>>>>> experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>
>>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221201/2a066dc9/attachment-0001.html>
More information about the petsc-users
mailing list