<div dir="ltr">Following your comments my options are as below.<div>mpirun -np 3 ./app -mpi_linear_solver_server -mpi_linear_solver_server_view -pc_type mpi -ksp_type preonly -mpi_ksp_monitor -mpi_ksp_converged_reason -mpi_pc_type lu -pc_mpi_always_use_server -ksp_monitor_true_residual -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5<br></div><div><br></div><div>and the outputs are as below.</div><div> Residual norms for mpi_ solve.<br> 0 KSP Residual norm 3.941360585078e-05 <br> 1 KSP Residual norm 3.325311951792e-20 <br>Linear mpi_ solve converged due to CONVERGED_RTOL iterations 1<br> 0 KSP none resid norm 2.000000000000e+00 true resid norm 5.241047207555e-16 ||r(i)||/||b|| 2.620523603778e-16<br> 1 KSP none resid norm 5.241047207555e-16 true resid norm 5.241047207555e-16 ||r(i)||/||b|| 2.620523603778e-16<br>KSP Object: 1 MPI process<br> type: preonly<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.<br> left preconditioning<br> using NONE norm type for convergence test<br>PC Object: 1 MPI process<br> type: mpi<br> Size of MPI communicator used for MPI parallel KSP solve 1<br> Desired minimum number of nonzeros per rank for MPI parallel solve 10000<br> *** Use -mpi_ksp_view to see the MPI KSP parameters ***<br> *** Use -mpi_linear_solver_server_view to statistics on all the solves ***<br> linear system matrix = precond matrix:<br> Mat Object: 1 MPI process<br> type: seqaij<br> rows=24, cols=24<br> total: nonzeros=576, allocated nonzeros=840<br> total number of mallocs used during MatSetValues calls=48<br> using I-node routines: found 5 nodes, limit used is 5<br>Solving Time : 0.003143sec<br>Assemble Time : 0.000994sec<br><br> TIME : 1.000000, TIME_STEP : 1.000000, ITER : 2, RESIDUAL : 5.889432e-09<br> TIME0 : 1.000000<br>MPI linear solver server statistics:<br> Ranks KSPSolve()s Mats KSPs Avg. Size Avg. Its<br> 1 1 1 1 24 1<br>WARNING! There are options you set that were not used!<br>WARNING! could be spelling mistake, etc!<br>There is one unused database option. It is:<br>Option left: name:-mat_mumps_icntl_7 value: 5<br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">2022년 11월 30일 (수) 오후 11:54, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>>님이 작성:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Wed, Nov 30, 2022 at 9:31 AM 김성익 <<a href="mailto:ksi2443@gmail.com" target="_blank">ksi2443@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">After folloing the comment,
./app -pc_type lu -ksp_type preonly -ksp_monitor_true_residual -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5</div></blockquote><div><br></div><div>Okay, you can see that it is using METIS:</div><div><br></div><div> INFOG(7) (ordering option effectively used after analysis): 5<br></div><div><br></div><div>It looks like the server stuff was not seeing the option. Put it back in and send the output.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>The outputs are as below.</div><div><br></div><div> 0 KSP none resid norm 2.000000000000e+00 true resid norm 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16<br> 1 KSP none resid norm 4.241815708566e-16 true resid norm 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16<br>Linear solve converged due to CONVERGED_ITS iterations 1<br>KSP Object: 1 MPI process<br> type: preonly<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.<br> left preconditioning<br> using NONE norm type for convergence test<br>PC Object: 1 MPI process<br> type: lu<br> out-of-place factorization<br> tolerance for zero pivot 2.22045e-14<br> matrix ordering: external<br> factor fill ratio given 0., needed 0.<br> Factored matrix follows:<br> Mat Object: 1 MPI process<br> type: mumps<br> rows=24, cols=24<br> package used to perform factorization: mumps<br> total: nonzeros=576, allocated nonzeros=576<br> MUMPS run parameters:<br> Use -ksp_view ::ascii_info_detail to display information for all processes<br> RINFOG(1) (global estimated flops for the elimination after analysis): 8924.<br> RINFOG(2) (global estimated flops for the assembly after factorization): 0.<br> RINFOG(3) (global estimated flops for the elimination after factorization): 8924.<br> (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0)<br> INFOG(3) (estimated real workspace for factors on all processors after analysis): 576<br> INFOG(4) (estimated integer workspace for factors on all processors after analysis): 68<br> INFOG(5) (estimated maximum front size in the complete tree): 24<br> INFOG(6) (number of nodes in the complete tree): 1<br> INFOG(7) (ordering option effectively used after analysis): 5<br> INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100<br> INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 576<br> INFOG(10) (total integer space store the matrix factors after factorization): 68<br> INFOG(11) (order of largest frontal matrix after factorization): 24<br> INFOG(12) (number of off-diagonal pivots): 0<br> INFOG(13) (number of delayed pivots after factorization): 0<br> INFOG(14) (number of memory compress after factorization): 0<br> INFOG(15) (number of steps of iterative refinement after solution): 0<br> INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 0<br> INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 0<br> INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 0<br> INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 0<br> INFOG(20) (estimated number of entries in the factors): 576<br> INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 0<br> INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 0<br> INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0<br> INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1<br> INFOG(25) (after factorization: number of pivots modified by static pivoting): 0<br> INFOG(28) (after factorization: number of null pivots encountered): 0<br> INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 576<br> INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0<br> INFOG(32) (after analysis: type of analysis done): 1<br> INFOG(33) (value used for ICNTL(8)): 7<br> INFOG(34) (exponent of the determinant if determinant is requested): 0<br> INFOG(35) (after factorization: number of entries taking into account BLR factor compression - sum over all processors): 576<br> INFOG(36) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - value on the most memory consuming processor): 0<br> INFOG(37) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - sum over all processors): 0<br> INFOG(38) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core - value on the most memory consuming processor): 0<br> INFOG(39) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core - sum over all processors): 0<br> linear system matrix = precond matrix:<br> Mat Object: 1 MPI process<br> type: seqaij<br> rows=24, cols=24<br> total: nonzeros=576, allocated nonzeros=840<br> total number of mallocs used during MatSetValues calls=48<br> using I-node routines: found 5 nodes, limit used is 5<br></div><div><br></div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">2022년 11월 30일 (수) 오후 11:26, Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>>님이 작성:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Wed, Nov 30, 2022 at 9:20 AM 김성익 <<a href="mailto:ksi2443@gmail.com" target="_blank">ksi2443@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>In my code there are below.</div><div>PetscCall(KSPCreate(PETSC_COMM_WORLD, &ksp));<br> PetscCall(KSPSetOperators(ksp, xGK, xGK));<br> PetscCall(KSPGetPC(ksp, &pc));<br> PetscCall(PCSetType(pc, PCLU));<br> PetscCall(PCFactorSetMatSolverType(pc, MATSOLVERMUMPS));<br> PetscCall(KSPSetFromOptions(ksp));<br></div><div><br></div><div>and my runtime options are as below.</div><div>mpirun -np 3 ./app -mpi_linear_solver_server -mpi_linear_solver_server_view -pc_type mpi -ksp_type preonly -mpi_ksp_monitor -mpi_ksp_converged_reason -mpi_pc_type lu -pc_mpi_always_use_server -mat_mumps_icntl_7 5</div></div></blockquote><div><br></div><div>1) Get rid of the all server stuff until we see what is wrong with your code</div><div><br></div><div>2) Always run in serial until it works</div><div> </div><div> ./app -pc_type lu -ksp_type preonly -ksp_monitor_true_residual -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5</div><div><br></div><div>Send the output so we can see what the solver is.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div class="gmail_quote"><div dir="ltr" class="gmail_attr">2022년 11월 30일 (수) 오후 11:16, Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>>님이 작성:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Wed, Nov 30, 2022 at 9:10 AM 김성익 <<a href="mailto:ksi2443@gmail.com" target="_blank">ksi2443@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">When I adopt icntl by using option, the outputs are as below.<br></div><div dir="ltr"><br></div><div dir="ltr">WARNING! There are options you set that were not used!<br>WARNING! could be spelling mistake, etc!<br>There is one unused database option. It is:<br>Option left: name:-mat_mumps_icntl_7 value: 5<br></div><div dir="ltr"><br></div><div>Is it work??</div></div></blockquote><div><br></div><div>Are you calling KSPSetFromOptions() after the PC is created?</div><div><br></div><div> -pc_type lu -pc_factor_mat_solver_type mumps -mat_mumps_icntl_7 3<br></div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Thanks, </div><div>Hyung Kim</div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">2022년 11월 30일 (수) 오후 11:04, Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>>님이 작성:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Wed, Nov 30, 2022 at 8:58 AM 김성익 <<a href="mailto:ksi2443@gmail.com" target="_blank">ksi2443@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">I'm working on FEM. <div> When I used mumps alone, I fount it efficient to use mumps with metis. <br>So my purpose is using MUMPSsolver with METIS.<div><br></div></div><div>I tried to set metis (by icntl_7 : 5) after global matrix assembly and just before kspsolve.</div><div>However there is error because of 'pcfactorgetmatrix' and 'matmumpsseticntl'.</div><div><br></div><div>How can I fix this?</div></div></blockquote><div><br></div><div>Give the Icntrl as an option.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Thanks,</div><div>Hyung Kim</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">2022년 11월 30일 (수) 오후 10:44, Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>>님이 작성:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Wed, Nov 30, 2022 at 8:40 AM 김성익 <<a href="mailto:ksi2443@gmail.com" target="_blank">ksi2443@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Following your comments,<div><br></div><div>After matrix assembly end, </div><div>PetscCall(KSPGetPC(ksp,&pc));<br> PetscCall(KSPSetFromOptions(ksp));<br> PetscCall(KSPSetUp(ksp));<br> PetscCall(PCFactorGetMatrix(pc,&xGK));<br></div><div><br></div><div>However there is another error as below.</div><div>[0]PETSC ERROR: Object is in wrong state<br>[0]PETSC ERROR: Not for factored matrix<br></div></div></blockquote><div><br></div><div>The error message is telling you that you cannot alter values in the factored matrix. This is because</div><div>the direct solvers use their own internal storage formats which we cannot alter, and you should probably</div><div>not alter either.</div><div><br></div><div>What are you trying to do?</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>[0]PETSC ERROR: See <a href="https://petsc.org/release/faq/" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.18.1, unknown <br>[0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 Wed Nov 30 05:37:52 2022<br>[0]PETSC ERROR: Configure options -download-mumps -download-scalapack -download-parmetis -download-metis<br>[0]PETSC ERROR: #1 MatZeroEntries() at /home/ksi2443/petsc/src/mat/interface/matrix.c:6024<br>[0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:339<br>[0]PETSC ERROR: No PETSc Option Table entries<br></div><div><br></div><div>How can I fix this?</div><div><br></div><div><br></div><div>Thanks,</div><div>Hyung Kim</div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">2022년 11월 30일 (수) 오후 4:18, Jose E. Roman <<a href="mailto:jroman@dsic.upv.es" target="_blank">jroman@dsic.upv.es</a>>님이 작성:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">You have to call PCFactorGetMatrix() first. See any of the examples that use MatMumpsSetIcntl(), for instance <a href="https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html" rel="noreferrer" target="_blank">https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html</a><br>
<br>
Jose<br>
<br>
<br>
> El 30 nov 2022, a las 6:52, 김성익 <<a href="mailto:ksi2443@gmail.com" target="_blank">ksi2443@gmail.com</a>> escribió:<br>
> <br>
> Hello,<br>
> <br>
> <br>
> I tried to adopt METIS option in MUMPS by using <br>
> ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));'<br>
> <br>
> However, there is an error as follows<br>
> <br>
> [0]PETSC ERROR: Object is in wrong state<br>
> [0]PETSC ERROR: Only for factored matrix<br>
> [0]PETSC ERROR: See <a href="https://petsc.org/release/faq/" rel="noreferrer" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>
> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown <br>
> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 Tue Nov 29 21:12:41 2022<br>
> [0]PETSC ERROR: Configure options -download-mumps -download-scalapack -download-parmetis -download-metis<br>
> [0]PETSC ERROR: #1 MatMumpsSetIcntl() at /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478<br>
> [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:149<br>
> [0]PETSC ERROR: No PETSc Option Table entries<br>
> <br>
> How can I fix this error?<br>
> <br>
> Thank you for your help.<br>
> <br>
> <br>
> Hyung Kim<br>
<br>
</blockquote></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>