[petsc-users] ILUTP in PETSc
Qin Lu
lu_qin_2000 at yahoo.com
Wed May 14 09:48:32 CDT 2014
It turns out that I can not set PC side as right when KSP type is set to pre_only. After I fixed that, it works fine.
This brings me a question: why do I have to set KSP type to pre_only when SuperLU's ILUTP is used as preconditioner? Can I still set KSP type as KSPBCGS (which seems to be the fastest with PETSc's ILU for my cases)?
Thanks,
Qin
________________________________
From: Barry Smith <bsmith at mcs.anl.gov>
To: Qin Lu <lu_qin_2000 at yahoo.com>
Cc: Xiaoye S. Li <xsli at lbl.gov>; "petsc-users at mcs.anl.gov" <petsc-users at mcs.anl.gov>
Sent: Tuesday, May 13, 2014 8:31 PM
Subject: Re: [petsc-users] ILUTP in PETSc
Works fine for me. Please please please ALWAYS cut and paste the entire error message that is printed. We print the information for a reason, because it provides clues as to what went wrong.
./ex10 -f0 ~/Datafiles/Matrices/arco1 -pc_type ilu -pc_factor_mat_solver_package superlu -mat_superlu_ilu_droptol 1.e-8 -ksp_monitor_true_residual -ksp_rtol 1.e-12 -ksp_view
0 KSP preconditioned resid norm 2.544968580491e+03 true resid norm 7.410897708964e+00 ||r(i)||/||b|| 1.000000000000e+00
1 KSP preconditioned resid norm 2.467110329809e-06 true resid norm 1.439993537311e-07 ||r(i)||/||b|| 1.943075716143e-08
2 KSP preconditioned resid norm 1.522204461523e-12 true resid norm 2.699724724531e-11 ||r(i)||/||b|| 3.642911871885e-12
KSP Object: 1 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-12, absolute=1e-50, divergence=10000
left preconditioning
using PRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
type: ilu
ILU: out-of-place factorization
0 levels of fill
tolerance for zero pivot 2.22045e-14
using diagonal shift on blocks to prevent zero pivot [INBLOCKS]
matrix ordering: natural
factor fill ratio given 0, needed 0
Factored matrix follows:
Mat Object: 1 MPI processes
type: seqaij
rows=1501, cols=1501
package used to perform factorization: superlu
total: nonzeros=0, allocated nonzeros=0
total number of mallocs used during MatSetValues calls =0
SuperLU run parameters:
Equil: YES
ColPerm: 3
IterRefine: 0
SymmetricMode: NO
DiagPivotThresh: 0.1
PivotGrowth: NO
ConditionNumber: NO
RowPerm: 1
ReplaceTinyPivot: NO
PrintStat: NO
lwork: 0
ILU_DropTol: 1e-08
ILU_FillTol: 0.01
ILU_FillFactor: 10
ILU_DropRule: 9
ILU_Norm: 2
ILU_MILU: 0
linear system matrix = precond matrix:
Mat Object: 1 MPI processes
type: seqaij
rows=1501, cols=1501
total: nonzeros=26131, allocated nonzeros=26131
total number of mallocs used during MatSetValues calls =0
using I-node routines: found 501 nodes, limit used is 5
Number of iterations = 2
Residual norm 2.69972e-11
~/Src/petsc/src/ksp/ksp/examples/tutorials master
On May 13, 2014, at 12:17 PM, Qin Lu <lu_qin_2000 at yahoo.com> wrote:
> I tried to use command line options as the example suggested ('-ksp_type preonly -pc_type ilu -pc_factor_mat_solver_package superlu -mat_superlu_ilu_droptol 1.e-8') without changing my source code, but then the call to KSPSetUp returned error number 56.
>
> Does this mean I still need to change the source code (such as adding calls to PCFactorSetMatSolverPackage, PCFactorGetMatrix, etc.)in addition to the command line options?
>
> I ask this since the use of SuperLU seems to be different from using Hypre, which can be invoked with command line options without changing source code.
>
> Thanks a lot,
> Qin
>
>
> ----- Original Message -----
> From: Barry Smith <bsmith at mcs.anl.gov>
> To: Qin Lu <lu_qin_2000 at yahoo.com>
> Cc: Xiaoye S. Li <xsli at lbl.gov>; "petsc-users at mcs.anl.gov" <petsc-users at mcs.anl.gov>
> Sent: Monday, May 12, 2014 5:11 PM
> Subject: Re: [petsc-users] ILUTP in PETSc
>
>
> See for example: http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATSOLVERSUPERLU.html
>
>
>
>
>
> On May 12, 2014, at 4:54 PM, Qin Lu <lu_qin_2000 at yahoo.com> wrote:
>
>> Hello,
>>
>> I have built PETSc with SuperLU, but what are PETSc's command line options to invoke SuperLU's ILUTP preconditioner and to set the dropping tolerance? (-mat_superlu_ilu_droptol for the latter?)
>>
>> Do I need to do some programming in order to call SuperLU's preconditioner, or the command line options would work?
>>
>> Many thanks,
>> Qin
>>
>>
>> From: Xiaoye S. Li <xsli at lbl.gov>
>> To: Barry Smith <bsmith at mcs.anl.gov>
>> Cc: Qin Lu <lu_qin_2000 at yahoo.com>; "petsc-users at mcs.anl.gov" <petsc-users at mcs.anl.gov>
>> Sent: Friday, May 2, 2014 3:40 PM
>> Subject: Re: [petsc-users] ILUTP in PETSc
>>
>>
>>
>> The sequential SuperLU has ILUTP implementation, not in parallel versions. PETSc already supports the option of using SuperLU, so you should be able to try easily.
>>
>> In SuperLU distribution:
>>
>> EXAMPLE/zitersol.c : an example to use GMRES with ILUTP preconditioner (returned from driver SRC/zgsisx.c)
>>
>> SRC/zgsitrf.c : the actual ILUTP factorization routine
>>
>>
>> Sherry Li
>>
>>
>>
>> On Fri, May 2, 2014 at 12:25 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>
>>
>>> At http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.htmlthere are two listed. ./configure —download-hypre
>>>
>>> mpiexec -n 23 ./yourprogram -pc_type hypre -pc_hypre_type ilupt or euclid
>>>
>>> you can also add -help to see what options are available.
>>>
>>> Both pretty much suck and I can’t image much reason for using them.
>>>
>>> Barry
>>>
>>>
>>>
>>> On May 2, 2014, at 10:27 AM, Qin Lu <lu_qin_2000 at yahoo.com> wrote:
>>>
>>>> Hello,
>>>>
>>>> I am interested in using ILUTP preconditioner with PETSc linear solver. There is an online doc https://fs.hlrs.de/projects/par/par_prog_ws/pdf/petsc_nersc01_short.pdfthatmentioned it is available in PETSc with other packages (page 62-63). Is there any instructions or examples on how to use it?
>>>>
>>>> Many thanks,
>>>> Qin
>>>
>>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140514/9ff3d8a7/attachment.html>
More information about the petsc-users
mailing list