[petsc-users] Performance of PETSc TS solver
Shri
abhyshr at mcs.anl.gov
Tue Aug 13 13:00:41 CDT 2013
90% of the time is spent in your Jacobian evaluation routine which is clearly a bottleneck.
On Aug 13, 2013, at 12:34 PM, Jin, Shuangshuang wrote:
> Hello, Jed and Barry, thanks for your reply.
>
> We are solving a power system dynamic simulation problem. We set up the DAE equations and its Jacobian matrix, and would like to use the Trapezoid method to solve it.
>
> That's also the reason why we chose TSTHETA. From the PETSc manual, we read that:
>
> "-ts_type theta -ts_theta_theta 0.5 -ts_theta_endpoint corresponds to Crank-Nicholson (TSCN). This method can be applied to DAE.
> For the default Theta=0.5, this is the trapezoid rule (also known as Crank-Nicolson, see TSCN)."
>
> I haven't heard of ARKIMEX or ROSW before. Are they some external packages or DAE solvers that implement the Trapezoid method?
>
> I have also tried the -ksp_type preonly -pc_type lu option you indicated but failed. The PETSC ERROR messages are: No support for this operation for this object type! Matrix format mpiaij does not have a built-in PETSc LU!
PETSc does not have a native parallel direct solver. You can use MUMPS (--download-mumps) or superlu_dist (--download_superlu_dist)
>
> Attached please see the log_summary for running the TSTHETA with -ts_theta_theta 0.5 and its default ksp solver. Please help me to evaluate the performance and see what's the bottleneck of the slow computation speed.
>
> Thanks a lot!
>
> Shuangshuang
>
>
>
>
>
>
> -----Original Message-----
> From: Barry Smith [mailto:bsmith at mcs.anl.gov]
> Sent: Monday, August 12, 2013 6:39 PM
> To: Jed Brown
> Cc: Jin, Shuangshuang; petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] Performance of PETSc TS solver
>
>
> Also always send the output from running with -log_summary whenever you ask performance questions so we know what kind of performance it is getting.
>
> Barry
>
> On Aug 12, 2013, at 8:14 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>
>> "Jin, Shuangshuang" <Shuangshuang.Jin at pnnl.gov> writes:
>>
>>> Hello, PETSc developers,
>>> I have a question regarding the performance of PETSc TS solver
>>> expecially the TSTHETA. I used it to solve my DAE equations.
>>
>> TSTHETA is not L-stable and not stiffly accurate, so it's not normally
>> something that you'd want to use for a DAE. Make sure you're getting
>> meaningful results and try switching to something like an ARKIMEX or
>> ROSW since those are likely better for your problem.
>>
>>> I have recorded the solution times when different numbers of processors are used:
>>>
>>> 2 processors: 1021 seconds,
>>> 4 processors: 587.244 seconds,
>>> 8 processors: 421.565 seconds,
>>> 16 processors: 355.594 seconds,
>>> 32 processors: 322.28 seconds,
>>> 64 processors: 382.967 seconds.
>>>
>>> It seems like with 32 processors, it reaches the best performance.
>>> However, 322.28 seconds to solve such DAE equations is too slow than
>>> I expected.
>>
>> The number of equations (1152) is quite small, so I'm not surprised
>> there is no further speedup. Can you explain more about your equations?
>>
>>>
>>> I have the following questions based on the above results:
>>> 1. Is this the usual DAE solving time in PETSc to for the problem with this dimension?
>>
>> That depends what your function is.
>>
>>> 2. I was told that in TS, by default, ksp uses GMRES, and the
>>> preconditioner is ILU(0), is there any other alterative ksp solver or
>>> options I should use in the command line to solve the problem much
>>> faster?
>>
>> I would use -ksp_type preonly -pc_type lu for such small problems. Is
>> the system dense?
>>
>>> 3. Do you have any other suggestion for me to speed up the DAE computation in PETSc?
>>
>> Can you describe what sort of problem you're dealing with, what causes
>> the stiffness in your equations, what accuracy you want, etc.
>
> <job.out.summary>
More information about the petsc-users
mailing list