[petsc-users] Mat_CheckInode

Debao Shao Debao.Shao at brion.com
Mon Dec 5 20:24:23 CST 2011


Hi, Hong:

Thanks a lot for your suggestion.
The matrix scale of my case may vary from 10*10~1000000*1000000, so, I need both direct solver & iterative solver.
For example,
1), <100*100, use direct dense solver;
2), <10000*10000, use direct sparse solver;
3), else, use iterative solver.

Iterative solver(ILU(1) & GMRES) is very efficient on many large scale problems, the issue I encounter now is its runtime's unstable. For a 50000*50000 matrix, it may only need dozens of seconds, but sometimes it also may take thousands of seconds.

Do you have any suggestion?

Thanks,
Debao

-----Original Message-----
From: petsc-users-bounces at mcs.anl.gov [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Hong Zhang
Sent: Monday, December 05, 2011 11:35 PM
To: PETSc users list
Subject: Re: [petsc-users] Mat_CheckInode

For
>The matrix size is around 50000*50000, nnz is ~2000000

You may also try direct solver '-pc_type lu' either Petsc lu (sequential)
or mumps, or superlu_dist.
They may be faster than ilu(1).

Hong

On Mon, Dec 5, 2011 at 8:54 AM, Matthew Knepley <knepley at gmail.com> wrote:
> On Mon, Dec 5, 2011 at 8:46 AM, Debao Shao <Debao.Shao at brion.com> wrote:
>>
>> DA,
>>
>> 1, Will check_inode affect the runtime performance? For example, in my
>> case, there are such different logs:
>>
>>      [0] Mat_CheckInode(): Found 18602 nodes out of 18609 rows. Not using
>> Inode routines
>>
>>      [0] Mat_CheckInode(): Found 0 nodes of 0. Limit used: 5. Using Inode
>> routines
>>
>>      [0] Mat_CheckInode(): Found 14020 nodes out of 14020 rows. Not using
>> Inode routines
>
>
> This is intended to "discover" block structure. If you have it, MatMult
> should be faster.
>
>>
>> 2, Here is the log_summary, looks like PCSetup is big time-consuming
>> function, the method I'm using is ILU(1) & GMRES, may I improve it?
>>
>>      MatMult            30733 1.0 8.9553e+01 1.0 3.41e+10 1.0 0.0e+00
>> 0.0e+00 0.0e+00  2  4  0  0  0   2  4  0  0  0   380
>>
>>      MatMultAdd          9552 1.0 6.6610e+00 1.0 2.42e+09 1.0 0.0e+00
>> 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0   363
>>
>>      MatSolve           36236 1.0 2.4699e+02 1.0 9.44e+10 1.0 0.0e+00
>> 0.0e+00 0.0e+00  6 10  0  0  0   6 10  0  0  0   382
>>
>>      MatLUFactorNum      2838 1.0 2.4328e+03 1.0 7.65e+11 1.0 0.0e+00
>> 0.0e+00 0.0e+00 61 85  0  0  0  61 85  0  0  0   315
>>
>>      MatILUFactorSym      173 1.0 1.3486e+02 1.0 0.00e+00 0.0 0.0e+00
>> 0.0e+00 0.0e+00  3  0  0  0  0   3  0  0  0  0     0
>>
>>      MatAssemblyBegin    5787 1.0 2.0547e-03 1.0 0.00e+00 0.0 0.0e+00
>> 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>>
>>      MatAssemblyEnd      5787 1.0 5.3559e+00 1.0 0.00e+00 0.0 0.0e+00
>> 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>>
>>      KSPGMRESOrthog     30679 1.0 5.5968e+00 1.0 4.06e+09 1.0 0.0e+00
>> 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0   725
>>
>>      KSPSetup            2838 1.0 1.8219e-02 1.0 0.00e+00 0.0 0.0e+00
>> 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>>
>>      KSPSolve            5503 1.0 2.9136e+03 1.0 8.99e+11 1.0 0.0e+00
>> 0.0e+00 0.0e+00 73100  0  0  0  73100  0  0  0   308
>>
>>      PCSetUp             2838 1.0 2.5682e+03 1.0 7.65e+11 1.0 0.0e+00
>> 0.0e+00 0.0e+00 64 85  0  0  0  64 85  0  0  0   298
>>
>>      PCApply            36236 1.0 2.4709e+02 1.0 9.44e+10 1.0 0.0e+00
>> 0.0e+00 0.0e+00  6 10  0  0  0   6 10  0  0  0   382
>>
>> The matrix size is around 50000*50000, nnz is ~2000000
>
>
> ILU is expensive, and levels make it much more expensive. Maybe try AMG?
>>
>> 3, Will the fill of ilu affect the runtime performance?
>>
>>      [0] MatILUFactorSymbolic_SeqAIJ(): Reallocs 0 Fill ratio:given 1
>> needed 1.11111
>>
>>      [0] MatILUFactorSymbolic_SeqAIJ(): Reallocs 1 Fill ratio:given 1
>> needed 5.05
>>
>>      [0] MatILUFactorSymbolic_SeqAIJ(): Reallocs 1 Fill ratio:given 1
>> needed 8.3410
>>
>> How to set a proper value for fill, >8?
>
>
> http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/PC/PCFactorSetFill.html
>
> Use -pc_factor_fill
>
>   Matt
>
>>
>> Thanks,
>>
>> Debao
>>
>>
>> ________________________________
>> -- The information contained in this communication and any attachments is
>> confidential and may be privileged, and is for the sole use of the intended
>> recipient(s). Any unauthorized review, use, disclosure or distribution is
>> prohibited. Unless explicitly stated otherwise in the body of this
>> communication or the attachment thereto (if any), the information is
>> provided on an AS-IS basis without any express or implied warranties or
>> liabilities. To the extent you are relying on this information, you are
>> doing so at your own risk. If you are not the intended recipient, please
>> notify the sender immediately by replying to this message and destroy all
>> copies of this message and any attachments. ASML is neither liable for the
>> proper and complete transmission of the information contained in this
>> communication, nor for any delay in its receipt.
>
>
>
>
> --
> What most experimenters take for granted before they begin their experiments
> is infinitely more interesting than any results to which their experiments
> lead.
> -- Norbert Wiener

-- The information contained in this communication and any attachments is confidential and may be privileged, and is for the sole use of the intended recipient(s). Any unauthorized review, use, disclosure or distribution is prohibited. Unless explicitly stated otherwise in the body of this communication or the attachment thereto (if any), the information is provided on an AS-IS basis without any express or implied warranties or liabilities. To the extent you are relying on this information, you are doing so at your own risk. If you are not the intended recipient, please notify the sender immediately by replying to this message and destroy all copies of this message and any attachments. ASML is neither liable for the proper and complete transmission of the information contained in this communication, nor for any delay in its receipt.


More information about the petsc-users mailing list