[petsc-users] Obtaining compiling and building information from a.out

Matthew Knepley knepley at gmail.com
Wed Apr 4 06:55:51 CDT 2018


On Thu, Mar 29, 2018 at 1:48 AM, TAY wee-beng <zonexo at gmail.com> wrote:

>
> On 28/3/2018 5:45 PM, Matthew Knepley wrote:
>
> On Tue, Mar 27, 2018 at 10:17 PM, TAY wee-beng <zonexo at gmail.com> wrote:
>
>> Hi Dave,
>>
>> I looked at the output using log_view and re-compile. However, although I
>> use the same options "-xHost -g -O3 -openmp" (some filenames and dir names
>> are now different but they are actually the same), I still get different
>> results timing. I have attach both, fast and slow. So what else can I do to
>> pinpoint the differences?
>>
> The solver options must be different. In Fast, there is almost no time in
> LUFactor, but in Slow its half the time.
>
>    Matt
>
> Hi Matt,
>
> Finally found that it was because I was using KSPRICHARDSON in the new
> code. KSPGMRES in the old code is much faster.
>
> I also tried cg, lsqr, fgmres. CG and GMRES seem similar in term of best
> performance. Are there any other recommendations to try? The Poisson eqn is
> symmetric.
>

Krylov solvers fundamentally do not matter for solving elliptic problems.
It comes down to your preconditioner.


> I also tried -pc_type mg -pc_mg_nlevels 2 instead of -poisson_pc_type gamg
> -poisson_pc_gamg_agg_nsmooths 1.
>
> It reduces the runtime from 2.25 to 1.46min.
>

Yes, GMG has faster setup time than AMG. The solve time should be similar.


> However, I found that my subroutine is very similar to my momentum
> subroutine, which is based on KSPSolve:
>
> 0. DMDACreate, DMDACreate3d etc
> 1. Assemble matrix
> 2. KSPSetOperators, KSPSetType etc
> 3. KSPSolve
>
> However, the 2D Poisson example in ex5f.F90 uses SNESSetDM, SNESSolve etc.
> So does it matter if I use SNESSolve or KSPSolve if I'm using -pc_type mg?
>

Not really.

  Thanks,

     Matt


> Thank you very much.
>>
>> Yours sincerely,
>>
>> ================================================
>> TAY Wee-Beng (Zheng Weiming) 郑伟明
>> Personal research webpage: http://tayweebeng.wixsite.com/website
>> Youtube research showcase: https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
>> linkedin: www.linkedin.com/in/tay-weebeng
>> ================================================
>>
>> On 27/3/2018 5:22 PM, Dave May wrote:
>>
>>
>>
>> On 27 March 2018 at 10:16, TAY wee-beng <zonexo at gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I have been compiling and building different version of my CFD with the
>>> intel 2016, 2018 compilers, and also different compiling options.
>>>
>>> I tested a version of my a.out and it is much faster than the other
>>> a.out, using only 3 min instead of more than 10min to solve a certain case
>>> using GAMG.
>>>
>>> However, I can't recall how it was compiled. I only know that I used the
>>> intel 2016 compiler.
>>>
>>> So is there any way I can find out how the a.out was compiled? Like what
>>> options were used?
>>
>>
>> Since you posted to the list I presume "a.out" links against petsc...
>> If so, run your code with
>>   -log_view
>>
>> Upon calling PetscFinalize(), you will get all the options given to PETSc
>> configure, plus the CFLAGS, link lines, etc.
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180404/84b21808/attachment.html>


More information about the petsc-users mailing list