[petsc-users] Scaling problem when cores > 600

TAY wee-beng zonexo at gmail.com
Wed Feb 28 22:59:02 CST 2018


On 1/3/2018 10:07 AM, Matthew Knepley wrote:
> On Wed, Feb 28, 2018 at 9:01 PM, TAY wee-beng <zonexo at gmail.com 
> <mailto:zonexo at gmail.com>> wrote:
>
>
>     On 1/3/2018 12:10 AM, Matthew Knepley wrote:
>>     On Wed, Feb 28, 2018 at 10:45 AM, TAY wee-beng <zonexo at gmail.com
>>     <mailto:zonexo at gmail.com>> wrote:
>>
>>         Hi,
>>
>>         I have a CFD code which uses PETSc and HYPRE. I found that
>>         for a certain case with grid size of 192,570,048, I encounter
>>         scaling problem when my cores > 600. At 600 cores, the code
>>         took 10min for 100 time steps. At 960, 1440 and 2880 cores,
>>         it still takes around 10min. At 360 cores, it took 15min.
>>
>>         So how can I find the bottleneck? Any recommended steps?
>>
>>
>>     For any performance question, we need to see the output of
>>     -log_view for all test cases.
>     Hi,
>
>     To be more specific, I use PETSc KSPBCGS and HYPRE geometric
>     multigrid (entirely based on HYPRE, no PETSc) for the momentum and
>     Poisson eqns in my code.
>
>     So can log_view be used in this case to give a meaningful? Since
>     part of the code uses HYPRE?
>
>
> Make an event to time the HYPRE solve. It only takes a few lines of code.
Hi,

I check PETSc and found some routines which can be used to time the 
HYPRE solve, like PetscGetTime and PetscGetCPUTime.

And then using:

PetscLogDouble t1, t2;

     ierr = PetscGetCPUTime(&t1);CHKERRQ(ierr);
     ... code to time ...
     ierr = PetscGetCPUTime(&t2);CHKERRQ(ierr);
     printf("Code took %f CPU seconds\n", t2-t1);

Are these 2 routines suitable? Which one should I use?
>
>     I also program another subroutine in the past which uses PETSc to
>     solve the Poisson eqn. It uses either HYPRE's boomeramg, KSPBCGS
>     or KSPGMRES.
>
>     If I use boomeramg, can log_view be used in this case?
>
>
> Yes, its automatic.
>
>     Or do I have to use KSPBCGS or KSPGMRES, which is directly from
>     PETSc? However, I ran KSPGMRES yesterday with the Poisson eqn and
>     my ans didn't converge.
>
>
> Plain GMRES is not good for Poisson. You would be better off with 
> GMRES/GAMG.
>
>   Thanks,
>
>      Matt
>
>     Thanks.
>>
>>         I must also mention that I partition my grid only in the x
>>         and y direction. There is no partitioning in the z direction
>>         due to limited code development. I wonder if there is a
>>         strong effect in this case.
>>
>>
>>     Maybe. Usually what happens is you fill up memory with a z-column
>>     and cannot scale further.
>>
>>       Thanks,
>>
>>          Matt
>>
>>
>>         -- 
>>         Thank you very much
>>
>>         Yours sincerely,
>>
>>         ================================================
>>         TAY Wee-Beng 郑伟明 (Zheng Weiming)
>>         Personal research webpage:
>>         http://tayweebeng.wixsite.com/website
>>         <http://tayweebeng.wixsite.com/website>
>>         Youtube research showcase:
>>         https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
>>         <https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA>
>>         linkedin: www.linkedin.com/in/tay-weebeng
>>         <http://www.linkedin.com/in/tay-weebeng>
>>         ================================================
>>
>>
>>
>>
>>     -- 
>>     What most experimenters take for granted before they begin their
>>     experiments is infinitely more interesting than any results to
>>     which their experiments lead.
>>     -- Norbert Wiener
>>
>>     https://www.cse.buffalo.edu/~knepley/
>>     <http://www.caam.rice.edu/%7Emk51/>
>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180301/a49f40ff/attachment-0001.html>


More information about the petsc-users mailing list