[petsc-users] Problem with solving Poisson eqn for some cases

TAY wee-beng zonexo at gmail.com
Mon Mar 26 00:15:37 CDT 2018


On 24/3/2018 6:00 AM, Matthew Knepley wrote:
> On Fri, Mar 23, 2018 at 5:23 AM, TAY wee-beng <zonexo at gmail.com 
> <mailto:zonexo at gmail.com>> wrote:
>
>
>     On 21/3/2018 7:47 PM, Matthew Knepley wrote:
>>     On Wed, Mar 21, 2018 at 4:59 AM, TAY wee-beng <zonexo at gmail.com
>>     <mailto:zonexo at gmail.com>> wrote:
>>
>>
>>         On 19/3/2018 6:32 PM, Matthew Knepley wrote:
>>>         On Mon, Mar 19, 2018 at 5:19 AM, TAY wee-beng
>>>         <zonexo at gmail.com <mailto:zonexo at gmail.com>> wrote:
>>>
>>>
>>>             On 17/3/2018 1:15 AM, Matthew Knepley wrote:
>>>>             On Fri, Mar 16, 2018 at 12:54 PM, TAY wee-beng
>>>>             <zonexo at gmail.com <mailto:zonexo at gmail.com>> wrote:
>>>>
>>>>
>>>>                 On 15/3/2018 6:21 PM, Matthew Knepley wrote:
>>>>>                 On Thu, Mar 15, 2018 at 3:51 PM, TAY wee-beng
>>>>>                 <zonexo at gmail.com <mailto:zonexo at gmail.com>> wrote:
>>>>>
>>>>>                     Hi,
>>>>>
>>>>>                     I'm running a CFD code which solves the
>>>>>                     momentum and Poisson eqns.
>>>>>
>>>>>                     Due to poor scaling with HYPRE at higher cpu
>>>>>                     no., I decided to try using PETSc with
>>>>>                     boomeramg and gamg.
>>>>>
>>>>>                     I tested for some small cases and it work
>>>>>                     well. However, for the large problem which has
>>>>>                     poor scaling, it gives an error when I change
>>>>>                     my Poisson solver from pure HYPRE to PETSc
>>>>>                     with boomeramg and gamg.
>>>>>
>>>>>                     The error is :
>>>>>
>>>>>                     Caught signal number 11 SEGV: Segmentation
>>>>>                     Violation, probably memory access out of range
>>>>>
>>>>>                     I tried using:
>>>>>
>>>>>                     -poisson_ksp_type richardson -poisson_pc_type
>>>>>                     hypre -poisson_pc_type_hypre boomeramg
>>>>>
>>>>>                     -poisson_ksp_type gmres -poisson_pc_type hypre
>>>>>                     -poisson_pc_type_hypre boomeramg
>>>>>
>>>>>                     -poisson_pc_type gamg
>>>>>                     -poisson_pc_gamg_agg_nsmooths 1
>>>>>
>>>>>                     but they all gave similar error.
>>>>>
>>>>>                     So why is this so? How should I troubleshoot?
>>>>>                     I am now running a debug ver of PETSc to check
>>>>>                     the error msg.
>>>>>
>>>>>
>>>>>                 1) For anything like this, we would like to see a
>>>>>                 stack trace from the debugger or valgrind output.
>>>>>
>>>>>                 2) We do have several Poisson examples. Does it
>>>>>                 fail for you on those?
>>>>                 Hi,
>>>>
>>>>                 Can you recommend me some suitable egs? Esp in Fortran?
>>>>
>>>>
>>>>             Here is 2D Poisson
>>>>
>>>>             https://bitbucket.org/petsc/petsc/src/4b6141395f14f0c7d1415a2ff0158eec75a27d63/src/snes/examples/tutorials/ex5f.F90?at=master&fileviewer=file-view-default
>>>>             <https://bitbucket.org/petsc/petsc/src/4b6141395f14f0c7d1415a2ff0158eec75a27d63/src/snes/examples/tutorials/ex5f.F90?at=master&fileviewer=file-view-default>
>>>>
>>>>>
>>         Hi,
>>
>>         I tried to use the different options like ML, MG BoomerAMG
>>         and while they worked for the small problem, it failed for
>>         the large problem. I checked and compared the 2D Poisson
>>         example with my 3D Poisson subroutine. I found that my
>>         subroutine is very similar to my momentum subroutine, which
>>         is based on KSPSolve:
>>
>>         0. DMDACreate, DMDACreate3d etc
>>         1. Assemble matrix
>>         2. KSPSetOperators, KSPSetType etc
>>         3. KSPSolve
>>
>>         However, the 2D Poisson example uses SNESSetDM, SNESSolve
>>         etc. So does it matter if I use SNESSolve or KSPSolve?
>>
>>
>>     Not if you are using Algebraic Multigrid. "It failed for the
>>     large problem" is not descriptive enough. Please send the output of
>>
>>       -ksp_view -ksp_monitor_true_residual
>>
>>         I also didn't set any nullspace. Is this required for a
>>         Poisson eqn solve?
>>
>>
>>     If you only have the constant null space, no. Most AMG packages
>>     do that one by default.
>>
>>       Matt
>     Hi,
>
>     I tried to use
>
>     1. -ksp_view -ksp_monitor_true_residual
>     2. -poisson_ksp_view -poisson_ksp_monitor_true_residual
>
>     However, this did not give any output. May I know why?
>
>
> You need to call
>
>   KSPSetFromOptions()
Hi Matt,

But I thought I already called:

call KSPSetOperators(ksp,A_mat,A_mat,ierr)

     call KSPGetPC(ksp,pc,ierr)

     call KSPSetOptionsPrefix(ksp,"poisson_",ierr)

     call KSPSetType(ksp,ksptype,ierr)

     ksptype=KSPRICHARDSON

     call KSPSetType(ksp,ksptype,ierr)

     call PCSetType(pc,'hypre',ierr)

     call KSPSetFromOptions(ksp,ierr)

...

Is my sequence of calling wrong?

Thanks.
>
> after you create it. When you launch valgrind you need
>
>   valgrind --trace-children=yes <exec>
>
>    Matt
>
>     I also tried valgrind:
>
>     ==32157== Memcheck, a memory error detector
>     ==32157== Copyright (C) 2002-2012, and GNU GPL'd, by Julian Seward
>     et al.
>     ==32157== Using Valgrind-3.8.1 and LibVEX; rerun with -h for
>     copyright info
>     ==32157== Command:
>     /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpirun
>     ./a.out -poisson_ksp_type richardson -poisson_pc_type hypre
>     -poisson_pc_type_hypre boomeramg
>     ==32157==
>     ==32159==
>     ==32159== HEAP SUMMARY:
>     ==32159==     in use at exit: 49,981 bytes in 1,137 blocks
>     ==32159==   total heap usage: 1,542 allocs, 405 frees, 72,030
>     bytes allocated
>     ==32159==
>     ==32159== LEAK SUMMARY:
>     ==32159==    definitely lost: 0 bytes in 0 blocks
>     ==32159==    indirectly lost: 0 bytes in 0 blocks
>     ==32159==      possibly lost: 0 bytes in 0 blocks
>     ==32159==    still reachable: 49,981 bytes in 1,137 blocks
>     ==32159==         suppressed: 0 bytes in 0 blocks
>     ==32159== Reachable blocks (those to which a pointer was found)
>     are not shown.
>     ==32159== To see them, rerun with: --leak-check=full
>     --show-reachable=yes
>     ==32159==
>     ==32159== For counts of detected and suppressed errors, rerun with: -v
>     ==32159== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 8
>     from 6)
>     [362]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [362]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
>     Violation, probably memory access out of range
>     [362]PETSC ERROR: Try option -start_in_debugger or
>     -on_error_attach_debugger
>     [362]PETSC ERROR: or see
>     http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>     <http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind>
>     [362]PETSC ERROR: or try http://valgrind.org on GNU/linux and
>     Apple Mac OS X to find memory corruption errors
>     [362]PETSC ERROR: configure using --with-debugging=yes, recompile,
>     link, and run
>     [362]PETSC ERROR: to get more information on the crash.
>     [362]PETSC ERROR: --------------------- Error Message
>     --------------------------------------------------------------
>     [362]PETSC ERROR: Signal received
>     [362]PETSC ERROR: See
>     http://www.mcs.anl.gov/petsc/documentation/faq.html
>     <http://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble
>     shooting.
>     [362]PETSC ERROR: Petsc Release Version 3.8.3, Dec, 09, 2017
>     [362]PETSC ERROR: ./a.out on a petsc-3.8.3_intel_rel named std1055
>     by tsltaywb Thu Mar 22 17:43:54 2018
>     [362]PETSC ERROR: Configure options
>     --with-mpi-dir=/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64
>     --with-blaslapack-dir=/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mkl/lib/intel64
>     --download-hypre=/home/users/nus/tsltaywb/source/git.hypre.tar.gz
>     --with-debugging=0
>     --prefix=/home/users/nus/tsltaywb/lib/petsc-3.8.3_intel_rel
>     --with-shared-libraries=0 --known-mpi-shared-libraries=0
>     --with-fortran-interfaces=1 --CFLAGS="-xHost -g -O2"
>     --CXXFLAGS="-xHost -g -O2" --FFLAGS="-xHost -g -O2"
>     [362]PETSC ERROR: #1 User provided function() line 0 in unknown file
>     application called MPI_Abort(MPI_COMM_WORLD, 59) - process 362
>     ==32409==
>     ==32409== HEAP SUMMARY:
>     ==32409==     in use at exit: 52,168 bytes in 1,223 blocks
>     ==32409==   total heap usage: 3,055 allocs, 1,832 frees, 122,188
>     bytes allocated
>     ==32409==
>     ==32409== 10 bytes in 1 blocks are definitely lost in loss record
>     89 of 319
>     ==32409==    at 0x4C28A2E: malloc (vg_replace_malloc.c:270)
>     ==32409==    by 0x466742: xmalloc (in /bin/bash)
>     ==32409==    by 0x42EC78: ??? (in /bin/bash)
>     ==32409==    by 0x430462: execute_command_internal (in /bin/bash)
>     ==32409==    by 0x43320D: ??? (in /bin/bash)
>     ==32409==    by 0x433629: ??? (in /bin/bash)
>     ==32409==    by 0x4303BC: execute_command_internal (in /bin/bash)
>     ==32409==    by 0x43110D: execute_command (in /bin/bash)
>     ==32409==    by 0x41D6D5: reader_loop (in /bin/bash)
>     ==32409==    by 0x41CEBB: main (in /bin/bash)
>     ==32409==
>     ==32409== LEAK SUMMARY:
>     ==32409==    definitely lost: 10 bytes in 1 blocks
>     ==32409==    indirectly lost: 0 bytes in 0 blocks
>     ==32409==      possibly lost: 0 bytes in 0 blocks
>     ==32409==    still reachable: 52,158 bytes in 1,222 blocks
>     ==32409==         suppressed: 0 bytes in 0 blocks
>     ==32409== Reachable blocks (those to which a pointer was found)
>     are not shown.
>     ==32409== To see them, rerun with: --leak-check=full
>     --show-reachable=yes
>     ==32409== Reachable blocks (those to which a pointer was found)
>     are not shown.
>     ==32409== To see them, rerun with: --leak-check=full
>     --show-reachable=yes
>     ==32409==
>     ==32409== For counts of detected and suppressed errors, rerun with: -v
>     ==32409== ERROR SUMMARY: 1 errors from 1 contexts (suppressed: 8
>     from 6)
>     ==32157==
>     ==32157== HEAP SUMMARY:
>     ==32157==     in use at exit: 55,861 bytes in 1,202 blocks
>     ==32157==   total heap usage: 3,249 allocs, 2,047 frees, 125,857
>     bytes allocated
>     ==32157==
>     ==32157== LEAK SUMMARY:
>     ==32157==    definitely lost: 0 bytes in 0 blocks
>     ==32157==    indirectly lost: 0 bytes in 0 blocks
>     ==32157==      possibly lost: 0 bytes in 0 blocks
>     ==32157==    still reachable: 55,861 bytes in 1,202 blocks
>     ==32157==         suppressed: 0 bytes in 0 blocks
>     ==32157== Reachable blocks (those to which a pointer was found)
>     are not shown.
>     ==32157== To see them, rerun with: --leak-check=full
>     --show-reachable=yes
>     ==32157==
>     ==32157== For counts of detected and suppressed errors, rerun with: -v
>     ==32157== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 8
>     from 6)
>
>     Is this the output I'm supposed to get?
>
>     Thanks!
>
>
>>         Thanks
>>>
>>>>>                 3) You can also try ML, which is the same type of
>>>>>                 MG as GAMG. (--download-ml).
>>>>>
>>>             I have recompiled PETSc with ML. Is there an example
>>>             command line options which I can use for ML?
>>>
>>>
>>>         -pc_type ml
>>>
>>>             Another question is generally speaking, is geometric
>>>             multigrid (GMG) faster than algebraic?
>>>
>>>
>>>         No, only for the setup time.
>>>
>>>             I tested on a small problem and the time taken varies
>>>             from 1.15min (HYPRE, geometric) to 3.25 (GAMG).
>>>             BoomerAMG is 1.45min.
>>>
>>>
>>>         I am not sure what you are running when you say Hypre geometric.
>>>
>>>             Besides HYPRE, is there any other GMG I can use?
>>>
>>>
>>>         As I said above, it does not converge any faster and the
>>>         solve is not faster, its all setup time, so small problems
>>>         will look faster.
>>>         You should be doing 10-20 iterates. If you are doing more,
>>>         MG is not working.
>>>
>>>         If you truly have a structured grid, then use DMDA in PETSc
>>>         and you can use GMG with
>>>
>>>           -pc_type mg -pc_mg_nlevels <n>
>>>
>>>            Matt
>>>
>>>>                 My cluster can't connect to the internet. Where can
>>>>                 I 1st download it?
>>>>
>>>>                 Similarly, how can I find out the location of the
>>>>                 ext software by myself?
>>>>
>>>>
>>>>             The locations are all in the configure Python modules:
>>>>
>>>>             https://bitbucket.org/petsc/petsc/src/4b6141395f14f0c7d1415a2ff0158eec75a27d63/config/BuildSystem/config/packages/ml.py?at=master&fileviewer=file-view-default
>>>>             <https://bitbucket.org/petsc/petsc/src/4b6141395f14f0c7d1415a2ff0158eec75a27d63/config/BuildSystem/config/packages/ml.py?at=master&fileviewer=file-view-default>
>>>>
>>>>               Thanks,
>>>>
>>>>                 Matt
>>>>
>>>>>                   Thanks,
>>>>>
>>>>>                      Matt
>>>>>
>>>>>
>>>>>                     -- 
>>>>>                     Thank you very much.
>>>>>
>>>>>                     Yours sincerely,
>>>>>
>>>>>                     ================================================
>>>>>                     TAY Wee-Beng (Zheng Weiming) 郑伟明
>>>>>                     Personal research webpage:
>>>>>                     http://tayweebeng.wixsite.com/website
>>>>>                     <http://tayweebeng.wixsite.com/website>
>>>>>                     Youtube research showcase:
>>>>>                     https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
>>>>>                     <https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA>
>>>>>                     linkedin: www.linkedin.com/in/tay-weebeng
>>>>>                     <http://www.linkedin.com/in/tay-weebeng>
>>>>>                     ================================================
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>                 -- 
>>>>>                 What most experimenters take for granted before
>>>>>                 they begin their experiments is infinitely more
>>>>>                 interesting than any results to which their
>>>>>                 experiments lead.
>>>>>                 -- Norbert Wiener
>>>>>
>>>>>                 https://www.cse.buffalo.edu/~knepley/
>>>>>                 <http://www.caam.rice.edu/%7Emk51/>
>>>>
>>>>
>>>>
>>>>
>>>>             -- 
>>>>             What most experimenters take for granted before they
>>>>             begin their experiments is infinitely more interesting
>>>>             than any results to which their experiments lead.
>>>>             -- Norbert Wiener
>>>>
>>>>             https://www.cse.buffalo.edu/~knepley/
>>>>             <http://www.caam.rice.edu/%7Emk51/>
>>>
>>>
>>>
>>>
>>>         -- 
>>>         What most experimenters take for granted before they begin
>>>         their experiments is infinitely more interesting than any
>>>         results to which their experiments lead.
>>>         -- Norbert Wiener
>>>
>>>         https://www.cse.buffalo.edu/~knepley/
>>>         <http://www.caam.rice.edu/%7Emk51/>
>>
>>
>>
>>
>>     -- 
>>     What most experimenters take for granted before they begin their
>>     experiments is infinitely more interesting than any results to
>>     which their experiments lead.
>>     -- Norbert Wiener
>>
>>     https://www.cse.buffalo.edu/~knepley/
>>     <http://www.caam.rice.edu/%7Emk51/>
>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180326/3aa625dc/attachment-0001.html>


More information about the petsc-users mailing list