[petsc-users] Problem with solving Poisson eqn for some cases
Matthew Knepley
knepley at gmail.com
Wed Mar 21 06:47:50 CDT 2018
On Wed, Mar 21, 2018 at 4:59 AM, TAY wee-beng <zonexo at gmail.com> wrote:
>
> On 19/3/2018 6:32 PM, Matthew Knepley wrote:
>
> On Mon, Mar 19, 2018 at 5:19 AM, TAY wee-beng <zonexo at gmail.com> wrote:
>
>>
>> On 17/3/2018 1:15 AM, Matthew Knepley wrote:
>>
>> On Fri, Mar 16, 2018 at 12:54 PM, TAY wee-beng <zonexo at gmail.com> wrote:
>>
>>>
>>> On 15/3/2018 6:21 PM, Matthew Knepley wrote:
>>>
>>> On Thu, Mar 15, 2018 at 3:51 PM, TAY wee-beng <zonexo at gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> I'm running a CFD code which solves the momentum and Poisson eqns.
>>>>
>>>> Due to poor scaling with HYPRE at higher cpu no., I decided to try
>>>> using PETSc with boomeramg and gamg.
>>>>
>>>> I tested for some small cases and it work well. However, for the large
>>>> problem which has poor scaling, it gives an error when I change my Poisson
>>>> solver from pure HYPRE to PETSc with boomeramg and gamg.
>>>>
>>>> The error is :
>>>>
>>>> Caught signal number 11 SEGV: Segmentation Violation, probably memory
>>>> access out of range
>>>>
>>>> I tried using:
>>>>
>>>> -poisson_ksp_type richardson -poisson_pc_type hypre
>>>> -poisson_pc_type_hypre boomeramg
>>>>
>>>> -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_type_hypre
>>>> boomeramg
>>>>
>>>> -poisson_pc_type gamg -poisson_pc_gamg_agg_nsmooths 1
>>>>
>>>> but they all gave similar error.
>>>>
>>>> So why is this so? How should I troubleshoot? I am now running a debug
>>>> ver of PETSc to check the error msg.
>>>
>>>
>>> 1) For anything like this, we would like to see a stack trace from the
>>> debugger or valgrind output.
>>>
>>> 2) We do have several Poisson examples. Does it fail for you on those?
>>>
>>> Hi,
>>>
>>> Can you recommend me some suitable egs? Esp in Fortran?
>>>
>>
>> Here is 2D Poisson
>>
>> https://bitbucket.org/petsc/petsc/src/4b6141395f14f0c7d141
>> 5a2ff0158eec75a27d63/src/snes/examples/tutorials/ex5f.F90?
>> at=master&fileviewer=file-view-default
>>
>>
>>>
>>> Hi,
>
> I tried to use the different options like ML, MG BoomerAMG and while they
> worked for the small problem, it failed for the large problem. I checked
> and compared the 2D Poisson example with my 3D Poisson subroutine. I found
> that my subroutine is very similar to my momentum subroutine, which is
> based on KSPSolve:
>
> 0. DMDACreate, DMDACreate3d etc
> 1. Assemble matrix
> 2. KSPSetOperators, KSPSetType etc
> 3. KSPSolve
>
> However, the 2D Poisson example uses SNESSetDM, SNESSolve etc. So does it
> matter if I use SNESSolve or KSPSolve?
>
Not if you are using Algebraic Multigrid. "It failed for the large problem"
is not descriptive enough. Please send the output of
-ksp_view -ksp_monitor_true_residual
> I also didn't set any nullspace. Is this required for a Poisson eqn solve?
>
If you only have the constant null space, no. Most AMG packages do that one
by default.
Matt
> Thanks
>
> 3) You can also try ML, which is the same type of MG as GAMG.
>>> (--download-ml).
>>>
>>> I have recompiled PETSc with ML. Is there an example command line
>> options which I can use for ML?
>>
>
> -pc_type ml
>
>
>> Another question is generally speaking, is geometric multigrid (GMG)
>> faster than algebraic?
>>
>
> No, only for the setup time.
>
>
>> I tested on a small problem and the time taken varies from 1.15min
>> (HYPRE, geometric) to 3.25 (GAMG). BoomerAMG is 1.45min.
>>
>
> I am not sure what you are running when you say Hypre geometric.
>
>
>> Besides HYPRE, is there any other GMG I can use?
>>
>
> As I said above, it does not converge any faster and the solve is not
> faster, its all setup time, so small problems will look faster.
> You should be doing 10-20 iterates. If you are doing more, MG is not
> working.
>
> If you truly have a structured grid, then use DMDA in PETSc and you can
> use GMG with
>
> -pc_type mg -pc_mg_nlevels <n>
>
> Matt
>
>> My cluster can't connect to the internet. Where can I 1st download it?
>>>
>>> Similarly, how can I find out the location of the ext software by myself?
>>>
>>
>> The locations are all in the configure Python modules:
>>
>> https://bitbucket.org/petsc/petsc/src/4b6141395f14f0c7d141
>> 5a2ff0158eec75a27d63/config/BuildSystem/config/packages/
>> ml.py?at=master&fileviewer=file-view-default
>>
>> Thanks,
>>
>> Matt
>>
>>
>>> Thanks,
>>>
>>> Matt
>>>
>>>
>>>>
>>>> --
>>>> Thank you very much.
>>>>
>>>> Yours sincerely,
>>>>
>>>> ================================================
>>>> TAY Wee-Beng (Zheng Weiming) 郑伟明
>>>> Personal research webpage: http://tayweebeng.wixsite.com/website
>>>> Youtube research showcase: https://www.youtube.com/channe
>>>> l/UC72ZHtvQNMpNs2uRTSToiLA
>>>> linkedin: www.linkedin.com/in/tay-weebeng
>>>> ================================================
>>>>
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.caam.rice.edu/%7Emk51/>
>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180321/54dbf451/attachment-0001.html>
More information about the petsc-users
mailing list