<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Fri, Jan 22, 2016 at 12:17 PM, Hom Nath Gharti <span dir="ltr"><<a href="mailto:hng.email@gmail.com" target="_blank">hng.email@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Thanks Matt for great suggestion. One last question, do you know<br>
whether the GPU capability of current PETSC version is matured enough<br>
to try for my problem?<br></blockquote><div><br></div><div>The only thing that would really make sense to do on the GPU is the SEM integration, which</div><div>would not be part of PETSc. This is what SPECFEM has optimized.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Thanks again for your help.<br>
Hom Nath<br>
<br>
On Fri, Jan 22, 2016 at 1:07 PM, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> wrote:<br>
> On Fri, Jan 22, 2016 at 11:47 AM, Hom Nath Gharti <<a href="mailto:hng.email@gmail.com">hng.email@gmail.com</a>><br>
> wrote:<br>
>><br>
>> Thanks a lot.<br>
>><br>
>> With AMG it did not converge within the iteration limit of 3000.<br>
>><br>
>> In solid: elastic wave equation with added gravity term \rho \nabla\phi<br>
>> In fluid: acoustic wave equation with added gravity term \rho \nabla\phi<br>
>> Both solid and fluid: Poisson's equation for gravity<br>
>> Outer space: Laplace's equation for gravity<br>
>><br>
>> We combine so called mapped infinite element with spectral-element<br>
>> method (higher order FEM that uses nodal quadrature) and solve in<br>
>> frequency domain.<br>
><br>
><br>
> 1) The Poisson and Laplace equation should be using MG, however you are<br>
> using SEM, so<br>
> you would need to use a low order PC for the high order problem, also<br>
> called p-MG (Paul Fischer), see<br>
><br>
> <a href="http://epubs.siam.org/doi/abs/10.1137/110834512" rel="noreferrer" target="_blank">http://epubs.siam.org/doi/abs/10.1137/110834512</a><br>
><br>
> 2) The acoustic wave equation is Helmholtz to us, and that needs special MG<br>
> tweaks that<br>
> are still research material so I can understand using ASM.<br>
><br>
> 3) Same thing for the elastic wave equations. Some people say they have this<br>
> solved using<br>
> hierarchical matrix methods, something like<br>
><br>
> <a href="http://portal.nersc.gov/project/sparse/strumpack/" rel="noreferrer" target="_blank">http://portal.nersc.gov/project/sparse/strumpack/</a><br>
><br>
> However, I think the jury is still out.<br>
><br>
> If you can do 100 iterations of plain vanilla solvers, that seems like a win<br>
> right now. You might improve<br>
> the time using FS, but I am not sure about the iterations on the smaller<br>
> problem.<br>
><br>
> Thanks,<br>
><br>
> Matt<br>
><br>
>><br>
>> Hom Nath<br>
>><br>
>> On Fri, Jan 22, 2016 at 12:16 PM, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>><br>
>> wrote:<br>
>> > On Fri, Jan 22, 2016 at 11:10 AM, Hom Nath Gharti <<a href="mailto:hng.email@gmail.com">hng.email@gmail.com</a>><br>
>> > wrote:<br>
>> >><br>
>> >> Thanks Matt.<br>
>> >><br>
>> >> Attached detailed info on ksp of a much smaller test. This is a<br>
>> >> multiphysics problem.<br>
>> ><br>
>> ><br>
>> > You are using FGMRES/ASM(ILU0). From your description below, this sounds<br>
>> > like<br>
>> > an elliptic system. I would at least try AMG (-pc_type gamg) to see how<br>
>> > it<br>
>> > does. Any<br>
>> > other advice would have to be based on seeing the equations.<br>
>> ><br>
>> > Thanks,<br>
>> ><br>
>> > Matt<br>
>> ><br>
>> >><br>
>> >> Hom Nath<br>
>> >><br>
>> >> On Fri, Jan 22, 2016 at 12:01 PM, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>><br>
>> >> wrote:<br>
>> >> > On Fri, Jan 22, 2016 at 10:52 AM, Hom Nath Gharti<br>
>> >> > <<a href="mailto:hng.email@gmail.com">hng.email@gmail.com</a>><br>
>> >> > wrote:<br>
>> >> >><br>
>> >> >> Dear all,<br>
>> >> >><br>
>> >> >> I take this opportunity to ask for your important suggestion.<br>
>> >> >><br>
>> >> >> I am solving an elastic-acoustic-gravity equation on the planet. I<br>
>> >> >> have displacement vector (ux,uy,uz) in solid region, displacement<br>
>> >> >> potential (\xi) and pressure (p) in fluid region, and gravitational<br>
>> >> >> potential (\phi) in all of space. All these variables are coupled.<br>
>> >> >><br>
>> >> >> Currently, I am using MATMPIAIJ and form a single global matrix.<br>
>> >> >> Does<br>
>> >> >> using a MATMPIBIJ or MATNEST improve the convergence/efficiency in<br>
>> >> >> this case? For your information, total degrees of freedoms are about<br>
>> >> >> a<br>
>> >> >> billion.<br>
>> >> ><br>
>> >> ><br>
>> >> > 1) For any solver question, we need to see the output of -ksp_view,<br>
>> >> > and<br>
>> >> > we<br>
>> >> > would also like<br>
>> >> ><br>
>> >> > -ksp_monitor_true_residual -ksp_converged_reason<br>
>> >> ><br>
>> >> > 2) MATNEST does not affect convergence, and MATMPIBAIJ only in the<br>
>> >> > blocksize<br>
>> >> > which you<br>
>> >> > could set without that format<br>
>> >> ><br>
>> >> > 3) However, you might see benefit from using something like<br>
>> >> > PCFIELDSPLIT<br>
>> >> > if<br>
>> >> > you have multiphysics here<br>
>> >> ><br>
>> >> > Matt<br>
>> >> ><br>
>> >> >><br>
>> >> >> Any suggestion would be greatly appreciated.<br>
>> >> >><br>
>> >> >> Thanks,<br>
>> >> >> Hom Nath<br>
>> >> >><br>
>> >> >> On Fri, Jan 22, 2016 at 10:32 AM, Matthew Knepley<br>
>> >> >> <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>><br>
>> >> >> wrote:<br>
>> >> >> > On Fri, Jan 22, 2016 at 9:27 AM, Mark Adams <<a href="mailto:mfadams@lbl.gov">mfadams@lbl.gov</a>><br>
>> >> >> > wrote:<br>
>> >> >> >>><br>
>> >> >> >>><br>
>> >> >> >>><br>
>> >> >> >>> I said the Hypre setup cost is not scalable,<br>
>> >> >> >><br>
>> >> >> >><br>
>> >> >> >> I'd be a little careful here. Scaling for the matrix triple<br>
>> >> >> >> product<br>
>> >> >> >> is<br>
>> >> >> >> hard and hypre does put effort into scaling. I don't have any<br>
>> >> >> >> data<br>
>> >> >> >> however.<br>
>> >> >> >> Do you?<br>
>> >> >> ><br>
>> >> >> ><br>
>> >> >> > I used it for PyLith and saw this. I did not think any AMG had<br>
>> >> >> > scalable<br>
>> >> >> > setup time.<br>
>> >> >> ><br>
>> >> >> > Matt<br>
>> >> >> ><br>
>> >> >> >>><br>
>> >> >> >>> but it can be amortized over the iterations. You can quantify<br>
>> >> >> >>> this<br>
>> >> >> >>> just by looking at the PCSetUp time as your increase the number<br>
>> >> >> >>> of<br>
>> >> >> >>> processes. I don't think they have a good<br>
>> >> >> >>> model for the memory usage, and if they do, I do not know what<br>
>> >> >> >>> it<br>
>> >> >> >>> is.<br>
>> >> >> >>> However, generally Hypre takes more<br>
>> >> >> >>> memory than the agglomeration MG like ML or GAMG.<br>
>> >> >> >>><br>
>> >> >> >><br>
>> >> >> >> agglomerations methods tend to have lower "grid complexity", that<br>
>> >> >> >> is<br>
>> >> >> >> smaller coarse grids, than classic AMG like in hypre. THis is<br>
>> >> >> >> more<br>
>> >> >> >> of a<br>
>> >> >> >> constant complexity and not a scaling issue though. You can<br>
>> >> >> >> address<br>
>> >> >> >> this<br>
>> >> >> >> with parameters to some extent. But for elasticity, you want to<br>
>> >> >> >> at<br>
>> >> >> >> least<br>
>> >> >> >> try, if not start with, GAMG or ML.<br>
>> >> >> >><br>
>> >> >> >>><br>
>> >> >> >>> Thanks,<br>
>> >> >> >>><br>
>> >> >> >>> Matt<br>
>> >> >> >>><br>
>> >> >> >>>><br>
>> >> >> >>>><br>
>> >> >> >>>> Giang<br>
>> >> >> >>>><br>
>> >> >> >>>> On Mon, Jan 18, 2016 at 5:25 PM, Jed Brown <<a href="mailto:jed@jedbrown.org">jed@jedbrown.org</a>><br>
>> >> >> >>>> wrote:<br>
>> >> >> >>>>><br>
>> >> >> >>>>> Hoang Giang Bui <<a href="mailto:hgbk2008@gmail.com">hgbk2008@gmail.com</a>> writes:<br>
>> >> >> >>>>><br>
>> >> >> >>>>> > Why P2/P2 is not for co-located discretization?<br>
>> >> >> >>>>><br>
>> >> >> >>>>> Matt typed "P2/P2" when me meant "P2/P1".<br>
>> >> >> >>>><br>
>> >> >> >>>><br>
>> >> >> >>><br>
>> >> >> >>><br>
>> >> >> >>><br>
>> >> >> >>> --<br>
>> >> >> >>> What most experimenters take for granted before they begin their<br>
>> >> >> >>> experiments is infinitely more interesting than any results to<br>
>> >> >> >>> which<br>
>> >> >> >>> their<br>
>> >> >> >>> experiments lead.<br>
>> >> >> >>> -- Norbert Wiener<br>
>> >> >> >><br>
>> >> >> >><br>
>> >> >> ><br>
>> >> >> ><br>
>> >> >> ><br>
>> >> >> > --<br>
>> >> >> > What most experimenters take for granted before they begin their<br>
>> >> >> > experiments<br>
>> >> >> > is infinitely more interesting than any results to which their<br>
>> >> >> > experiments<br>
>> >> >> > lead.<br>
>> >> >> > -- Norbert Wiener<br>
>> >> ><br>
>> >> ><br>
>> >> ><br>
>> >> ><br>
>> >> > --<br>
>> >> > What most experimenters take for granted before they begin their<br>
>> >> > experiments<br>
>> >> > is infinitely more interesting than any results to which their<br>
>> >> > experiments<br>
>> >> > lead.<br>
>> >> > -- Norbert Wiener<br>
>> ><br>
>> ><br>
>> ><br>
>> ><br>
>> > --<br>
>> > What most experimenters take for granted before they begin their<br>
>> > experiments<br>
>> > is infinitely more interesting than any results to which their<br>
>> > experiments<br>
>> > lead.<br>
>> > -- Norbert Wiener<br>
><br>
><br>
><br>
<span class="HOEnZb"><font color="#888888">><br>
> --<br>
> What most experimenters take for granted before they begin their experiments<br>
> is infinitely more interesting than any results to which their experiments<br>
> lead.<br>
> -- Norbert Wiener<br>
</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div>
</div></div>