<div dir="ltr"><div dir="ltr"><br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Jul 21, 2020 at 12:06 PM Pierpaolo Minelli <<a href="mailto:pierpaolo.minelli@cnr.it">pierpaolo.minelli@cnr.it</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div style="overflow-wrap: break-word;"><br><div><br><blockquote type="cite"><div>Il giorno 21 lug 2020, alle ore 16:56, Mark Adams <<a href="mailto:mfadams@lbl.gov" target="_blank">mfadams@lbl.gov</a>> ha scritto:</div><br><div><div dir="ltr"><div dir="ltr"><br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Jul 21, 2020 at 9:46 AM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Tue, Jul 21, 2020 at 9:35 AM Pierpaolo Minelli <<a href="mailto:pierpaolo.minelli@cnr.it" target="_blank">pierpaolo.minelli@cnr.it</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div>Thanks for your reply.<div>As I wrote before, I use these settings:</div><div><br><span>-dm_mat_type hypre -pc_type hypre -pc_hypre_type boomeramg -pc_hypre_boomeramg_relax_type_all SOR/Jacobi -pc_hypre_boomeramg_coarsen_type PMIS -pc_hypre_boomeramg_interp_type FF1 -ksp_type richardson</span></div><div><font><span><br></span></font></div><div><font>Is there a way to emulate this features also with GAMG?</font></div></div></blockquote><div><br></div><div>Smoothers: You have complete control here</div><div><br></div><div> -mg_levels_pc_type sor (the default is Chebyshev which you could also try)</div></div></div></blockquote><div><br></div><div>And you set the KSP type. You have -ksp_type richardson above but that is not used for Hypre. It is for GAMG. Chebyshev is a ksp type (-ksp_type chebyshev).</div></div></div></div></blockquote><div><br></div><blockquote type="cite"><div><div dir="ltr"><div class="gmail_quote"><div><br></div><div>Hypre is very good on Poisson. THe grid complexity (cost per iteration) can be high but the convergence rate will be better than GAMG.</div><div><br></div><div>But, you should be able to get hypre to work. </div></div></div></div></blockquote><div><br></div><div>Yes it is very good for Poisson, and on a smaller case, at the beginning of my code development, I have tried Hypre, ML, and GAMG (without adding more options I have to admit) and hypre was faster without losing in precision and accurateness of results (I have checked them with -ksp_monitor_true_residual).</div><div>I left -kps_type Richardson instead of default gmres only because from residuals it seems more accurate.</div></div></div></blockquote><div><br></div><div>Whoops, I made a mistake. I was talking about the smoother. So -mg_levels_pc_type sor -mg_levels_ksp_type chebyshev</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div style="overflow-wrap: break-word;"><div><div><br></div><div>So first, i will try again to see if hypre (with integer 64bit) is able to work on a smaller case as suggested by Stefano.</div><div>Then I will investigate GAMG options and I will give you a feedback.</div><div>The problem is that I need 64bit integers because of my problem size so I have to follow both path, but I hope that I will be able to continue to use hypre.</div><div><br></div><div>Thanks</div><div><br></div><div>Pierpaolo</div><div><br></div><br><blockquote type="cite"><div><div dir="ltr"><div class="gmail_quote"><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_quote"><div><br></div><div>Coarsening: This is much different in agglomeration AMG. There is a discussion here:</div><div><br></div><div> <a href="https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGAMGSetThreshold.html" target="_blank">https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGAMGSetThreshold.html</a></div><div> <a href="https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGAMGSetSquareGraph.html" target="_blank">https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGAMGSetSquareGraph.html</a></div><div><br></div><div>Interpolation: This is built-in for agglomeration AMG.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><font>It would be better to use only native Petsc implementations, but these settings, up to single precision indexing for integers, gave me optimal performances. </font></div><div><font>For this reason I asked also, if it was possible to configure hypre (inside Petsc) with 64bit integers.</font></div></div></blockquote><div><br></div><div>Yes. That happened when you reconfigured for 64 bits. You may have encountered a Hypre bug.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><font><span>Pierpaolo</span></font></div><div><br></div><div><div><br><blockquote type="cite"><div>Il giorno 21 lug 2020, alle ore 13:36, Dave May <<a href="mailto:dave.mayhem23@gmail.com" target="_blank">dave.mayhem23@gmail.com</a>> ha scritto:</div><br><div><br><br style="font-family:Helvetica;font-size:12px;font-style:normal;font-variant-caps:normal;font-weight:normal;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div class="gmail_quote" style="font-family:Helvetica;font-size:12px;font-style:normal;font-variant-caps:normal;font-weight:normal;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;text-decoration:none"><div dir="ltr" class="gmail_attr">On Tue, 21 Jul 2020 at 12:32, Pierpaolo Minelli <<a href="mailto:pierpaolo.minelli@cnr.it" target="_blank">pierpaolo.minelli@cnr.it</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div>Hi,<div><br></div><div>I have asked to compile a Petsc Version updated and with 64bit indices.</div><div>Now I have Version 3.13.3 and these are the configure options used:</div><div><br></div><div>#!/bin/python<br>if __name__ == '__main__':<br> import sys<br> import os<br> sys.path.insert(0, os.path.abspath('config'))<br> import configure<br> configure_options = [<br> '--CC=mpiicc',<br> '--CXX=mpiicpc',<br> '--download-hypre',<br> '--download-metis',<br> '--download-mumps=yes',<br> '--download-parmetis',<br> '--download-scalapack',<br> '--download-superlu_dist',<br> '--known-64-bit-blas-indices',<br> '--prefix=/cineca/prod/opt/libraries/petsc/3.13.3_int64/intelmpi--2018--binary',<br> '--with-64-bit-indices=1',<br> '--with-blaslapack-dir=/cineca/prod/opt/compilers/intel/pe-xe-2018/binary/mkl',<br> '--with-cmake-dir=/cineca/prod/opt/tools/cmake/3.12.0/none',<br> '--with-debugging=0',<br> '--with-fortran-interfaces=1',<br> '--with-fortran=1',<br> 'FC=mpiifort',<br> 'PETSC_ARCH=arch-linux2-c-opt',<br> ]<br> configure.petsc_configure(configure_options)</div><div><br></div><div>Now, I receive an error on hypre:</div><div><br></div><div>forrtl: error (78): process killed (SIGTERM)<br>Image PC Routine Line Source <br>libHYPRE-2.18.2.s 00002B33CF465D3F for__signal_handl Unknown Unknown<br>libpthread-2.17.s 00002B33D5BFD370 Unknown Unknown Unknown<br>libpthread-2.17.s 00002B33D5BF96D3 pthread_cond_wait Unknown Unknown<br>libiomp5.so 00002B33DBA14E07 Unknown Unknown Unknown<br>libiomp5.so 00002B33DB98810C Unknown Unknown Unknown<br>libiomp5.so 00002B33DB990578 Unknown Unknown Unknown<br>libiomp5.so 00002B33DB9D9659 Unknown Unknown Unknown<br>libiomp5.so 00002B33DB9D8C39 Unknown Unknown Unknown<br>libiomp5.so 00002B33DB993BCE __kmpc_fork_call Unknown Unknown<br>PIC_3D 00000000004071C0 Unknown Unknown Unknown<br>PIC_3D 0000000000490299 Unknown Unknown Unknown<br>PIC_3D 0000000000492C17 Unknown Unknown Unknown<br>PIC_3D 000000000040562E Unknown Unknown Unknown<br><a href="http://libc-2.17.so/" target="_blank">libc-2.17.so</a> 00002B33DC5BEB35 __libc_start_main Unknown Unknown<br>PIC_3D 0000000000405539 Unknown Unknown Unknown</div><div><br></div><div>Is it possible that I need to ask also to compile hypre with an option for 64bit indices?</div><div>Is it possible to instruct this inside Petsc configure?</div><div>Alternatively, is it possible to use a different multigrid PC inside PETSc that accept 64bit indices?</div></div></blockquote><div><br></div><div><div>You can use</div><div> -pc_type gamg<br></div></div><div>All native PETSc implementations support 64bit indices.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><br></div><div>Thanks in advance</div><div><br></div><div>Pierpaolo</div><div><br></div><div><div><br><blockquote type="cite"><div>Il giorno 27 mag 2020, alle ore 11:26, Stefano Zampini <<a href="mailto:stefano.zampini@gmail.com" target="_blank">stefano.zampini@gmail.com</a>> ha scritto:</div><br><div><div dir="ltr">You need a version of PETSc compiled with 64bit indices, since the message indicates the number of dofs in this case is larger the INT_MAX<div>2501×3401×1601 = 13617947501<br></div><div><br></div><div>I also suggest you upgrade to a newer version, 3.8.3 is quite old as the error message reports</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">Il giorno mer 27 mag 2020 alle ore 11:50 Pierpaolo Minelli <<a href="mailto:pierpaolo.minelli@cnr.it" target="_blank">pierpaolo.minelli@cnr.it</a>> ha scritto:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hi,<br><br>I am trying to solve a Poisson equation on this grid:<br><br>Nx = 2501<br>Ny = 3401<br>Nz = 1601<br><br>I received this error:<br><br>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Overflow in integer operation:<span> </span><a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#64-bit-indices" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#64-bit-indices</a><br>[0]PETSC ERROR: Mesh of 2501 by 3401 by 1 (dof) is too large for 32 bit indices<br>[0]PETSC ERROR: See<span> </span><a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><span> </span>for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.8.3, Dec, 09, 2017<span> </span><br>[0]PETSC ERROR: /marconi_scratch/userexternal/pminelli/PIC3D/2500_3400_1600/./PIC_3D on a arch-linux2-c-opt named r129c09s02 by pminelli Tu<br>e May 26 20:16:34 2020<br>[0]PETSC ERROR: Configure options --prefix=/cineca/prod/opt/libraries/petsc/3.8.3/intelmpi--2018--binary CC=mpiicc FC=mpiifort CXX=mpiicpc<span> </span><br>F77=mpiifort F90=mpiifort --with-debugging=0 --with-blaslapack-dir=/cineca/prod/opt/compilers/intel/pe-xe-2018/binary/mkl --with-fortran=1<span> </span><br>--with-fortran-interfaces=1 --with-cmake-dir=/cineca/prod/opt/tools/cmake/3.5.2/none --with-mpi-dir=/cineca/prod/opt/compilers/intel/pe-xe-<br>2018/binary/impi/2018.4.274 --download-scalapack --download-mumps=yes --download-hypre --download-superlu_dist --download-parmetis --downlo<br>ad-metis<br>[0]PETSC ERROR: #1 DMSetUp_DA_3D() line 218 in /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/<br>impls/da/da3.c<br>[0]PETSC ERROR: #2 DMSetUp_DA() line 25 in /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/impl<br>s/da/dareg.c<br>[0]PETSC ERROR: #3 DMSetUp() line 720 in /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/interf<br>ace/dm.c<br>forrtl: error (76): Abort trap signal<br><br><br>I am on an HPC facility and after I loaded PETSC module, I have seen that it is configured with INTEGER size = 32<br><br>I solve my problem with these options and it works perfectly with smaller grids:<br><br>-dm_mat_type hypre -pc_type hypre -pc_hypre_type boomeramg -pc_hypre_boomeramg_relax_type_all SOR/Jacobi -pc_hypre_boomeramg_coarsen_type PMIS -pc_hypre_boomeramg_interp_type FF1 -ksp_type richardson<br><br>Is it possible to overcome this if I ask them to install a version with INTEGER SIZE = 64?<br>Alternatively, is it possible to overcome this using intel compiler options?<br><br>Thanks in advance<br><br>Pierpaolo Minelli</blockquote></div><br clear="all"><div><br></div>--<span> </span><br><div dir="ltr">Stefano</div></div></blockquote></div></div></div></blockquote></div></div></blockquote></div><br></div></div></blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div></div>
</div></blockquote></div><br></div></blockquote></div></div>