<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><div class=""><br class=""></div><div class=""> I assume PIC_3D is your code and you are using OpenMP? </div><div class=""><br class=""></div><div class=""> Are you calling hypre from inside your OpenMP parallelism? From inside PIC_3D?</div><div class=""><br class=""></div><div class=""> The SIGTERM is confusing to me. Are you using signals in any way? Usually a sigterm comes outside a process not process or thread crash.</div><div class=""><br class=""></div><div class=""> I assume for__signal_handl... is a Fortran signal handler</div><div class=""><br class=""></div>forrtl: error (78): process killed (SIGTERM)<br class="">Image PC Routine Line Source <br class="">libHYPRE-2.18.2.s 00002B33CF465D3F for__signal_handl Unknown Unknown<br class="">libpthread-2.17.s 00002B33D5BFD370 Unknown Unknown Unknown<br class="">libpthread-2.17.s 00002B33D5BF96D3 pthread_cond_wait Unknown Unknown<br class="">libiomp5.so 00002B33DBA14E07 Unknown Unknown Unknown<br class="">libiomp5.so 00002B33DB98810C Unknown Unknown Unknown<br class="">libiomp5.so 00002B33DB990578 Unknown Unknown Unknown<br class="">libiomp5.so 00002B33DB9D9659 Unknown Unknown Unknown<br class="">libiomp5.so 00002B33DB9D8C39 Unknown Unknown Unknown<br class="">libiomp5.so 00002B33DB993BCE __kmpc_fork_call Unknown Unknown<br class="">PIC_3D 00000000004071C0 Unknown Unknown Unknown<br class="">PIC_3D 0000000000490299 Unknown Unknown Unknown<br class="">PIC_3D 0000000000492C17 Unknown Unknown Unknown<br class="">PIC_3D 000000000040562E Unknown Unknown Unknown<br class="">libc-2.17.so 00002B33DC5BEB35 __libc_start_main Unknown Unknown<br class="">PIC_3D 0000000000405539 Unknown Unknown Unknown<div class=""><br class=""></div><div class=""><br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Jul 21, 2020, at 6:32 AM, Pierpaolo Minelli <<a href="mailto:pierpaolo.minelli@cnr.it" class="">pierpaolo.minelli@cnr.it</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><meta http-equiv="Content-Type" content="text/html; charset=utf-8" class=""><div style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class="">Hi,<div class=""><br class=""></div><div class="">I have asked to compile a Petsc Version updated and with 64bit indices.</div><div class="">Now I have Version 3.13.3 and these are the configure options used:</div><div class=""><br class=""></div><div class="">#!/bin/python<br class="">if __name__ == '__main__':<br class=""> import sys<br class=""> import os<br class=""> sys.path.insert(0, os.path.abspath('config'))<br class=""> import configure<br class=""> configure_options = [<br class=""> '--CC=mpiicc',<br class=""> '--CXX=mpiicpc',<br class=""> '--download-hypre',<br class=""> '--download-metis',<br class=""> '--download-mumps=yes',<br class=""> '--download-parmetis',<br class=""> '--download-scalapack',<br class=""> '--download-superlu_dist',<br class=""> '--known-64-bit-blas-indices',<br class=""> '--prefix=/cineca/prod/opt/libraries/petsc/3.13.3_int64/intelmpi--2018--binary',<br class=""> '--with-64-bit-indices=1',<br class=""> '--with-blaslapack-dir=/cineca/prod/opt/compilers/intel/pe-xe-2018/binary/mkl',<br class=""> '--with-cmake-dir=/cineca/prod/opt/tools/cmake/3.12.0/none',<br class=""> '--with-debugging=0',<br class=""> '--with-fortran-interfaces=1',<br class=""> '--with-fortran=1',<br class=""> 'FC=mpiifort',<br class=""> 'PETSC_ARCH=arch-linux2-c-opt',<br class=""> ]<br class=""> configure.petsc_configure(configure_options)</div><div class=""><br class=""></div><div class="">Now, I receive an error on hypre:</div><div class=""><br class=""></div><div class="">forrtl: error (78): process killed (SIGTERM)<br class="">Image PC Routine Line Source <br class="">libHYPRE-2.18.2.s 00002B33CF465D3F for__signal_handl Unknown Unknown<br class="">libpthread-2.17.s 00002B33D5BFD370 Unknown Unknown Unknown<br class="">libpthread-2.17.s 00002B33D5BF96D3 pthread_cond_wait Unknown Unknown<br class="">libiomp5.so 00002B33DBA14E07 Unknown Unknown Unknown<br class="">libiomp5.so 00002B33DB98810C Unknown Unknown Unknown<br class="">libiomp5.so 00002B33DB990578 Unknown Unknown Unknown<br class="">libiomp5.so 00002B33DB9D9659 Unknown Unknown Unknown<br class="">libiomp5.so 00002B33DB9D8C39 Unknown Unknown Unknown<br class="">libiomp5.so 00002B33DB993BCE __kmpc_fork_call Unknown Unknown<br class="">PIC_3D 00000000004071C0 Unknown Unknown Unknown<br class="">PIC_3D 0000000000490299 Unknown Unknown Unknown<br class="">PIC_3D 0000000000492C17 Unknown Unknown Unknown<br class="">PIC_3D 000000000040562E Unknown Unknown Unknown<br class="">libc-2.17.so 00002B33DC5BEB35 __libc_start_main Unknown Unknown<br class="">PIC_3D 0000000000405539 Unknown Unknown Unknown</div><div class=""><br class=""></div><div class="">Is it possible that I need to ask also to compile hypre with an option for 64bit indices?</div><div class="">Is it possible to instruct this inside Petsc configure?</div><div class="">Alternatively, is it possible to use a different multigrid PC inside PETSc that accept 64bit indices?</div><div class=""><br class=""></div><div class="">Thanks in advance</div><div class=""><br class=""></div><div class="">Pierpaolo</div><div class=""><br class=""></div><div class=""><div class=""><br class=""><blockquote type="cite" class=""><div class="">Il giorno 27 mag 2020, alle ore 11:26, Stefano Zampini <<a href="mailto:stefano.zampini@gmail.com" class="">stefano.zampini@gmail.com</a>> ha scritto:</div><br class="Apple-interchange-newline"><div class=""><div dir="ltr" class="">You need a version of PETSc compiled with 64bit indices, since the message indicates the number of dofs in this case is larger the INT_MAX<div class="">2501×3401×1601 = 13617947501<br class=""></div><div class=""><br class=""></div><div class="">I also suggest you upgrade to a newer version, 3.8.3 is quite old as the error message reports</div></div><br class=""><div class="gmail_quote"><div dir="ltr" class="gmail_attr">Il giorno mer 27 mag 2020 alle ore 11:50 Pierpaolo Minelli <<a href="mailto:pierpaolo.minelli@cnr.it" class="">pierpaolo.minelli@cnr.it</a>> ha scritto:<br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hi,<br class="">
<br class="">
I am trying to solve a Poisson equation on this grid:<br class="">
<br class="">
Nx = 2501<br class="">
Ny = 3401<br class="">
Nz = 1601<br class="">
<br class="">
I received this error:<br class="">
<br class="">
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br class="">
[0]PETSC ERROR: Overflow in integer operation: <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#64-bit-indices" rel="noreferrer" target="_blank" class="">http://www.mcs.anl.gov/petsc/documentation/faq.html#64-bit-indices</a><br class="">
[0]PETSC ERROR: Mesh of 2501 by 3401 by 1 (dof) is too large for 32 bit indices<br class="">
[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank" class="">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br class="">
[0]PETSC ERROR: Petsc Release Version 3.8.3, Dec, 09, 2017 <br class="">
[0]PETSC ERROR: /marconi_scratch/userexternal/pminelli/PIC3D/2500_3400_1600/./PIC_3D on a arch-linux2-c-opt named r129c09s02 by pminelli Tu<br class="">
e May 26 20:16:34 2020<br class="">
[0]PETSC ERROR: Configure options --prefix=/cineca/prod/opt/libraries/petsc/3.8.3/intelmpi--2018--binary CC=mpiicc FC=mpiifort CXX=mpiicpc <br class="">
F77=mpiifort F90=mpiifort --with-debugging=0 --with-blaslapack-dir=/cineca/prod/opt/compilers/intel/pe-xe-2018/binary/mkl --with-fortran=1 <br class="">
--with-fortran-interfaces=1 --with-cmake-dir=/cineca/prod/opt/tools/cmake/3.5.2/none --with-mpi-dir=/cineca/prod/opt/compilers/intel/pe-xe-<br class="">
2018/binary/impi/2018.4.274 --download-scalapack --download-mumps=yes --download-hypre --download-superlu_dist --download-parmetis --downlo<br class="">
ad-metis<br class="">
[0]PETSC ERROR: #1 DMSetUp_DA_3D() line 218 in /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/<br class="">
impls/da/da3.c<br class="">
[0]PETSC ERROR: #2 DMSetUp_DA() line 25 in /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/impl<br class="">
s/da/dareg.c<br class="">
[0]PETSC ERROR: #3 DMSetUp() line 720 in /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/interf<br class="">
ace/dm.c<br class="">
forrtl: error (76): Abort trap signal<br class="">
<br class="">
<br class="">
I am on an HPC facility and after I loaded PETSC module, I have seen that it is configured with INTEGER size = 32<br class="">
<br class="">
I solve my problem with these options and it works perfectly with smaller grids:<br class="">
<br class="">
-dm_mat_type hypre -pc_type hypre -pc_hypre_type boomeramg -pc_hypre_boomeramg_relax_type_all SOR/Jacobi -pc_hypre_boomeramg_coarsen_type PMIS -pc_hypre_boomeramg_interp_type FF1 -ksp_type richardson<br class="">
<br class="">
Is it possible to overcome this if I ask them to install a version with INTEGER SIZE = 64?<br class="">
Alternatively, is it possible to overcome this using intel compiler options?<br class="">
<br class="">
Thanks in advance<br class="">
<br class="">
Pierpaolo Minelli</blockquote></div><br clear="all" class=""><div class=""><br class=""></div>-- <br class=""><div dir="ltr" class="gmail_signature">Stefano</div>
</div></blockquote></div><br class=""></div></div></div></blockquote></div><br class=""></div></body></html>