[petsc-users] Error on INTEGER SIZE using DMDACreate3d
Pierpaolo Minelli
pierpaolo.minelli at cnr.it
Fri Jul 31 05:46:33 CDT 2020
Hi,
I want to thank you for all your useful suggestions, but luckily fo me it was non a problem related to Petsc as I wrongly supposed reading this error. There was a problem on GPFS filesystem that CINECA support solved and also a problem related to my code that was writing too much files when using 512 nodes. Once these two problem were solved I was able to continue to use Hypre without any problem.
Sorry for the trouble and thanks again for your suggestions.
Pierpaolo
> Il giorno 22 lug 2020, alle ore 02:51, Barry Smith <bsmith at petsc.dev> ha scritto:
>
>
> I assume PIC_3D is your code and you are using OpenMP?
>
> Are you calling hypre from inside your OpenMP parallelism? From inside PIC_3D?
>
> The SIGTERM is confusing to me. Are you using signals in any way? Usually a sigterm comes outside a process not process or thread crash.
>
> I assume for__signal_handl... is a Fortran signal handler
>
> forrtl: error (78): process killed (SIGTERM)
> Image PC Routine Line Source
> libHYPRE-2.18.2.s 00002B33CF465D3F for__signal_handl Unknown Unknown
> libpthread-2.17.s 00002B33D5BFD370 Unknown Unknown Unknown
> libpthread-2.17.s 00002B33D5BF96D3 pthread_cond_wait Unknown Unknown
> libiomp5.so 00002B33DBA14E07 Unknown Unknown Unknown
> libiomp5.so 00002B33DB98810C Unknown Unknown Unknown
> libiomp5.so 00002B33DB990578 Unknown Unknown Unknown
> libiomp5.so 00002B33DB9D9659 Unknown Unknown Unknown
> libiomp5.so 00002B33DB9D8C39 Unknown Unknown Unknown
> libiomp5.so 00002B33DB993BCE __kmpc_fork_call Unknown Unknown
> PIC_3D 00000000004071C0 Unknown Unknown Unknown
> PIC_3D 0000000000490299 Unknown Unknown Unknown
> PIC_3D 0000000000492C17 Unknown Unknown Unknown
> PIC_3D 000000000040562E Unknown Unknown Unknown
> libc-2.17.so 00002B33DC5BEB35 __libc_start_main Unknown Unknown
> PIC_3D 0000000000405539 Unknown Unknown Unknown
>
>
>
>> On Jul 21, 2020, at 6:32 AM, Pierpaolo Minelli <pierpaolo.minelli at cnr.it <mailto:pierpaolo.minelli at cnr.it>> wrote:
>>
>> Hi,
>>
>> I have asked to compile a Petsc Version updated and with 64bit indices.
>> Now I have Version 3.13.3 and these are the configure options used:
>>
>> #!/bin/python
>> if __name__ == '__main__':
>> import sys
>> import os
>> sys.path.insert(0, os.path.abspath('config'))
>> import configure
>> configure_options = [
>> '--CC=mpiicc',
>> '--CXX=mpiicpc',
>> '--download-hypre',
>> '--download-metis',
>> '--download-mumps=yes',
>> '--download-parmetis',
>> '--download-scalapack',
>> '--download-superlu_dist',
>> '--known-64-bit-blas-indices',
>> '--prefix=/cineca/prod/opt/libraries/petsc/3.13.3_int64/intelmpi--2018--binary',
>> '--with-64-bit-indices=1',
>> '--with-blaslapack-dir=/cineca/prod/opt/compilers/intel/pe-xe-2018/binary/mkl',
>> '--with-cmake-dir=/cineca/prod/opt/tools/cmake/3.12.0/none',
>> '--with-debugging=0',
>> '--with-fortran-interfaces=1',
>> '--with-fortran=1',
>> 'FC=mpiifort',
>> 'PETSC_ARCH=arch-linux2-c-opt',
>> ]
>> configure.petsc_configure(configure_options)
>>
>> Now, I receive an error on hypre:
>>
>> forrtl: error (78): process killed (SIGTERM)
>> Image PC Routine Line Source
>> libHYPRE-2.18.2.s 00002B33CF465D3F for__signal_handl Unknown Unknown
>> libpthread-2.17.s 00002B33D5BFD370 Unknown Unknown Unknown
>> libpthread-2.17.s 00002B33D5BF96D3 pthread_cond_wait Unknown Unknown
>> libiomp5.so 00002B33DBA14E07 Unknown Unknown Unknown
>> libiomp5.so 00002B33DB98810C Unknown Unknown Unknown
>> libiomp5.so 00002B33DB990578 Unknown Unknown Unknown
>> libiomp5.so 00002B33DB9D9659 Unknown Unknown Unknown
>> libiomp5.so 00002B33DB9D8C39 Unknown Unknown Unknown
>> libiomp5.so 00002B33DB993BCE __kmpc_fork_call Unknown Unknown
>> PIC_3D 00000000004071C0 Unknown Unknown Unknown
>> PIC_3D 0000000000490299 Unknown Unknown Unknown
>> PIC_3D 0000000000492C17 Unknown Unknown Unknown
>> PIC_3D 000000000040562E Unknown Unknown Unknown
>> libc-2.17.so 00002B33DC5BEB35 __libc_start_main Unknown Unknown
>> PIC_3D 0000000000405539 Unknown Unknown Unknown
>>
>> Is it possible that I need to ask also to compile hypre with an option for 64bit indices?
>> Is it possible to instruct this inside Petsc configure?
>> Alternatively, is it possible to use a different multigrid PC inside PETSc that accept 64bit indices?
>>
>> Thanks in advance
>>
>> Pierpaolo
>>
>>
>>> Il giorno 27 mag 2020, alle ore 11:26, Stefano Zampini <stefano.zampini at gmail.com <mailto:stefano.zampini at gmail.com>> ha scritto:
>>>
>>> You need a version of PETSc compiled with 64bit indices, since the message indicates the number of dofs in this case is larger the INT_MAX
>>> 2501×3401×1601 = 13617947501
>>>
>>> I also suggest you upgrade to a newer version, 3.8.3 is quite old as the error message reports
>>>
>>> Il giorno mer 27 mag 2020 alle ore 11:50 Pierpaolo Minelli <pierpaolo.minelli at cnr.it <mailto:pierpaolo.minelli at cnr.it>> ha scritto:
>>> Hi,
>>>
>>> I am trying to solve a Poisson equation on this grid:
>>>
>>> Nx = 2501
>>> Ny = 3401
>>> Nz = 1601
>>>
>>> I received this error:
>>>
>>> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>>> [0]PETSC ERROR: Overflow in integer operation: http://www.mcs.anl.gov/petsc/documentation/faq.html#64-bit-indices <http://www.mcs.anl.gov/petsc/documentation/faq.html#64-bit-indices>
>>> [0]PETSC ERROR: Mesh of 2501 by 3401 by 1 (dof) is too large for 32 bit indices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html <http://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Release Version 3.8.3, Dec, 09, 2017
>>> [0]PETSC ERROR: /marconi_scratch/userexternal/pminelli/PIC3D/2500_3400_1600/./PIC_3D on a arch-linux2-c-opt named r129c09s02 by pminelli Tu
>>> e May 26 20:16:34 2020
>>> [0]PETSC ERROR: Configure options --prefix=/cineca/prod/opt/libraries/petsc/3.8.3/intelmpi--2018--binary CC=mpiicc FC=mpiifort CXX=mpiicpc
>>> F77=mpiifort F90=mpiifort --with-debugging=0 --with-blaslapack-dir=/cineca/prod/opt/compilers/intel/pe-xe-2018/binary/mkl --with-fortran=1
>>> --with-fortran-interfaces=1 --with-cmake-dir=/cineca/prod/opt/tools/cmake/3.5.2/none --with-mpi-dir=/cineca/prod/opt/compilers/intel/pe-xe-
>>> 2018/binary/impi/2018.4.274 --download-scalapack --download-mumps=yes --download-hypre --download-superlu_dist --download-parmetis --downlo
>>> ad-metis
>>> [0]PETSC ERROR: #1 DMSetUp_DA_3D() line 218 in /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/
>>> impls/da/da3.c
>>> [0]PETSC ERROR: #2 DMSetUp_DA() line 25 in /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/impl
>>> s/da/dareg.c
>>> [0]PETSC ERROR: #3 DMSetUp() line 720 in /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/interf
>>> ace/dm.c
>>> forrtl: error (76): Abort trap signal
>>>
>>>
>>> I am on an HPC facility and after I loaded PETSC module, I have seen that it is configured with INTEGER size = 32
>>>
>>> I solve my problem with these options and it works perfectly with smaller grids:
>>>
>>> -dm_mat_type hypre -pc_type hypre -pc_hypre_type boomeramg -pc_hypre_boomeramg_relax_type_all SOR/Jacobi -pc_hypre_boomeramg_coarsen_type PMIS -pc_hypre_boomeramg_interp_type FF1 -ksp_type richardson
>>>
>>> Is it possible to overcome this if I ask them to install a version with INTEGER SIZE = 64?
>>> Alternatively, is it possible to overcome this using intel compiler options?
>>>
>>> Thanks in advance
>>>
>>> Pierpaolo Minelli
>>>
>>>
>>> --
>>> Stefano
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200731/bc710d14/attachment-0001.html>
More information about the petsc-users
mailing list