[petsc-users] Error on INTEGER SIZE using DMDACreate3d

Mark Adams mfadams at lbl.gov
Tue Jul 21 12:10:57 CDT 2020


On Tue, Jul 21, 2020 at 12:06 PM Pierpaolo Minelli <pierpaolo.minelli at cnr.it>
wrote:

>
>
> Il giorno 21 lug 2020, alle ore 16:56, Mark Adams <mfadams at lbl.gov> ha
> scritto:
>
>
>
> On Tue, Jul 21, 2020 at 9:46 AM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Tue, Jul 21, 2020 at 9:35 AM Pierpaolo Minelli <
>> pierpaolo.minelli at cnr.it> wrote:
>>
>>> Thanks for your reply.
>>> As I wrote before, I use these settings:
>>>
>>> -dm_mat_type hypre -pc_type hypre -pc_hypre_type boomeramg
>>> -pc_hypre_boomeramg_relax_type_all SOR/Jacobi
>>> -pc_hypre_boomeramg_coarsen_type PMIS -pc_hypre_boomeramg_interp_type FF1
>>> -ksp_type richardson
>>>
>>> Is there a way to emulate this features also with GAMG?
>>>
>>
>> Smoothers: You have complete control here
>>
>>   -mg_levels_pc_type sor   (the default is Chebyshev which you could also
>> try)
>>
>
> And you set the KSP type. You have -ksp_type richardson above but that is
> not used for Hypre. It is for GAMG. Chebyshev is a ksp type (-ksp_type
> chebyshev).
>
>
>
> Hypre is very good on Poisson. THe grid complexity (cost per iteration)
> can be high but the convergence rate will be better than GAMG.
>
> But, you should be able to get hypre to work.
>
>
> Yes it is very good for Poisson, and on a smaller case, at the beginning
> of my code development,  I have tried Hypre, ML, and GAMG (without adding
> more options I have to admit) and hypre was faster without losing in
> precision and accurateness of results (I have checked them with
> -ksp_monitor_true_residual).
> I left -kps_type Richardson instead of default gmres only because from
> residuals it seems more accurate.
>

Whoops, I made a mistake. I was talking about the smoother. So
-mg_levels_pc_type sor -mg_levels_ksp_type chebyshev


>
> So first,  i will try again to see if hypre (with integer 64bit) is able
> to work on a smaller case as suggested by Stefano.
> Then I will investigate GAMG options and I will give you a feedback.
> The problem is that I need 64bit integers because of my problem size so I
> have to follow both path, but I hope that I will be able to continue to use
> hypre.
>
> Thanks
>
> Pierpaolo
>
>
>
>
>>
>> Coarsening: This is much different in agglomeration AMG. There is a
>> discussion here:
>>
>>
>> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGAMGSetThreshold.html
>>
>> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGAMGSetSquareGraph.html
>>
>> Interpolation: This is built-in for agglomeration AMG.
>>
>>
>>> It would be better to use only native Petsc implementations, but these
>>> settings, up to single precision indexing for integers, gave me optimal
>>> performances.
>>> For this reason I asked also, if it was possible to configure hypre
>>> (inside Petsc) with 64bit integers.
>>>
>>
>> Yes. That happened when you reconfigured for 64 bits. You may have
>> encountered a Hypre bug.
>>
>>   Thanks,
>>
>>     Matt
>>
>>
>>> Pierpaolo
>>>
>>>
>>> Il giorno 21 lug 2020, alle ore 13:36, Dave May <dave.mayhem23 at gmail.com>
>>> ha scritto:
>>>
>>>
>>>
>>> On Tue, 21 Jul 2020 at 12:32, Pierpaolo Minelli <
>>> pierpaolo.minelli at cnr.it> wrote:
>>>
>>>> Hi,
>>>>
>>>> I have asked to compile a Petsc Version updated and with 64bit indices.
>>>> Now I have Version 3.13.3 and these are the configure options used:
>>>>
>>>> #!/bin/python
>>>> if __name__ == '__main__':
>>>>   import sys
>>>>   import os
>>>>   sys.path.insert(0, os.path.abspath('config'))
>>>>   import configure
>>>>   configure_options = [
>>>>     '--CC=mpiicc',
>>>>     '--CXX=mpiicpc',
>>>>     '--download-hypre',
>>>>     '--download-metis',
>>>>     '--download-mumps=yes',
>>>>     '--download-parmetis',
>>>>     '--download-scalapack',
>>>>     '--download-superlu_dist',
>>>>     '--known-64-bit-blas-indices',
>>>>
>>>>   '--prefix=/cineca/prod/opt/libraries/petsc/3.13.3_int64/intelmpi--2018--binary',
>>>>     '--with-64-bit-indices=1',
>>>>
>>>>   '--with-blaslapack-dir=/cineca/prod/opt/compilers/intel/pe-xe-2018/binary/mkl',
>>>>     '--with-cmake-dir=/cineca/prod/opt/tools/cmake/3.12.0/none',
>>>>     '--with-debugging=0',
>>>>     '--with-fortran-interfaces=1',
>>>>     '--with-fortran=1',
>>>>     'FC=mpiifort',
>>>>     'PETSC_ARCH=arch-linux2-c-opt',
>>>>   ]
>>>>   configure.petsc_configure(configure_options)
>>>>
>>>> Now, I receive an error on hypre:
>>>>
>>>> forrtl: error (78): process killed (SIGTERM)
>>>> Image              PC                Routine            Line
>>>>   Source
>>>> libHYPRE-2.18.2.s  00002B33CF465D3F  for__signal_handl
>>>>   Unknown  Unknown
>>>> libpthread-2.17.s  00002B33D5BFD370  Unknown
>>>>   Unknown  Unknown
>>>> libpthread-2.17.s  00002B33D5BF96D3  pthread_cond_wait
>>>>   Unknown  Unknown
>>>> libiomp5.so        00002B33DBA14E07  Unknown
>>>>   Unknown  Unknown
>>>> libiomp5.so        00002B33DB98810C  Unknown
>>>>   Unknown  Unknown
>>>> libiomp5.so        00002B33DB990578  Unknown
>>>>   Unknown  Unknown
>>>> libiomp5.so        00002B33DB9D9659  Unknown
>>>>   Unknown  Unknown
>>>> libiomp5.so        00002B33DB9D8C39  Unknown
>>>>   Unknown  Unknown
>>>> libiomp5.so        00002B33DB993BCE  __kmpc_fork_call
>>>>   Unknown  Unknown
>>>> PIC_3D             00000000004071C0  Unknown
>>>>   Unknown  Unknown
>>>> PIC_3D             0000000000490299  Unknown
>>>>   Unknown  Unknown
>>>> PIC_3D             0000000000492C17  Unknown
>>>>   Unknown  Unknown
>>>> PIC_3D             000000000040562E  Unknown
>>>>   Unknown  Unknown
>>>> libc-2.17.so       00002B33DC5BEB35  __libc_start_main
>>>>   Unknown  Unknown
>>>> PIC_3D             0000000000405539  Unknown
>>>>   Unknown  Unknown
>>>>
>>>> Is it possible that I need to ask also to compile hypre with an option
>>>> for 64bit indices?
>>>> Is it possible to instruct this inside Petsc configure?
>>>> Alternatively, is it possible to use a different multigrid PC inside
>>>> PETSc that accept 64bit indices?
>>>>
>>>
>>> You can use
>>>   -pc_type gamg
>>> All native PETSc implementations support 64bit indices.
>>>
>>>
>>>>
>>>> Thanks in advance
>>>>
>>>> Pierpaolo
>>>>
>>>>
>>>> Il giorno 27 mag 2020, alle ore 11:26, Stefano Zampini <
>>>> stefano.zampini at gmail.com> ha scritto:
>>>>
>>>> You need a version of PETSc compiled with 64bit indices, since the
>>>> message indicates the number of dofs in this case is larger the INT_MAX
>>>> 2501×3401×1601 = 13617947501
>>>>
>>>> I also suggest you upgrade to a newer version, 3.8.3 is quite old as
>>>> the error message reports
>>>>
>>>> Il giorno mer 27 mag 2020 alle ore 11:50 Pierpaolo Minelli <
>>>> pierpaolo.minelli at cnr.it> ha scritto:
>>>>
>>>>> Hi,
>>>>>
>>>>> I am trying to solve a Poisson equation on this grid:
>>>>>
>>>>> Nx = 2501
>>>>> Ny = 3401
>>>>> Nz = 1601
>>>>>
>>>>> I received this error:
>>>>>
>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>> --------------------------------------------------------------
>>>>> [0]PETSC ERROR: Overflow in integer operation:
>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#64-bit-indices
>>>>> [0]PETSC ERROR: Mesh of 2501 by 3401 by 1 (dof) is too large for 32
>>>>> bit indices
>>>>> [0]PETSC ERROR: See
>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>>>>> shooting.
>>>>> [0]PETSC ERROR: Petsc Release Version 3.8.3, Dec, 09, 2017
>>>>> [0]PETSC ERROR:
>>>>> /marconi_scratch/userexternal/pminelli/PIC3D/2500_3400_1600/./PIC_3D on a
>>>>> arch-linux2-c-opt named r129c09s02 by pminelli Tu
>>>>> e May 26 20:16:34 2020
>>>>> [0]PETSC ERROR: Configure options
>>>>> --prefix=/cineca/prod/opt/libraries/petsc/3.8.3/intelmpi--2018--binary
>>>>> CC=mpiicc FC=mpiifort CXX=mpiicpc
>>>>> F77=mpiifort F90=mpiifort --with-debugging=0
>>>>> --with-blaslapack-dir=/cineca/prod/opt/compilers/intel/pe-xe-2018/binary/mkl
>>>>> --with-fortran=1
>>>>> --with-fortran-interfaces=1
>>>>> --with-cmake-dir=/cineca/prod/opt/tools/cmake/3.5.2/none
>>>>> --with-mpi-dir=/cineca/prod/opt/compilers/intel/pe-xe-
>>>>> 2018/binary/impi/2018.4.274 --download-scalapack --download-mumps=yes
>>>>> --download-hypre --download-superlu_dist --download-parmetis --downlo
>>>>> ad-metis
>>>>> [0]PETSC ERROR: #1 DMSetUp_DA_3D() line 218 in
>>>>> /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/
>>>>> impls/da/da3.c
>>>>> [0]PETSC ERROR: #2 DMSetUp_DA() line 25 in
>>>>> /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/impl
>>>>> s/da/dareg.c
>>>>> [0]PETSC ERROR: #3 DMSetUp() line 720 in
>>>>> /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/interf
>>>>> ace/dm.c
>>>>> forrtl: error (76): Abort trap signal
>>>>>
>>>>>
>>>>> I am on an HPC facility and after I loaded PETSC module, I have seen
>>>>> that it is configured with INTEGER size = 32
>>>>>
>>>>> I solve my problem with these options and it works perfectly with
>>>>> smaller grids:
>>>>>
>>>>> -dm_mat_type hypre -pc_type hypre -pc_hypre_type boomeramg
>>>>> -pc_hypre_boomeramg_relax_type_all SOR/Jacobi
>>>>> -pc_hypre_boomeramg_coarsen_type PMIS -pc_hypre_boomeramg_interp_type FF1
>>>>> -ksp_type richardson
>>>>>
>>>>> Is it possible to overcome this if I ask them to install a version
>>>>> with INTEGER SIZE = 64?
>>>>> Alternatively, is it possible to overcome this using intel compiler
>>>>> options?
>>>>>
>>>>> Thanks in advance
>>>>>
>>>>> Pierpaolo Minelli
>>>>
>>>>
>>>>
>>>> --
>>>> Stefano
>>>>
>>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200721/cf258db0/attachment.html>


More information about the petsc-users mailing list