[petsc-users] build pets with 64 bit indices
hexioafeng
hexiaofeng at buaa.edu.cn
Wed Apr 30 03:11:38 CDT 2025
I found that the preallocation check for dense matrix was removed in v3.16.3. I will try to update this version first.
Thanks.
Xiaofeng
> On Apr 30, 2025, at 15:35, Pierre Jolivet <pierre at joliv.et> wrote:
>
>
>
>> On 30 Apr 2025, at 9:31 AM, hexioafeng <hexiaofeng at buaa.edu.cn> wrote:
>>
>> Dear sir,
>>
>> I ran the case with the bv type mat again, and got the similar error:
>>
>>
>> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> [0]PETSC ERROR: No support for this operation for this object type
>> [0]PETSC ERROR: Product of two integer 4633044 1925 overflow, you must ./configure PETSc with --with-64-bit-indices for the case you are running
>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
>> [0]PETSC ERROR: Petsc Release Version 3.14.3, Jan 09, 2021
>> [0]PETSC ERROR: Unknown Name on a named DESKTOP-74R6I4M by ibe Wed Apr 30 15:08:43 2025
>> [0]PETSC ERROR: Configure options --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpifort --with-scalar-type=real --with-precision=double --prefix=/home/sd/petsc/petsc-3.14.3/..//rd CXXFLAGS=-fno-stack-protector CFLAGS=-fno-stack-protector FFLAGS=" -O2 -fallow-argument-mismatch -fallow-invalid-boz" --with-debugging=0 COPTFLAGS="-O3 -mtune=generic" CXXOPTFLAGS="-O3 -mtune=generic" FOPTFLAGS="-O3 -mtune=generic" --known-64-bit-blas-indices=0 --with-cxx-dialect=C++11 --with-ssl=0 --with-x=0 --with-fortran-bindings=0 --with-cudac=0 --with-shared-libraries=0 --with-mpi-lib=/c/Windows/System32/msmpi.dll --with-mpi-include=/home/sd/petsc/thirdparty/MPI/Include --with-mpiexec="/C/Program Files/Microsoft MPI/Bin/mpiexec" --with-blaslapack-lib="-L/home/sd/petsc/thirdparty/openblas/lib -llibopenblas -lopenblas" --with-metis-include=/home/sd/petsc/scalapack-mumps-dll/metis --with-metis-lib=/home/sd/petsc/scalapack-mumps-dll/metis/libmetis.dll --with-parmetis-include=/home/sd/petsc/scalapack-mumps-dll/metis --with-parmetis-lib=/home/sd/petsc/scalapack-mumps-dll/metis/libparmetis.dll --download-slepc --with-scalapack-lib=/home/sd/petsc/scalapack-mumps-dll/scalapack/libscalapack.dll --download-hypre --download-mumps --download-hypre-configure-arguments="--build=x86_64-linux-gnu --host=x86_64-linux-gnu" PETSC_ARCH=rd
>> [0]PETSC ERROR: #1 PetscIntMultError() line 2309 in C:/msys64/home/sd/petsc/petsc-3.14.3/include/petscsys.h
>> [0]PETSC ERROR: #2 MatSeqDenseSetPreallocation_SeqDense() line 2785 in C:/msys64/home/sd/petsc/petsc-3.14.3/src/mat/impls/dense/seq/dense.
>> [0]PETSC ERROR: #3 MatSeqDenseSetPreallocation() line 2767 in C:/msys64/home/sd/petsc/petsc-3.14.3/src/mat/impls/dense/seq/dense.c
>> [0]PETSC ERROR: #4 MatCreateDense() line 2416 in C:/msys64/home/sd/petsc/petsc-3.14.3/src/mat/impls/dense/mpi/mpidense.c
>> [0]PETSC ERROR: #5 BVCreate_Mat() line 455 in C:/msys64/home/sd/petsc/petsc-3.14.3/rd/externalpackages/git.slepc/src/sys/classes/bv/impls/mat/bvmat.c
>> [0]PETSC ERROR: #6 BVSetSizesFromVec() line 186 in C:/msys64/home/sd/petsc/petsc-3.14.3/rd/externalpackages/git.slepc/src/sys/classes/bv/interface/bvbasic.c
>> [0]PETSC ERROR: #7 EPSAllocateSolution() line 687 in C:/msys64/home/sd/petsc/petsc-3.14.3/rd/externalpackages/git.slepc/src/eps/interface/epssetup.c
>> [0]PETSC ERROR: #8 EPSSetUp_KrylovSchur() line 159 in C:/msys64/home/sd/petsc/petsc-3.14.3/rd/externalpackages/git.slepc/src/eps/impls/krylov/krylovschur/krylovschur.c
>> [0]PETSC ERROR: #9 EPSSetUp() line 315 in C:/msys64/home/sd/petsc/petsc-3.14.3/rd/externalpackages/git.slepc/src/eps/interface/epssetup.c
>>
>>
>>
>> Does it mean that I have to run with the 64-bit PETSc/SLEPC?
>
> Please always keep the list in copy.
> I’d rather update PETSc/SLEPc than just reconfigure with 64-bit PetscInt.
> This (preallocation error) has been fixed in PETSc.
>
> Thanks,
> Pierre
>
>> Thanks.
>> Xiaofeng
>>
>>
>>
>>> On Apr 30, 2025, at 14:10, hexioafeng <hexiaofeng at buaa.edu.cn> wrote:
>>>
>>> Thank you, sir. I will try it.
>>>
>>>
>>> Sincerely,
>>> Xiaofeng
>>>
>>>
>>>
>>>>> On Apr 30, 2025, at 14:01, Pierre Jolivet <pierre at joliv.et> wrote:
>>>>
>>>> Just use -bv_type mat and the error will go away.
>>>> Note: you are highly advised to update to a new PETSc/SLEPc version.
>>>>
>>>> Thanks,
>>>> Pierre
>>>>
>>>>> On 30 Apr 2025, at 7:58 AM, hexioafeng <hexiaofeng at buaa.edu.cn> wrote:
>>>>>
>>>>> Dear Sir,
>>>>>
>>>>> Thank you for your kind reply. Bellow are the full backtrace:
>>>>>
>>>>> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>>>>> [0]PETSC ERROR: No support for this operation for this object type
>>>>> [0]PETSC ERROR: Product of two integer 1925 4633044 overflow, you must ./configure PETSc with --with-64-bit-indices for the case you are running
>>>>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
>>>>> [0]PETSC ERROR: Petsc Release Version 3.14.3, Jan 09, 2021
>>>>> [0]PETSC ERROR: Unknown Name on a named DESKTOP-74R6I4M by ibe Sun Apr 27 16:10:08 2025
>>>>> [0]PETSC ERROR: Configure options --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpifort --with-scalar-type=real --with-precision=double --prefix=/home/sd/petsc/petsc-3.14.3/..//rd CXXFLAGS=-fno-stack-protector CFLAGS=-fno-stack-protector FFLAGS=" -O2 -fallow-argument-mismatch -fallow-invalid-boz" --with-debugging=0 COPTFLAGS="-O3 -mtune=generic" CXXOPTFLAGS="-O3 -mtune=generic" FOPTFLAGS="-O3 -mtune=generic" --known-64-bit-blas-indices=0 --with-cxx-dialect=C++11 --with-ssl=0 --with-x=0 --with-fortran-bindings=0 --with-cudac=0 --with-shared-libraries=0 --with-mpi-lib=/c/Windows/System32/msmpi.dll --with-mpi-include=/home/sd/petsc/thirdparty/MPI/Include --with-mpiexec="/C/Program Files/Microsoft MPI/Bin/mpiexec" --with-blaslapack-lib="-L/home/sd/petsc/thirdparty/openblas/lib -llibopenblas -lopenblas" --with-metis-include=/home/sd/petsc/scalapack-mumps-dll/metis --with-metis-lib=/home/sd/petsc/scalapack-mumps-dll/metis/libmetis.dll --with-parmetis-include=/home/sd/petsc/scalapack-mumps-dll/metis --with-parmetis-lib=/home/sd/petsc/scalapack-mumps-dll/metis/libparmetis.dll --download-slepc --with-scalapack-lib=/home/sd/petsc/scalapack-mumps-dll/scalapack/libscalapack.dll --download-hypre --download-mumps --download-hypre-configure-arguments="--build=x86_64-linux-gnu --host=x86_64-linux-gnu" PETSC_ARCH=rd
>>>>> [0]PETSC ERROR: #1 PetscIntMultError() line 2309 in C:/msys64/home/sd/petsc/rd/include/petscsys.h
>>>>> [0]PETSC ERROR: #2 BVCreate_Svec() line 452 in C:/msys64/home/sd/petsc/petsc-3.14.3/rd/externalpackages/git.slepc/src/sys/classes/bv/impls/svec/svec.c
>>>>> [0]PETSC ERROR: #3 BVSetSizesFromVec() line 186 in C:/msys64/home/sd/petsc/petsc-3.14.3/rd/externalpackages/git.slepc/src/sys/classes/bv/interface/bvbasic.c[0]PETSC ERROR: #4 EPSAllocateSolution() line 687 in C:/msys64/home/sd/petsc/petsc-3.14.3/rd/externalpackages/git.slepc/src/eps/interface/epssetup.c
>>>>> [0]PETSC ERROR: #5 EPSSetUp_KrylovSchur() line 159 in C:/msys64/home/sd/petsc/petsc-3.14.3/rd/externalpackages/git.slepc/src/eps/impls/krylov/krylovschur/krylovschur.c
>>>>> [0]PETSC ERROR: #6 EPSSetUp() line 315 in C:/msys64/home/sd/petsc/petsc-3.14.3/rd/externalpackages/git.slepc/src/eps/interface/epssetup.c
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> Sincerely,
>>>>>
>>>>> Xiaofeng
>>>>>
>>>>>
>>>>>
>>>>>> On Apr 30, 2025, at 13:08, Pierre Jolivet <pierre at joliv.et> wrote:
>>>>>>
>>>>>> Could you please provide the full back trace?
>>>>>> Depending on your set of options, it may be as simple as switching -bv_type to make your code run (if you are using svec, this would explain such an error but could be circumvented with something else, like mat).
>>>>>>
>>>>>> Thanks,
>>>>>> Pierre
>>>>>>
>>>>>>>> On 30 Apr 2025, at 6:27 AM, Satish Balay <balay.anl at fastmail.org> wrote:
>>>>>>>
>>>>>>>> On Wed, 30 Apr 2025, hexioafeng via petsc-users wrote:
>>>>>>>
>>>>>>>> Dear PETSc developers,
>>>>>>>>
>>>>>>>> I use PETSc and SLEPC to solve generalized eigen problems. When solving an interval eigen problem with matrix size about 5 million, i got the error message: "product of two integer xx xx overflow, you must ./configure PETSc with --with-64-bit-indices for the case you are running".
>>>>>>>>
>>>>>>>> I use some prebuilt third-party packages when building PETSc, namely OpenBLAS, METIS, ParMETIS and SCALAPACK. I wonder should i also use 64-bit prebuilt packages when configure PETSc with the --with-64-bit-indices flag? How about the MUMPS and MPI? Do i have to also use the b4-bit version?
>>>>>>>
>>>>>>> Hm - metis/parmetis would need a rebuild [with -DMETIS_USE_LONGINDEX=1 option]. Others should be unaffected.
>>>>>>>
>>>>>>> You could use petsc configure to build pkgs to ensure compatibility i.e. use --download-metis --download-parmetis etc..
>>>>>>>
>>>>>>> Note - there is a difference between --with-64-bit-indices (PetscInt) and --with-64-bit-blas-indices (PetscBlasInt) [and ILP64 - aka fortran '-i8']
>>>>>>>
>>>>>>> Satish
>>>>>>>
>>>>>>> ----
>>>>>>>
>>>>>>> $ grep defaultIndexSize config/BuildSystem/config/packages/*.py
>>>>>>> config/BuildSystem/config/packages/hypre.py: if self.defaultIndexSize == 64:
>>>>>>> config/BuildSystem/config/packages/metis.py: if self.defaultIndexSize == 64:
>>>>>>> config/BuildSystem/config/packages/mkl_cpardiso.py: elif self.blasLapack.has64bitindices and not self.defaultIndexSize == 64:
>>>>>>> config/BuildSystem/config/packages/mkl_cpardiso.py: elif not self.blasLapack.has64bitindices and self.defaultIndexSize == 64:
>>>>>>> config/BuildSystem/config/packages/mkl_pardiso.py: elif self.blasLapack.has64bitindices and not self.defaultIndexSize == 64:
>>>>>>> config/BuildSystem/config/packages/mkl_sparse_optimize.py: if not self.blasLapack.mkl or (not self.blasLapack.has64bitindices and self.defaultIndexSize == 64):
>>>>>>> config/BuildSystem/config/packages/mkl_sparse.py: if not self.blasLapack.mkl or (not self.blasLapack.has64bitindices and self.defaultIndexSize == 64):
>>>>>>> config/BuildSystem/config/packages/SuperLU_DIST.py: if self.defaultIndexSize == 64:
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>>
>>>>>>>> Look forward for your reply, thanks.
>>>>>>>>
>>>>>>>> Xiaofeng
>>>>>>>
>>>>>
>>>
>>
More information about the petsc-users
mailing list