[petsc-users] Problem compiling with 64bit PETSc
TAY wee-beng
zonexo at gmail.com
Wed Sep 5 22:56:40 CDT 2018
Hi,
My code has some problems now after converting to 64bit indices.
After debugging, I realised that I'm using:
call
MPI_ALLGATHER(counter,1,MPI_INTEGER,counter_global,1,MPI_INTEGER,MPI_COMM_WORLD,ierr)
but now counter and counter_global are both 64bit integers. So should I
change all mpi routine from MPI_INTEGER to MPI_INTEGER8?
But if I switch back to using the 32bit PETSc, do I have to switch back
again? In that case, does it mean I need to have 2 copies of my code -
one to compile with PETSc 32, another to compile with PETSc 64?
Is there an easier way?
Thank you very much.
Yours sincerely,
================================================
TAY Wee-Beng (Zheng Weiming) 郑伟明
Personal research webpage: http://tayweebeng.wixsite.com/website
Youtube research showcase: https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
linkedin: www.linkedin.com/in/tay-weebeng
================================================
On 5/9/2018 6:25 PM, Matthew Knepley wrote:
> On Wed, Sep 5, 2018 at 3:27 AM TAY wee-beng <zonexo at gmail.com
> <mailto:zonexo at gmail.com>> wrote:
>
>
> On 31/8/2018 10:43 AM, Smith, Barry F. wrote:
> >
> >> On Aug 30, 2018, at 9:40 PM, TAY wee-beng <zonexo at gmail.com
> <mailto:zonexo at gmail.com>> wrote:
> >>
> >>
> >> On 31/8/2018 10:38 AM, Smith, Barry F. wrote:
> >>> PetscReal is by default real(8) you can leave those alone
> >>>
> >>> Any integer you pass to a PETSc routine needs to be
> declared as PetscInt (not integer) otherwise the 64 bit indices
> stuff won't work.
> >>>
> >>> Barry
> >>>
> >> Hi,
> >>
> >> ok, I got it. Btw, is it advisable to change all integer in my
> code to PetscInt?
> >>
> >> Will it cause any conflict or waste a lot of memory?
> >>
> >> Or should I only change those related to PETSc?
> > That is up to you. Since you probably pass the values
> between PETSc and non-PETSc part of the code it is probably easier
> just to make all the integer PetscInt instead. No performance
> difference that you can measure by keeping a few integer around.
> >
> > Barry
> Hi,
>
> For some small parts of the code, it is preferred to use integer
> instead. Btw, to force variable as integer, I can use int(aa).
> However,
> I tried to force variable as PetscInt using PetscInt(aa) but it
> can't work.
>
> Is there any way I can make it work?
>
>
> I think you just define a PetscInt variable and use assignment.
>
> Matt
>
> Thanks.
> >> Thanks!
> >>>> On Aug 30, 2018, at 9:35 PM, TAY wee-beng <zonexo at gmail.com
> <mailto:zonexo at gmail.com>> wrote:
> >>>>
> >>>>
> >>>> On 31/8/2018 10:21 AM, Matthew Knepley wrote:
> >>>>> On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng
> <zonexo at gmail.com <mailto:zonexo at gmail.com>> wrote:
> >>>>> Hi,
> >>>>>
> >>>>> Due to my increase grid size, I have to go 64bit. I compiled
> the 64bit
> >>>>> PETSc w/o error. However, when I tried to compile my code
> using the
> >>>>> 64bit PETSc, I got the error below. May I know why is this so?
> >>>>>
> >>>>> What changes should I make?
> >>>>>
> >>>>> Is it possible that you did not declare some inputs as
> PetscInt, so the interface check is failing?
> >>>>>
> >>>>> Matt
> >>>> Hi,
> >>>>
> >>>> I'm using the standard
> >>>>
> >>>> integer ::
> >>>>
> >>>> real(8) ::
> >>>>
> >>>> for some variables. For some others relating to PETSc, I use
> PetscInt.
> >>>>
> >>>> Should I change all to PetscInt and PetscReal?
> >>>>
> >>>> Currently, I use real(8) for all real values. If I change all
> to PetscReal, will PetscReal be real or real(8) by default?
> >>>>
> >>>> Thanks!
> >>>>>
> >>>>> [tsltaywb at nus02 ibm3d_IIB_mpi]$ make -f makefile_2018
> >>>>>
> /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
> >>>>> -g -ip -ipo -O3 -c -fPIC -save kinefunc.F90
> >>>>>
> /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
> >>>>> -g -ip -ipo -O3 -c -fPIC -save -w
> >>>>>
> -I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
> >>>>>
> -I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
> >>>>> global.F90
> >>>>> global.F90(979): error #6285: There is no matching specific
> subroutine
> >>>>> for this generic subroutine call. [DMDACREATE3D]
> >>>>> call
> >>>>>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> >>>>> -----^
> >>>>> global.F90(989): error #6285: There is no matching specific
> subroutine
> >>>>> for this generic subroutine call. [DMDACREATE3D]
> >>>>> call
> >>>>>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> >>>>> ---------^
> >>>>> global.F90(997): error #6285: There is no matching specific
> subroutine
> >>>>> for this generic subroutine call. [DMDACREATE3D]
> >>>>> call
> >>>>>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> >>>>> ---------^
> >>>>> global.F90(1005): error #6285: There is no matching specific
> subroutine
> >>>>> for this generic subroutine call. [DMDACREATE3D]
> >>>>> call
> >>>>>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> >>>>> ---------^
> >>>>> global.F90(1013): error #6285: There is no matching specific
> subroutine
> >>>>> for this generic subroutine call. [DMDACREATE3D]
> >>>>> call
> >>>>>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> >>>>> ---------^
> >>>>> global.F90(1021): error #6285: There is no matching specific
> subroutine
> >>>>> for this generic subroutine call. [DMDACREATE3D]
> >>>>> call
> >>>>>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> >>>>> ---------^
> >>>>> global.F90(1029): error #6285: There is no matching specific
> subroutine
> >>>>> for this generic subroutine call. [DMDACREATE3D]
> >>>>> call
> >>>>>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> >>>>> ---------^
> >>>>> compilation aborted for global.F90 (code 1)
> >>>>>
> >>>>> --
> >>>>> Thank you very much.
> >>>>>
> >>>>> Yours sincerely,
> >>>>>
> >>>>> ================================================
> >>>>> TAY Wee-Beng (Zheng Weiming) 郑伟明
> >>>>> Personal research webpage: http://tayweebeng.wixsite.com/website
> >>>>> Youtube research showcase:
> https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
> >>>>> linkedin: www.linkedin.com/in/tay-weebeng
> <http://www.linkedin.com/in/tay-weebeng>
> >>>>> ================================================
> >>>>>
> >>>>>
> >>>>>
> >>>>> --
> >>>>> What most experimenters take for granted before they begin
> their experiments is infinitely more interesting than any results
> to which their experiments lead.
> >>>>> -- Norbert Wiener
> >>>>>
> >>>>> https://www.cse.buffalo.edu/~knepley/
> <https://www.cse.buffalo.edu/%7Eknepley/>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/%7Eknepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180906/e56c3049/attachment.html>
More information about the petsc-users
mailing list