[petsc-users] Compiling problem after upgrading to PETSc 3.8.3

Smith, Barry F. bsmith at mcs.anl.gov
Thu Feb 22 11:40:05 CST 2018


  First run under valgrind to look for memory issues.

   Second I would change 1 thing at a time. So use the intel 2017 compiler with PETSc 2.8.3 so the only change is your needed changes to match 2.8.3 and does not include a compiler change.

   I am not sure what numbers you are printing below but often changing optimization levels can and will change numerical values slightly so change in numerical values may not indicate anything is wrong (or it may indicate something is wrong depending on how different the numerical values are).

   Barry


> On Feb 21, 2018, at 10:23 PM, TAY wee-beng <zonexo at gmail.com> wrote:
> 
> 
> On 21/2/2018 11:44 AM, Smith, Barry F. wrote:
>>   Did you follow the directions in the changes file for 3.8?
>> 
>> <li>Replace calls to DMDACreateXd() with DMDACreateXd(), [DMSetFromOptions()] DMSetUp()</li>
>>         <li>DMDACreateXd() no longer can take negative values for dimensons, instead pass positive values and call DMSetFromOptions() immediately after</li>
>> 
>> I suspect you are not calling DMSetUp() and this is causing the problem.
>> 
>>   Barry
> Ops sorry, indeed I didn't change that part. Got it compiled now.
> 
> However, I have got a new problem. Previously, I was using Intel 2016 with PETSc 3.7.6. During compile, I used -O3 for all modules except one, which will give error (due to DMDAVecGetArrayF90 and DMDAVecRestoreArrayF90). Hence, I need to use -O1.
> 
> Now, I'm using Intel 2018 with PETSc 3.8.3 and I got the error:
> 
> M Diverged but why?, time =            2
>  reason =           -9
> 
> I tried to change all *.F90 from using -O3 to -O1 and although there's no diverged err printed, my values are different:
> 
> 1      0.01600000      0.46655767      0.46310378      1.42427154 -0.81598016E+02 -0.11854431E-01  0.42046197E+06
>        2      0.00956350      0.67395693      0.64698638 1.44166606 -0.12828928E+03  0.12179394E-01  0.41961824E+06
> 
> vs
> 
> 1      0.01600000      0.49096543      0.46259333      1.41828130 -0.81561221E+02 -0.16146574E-01  0.42046335E+06
>        2      0.00956310      0.68342495      0.63682485 1.44353571 -0.12813998E+03  0.24226242E+00  0.41962121E+06
> 
> The latter values are obtained using the debug built and they compared correctly with another cluster, which use GNU.
> 
> What going on and how should I troubleshoot?
> Thanks
>> 
>> 
>>> On Feb 20, 2018, at 7:35 PM, TAY wee-beng <zonexo at gmail.com> wrote:
>>> 
>>> 
>>> On 21/2/2018 10:47 AM, Smith, Barry F. wrote:
>>>>   Try setting
>>>> 
>>>>   u_global = tVec(1)
>>>> 
>>>>   immediately before the call to DMCreateGlobalVector()
>>>> 
>>>> 
>>> Hi,
>>> 
>>> I added the line in but still got the same error below. Btw, my code is organised as:
>>> 
>>> module global_data
>>> 
>>> #include "petsc/finclude/petsc.h"
>>> use petsc
>>> use kdtree2_module
>>> implicit none
>>> save
>>> ...
>>> Vec u_local,u_global ...
>>> ...
>>> contains
>>> 
>>> subroutine allo_var
>>> ...
>>> u_global = tVec(1)
>>> call DMCreateGlobalVector(da_u,u_global,ierr)
>>> ...
>>> 
>>> 
>>> 
>>> 
>>> [0]PETSC ERROR: --------------------- Error Message ----------------------------
>>> ----------------------------------
>>> [0]PETSC ERROR: Null argument, when expecting valid pointer
>>> [0]PETSC ERROR: Null Object: Parameter # 2
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trou
>>> ble shooting.
>>> [0]PETSC ERROR: Petsc Release Version 3.8.3, Dec, 09, 2017
>>> [0]PETSC ERROR: C:\Obj_tmp\ibm3d_IIB_mpi\Debug\ibm3d_IIB_mpi.exe on a petsc-3.8.
>>> 3_win64_msmpi_vs2008 named 1C3YYY1-PC by tsltaywb Wed Feb 21 11:18:20 2018
>>> [0]PETSC ERROR: Configure options --with-cc="win32fe icl" --with-fc="win32fe ifo
>>> rt" --with-cxx="win32fe icl" --download-fblaslapack --with-mpi-include="[/cygdri
>>> ve/c/Program Files (x86)/Microsoft SDKs/MPI/Include,/cygdrive/c/Program Files (x
>>> 86)/Microsoft SDKs/MPI/Include/x64]" --with-mpi-mpiexec="/cygdrive/c/Program Fil
>>> es/Microsoft MPI/Bin/mpiexec.exe" --with-debugging=1 --with-file-create-pause=1
>>> --prefix=/cygdrive/c/wtay/Lib/petsc-3.8.3_win64_msmpi_vs2008 --with-mpi-lib="[/c
>>> ygdrive/c/Program Files (x86)/Microsoft SDKs/MPI/Lib/x64/msmpifec.lib,/cygdrive/
>>> c/Program Files (x86)/Microsoft SDKs/MPI/Lib/x64/msmpi.lib]" --with-shared-libra
>>> ries=0
>>> [0]PETSC ERROR: #1 VecSetLocalToGlobalMapping() line 78 in C:\Source\PETSC-~2.3\
>>> src\vec\vec\INTERF~1\vector.c
>>> [0]PETSC ERROR: #2 DMCreateGlobalVector_DA() line 41 in C:\Source\PETSC-~2.3\src
>>> \dm\impls\da\dadist.c
>>> [0]PETSC ERROR: #3 DMCreateGlobalVector() line 844 in C:\Source\PETSC-~2.3\src\d
>>> m\INTERF~1\dm.c
>>> 
>>> Thanks.
>>>>> On Feb 20, 2018, at 6:40 PM, TAY wee-beng <zonexo at gmail.com> wrote:
>>>>> 
>>>>> Hi,
>>>>> 
>>>>> Indeed, replacing tvec with t_vec solves the problem. Now I'm trying to debug step by step. I got into problem when calling:
>>>>> 
>>>>> call DMCreateGlobalVector(da_u,u_global,ierr)
>>>>> 
>>>>> The error is:
>>>>> 
>>>>> [0]PETSC ERROR: --------------------- Error Message ----------------------------
>>>>> ----------------------------------
>>>>> [0]PETSC ERROR: Null argument, when expecting valid pointer
>>>>> [0]PETSC ERROR: Null Object: Parameter # 2
>>>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trou
>>>>> ble shooting.
>>>>> [0]PETSC ERROR: Petsc Release Version 3.8.3, Dec, 09, 2017
>>>>> [0]PETSC ERROR: C:\Obj_tmp\ibm3d_IIB_mpi\Debug\ibm3d_IIB_mpi.exe on a petsc-3.8.
>>>>> 3_win64_msmpi_vs2008 named 1C3YYY1-PC by tsltaywb Wed Feb 21 10:20:20 2018
>>>>> [0]PETSC ERROR: Configure options --with-cc="win32fe icl" --with-fc="win32fe ifo
>>>>> rt" --with-cxx="win32fe icl" --download-fblaslapack --with-mpi-include="[/cygdri
>>>>> ve/c/Program Files (x86)/Microsoft SDKs/MPI/Include,/cygdrive/c/Program Files (x....
>>>>> 
>>>>> But all I changed is from:
>>>>> 
>>>>> module global_data
>>>>> #include "petsc/finclude/petsc.h"
>>>>> use petsc
>>>>> use kdtree2_module
>>>>> implicit none
>>>>> save
>>>>> !grid variables
>>>>> 
>>>>> integer :: size_x,s....
>>>>> 
>>>>> ...
>>>>> 
>>>>> to
>>>>> 
>>>>> module global_data
>>>>> use kdtree2_module
>>>>> implicit none
>>>>> save
>>>>> #include "petsc/finclude/petsc.h90"
>>>>> !grid variables
>>>>> integer :: size_x,s...
>>>>> 
>>>>> ...
>>>>> 
>>>>> da_u, u_global were declared thru:
>>>>> 
>>>>> DM  da_u,da_v,...
>>>>> DM  da_cu_types ...
>>>>> Vec u_local,u_global,v_local...
>>>>> 
>>>>> So what could be the problem?
>>>>> 
>>>>> 
>>>>> Thank you very much.
>>>>> 
>>>>> Yours sincerely,
>>>>> 
>>>>> ================================================
>>>>> TAY Wee-Beng (Zheng Weiming) 郑伟明
>>>>> Personal research webpage: http://tayweebeng.wixsite.com/website
>>>>> Youtube research showcase: https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
>>>>> linkedin: www.linkedin.com/in/tay-weebeng
>>>>> ================================================
>>>>> 
>>>>> On 20/2/2018 10:46 PM, Jose E. Roman wrote:
>>>>>> Probably the first error is produced by using a variable (mpi_comm) with the same name as an MPI type.
>>>>>> 
>>>>>> The second error I guess is due to variable tvec, since a Fortran type tVec is now being defined in src/vec/f90-mod/petscvec.h
>>>>>> 
>>>>>> Jose
>>>>>> 
>>>>>> 
>>>>>>> El 20 feb 2018, a las 15:35, Smith, Barry F. <bsmith at mcs.anl.gov> escribió:
>>>>>>> 
>>>>>>> 
>>>>>>>   Please run a clean compile of everything and cut and paste all the output. This will make it much easier to debug than trying to understand your snippets of what is going wrong.
>>>>>>> 
>>>>>>>> On Feb 20, 2018, at 1:56 AM, TAY Wee Beng <tsltaywb at nus.edu.sg> wrote:
>>>>>>>> 
>>>>>>>> Hi,
>>>>>>>> 
>>>>>>>> I was previously using PETSc 3.7.6 on different clusters with both Intel
>>>>>>>> Fortran and GNU Fortran. After upgrading, I met some problems when
>>>>>>>> trying to compile:
>>>>>>>> 
>>>>>>>> On Intel Fortran:
>>>>>>>> 
>>>>>>>> Previously, I was using:
>>>>>>>> 
>>>>>>>> #include "petsc/finclude/petsc.h90"
>>>>>>>> 
>>>>>>>> in *.F90 when requires the use of PETSc
>>>>>>>> 
>>>>>>>> I read in the change log that h90 is no longer there and so I replaced
>>>>>>>> with #include "petsc/finclude/petsc.h"
>>>>>>>> 
>>>>>>>> It worked. But I also have some *.F90 which do not use PETSc. However,
>>>>>>>> they use some modules which uses PETSc.
>>>>>>>> 
>>>>>>>> Now I can't compile them. The error is :
>>>>>>>> 
>>>>>>>> math_routine.f90(3): error #7002: Error in opening the compiled module
>>>>>>>> file.  Check INCLUDE paths.   [PETSC]
>>>>>>>> use mpi_subroutines
>>>>>>>> 
>>>>>>>> mpi_subroutines is a module which uses PETSc, and it compiled w/o problem.
>>>>>>>> 
>>>>>>>> The solution is that I have to compile e.g.  math_routine.F90 as if they
>>>>>>>> use PETSc, by including PETSc include and lib files.
>>>>>>>> 
>>>>>>>> May I know why this is so? It was not necessary before.
>>>>>>>> 
>>>>>>>> Anyway, it managed to compile until it reached hypre.F90.
>>>>>>>> 
>>>>>>>> Previously, due to some bugs, I have to compile hypre with the -r8
>>>>>>>> option. Also, I have to use:
>>>>>>>> 
>>>>>>>> integer(8) mpi_comm
>>>>>>>> 
>>>>>>>> mpi_comm = MPI_COMM_WORLD
>>>>>>>> 
>>>>>>>> to make my codes work with HYPRE.
>>>>>>>> 
>>>>>>>> But now, compiling gives the error:
>>>>>>>> 
>>>>>>>> hypre.F90(11): error #6401: The attributes of this name conflict with
>>>>>>>> those made accessible by a USE statement.   [MPI_COMM]
>>>>>>>> integer(8) mpi_comm
>>>>>>>> --------------------------------------^
>>>>>>>> hypre.F90(84): error #6478: A type-name must not be used as a
>>>>>>>> variable.   [MPI_COMM]
>>>>>>>>   mpi_comm = MPI_COMM_WORLD
>>>>>>>> ----^
>>>>>>>> hypre.F90(84): error #6303: The assignment operation or the binary
>>>>>>>> expression operation is invalid for the data types of the two
>>>>>>>> operands.   [1140850688]
>>>>>>>>   mpi_comm = MPI_COMM_WORLD
>>>>>>>> ---------------^
>>>>>>>> hypre.F90(100): error #6478: A type-name must not be used as a
>>>>>>>> variable.   [MPI_COMM]
>>>>>>>>       call HYPRE_StructGridCreate(mpi_comm, 3, grid_hypre, ierr)
>>>>>>>> ...
>>>>>>>> 
>>>>>>>> What's actually happening? Why can't I compile now?
>>>>>>>> 
>>>>>>>> On GNU gfortran:
>>>>>>>> 
>>>>>>>> I tried to use similar tactics as above here. However, when compiling
>>>>>>>> math_routine.F90, I got the error:
>>>>>>>> 
>>>>>>>> math_routine.F90:1333:21:
>>>>>>>> 
>>>>>>>> call subb(orig,vert1,tvec)
>>>>>>>>                    1
>>>>>>>> Error: Invalid procedure argument at (1)
>>>>>>>> math_routine.F90:1339:18:
>>>>>>>> 
>>>>>>>> qvec = cross_pdt2(tvec,edge1)
>>>>>>>>                 1
>>>>>>>> Error: Invalid procedure argument at (1)
>>>>>>>> math_routine.F90:1345:21:
>>>>>>>> 
>>>>>>>>    uu = dot_product(tvec,pvec)
>>>>>>>>                    1
>>>>>>>> Error: ‘vector_a’ argument of ‘dot_product’ intrinsic at (1) must be
>>>>>>>> numeric or LOGICAL
>>>>>>>> math_routine.F90:1371:21:
>>>>>>>> 
>>>>>>>>    uu = dot_product(tvec,pvec)
>>>>>>>> 
>>>>>>>> These errors were not present before. My variables are mostly vectors:
>>>>>>>> 
>>>>>>>> real(8), intent(in) ::
>>>>>>>> orig(3),infinity(3),vert1(3),vert2(3),vert3(3),normal(3)
>>>>>>>> 
>>>>>>>> real(8) :: uu,vv,dir(3)
>>>>>>>> 
>>>>>>>> real(8) :: edge1(3),edge2(3),tvec(3),pvec(3),qvec(3),det,inv_det,epsilon,d,t
>>>>>>>> 
>>>>>>>> I wonder what happened?
>>>>>>>> 
>>>>>>>> Please advice.
>>>>>>>> 
>>>>>>>> 
>>>>>>>> --
>>>>>>>> Thank you very much.
>>>>>>>> 
>>>>>>>> Yours sincerely,
>>>>>>>> 
>>>>>>>> ================================================
>>>>>>>> TAY Wee-Beng 郑伟明
>>>>>>>> Research Scientist
>>>>>>>> Experimental AeroScience Group
>>>>>>>> Temasek Laboratories
>>>>>>>> National University of Singapore
>>>>>>>> T-Lab Building
>>>>>>>> 5A, Engineering Drive 1, #02-02
>>>>>>>> Singapore 117411
>>>>>>>> Phone: +65 65167330
>>>>>>>> E-mail: tsltaywb at nus.edu.sg
>>>>>>>> http://www.temasek-labs.nus.edu.sg/program/program_aeroexperimental_tsltaywb.php
>>>>>>>> Personal research webpage: http://tayweebeng.wixsite.com/website
>>>>>>>> Youtube research showcase: https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
>>>>>>>> linkedin: www.linkedin.com/in/tay-weebeng
>>>>>>>> ================================================
>>>>>>>> 
>>>>>>>> 
>>>>>>>> ________________________________
>>>>>>>> 
>>>>>>>> Important: This email is confidential and may be privileged. If you are not the intended recipient, please delete it and notify us immediately; you should not copy or use it for any purpose, nor disclose its contents to any other person. Thank you.
> 



More information about the petsc-users mailing list