problem with put_att (maximum number of attributes)
Wei-keng Liao
wkliao at ece.northwestern.edu
Tue Feb 24 10:16:42 CST 2009
Hi, Lex,
Can you try the attached program? I removed the serial netCDF part.
Since I don't have any problem of running it, I would suggest you to
run mpiexec in the debug mode and find the exact error location in
pnetCDF.
Please let us know.
Wei-keng
-------------- next part --------------
A non-text attachment was scrubbed...
Name: mpi_setup_netCDFout.F90
Type: application/octet-stream
Size: 11294 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/parallel-netcdf/attachments/20090224/90a60e49/attachment-0001.obj>
-------------- next part --------------
On Feb 24, 2009, at 3:33 AM, Alex Hoffmann wrote:
> Dear Wei-keng,
>
> The following are the MPI errors I get, also if I subdivide the line
> (surely it can't be necessary to divide lines, though?). I never had
> any compilation problems...
>
> The job aborts at the nfmpi_put_vara_real_all() call, so my serial
> netCDF subfiles are still assigned and the parallel netCDF file still
> created.
>
> Cheers,
>
> Lex
>
> aborting job:
> Fatal error in MPI_Waitall: Other MPI error, error stack:
> MPI_Waitall(273): MPI_Waitall(count=5, req_array=0xa11220,
> status_array=0xa1dc70) failed
> MPIDI_CH3_Progress_wait(207): an error occurred while handling an
> event
> returned by MPIDU_Sock_Wait()
> MPIDI_CH3I_Progress_handle_sock_event(540): ch3|sock|immedread
> 0xa07e50
> 0xa116e0 0xa0cd70
> MPIDU_Sock_readv(331): the supplied buffer contains invalid memory
> (set=0,sock=3,errno=14:Bad address)
> aborting job:
> Fatal error in MPI_Alltoall: Other MPI error, error stack:
> MPI_Alltoall(814): MPI_Alltoall(sbuf=0xa1b180, scount=1, MPI_INT,
> rbuf=0xa1b160, rcount=1, MPI_INT, comm=0x84000001) failed
> MPI_Waitall(273): MPI_Waitall(count=8, req_array=0xa1b240,
> status_array=0xa1b270) failed
> MPIDI_CH3_Progress_wait(207): an error occurred while handling an
> event
> returned by MPIDU_Sock_Wait()
> MPIDI_CH3I_Progress_handle_sock_event(492):
> connection_recv_fail(1728):
> MPIDU_Socki_handle_read(590): connection closed by peer (set=0,sock=1)
> aborting job:
> Fatal error in MPI_Allreduce: Other MPI error, error stack:
> MPI_Allreduce(1058): MPI_Allreduce(sbuf=0x7fff2f18140c,
> rbuf=0x7fff2f181408, count=1, MPI_INT, MPI_MAX, comm=0x84000000)
> failed
> MPIR_Allreduce(317):
> MPIC_Sendrecv(161):
> MPIC_Wait(308):
> MPIDI_CH3_Progress_wait(207): an error occurred while handling an
> event
> returned by MPIDU_Sock_Wait()
> MPIDI_CH3I_Progress_handle_sock_event(492):
> connection_recv_fail(1728):
> MPIDU_Socki_handle_read(614): connection failure
> (set=0,sock=2,errno=104:Connection reset by peer)
> rank 0 in job 37 Herzog131_50855 caused collective abort of all
> ranks
> exit status of rank 0: return code 13
>
>
> Wei-keng Liao wrote:
>> Hi, Lex,
>>
>> I got a compile error of "Syntax error in array constructor" from the
>> line calling
>> nfmpi_put_vara_real_all(). But after divide this line into two lines:
>> call
>> handle_err_nc_p(nfmpi_put_vara_real_all(NCIDP,array_varIDP,(/
>> istart,jstart/),
>> &
>>
>> (/nx_mpi,ny_mpi/),reshape(array,(/nx_mpi,ny_mpi/))),10)
>>
>> Everything compiles, runs OK.
>> Please tell us what MPI errors you have got.
>>
>> The followings is from the command-line output and ncmpidump:
>>
>> ----------------------------------------------------------------------------------------
>>
>> % mpiexec -n 4 mpi_setup_netCDFout
>> Myrank 0: 0
>> MPI initialized!
>> Number of processes [nprocs]: 4
>> Coords of process 0 : 0 0
>> whole 4*4 array on each process:
>> 11.00000 12.00000 13.00000 14.00000
>> 21.00000 22.00000 23.00000 24.00000
>> 31.00000 32.00000 33.00000 34.00000
>> 41.00000 42.00000 43.00000 44.00000
>>
>> subgrid array on process with coords 0 0 :
>> shape: 2 2
>> 11.00000 12.00000
>> 21.00000 22.00000
>>
>>
>> array in nc_out_p subroutine
>> 11.00000 21.00000 12.00000 22.00000
>>
>> (/nx,ny/) 2 2
>> (/istart,jstart/) 1 1
>>
>> pnetCDF var dimensions in xy
>> 4 4
>> ntx_mpi 4
>> ntx 4
>> nx_mpi 2
>> nx 2
>>
>>
>> -------------------------------------------------------------------------------------------------------
>>
>> % ncmpidump out_glb.nc
>> netcdf out_glb {
>> dimensions:
>> x = 4 ;
>> y = 4 ;
>> variables:
>> float x(x) ;
>> float y(y) ;
>> float array(y, x) ;
>> data:
>>
>> x = 0, 0, 0, 0 ;
>>
>> y = 0, 0, 0, 0 ;
>>
>> array =
>> 11, 21, 31, 41,
>> 12, 22, 32, 42,
>> 13, 23, 33, 43,
>> 14, 24, 34, 44 ;
>> }
>>
>> --------------------------------------------------------------------------------------------------------
>>
>>
>> On Feb 23, 2009, at 5:57 AM, Alex Hoffmann wrote:
>>
>>> Dear Wei-keng,
>>>
>>> below is the script section in collective data mode:
>>>
>>> -------
>>> ! this should have been possible in collective data mode, but did
>>> not seem to work !!!
>>> ! call handle_err_nc_p(nfmpi_begin_indep_data(NCIDP),9)
>>>
>>> ! call
>>> handle_err_nc_p(nfmpi_put_vara_real(NCIDP,array_varIDP,(/
>>> istart,jstart/),(/nx_mpi,ny_mpi/),reshape(array,(/
>>> nx_mpi,ny_mpi/))),10)
>>>
>>>
>>> call
>>> handle_err_nc_p(nfmpi_put_vara_real_all(NCIDP,array_varIDP,(/
>>> istart,jstart/),(/nx_mpi,ny_mpi/),reshape(array,(/
>>> nx_mpi,ny_mpi/))),10)
>>>
>>>
>>> ! call handle_err_nc_p(nfmpi_end_indep_data(NCIDP),11)
>>>
>>> call handle_err_nc_p(nfmpi_close(NCIDP),12)
>>> -------
>>>
>>> Basically, I replaced the independent call by the collective call
>>> for
>>> all processes, and commented out the call into/out of independent
>>> data
>>> mode. This produces MPI error messages...
>>>
>>> Cheers,
>>>
>>> Lex
>>>
>>> Wei-keng Liao wrote:
>>>> Hi, Lex,
>>>>
>>>> Have you tried nfmpi_put_vara_real_all() in collective mode?
>>>> Collective mode is the default mode.
>>>>
>>>> If yes, can you show me how you call it?
>>>>
>>>> Wei-keng
>>>>
>>>>
>>>> On Feb 20, 2009, at 1:24 PM, Alex Hoffmann wrote:
>>>>
>>>>> Thank you for the clarifications.
>>>>>
>>>>> I included the mpi setup and netCDF output test code and the
>>>>> corresponding Makefile as attachments. Maybe it can be of use
>>>>> as a
>>>>> simple example.
>>>>>
>>>>> I am currently running my code with the bugs in the output file
>>>>> fixed,
>>>>> and it seems to be working just fine.
>>>>>
>>>>> Thanks again,
>>>>>
>>>>> Lex
>>>>>
>>>>> Wei-keng Liao wrote:
>>>>>> Hi, Lex, Please read below.
>>>>>>
>>>>>> On Feb 20, 2009, at 5:13 AM, Alex Hoffmann wrote:
>>>>>>
>>>>>>> Many thanks, Wei-keng, the issue seems to have been tracked
>>>>>>> down...
>>>>>>>
>>>>>>> For reasons that are still not completely clear to me, using
>>>>>>> trim() for
>>>>>>> trimming the coordinate variable names and units (lines 480/81,
>>>>>>> 485/86)
>>>>>>> seemed to leave a strange character (such as y diacritic or Y
>>>>>>> accent)
>>>>>>> trailing on e.g. the units (such as degrees_east). Though this
>>>>>>> did not
>>>>>>> affect the run on its own, any subsequent attribute definition
>>>>>>> made the
>>>>>>> code crash. I fixed this by removing the trim().
>>>>>>
>>>>>> If you called trim() on a string argument in a function, the
>>>>>> tailing
>>>>>> character '\0' will not be passed to the function. So if the
>>>>>> string
>>>>>> length
>>>>>> argument has a larger value, the function will include undefined
>>>>>> tailing
>>>>>> characters as part of the string. Glad to know you solve the
>>>>>> problem.
>>>>>>
>>>>>>
>>>>>>> If I may, I'd like to clarify one further issue with PnetCDF.
>>>>>>> I initially implemented the code (also a small test program) in
>>>>>>> collective data mode. This did not work (the different
>>>>>>> processes/Cartesian sub domains) did not write their data into
>>>>>>> the
>>>>>>> netCDF file. I then changed the data write statements into
>>>>>>> independent
>>>>>>> data mode, with which it worked immediately. Am I right to use
>>>>>>> independent data mode for this purpose?
>>>>>>
>>>>>> I am not sure what you meant here.
>>>>>> Can you post your small test program to the list?
>>>>>>
>>>>>> Wei-keng
>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Thank you very much for your support.
>>>>>>>
>>>>>>> Cheers
>>>>>>>
>>>>>>> Lex
>>>>>>>
>>>>>>> Wei-keng Liao wrote:
>>>>>>>> Hi, Lex,
>>>>>>>>
>>>>>>>> If this error message "NC definations on multiprocesses
>>>>>>>> conflict"
>>>>>>>> is the only one you got before every crash, it is most likely
>>>>>>>> the
>>>>>>>> attribute values are not the same across all processes.
>>>>>>>> Please check that first, so you can rule out that possibility.
>>>>>>>>
>>>>>>>> Also, can you check the attributes in the output file for the
>>>>>>>> case
>>>>>>>> that ran successfully? (using ncmpidump -c filename) This is
>>>>>>>> just
>>>>>>>> to see if the attributes are the ones you expected. Also,
>>>>>>>> please
>>>>>>>> make sure the number of attributes created is less than 4K
>>>>>>>> global
>>>>>>>> or per variable, as Rob indicated.
>>>>>>>>
>>>>>>>> Wei-keng
>>>>>>>>
>>>>>>>>
>>>>>>>> On Feb 19, 2009, at 5:53 PM, Alex Hoffmann wrote:
>>>>>>>>
>>>>>>>>> Dear Wei-keng,
>>>>>>>>>
>>>>>>>>> I understand that the attribute define arguments must be the
>>>>>>>>> same
>>>>>>>>> across
>>>>>>>>> all processes. I am also almost certain that the arguments
>>>>>>>>> you
>>>>>>>>> cited
>>>>>>>>> are identical across processes but I will perform a check.
>>>>>>>>>
>>>>>>>>> Be this as it may, I get the same arrow message if I uncomment
>>>>>>>>> one of
>>>>>>>>> the text attributes specifying the unit or one of the real
>>>>>>>>> attributes
>>>>>>>>> specifying the fill value, for instance. There is no
>>>>>>>>> possibility
>>>>>>>>> that
>>>>>>>>> these are different across processes as they are introduced
>>>>>>>>> straight
>>>>>>>>> into the function call. Any check I performed seemed to
>>>>>>>>> indicate
>>>>>>>>> that
>>>>>>>>> the problem arises from the number of arguments I defined,
>>>>>>>>> and not
>>>>>>>>> from
>>>>>>>>> the arguments themselves, since they work all fine
>>>>>>>>> individually.
>>>>>>>>>
>>>>>>>>> I was actually thinking to write the attributes ad-hoc once
>>>>>>>>> the
>>>>>>>>> dataset
>>>>>>>>> is defined using regular netCDF functions. This however
>>>>>>>>> seemed to
>>>>>>>>> be a
>>>>>>>>> little elegant solution defeating the purpose of using
>>>>>>>>> PnetCDF.
>>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>>
>>>>>>>>> Lex
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Wei-keng Liao a ?crit :
>>>>>>>>>> Hi, Lex,
>>>>>>>>>>
>>>>>>>>>> PnetCDF requires the attribute define arguments be the same
>>>>>>>>>> across
>>>>>>>>>> all processes. This is why you got the error message for
>>>>>>>>>> defining
>>>>>>>>>> attribute "history" in line 400, if its value "conflicts"
>>>>>>>>>> among
>>>>>>>>>> processes.
>>>>>>>>>> I think you have got this.
>>>>>>>>>>
>>>>>>>>>> I checked your codes. Starting from line 562, you have a
>>>>>>>>>> few text
>>>>>>>>>> attributes
>>>>>>>>>> from des_tgas_picture(i), des_trac_picture(i),
>>>>>>>>>> des_tpas_picture(i),
>>>>>>>>>> and des.
>>>>>>>>>> Similar for out_type == 2. Are you sure their values are
>>>>>>>>>> all the
>>>>>>>>>> same
>>>>>>>>>> across
>>>>>>>>>> all the processes? May be you can use MPI_Bcast to check if
>>>>>>>>>> anyone has
>>>>>>>>>> different value.
>>>>>>>>>>
>>>>>>>>>> Wei-keng
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Feb 19, 2009, at 11:14 AM, Alex Hoffmann wrote:
>>>>>>>>>>
>>>>>>>>>>> Dear Rob
>>>>>>>>>>> Thanks a lot for your answer.
>>>>>>>>>>> Isolating the PnetCDF output module from the code as a
>>>>>>>>>>> stand-alone
>>>>>>>>>>> would
>>>>>>>>>>> require a substantial effort, and I do not encounter this
>>>>>>>>>>> error
>>>>>>>>>>> in my
>>>>>>>>>>> small test programs. I attached the output module for
>>>>>>>>>>> reference.
>>>>>>>>>>>
>>>>>>>>>>> The attribute section that causes trouble is between lines
>>>>>>>>>>> 369-405
>>>>>>>>>>> (global) and 468-613 (variables). I am aware that the
>>>>>>>>>>> attribute
>>>>>>>>>>> history
>>>>>>>>>>> (line 400) may be problematic as every process may have a
>>>>>>>>>>> slightly
>>>>>>>>>>> different value (for the time element taken from the fortran
>>>>>>>>>>> function
>>>>>>>>>>> date_and_time). Yet this is not the only issue.
>>>>>>>>>>>
>>>>>>>>>>> If I prematurely exit at line 492, uncommenting any further
>>>>>>>>>>> attribute
>>>>>>>>>>> definition between 468 and 489 causes a crash with the
>>>>>>>>>>> multiprocesses
>>>>>>>>>>> conflict.
>>>>>>>>>>>
>>>>>>>>>>> Note also that the attribute definitions between 558 and
>>>>>>>>>>> 613 only
>>>>>>>>>>> produce no crash currently because of the print statements
>>>>>>>>>>> on
>>>>>>>>>>> lines 560
>>>>>>>>>>> and 591, which made me think that maybe there is an
>>>>>>>>>>> optimization
>>>>>>>>>>> problem. What is strange is that if these are not
>>>>>>>>>>> included, the
>>>>>>>>>>> crash
>>>>>>>>>>> occurs only when I start writing the first file (movie file,
>>>>>>>>>>> out_type=2), AND if I leave the first block (if out_type==1)
>>>>>>>>>>> uncommented, although this block is not even entered in that
>>>>>>>>>>> particular
>>>>>>>>>>> case, and vice versa. Hope this is more or less clear...
>>>>>>>>>>>
>>>>>>>>>>> Cheers,
>>>>>>>>>>> Lex
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Robert Latham wrote:
>>>>>>>>>>>> On Thu, Feb 19, 2009 at 10:09:37AM +0000, Alex Hoffmann
>>>>>>>>>>>> wrote:
>>>>>>>>>>>>> Hello,
>>>>>>>>>>>>> I am getting the same error message as had already been
>>>>>>>>>>>>> inquired
>>>>>>>>>>>>> on in
>>>>>>>>>>>>> this list, i.e.:
>>>>>>>>>>>>>
>>>>>>>>>>>>> NC definations on multiprocesses conflict.
>>>>>>>>>>>>> NOTE: Definitions across all process
>>>>>>>>>>>>>
>>>>>>>>>>>>> and this seems to be related somehow randomly to the total
>>>>>>>>>>>>> number of
>>>>>>>>>>>>> attributes (global or for each variable) that I intend
>>>>>>>>>>>>> to write
>>>>>>>>>>>>> during
>>>>>>>>>>>>> the the definition phase (and, though I am not sure about
>>>>>>>>>>>>> this, to
>>>>>>>>>>>>> the
>>>>>>>>>>>>> number of processes used to run my F90 parallel code).
>>>>>>>>>>>>>
>>>>>>>>>>>>> More precisely, if I uncomment one put_att to replace
>>>>>>>>>>>>> another,
>>>>>>>>>>>>> the
>>>>>>>>>>>>> program still runs, if I do not comment out another one,
>>>>>>>>>>>>> the
>>>>>>>>>>>>> program
>>>>>>>>>>>>> crashes with the above error message.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Is there any way of identifying where this problem comes
>>>>>>>>>>>>> from
>>>>>>>>>>>>> or is
>>>>>>>>>>>>> there a maximum amount of attributes (though I don't
>>>>>>>>>>>>> write an
>>>>>>>>>>>>> excessive
>>>>>>>>>>>>> amount of them)? There is no problem running it on one
>>>>>>>>>>>>> process(or).
>>>>>>>>>>>>
>>>>>>>>>>>> Maybe you can show us with a small test program how you are
>>>>>>>>>>>> encountering this error? The maximum number of
>>>>>>>>>>>> attributes is 4k
>>>>>>>>>>>> (the
>>>>>>>>>>>> constant NC_MAX_ATTRS).
>>>>>>>>>>>>
>>>>>>>>>>>> ==rob
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> --
>>>>>>>>>>> ______________________________________________________________________
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Alex HOFFMANN PhD Candidate
>>>>>>>>>>> Centre for Atmospheric Science
>>>>>>>>>>> Department of Geography University of Cambridge
>>>>>>>>>>> Downing Place, Cambridge CB2 3EN, UK
>>>>>>>>>>> e-mail: ah519 at cam.ac.uk tel: +44 (0)1223 766581
>>>>>>>>>>> www.geog.cam.ac.uk/people/hoffmann/
>>>>>>>>>>> ______________________________________________________________________
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> !-*- F90 -*-
>>>>>>>>>>> module atham_pnetcdf
>>>>>>>>>>>
>>>>>>>>>>> !--------------------------------------------------------------------!
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ! author: Alex
>>>>>>>>>>> HOFFMANN !
>>>>>>>>>>> ! email:
>>>>>>>>>>> ah519 at cam.ac.uk !
>>>>>>>>>>> ! date: February
>>>>>>>>>>> 2009 !
>>>>>>>>>>> ! version:
>>>>>>>>>>> v0.1 !
>>>>>>>>>>> !
>>>>>>>>>>> !
>>>>>>>>>>> ! collection of routines for ATHAM PnetCDF
>>>>>>>>>>> output !
>>>>>>>>>>> !--------------------------------------------------------------------!
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ! module is to replace existing atham_netcdf.mod (GL) to
>>>>>>>>>>> write
>>>>>>>>>>> ATHAM
>>>>>>>>>>> output as netCDF files
>>>>>>>>>>> ! same as atham_netcdf.mod (AH) but adapted to run with the
>>>>>>>>>>> Parallel
>>>>>>>>>>> NetCDF API,
>>>>>>>>>>> ! developed by Northwestern University and Argonne National
>>>>>>>>>>> Laboratory,
>>>>>>>>>>> ! which is built on top of MPI-IO (tested with MPICH 2.1)
>>>>>>>>>>>
>>>>>>>>>>> use precision, only : kint, kreal, real4, my_real
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : nx, ny, nz, ntx, nty, npx, npy
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : ntgas_picture, ntrac_picture,
>>>>>>>>>>> ntpas_picture, &
>>>>>>>>>>> ntgas_movie, ntrac_movie, ntpas_movie
>>>>>>>>>>>
>>>>>>>>>>> implicit none
>>>>>>>>>>>
>>>>>>>>>>> private
>>>>>>>>>>> public :: netcdf_data
>>>>>>>>>>>
>>>>>>>>>>> ! 'pnetcdf.inc' file includes cpp directives that were not
>>>>>>>>>>> understood
>>>>>>>>>>> at compile-time
>>>>>>>>>>> ! this is to be changed and hard-wired link to be replaced
>>>>>>>>>>> by the
>>>>>>>>>>> include 'pnetcdf.inc' command
>>>>>>>>>>> !include 'pnetcdf.inc'
>>>>>>>>>>> #include "/usr/local/parallel-netcdf-1.0.1/include/
>>>>>>>>>>> pnetcdf.inc"
>>>>>>>>>>> include 'mpif.h'
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ! ######### tried saving netCDF variable IDs here so that
>>>>>>>>>>> they
>>>>>>>>>>> must
>>>>>>>>>>> not be retrieved by inquiry
>>>>>>>>>>> ! ######### if this creates conflict, especially between
>>>>>>>>>>> picture
>>>>>>>>>>> and
>>>>>>>>>>> movie file, this must be changed
>>>>>>>>>>> integer(kint), save :: x_dimID, y_dimID,
>>>>>>>>>>> z_dimID, t_dimID
>>>>>>>>>>> integer(kint), save :: x_varID, y_varID,
>>>>>>>>>>> z_varID, t_varID
>>>>>>>>>>> integer(kint), save :: u_varID, v_varID,
>>>>>>>>>>> w_varID
>>>>>>>>>>> integer(kint), save :: pnew_varID,
>>>>>>>>>>> tetnew_varID,
>>>>>>>>>>> tempnew_varID, density_varID
>>>>>>>>>>> integer(kint), save :: turbhor_varID,
>>>>>>>>>>> turbver_varID, turblen_varID
>>>>>>>>>>> integer(kint), dimension(99), save :: tgas_varID,
>>>>>>>>>>> trac_varID,
>>>>>>>>>>> tpas_varID, d_trac_varID
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> integer(MPI_OFFSET_KIND), save :: tcounter
>>>>>>>>>>>
>>>>>>>>>>> real(real4), parameter :: undef =
>>>>>>>>>>> -0.99e33_real4 !
>>>>>>>>>>> make sure precision of _FillValue and output (NF90_FLOAT)
>>>>>>>>>>> is the
>>>>>>>>>>> same
>>>>>>>>>>> !!!
>>>>>>>>>>>
>>>>>>>>>>> contains
>>>>>>>>>>> !
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> ============================================================
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> subroutine netcdf_data(outfile,out_type,timetot)
>>>>>>>>>>> ! IF: outfile = picture_file or movie_file (path/filename
>>>>>>>>>>> for
>>>>>>>>>>> output created in atham_module)
>>>>>>>>>>> ! out_type = 1 (picture_file) or 2 (movie_file)
>>>>>>>>>>> ! timetot = total simulation time since simulation start
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : unew, vnew, wnew, pnew, tetnew,
>>>>>>>>>>> tempnew, &
>>>>>>>>>>> density, turbhor, turbver, turblen
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : tgasnew, tracnew, tpasnew, tracdep
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : itgas_picture, itrac_picture,
>>>>>>>>>>> itpas_picture, &
>>>>>>>>>>> itgas_movie, itrac_movie, itpas_movie
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : mycoor, my_cart, myrank
>>>>>>>>>>>
>>>>>>>>>>> character(*), intent(in ) :: outfile
>>>>>>>>>>> integer(kint), intent(in ) :: out_type
>>>>>>>>>>> real(kreal), intent(in ) :: timetot
>>>>>>>>>>>
>>>>>>>>>>> !------------------------------------------------------------------!
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ! local
>>>>>>>>>>> variables !
>>>>>>>>>>> !------------------------------------------------------------------!
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> integer(MPI_OFFSET_KIND) :: nx_mpi,ny_mpi,
>>>>>>>>>>> nz_mpi,
>>>>>>>>>>> y_cl_mpi
>>>>>>>>>>> integer(MPI_OFFSET_KIND) :: istart,jstart
>>>>>>>>>>> integer(kint) :: STATUS
>>>>>>>>>>> character(len=50) :: OP ! debugging
>>>>>>>>>>> variable
>>>>>>>>>>> corresponding to the called netCDF OPeration
>>>>>>>>>>> integer(kint) :: NCID
>>>>>>>>>>> integer(kint) :: i
>>>>>>>>>>> logical :: DynMovie_flag,
>>>>>>>>>>> bounds_control
>>>>>>>>>>> integer(kint) :: y_st, y_sp, y_cl
>>>>>>>>>>>
>>>>>>>>>>> ! replace DynMovie precompiler flag by program flag for more
>>>>>>>>>>> convenient code structure
>>>>>>>>>>> DynMovie_flag = .false.
>>>>>>>>>>> #ifdef DynMovie
>>>>>>>>>>> DynMovie_flag = .true.
>>>>>>>>>>> #endif
>>>>>>>>>>>
>>>>>>>>>>> ! re-assign dimension lengths as an MPI_Offset type
>>>>>>>>>>> (necessary
>>>>>>>>>>> for
>>>>>>>>>>> the PnetCDF interface)
>>>>>>>>>>> ! cut-off subdomain borders
>>>>>>>>>>> nx_mpi = nx-2
>>>>>>>>>>> ny_mpi = ny-2
>>>>>>>>>>> nz_mpi = nz
>>>>>>>>>>>
>>>>>>>>>>> ! assign coordinates of first element of each process in the
>>>>>>>>>>> netCDF file
>>>>>>>>>>> istart = mycoor(1)*nx_mpi + 1_MPI_OFFSET_KIND
>>>>>>>>>>> jstart = mycoor(2)*ny_mpi + 1_MPI_OFFSET_KIND
>>>>>>>>>>>
>>>>>>>>>>> bounds_control = .true. ! debugging flag
>>>>>>>>>>> tcounter = tcounter+1_MPI_OFFSET_KIND
>>>>>>>>>>>
>>>>>>>>>>> ! === open netCDF dataset: enter data mode
>>>>>>>>>>> ------------------------!
>>>>>>>>>>> ! if dataset does not yet exist (STATUS = ), create
>>>>>>>>>>> file and
>>>>>>>>>>> enter data mode
>>>>>>>>>>> ! else continue
>>>>>>>>>>> OP = 'open existing netCDF file'
>>>>>>>>>>> STATUS = nfmpi_open(my_cart, trim(outfile), NF_WRITE,
>>>>>>>>>>> MPI_INFO_NULL, NCID)
>>>>>>>>>>> if ( STATUS /= nf_noerr ) then
>>>>>>>>>>> if (myrank==0) then
>>>>>>>>>>> print *, ""
>>>>>>>>>>> print *, "P-NetCDF operation: ",trim(OP)
>>>>>>>>>>> print *, trim(nfmpi_strerror(STATUS))
>>>>>>>>>>> print *, "atham_pnetcdf netcdf_data(): new netCDF file
>>>>>>>>>>> created: ", outfile
>>>>>>>>>>> print *, ""
>>>>>>>>>>> end if
>>>>>>>>>>> call netcdf_define(outfile,out_type,timetot,NCID)
>>>>>>>>>>> tcounter = 1_MPI_OFFSET_KIND ! reset time counter for
>>>>>>>>>>> output if
>>>>>>>>>>> new file is created
>>>>>>>>>>> end if
>>>>>>>>>>>
>>>>>>>>>>> ! --- write data values
>>>>>>>>>>> --------------------!
>>>>>>>>>>> ! unlimited record dimension time (only one process writes
>>>>>>>>>>> the
>>>>>>>>>>> whole dimension, i.e. accross all processes)
>>>>>>>>>>> OP = 'write data values - unlimited dimension t'
>>>>>>>>>>>
>>>>>>>>>>> call handle_err(nfmpi_begin_indep_data(NCID))
>>>>>>>>>>>
>>>>>>>>>>> if (myrank==0) then
>>>>>>>>>>> call handle_err(nfmpi_put_vara_real(NCID, t_varID,
>>>>>>>>>>> (/tcounter/), (/1_MPI_OFFSET_KIND/),
>>>>>>>>>>> real(timetot,real4)),OP)
>>>>>>>>>>> end if
>>>>>>>>>>>
>>>>>>>>>>> call handle_err(nfmpi_end_indep_data(NCID))
>>>>>>>>>>>
>>>>>>>>>>> ! --- write data values
>>>>>>>>>>> --------------------!
>>>>>>>>>>> ! discriminate between 2D and 3D case for output generation
>>>>>>>>>>> if (nty>4) then
>>>>>>>>>>> y_st = 2 ! y index start
>>>>>>>>>>> y_sp = ny-1 ! y index stop
>>>>>>>>>>> y_cl = ny-2 ! y number of columns
>>>>>>>>>>> else
>>>>>>>>>>> y_st = ny-1 ! 3 in 2D case
>>>>>>>>>>> y_sp = ny-1 ! 3 in 2D case
>>>>>>>>>>> y_cl = 1
>>>>>>>>>>> end if
>>>>>>>>>>> y_cl_mpi = y_cl
>>>>>>>>>>>
>>>>>>>>>>> ! dynamics data
>>>>>>>>>>> OP = 'write data values - data'
>>>>>>>>>>> if (out_type == 1 .OR. DynMovie_flag == .true.) then
>>>>>>>>>>> call write_data(real(unew,real4),u_varID)
>>>>>>>>>>> bounds_control = .false.
>>>>>>>>>>> call write_data(real(vnew, real4),v_varID)
>>>>>>>>>>> call write_data(real(wnew, real4),w_varID)
>>>>>>>>>>> call write_data(real(pnew, real4),pnew_varID)
>>>>>>>>>>> call write_data(real(tetnew, real4),tetnew_varID)
>>>>>>>>>>> call write_data(real(tempnew,real4),tempnew_varID)
>>>>>>>>>>> call write_data(real(density,real4),density_varID)
>>>>>>>>>>> call write_data(real(turbhor,real4),turbhor_varID)
>>>>>>>>>>> call write_data(real(turbver,real4),turbver_varID)
>>>>>>>>>>> call write_data(real(turblen,real4),turblen_varID)
>>>>>>>>>>> end if
>>>>>>>>>>>
>>>>>>>>>>> ! tracer data
>>>>>>>>>>> if (out_type == 1) then
>>>>>>>>>>> do i=1,ntgas_picture
>>>>>>>>>>> call
>>>>>>>>>>> write_data
>>>>>>>>>>> (real(tgasnew(:,:,:,itgas_picture(i)),real4),tgas_varID(i))
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntrac_picture
>>>>>>>>>>> call
>>>>>>>>>>> write_data
>>>>>>>>>>> (real(tracnew(:,:,:,itrac_picture(i)),real4),trac_varID(i))
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntpas_picture
>>>>>>>>>>> call
>>>>>>>>>>> write_data
>>>>>>>>>>> (real(tpasnew(:,:,:,itpas_picture(i)),real4),tpas_varID(i))
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntrac_picture
>>>>>>>>>>> call handle_err(nfmpi_begin_indep_data(NCID))
>>>>>>>>>>> call handle_err(nfmpi_put_vara_real(NCID,
>>>>>>>>>>> d_trac_varID(i),(/istart,jstart,tcounter/), &
>>>>>>>>>>> (/nx_mpi,y_cl_mpi,1_MPI_OFFSET_KIND/),
>>>>>>>>>>> real
>>>>>>>>>>> (reshape
>>>>>>>>>>> (tracdep(2:nx-1,y_st:y_sp,itrac_picture(i)),shape=(/
>>>>>>>>>>> nx-2,y_cl,1/)),real4)
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ),OP)
>>>>>>>>>>> call handle_err(nfmpi_end_indep_data(NCID))
>>>>>>>>>>> enddo
>>>>>>>>>>> else if (out_type == 2) then
>>>>>>>>>>> do i=1,ntgas_movie
>>>>>>>>>>> call
>>>>>>>>>>> write_data
>>>>>>>>>>> (real(tgasnew(:,:,:,itgas_movie(i)),real4),tgas_varID(i))
>>>>>>>>>>>
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntrac_movie
>>>>>>>>>>> call
>>>>>>>>>>> write_data
>>>>>>>>>>> (real(tracnew(:,:,:,itrac_movie(i)),real4),trac_varID(i))
>>>>>>>>>>>
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntpas_movie
>>>>>>>>>>> call
>>>>>>>>>>> write_data
>>>>>>>>>>> (real(tpasnew(:,:,:,itpas_movie(i)),real4),tpas_varID(i))
>>>>>>>>>>>
>>>>>>>>>>> enddo
>>>>>>>>>>> endif
>>>>>>>>>>>
>>>>>>>>>>> ! === close netCDF dataset: exit data mode & save
>>>>>>>>>>> -----------------!
>>>>>>>>>>> OP = 'close netCDF file'
>>>>>>>>>>> call handle_err(nfmpi_close(NCID))
>>>>>>>>>>>
>>>>>>>>>>> if (myrank==0) then
>>>>>>>>>>> print *, ''
>>>>>>>>>>> print *, 'wrote netCDF output to file ', outfile
>>>>>>>>>>> print *, ''
>>>>>>>>>>> end if
>>>>>>>>>>>
>>>>>>>>>>> contains
>>>>>>>>>>> !----------------------------------------------------------------
>>>>>>>>>>>
>>>>>>>>>>> subroutine write_data(array_in,varID)
>>>>>>>>>>> ! IF: array_in = data array (3D) to be written to netCDF
>>>>>>>>>>> ! varID = data variable netCDF ID
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : iflgs ! topography mask: topography
>>>>>>>>>>> (iflgs=0) remains undef
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : myrank
>>>>>>>>>>>
>>>>>>>>>>> real(real4),dimension(nx,ny,nz), intent (in ) ::
>>>>>>>>>>> array_in
>>>>>>>>>>> integer(kint), intent(in ) :: varID
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> !--------------------------------------------------------------!
>>>>>>>>>>> ! local
>>>>>>>>>>> variables !
>>>>>>>>>>>
>>>>>>>>>>> !--------------------------------------------------------------!
>>>>>>>>>>>
>>>>>>>>>>> real(real4), dimension(nx,ny,nz) :: array_masked
>>>>>>>>>>>
>>>>>>>>>>> ! dimensions & bounds control
>>>>>>>>>>> integer(MPI_OFFSET_KIND) :: dimlenx,
>>>>>>>>>>> dimleny, dimlenz, dimlent
>>>>>>>>>>> integer(kint), dimension(NF_max_var_dims) :: dimIDs
>>>>>>>>>>>
>>>>>>>>>>> ! dimensions & bounds control
>>>>>>>>>>> if (myrank == 0) then
>>>>>>>>>>> if (bounds_control) then
>>>>>>>>>>> print *, ''
>>>>>>>>>>> print *, 'atham var in xyz
>>>>>>>>>>> ',shape(array_in)
>>>>>>>>>>> print *, 'atham var reshaped to xyzt
>>>>>>>>>>> ',shape(reshape(array_in(2:nx-1,y_st:y_sp,1:nz),shape=(/
>>>>>>>>>>> nx-2,y_cl,nz,1/)))
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> call handle_err(nfmpi_inq_vardimid(NCID, u_varID,
>>>>>>>>>>> dimIDs)) ! taken unew as ref
>>>>>>>>>>> call handle_err(nfmpi_inq_dimlen(NCID, dimIDs(1),
>>>>>>>>>>> dimlenx))
>>>>>>>>>>> call handle_err(nfmpi_inq_dimlen(NCID, dimIDs(2),
>>>>>>>>>>> dimleny))
>>>>>>>>>>> call handle_err(nfmpi_inq_dimlen(NCID, dimIDs(3),
>>>>>>>>>>> dimlenz))
>>>>>>>>>>> call handle_err(nfmpi_inq_dimlen(NCID, dimIDs(4),
>>>>>>>>>>> dimlent))
>>>>>>>>>>> print *, 'netCDF var dimensions in xyzt
>>>>>>>>>>> ',dimlenx,dimleny,dimlenz,dimlent
>>>>>>>>>>> end if
>>>>>>>>>>> end if
>>>>>>>>>>>
>>>>>>>>>>> ! mask topography
>>>>>>>>>>> where (iflgs == 1)
>>>>>>>>>>> array_masked = array_in
>>>>>>>>>>> elsewhere
>>>>>>>>>>> array_masked = undef
>>>>>>>>>>> endwhere
>>>>>>>>>>>
>>>>>>>>>>> ! mask control
>>>>>>>>>>> !print *, ''
>>>>>>>>>>> !print *, 'topography mask'
>>>>>>>>>>> !print *, iflgs(1:5,3,3)
>>>>>>>>>>> !print *, iflgs(1:5,3,2)
>>>>>>>>>>> !print *, iflgs(1:5,3,1)
>>>>>>>>>>> !print *, ''
>>>>>>>>>>> !print *, 'masked variable array'
>>>>>>>>>>> !print *, array_masked(:,3,3)
>>>>>>>>>>> !print *, array_masked(:,3,2)
>>>>>>>>>>> !print *, array_masked(:,3,1)
>>>>>>>>>>> !print *, ''
>>>>>>>>>>>
>>>>>>>>>>> call handle_err(nfmpi_begin_indep_data(NCID))
>>>>>>>>>>> call handle_err(nfmpi_put_vara_real(NCID,
>>>>>>>>>>> varID,(/istart,jstart,1_MPI_OFFSET_KIND,tcounter/), &
>>>>>>>>>>> (/nx_mpi,y_cl_mpi,nz_mpi,1_MPI_OFFSET_KIND/),
>>>>>>>>>>> reshape(array_masked(2:nx-1,y_st:y_sp,1:nz),shape=(/
>>>>>>>>>>> nx-2,y_cl,nz,1/))
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ),OP)
>>>>>>>>>>> call handle_err(nfmpi_end_indep_data(NCID))
>>>>>>>>>>>
>>>>>>>>>>> end subroutine write_data
>>>>>>>>>>> !
>>>>>>>>>>> ----------------------------------------------------------------
>>>>>>>>>>> end subroutine netcdf_data
>>>>>>>>>>> !
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> ============================================================
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> subroutine netcdf_define(outfile,out_type,timetot,ncid_def)
>>>>>>>>>>> ! IF: outfile = picture_file or movie_file (path/filename
>>>>>>>>>>> for
>>>>>>>>>>> output created in atham_module)
>>>>>>>>>>> ! out_type = 1 (picture_file) or 2 (movie_file)
>>>>>>>>>>> ! timetot = simulation time since simulation start (from
>>>>>>>>>>> atham/atham_out)
>>>>>>>>>>> ! ncid_def = netCDF ID assigned to file during creation
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : var_tgas_picture,
>>>>>>>>>>> var_tgas_movie, &
>>>>>>>>>>> des_tgas_picture,
>>>>>>>>>>> des_tgas_movie, &
>>>>>>>>>>> var_trac_picture,
>>>>>>>>>>> var_trac_movie, &
>>>>>>>>>>> des_trac_picture,
>>>>>>>>>>> des_trac_movie, &
>>>>>>>>>>> var_tpas_picture,
>>>>>>>>>>> var_tpas_movie, &
>>>>>>>>>>> des_tpas_picture, des_tpas_movie
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : iindex, spinup, nrep, periodt
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : volcano_setup,
>>>>>>>>>>> coignimbrite_setup, &
>>>>>>>>>>> nuclear_setup, convection_setup,
>>>>>>>>>>> procsconfig_setup
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : x, y, z, xg, yg
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : grid2geo, sim_d_YYYY, sim_d_MM,
>>>>>>>>>>> sim_d_DD, &
>>>>>>>>>>> sim_t_hh, sim_t_mm, sim_t_ss
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : my_cart, myrank
>>>>>>>>>>>
>>>>>>>>>>> character(*), intent(in ) :: outfile
>>>>>>>>>>> integer(kint), intent(in ) :: out_type
>>>>>>>>>>> real(kreal), intent(in ) :: timetot
>>>>>>>>>>> integer(kint), intent( out) :: ncid_def
>>>>>>>>>>>
>>>>>>>>>>> !------------------------------------------------------------------!
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ! local
>>>>>>>>>>> variables !
>>>>>>>>>>> !------------------------------------------------------------------!
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> integer(MPI_OFFSET_KIND) ::
>>>>>>>>>>> ntx_mpi,nty_mpi,nz_mpi
>>>>>>>>>>> character(len=50) :: OP ! debugging
>>>>>>>>>>> variable
>>>>>>>>>>> corresponding to the called netCDF OPeration
>>>>>>>>>>> integer(kint) :: NCID
>>>>>>>>>>> integer(kint) :: i
>>>>>>>>>>> character(len=50) :: var, des
>>>>>>>>>>> real(kreal) :: toffset
>>>>>>>>>>> real(real4) :: coord_scale,
>>>>>>>>>>> coord_x_offset, coord_y_offset, coord_t_offset
>>>>>>>>>>> logical :: DynMovie_flag
>>>>>>>>>>> character(len=3) :: coord_x_name,
>>>>>>>>>>> coord_y_name, coord_z_name
>>>>>>>>>>> character(len=4) :: coord_t_name
>>>>>>>>>>> character(len=9) :: coord_x_lname,
>>>>>>>>>>> coord_y_lname
>>>>>>>>>>> character(len=13) :: coord_x_units,
>>>>>>>>>>> coord_y_units
>>>>>>>>>>> character(len=33) :: coord_t_units
>>>>>>>>>>> character(len=8) :: date
>>>>>>>>>>> character(len=10) :: time
>>>>>>>>>>> character(len=19) :: simdatetime
>>>>>>>>>>> character(len=10) :: simdate
>>>>>>>>>>> character(len=20) :: login
>>>>>>>>>>> character(len=63) :: hist_string
>>>>>>>>>>> integer(kint) :: isetup
>>>>>>>>>>> character(len=12) :: model_setup(6)
>>>>>>>>>>> data model_setup /'volcano ', 'coignimbrite',
>>>>>>>>>>> 'nuclear ',
>>>>>>>>>>> 'convection ', 'procsconfig ','unknown '/
>>>>>>>>>>>
>>>>>>>>>>> ! replace DynMovie precompiler flag by program flag for more
>>>>>>>>>>> convenient code structure
>>>>>>>>>>> DynMovie_flag = .false.
>>>>>>>>>>> #ifdef DynMovie
>>>>>>>>>>> DynMovie_flag = .true.
>>>>>>>>>>> #endif
>>>>>>>>>>>
>>>>>>>>>>> ! re-assign dimension lengths as an MPI_Offset type
>>>>>>>>>>> (necessary
>>>>>>>>>>> for
>>>>>>>>>>> the PnetCDF interface)
>>>>>>>>>>> ! cut-off domain borders
>>>>>>>>>>> ntx_mpi = ntx-2
>>>>>>>>>>> nty_mpi = nty-2
>>>>>>>>>>> nz_mpi = nz
>>>>>>>>>>>
>>>>>>>>>>> ! convert date and time integers into string 'YYYYMMDD
>>>>>>>>>>> hh:mm:ss'
>>>>>>>>>>> write (simdatetime,10)
>>>>>>>>>>> sim_d_YYYY,sim_d_MM,sim_d_DD,sim_t_hh,sim_t_mm,sim_t_ss
>>>>>>>>>>> 10 format (I4,'-',I2,'-',I2,1X,I2,':',I2,':',I2)
>>>>>>>>>>> simdate = simdatetime(1:10)
>>>>>>>>>>>
>>>>>>>>>>> ! switch between cartesian axes/time (default) and
>>>>>>>>>>> geographic
>>>>>>>>>>> axes/date (linear approx)
>>>>>>>>>>> coord_scale = 1._real4
>>>>>>>>>>> coord_x_offset = 0._real4
>>>>>>>>>>> coord_x_name = 'x '
>>>>>>>>>>> coord_x_lname = 'x_coor '
>>>>>>>>>>> coord_x_units = 'm '
>>>>>>>>>>> coord_y_offset = 0._real4
>>>>>>>>>>> coord_y_name = 'y '
>>>>>>>>>>> coord_y_lname = 'y_coor '
>>>>>>>>>>> coord_y_units = 'm '
>>>>>>>>>>> coord_z_name = 'z '
>>>>>>>>>>> coord_t_offset = 0._real4
>>>>>>>>>>> coord_t_name = 'time'
>>>>>>>>>>> coord_t_units = 'sec '
>>>>>>>>>>> if (grid2geo) then
>>>>>>>>>>> call
>>>>>>>>>>> gridconvert
>>>>>>>>>>> (coord_scale
>>>>>>>>>>> ,coord_x_offset,coord_x_name,coord_x_lname,coord_x_units,
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> &
>>>>>>>>>>>
>>>>>>>>>>> coord_y_offset,coord_y_name,coord_y_lname,coord_y_units, &
>>>>>>>>>>>
>>>>>>>>>>> coord_z_name,coord_t_offset,coord_t_name,coord_t_units)
>>>>>>>>>>> end if
>>>>>>>>>>>
>>>>>>>>>>> ! === create netCDF dataset: enter define mode
>>>>>>>>>>> --------------------!
>>>>>>>>>>>
>>>>>>>>>>> OP = 'enter define mode'
>>>>>>>>>>> call handle_err(nfmpi_create(my_cart, trim(outfile), 0,
>>>>>>>>>>> MPI_INFO_NULL,NCID),OP)
>>>>>>>>>>> ncid_def = NCID
>>>>>>>>>>>
>>>>>>>>>>> ! === create netCDF dataset: assign metadata
>>>>>>>>>>> ----------------------!
>>>>>>>>>>>
>>>>>>>>>>> ! --- set global attributes
>>>>>>>>>>> ----------------!
>>>>>>>>>>> OP = 'set global attributes'
>>>>>>>>>>> if (out_type == 1) then
>>>>>>>>>>> toffset = timetot - spinup
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, NF_GLOBAL, 'title',
>>>>>>>>>>> 13_MPI_OFFSET_KIND, 'atham_picture'),OP)
>>>>>>>>>>> else
>>>>>>>>>>> toffset = timetot - spinup - nrep*periodt
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, NF_GLOBAL, 'title',
>>>>>>>>>>> 11_MPI_OFFSET_KIND, 'atham_movie'),OP)
>>>>>>>>>>> end if
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, NF_GLOBAL,
>>>>>>>>>>> 'content',
>>>>>>>>>>> 223_MPI_OFFSET_KIND, &
>>>>>>>>>>> 'This data corresponds to a simulation run of the
>>>>>>>>>>> mesoscale-beta/gamma type non-hydrostatic Active Tracer
>>>>>>>>>>> High-resolution Atmospheric Model ATHAM ' &
>>>>>>>>>>> // '(Oberhuber et al., 1998, Herzog, 1998), for a setup as
>>>>>>>>>>> specified under history'),OP) ! ######### add further
>>>>>>>>>>> description as
>>>>>>>>>>> necessary
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, NF_GLOBAL,
>>>>>>>>>>> 'version',
>>>>>>>>>>> 21_MPI_OFFSET_KIND, 'no current versioning'),OP) ! #########
>>>>>>>>>>> change
>>>>>>>>>>> once available
>>>>>>>>>>>
>>>>>>>>>>> call date_and_time(date, time)
>>>>>>>>>>> call getlog(login)
>>>>>>>>>>> if (volcano_setup) then
>>>>>>>>>>> isetup=1
>>>>>>>>>>> else if (coignimbrite_setup) then
>>>>>>>>>>> isetup=2
>>>>>>>>>>> else if (nuclear_setup) then
>>>>>>>>>>> isetup=3
>>>>>>>>>>> else if (convection_setup) then
>>>>>>>>>>> isetup=4
>>>>>>>>>>> else if (procsconfig_setup) then
>>>>>>>>>>> isetup=5
>>>>>>>>>>> else
>>>>>>>>>>> isetup=6
>>>>>>>>>>> end if
>>>>>>>>>>> hist_string = trim(date//' '//time//' '//trim(login)//'
>>>>>>>>>>> atham
>>>>>>>>>>> '//model_setup(isetup))
>>>>>>>>>>>
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, NF_GLOBAL,
>>>>>>>>>>> 'history',
>>>>>>>>>>> 63_MPI_OFFSET_KIND, hist_string),OP) ! 1 line/
>>>>>>>>>>> modification: -
>>>>>>>>>>> cmd
>>>>>>>>>>> arg
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, NF_GLOBAL,
>>>>>>>>>>> 'Conventions', 28_MPI_OFFSET_KIND, 'COARDS,
>>>>>>>>>>> //cf-pcmdi.llnl.gov/'),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, NF_GLOBAL,
>>>>>>>>>>> 'sim_onset_date_time', 19_MPI_OFFSET_KIND, simdatetime),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, NF_GLOBAL,
>>>>>>>>>>> 'sim_time_offset', NF_FLOAT, 1_MPI_OFFSET_KIND,
>>>>>>>>>>> real(toffset,real4)),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, NF_GLOBAL,
>>>>>>>>>>> 'x_zoom_location', NF_FLOAT, 1_MPI_OFFSET_KIND,
>>>>>>>>>>> real(x(max(1,iindex)),real4)),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! --- define dimensions
>>>>>>>>>>> --------------------!
>>>>>>>>>>> OP = 'define dimensions'
>>>>>>>>>>> call handle_err(nfmpi_def_dim(NCID, trim(coord_x_name),
>>>>>>>>>>> ntx_mpi, x_dimID),OP)
>>>>>>>>>>> if (nty>4) then
>>>>>>>>>>> call handle_err(nfmpi_def_dim(NCID, trim(coord_y_name),
>>>>>>>>>>> nty_mpi, y_dimID),OP)
>>>>>>>>>>> else
>>>>>>>>>>> call handle_err(nfmpi_def_dim(NCID, trim(coord_y_name),
>>>>>>>>>>> 1_MPI_OFFSET_KIND, y_dimID),OP)
>>>>>>>>>>> end if
>>>>>>>>>>> call handle_err(nfmpi_def_dim(NCID, trim(coord_z_name),
>>>>>>>>>>> nz_mpi, z_dimID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_dim(NCID, coord_t_name,
>>>>>>>>>>> nfmpi_unlimited, t_dimID),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! --- define variables
>>>>>>>>>>> ---------------------!
>>>>>>>>>>> ! dimensions
>>>>>>>>>>> OP = 'define variables - dimensions'
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, trim(coord_x_name),
>>>>>>>>>>> NF_FLOAT, 1, x_dimID, x_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, trim(coord_y_name),
>>>>>>>>>>> NF_FLOAT, 1, y_dimID, y_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, trim(coord_z_name),
>>>>>>>>>>> NF_FLOAT, 1, z_dimID, z_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, coord_t_name,
>>>>>>>>>>> NF_FLOAT, 1, t_dimID, t_varID),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! dynamics data
>>>>>>>>>>> OP = 'define variables - data'
>>>>>>>>>>> if (out_type == 1 .OR. DynMovie_flag == .true.) then
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, 'u', NF_FLOAT, 4,
>>>>>>>>>>> (/x_dimID,y_dimID,z_dimID,t_dimID/), u_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, 'v', NF_FLOAT, 4,
>>>>>>>>>>> (/x_dimID,y_dimID,z_dimID,t_dimID/), v_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, 'w', NF_FLOAT, 4,
>>>>>>>>>>> (/x_dimID,y_dimID,z_dimID,t_dimID/), w_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, 'pnew', NF_FLOAT, 4,
>>>>>>>>>>> (/x_dimID,y_dimID,z_dimID,t_dimID/), pnew_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, 'tetnew', NF_FLOAT, 4,
>>>>>>>>>>> (/x_dimID,y_dimID,z_dimID,t_dimID/), tetnew_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, 'tempnew', NF_FLOAT, 4,
>>>>>>>>>>> (/x_dimID,y_dimID,z_dimID,t_dimID/), tempnew_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, 'density', NF_FLOAT, 4,
>>>>>>>>>>> (/x_dimID,y_dimID,z_dimID,t_dimID/), density_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, 'turbhor', NF_FLOAT, 4,
>>>>>>>>>>> (/x_dimID,y_dimID,z_dimID,t_dimID/), turbhor_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, 'turbver', NF_FLOAT, 4,
>>>>>>>>>>> (/x_dimID,y_dimID,z_dimID,t_dimID/), turbver_varID),OP)
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, 'turblen', NF_FLOAT, 4,
>>>>>>>>>>> (/x_dimID,y_dimID,z_dimID,t_dimID/), turblen_varID),OP)
>>>>>>>>>>> end if
>>>>>>>>>>>
>>>>>>>>>>> ! tracer data
>>>>>>>>>>> if (out_type == 1) then
>>>>>>>>>>> do i=1,ntgas_picture
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, var_tgas_picture(i),
>>>>>>>>>>> NF_FLOAT, 4, (/x_dimID,y_dimID,z_dimID,t_dimID/),
>>>>>>>>>>> tgas_varID(i)),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntrac_picture
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, var_trac_picture(i),
>>>>>>>>>>> NF_FLOAT, 4, (/x_dimID,y_dimID,z_dimID,t_dimID/),
>>>>>>>>>>> trac_varID(i)),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntpas_picture
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, var_tpas_picture(i),
>>>>>>>>>>> NF_FLOAT, 4, (/x_dimID,y_dimID,z_dimID,t_dimID/),
>>>>>>>>>>> tpas_varID(i)),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntrac_picture
>>>>>>>>>>> var='d_'//trim(var_trac_picture(i))
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, var,
>>>>>>>>>>> NF_FLOAT, 3, (/x_dimID,y_dimID,t_dimID/),
>>>>>>>>>>> d_trac_varID(i)),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> else if (out_type == 2) then
>>>>>>>>>>> do i=1,ntgas_movie
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, var_tgas_movie(i),
>>>>>>>>>>> NF_FLOAT, 4, (/x_dimID,y_dimID,z_dimID,t_dimID/),
>>>>>>>>>>> tgas_varID(i)),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntrac_movie
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, var_trac_movie(i),
>>>>>>>>>>> NF_FLOAT, 4, (/x_dimID,y_dimID,z_dimID,t_dimID/),
>>>>>>>>>>> trac_varID(i)),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntpas_movie
>>>>>>>>>>> call handle_err(nfmpi_def_var(NCID, var_tpas_movie(i),
>>>>>>>>>>> NF_FLOAT, 4, (/x_dimID,y_dimID,z_dimID,t_dimID/),
>>>>>>>>>>> tpas_varID(i)),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> endif
>>>>>>>>>>>
>>>>>>>>>>> ! --- assign attributes
>>>>>>>>>>> --------------------!
>>>>>>>>>>> ! common attributes: long_name, units, valid_range,
>>>>>>>>>>> scale_factor,
>>>>>>>>>>> add_offset, _FillValue, history, conventions
>>>>>>>>>>> ! dimensions
>>>>>>>>>>>
>>>>>>>>>>> OP = 'assign attributes - dimensions'
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, x_varID,
>>>>>>>>>>> 'long_name', 9_MPI_OFFSET_KIND,
>>>>>>>>>>> trim(coord_x_lname)),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, x_varID,
>>>>>>>>>>> 'units', 13_MPI_OFFSET_KIND,
>>>>>>>>>>> trim(coord_x_units)),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, x_varID,
>>>>>>>>>>> 'scale_factor', NF_FLOAT, 1_MPI_OFFSET_KIND,
>>>>>>>>>>> coord_scale),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, x_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND,
>>>>>>>>>>> coord_x_offset),OP)
>>>>>>>>>>>
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, y_varID,
>>>>>>>>>>> 'long_name', 9_MPI_OFFSET_KIND,
>>>>>>>>>>> trim(coord_y_lname)),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, y_varID,
>>>>>>>>>>> 'units', 13_MPI_OFFSET_KIND,
>>>>>>>>>>> trim(coord_y_units)),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, y_varID,
>>>>>>>>>>> 'scale_factor', NF_FLOAT, 1_MPI_OFFSET_KIND,
>>>>>>>>>>> coord_scale),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, y_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND,
>>>>>>>>>>> coord_y_offset),OP)
>>>>>>>>>>>
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, z_varID,
>>>>>>>>>>> 'long_name', 6_MPI_OFFSET_KIND, 'height'),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, z_varID,
>>>>>>>>>>> 'units', 1_MPI_OFFSET_KIND, 'm'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, z_varID,
>>>>>>>>>>> 'positive', 2_MPI_OFFSET_KIND, 'up'),OP)
>>>>>>>>>>>
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, t_varID,
>>>>>>>>>>> 'long_name', 4_MPI_OFFSET_KIND,
>>>>>>>>>>> coord_t_name),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, t_varID,
>>>>>>>>>>> 'units', 33_MPI_OFFSET_KIND,
>>>>>>>>>>> trim(coord_t_units)),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, t_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND,
>>>>>>>>>>> coord_t_offset),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! OP = 'exit define mode NOW for testing
>>>>>>>>>>> ****************************************************' !
>>>>>>>>>>> ah_testing
>>>>>>>>>>> ********************************
>>>>>>>>>>> ! call
>>>>>>>>>>> handle_err(nfmpi_enddef(NCID),OP)
>>>>>>>>>>> ! ah_testing ********************************
>>>>>>>>>>>
>>>>>>>>>>> ! dynamics data
>>>>>>>>>>> OP = 'assign attributes - data'
>>>>>>>>>>> if (out_type == 1 .OR. DynMovie_flag == .true.) then
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, u_varID,
>>>>>>>>>>> 'long_name', 6_MPI_OFFSET_KIND, 'x_wind'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, u_varID,
>>>>>>>>>>> 'units', 7_MPI_OFFSET_KIND, 'm sec-1'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, u_varID,
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1.),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, u_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, u_varID,
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, v_varID,
>>>>>>>>>>> 'long_name', 6_MPI_OFFSET_KIND, 'y_wind'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, v_varID,
>>>>>>>>>>> 'units', 7_MPI_OFFSET_KIND, 'm sec-1'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, v_varID,
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1.),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, v_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, v_varID,
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, w_varID,
>>>>>>>>>>> 'long_name', 19_MPI_OFFSET_KIND,
>>>>>>>>>>> 'upward_air_velocity'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, w_varID,
>>>>>>>>>>> 'units', 7_MPI_OFFSET_KIND, 'm sec-1'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, w_varID,
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1.),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, w_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, w_varID,
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, pnew_varID,
>>>>>>>>>>> 'long_name', 20_MPI_OFFSET_KIND,
>>>>>>>>>>> 'air_pressure_anomaly'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID, pnew_varID,
>>>>>>>>>>> 'units', 2_MPI_OFFSET_KIND, 'Pa'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, pnew_varID,
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1.),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, pnew_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID, pnew_varID,
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> tetnew_varID,
>>>>>>>>>>> 'long_name', 25_MPI_OFFSET_KIND,
>>>>>>>>>>> 'air_potential_temperature'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> tetnew_varID,
>>>>>>>>>>> 'units', 1_MPI_OFFSET_KIND, 'K'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> tetnew_varID,
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1.),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> tetnew_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> tetnew_varID,
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> tempnew_varID,
>>>>>>>>>>> 'long_name', 15_MPI_OFFSET_KIND,
>>>>>>>>>>> 'air_temperature'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> tempnew_varID,
>>>>>>>>>>> 'units', 7_MPI_OFFSET_KIND, 'Celsius'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> tempnew_varID,
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1.),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> tempnew_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, -273.15),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> tempnew_varID,
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> density_varID,
>>>>>>>>>>> 'long_name', 11_MPI_OFFSET_KIND,
>>>>>>>>>>> 'air_density'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> density_varID,
>>>>>>>>>>> 'units', 6_MPI_OFFSET_KIND, 'kg m-3'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> density_varID,
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1.),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> density_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> density_varID,
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> turbhor_varID,
>>>>>>>>>>> 'long_name', 15_MPI_OFFSET_KIND,
>>>>>>>>>>> 'hor_turb_energy'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> turbhor_varID,
>>>>>>>>>>> 'units', 8_MPI_OFFSET_KIND, 'm2 sec-2'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> turbhor_varID,
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1.),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> turbhor_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> turbhor_varID,
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> turbver_varID,
>>>>>>>>>>> 'long_name', 15_MPI_OFFSET_KIND,
>>>>>>>>>>> 'ver_turb_energy'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> turbver_varID,
>>>>>>>>>>> 'units', 8_MPI_OFFSET_KIND, 'm2 sec-2'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> turbver_varID,
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1.),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> turbver_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> turbver_varID,
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> turblen_varID,
>>>>>>>>>>> 'long_name', 17_MPI_OFFSET_KIND,
>>>>>>>>>>> 'turb_length_scale'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_text(NCID,
>>>>>>>>>>> turblen_varID,
>>>>>>>>>>> 'units', 1_MPI_OFFSET_KIND, 'm'),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> turblen_varID,
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1.),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> turblen_varID,
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> ! call handle_err(nfmpi_put_att_real(NCID,
>>>>>>>>>>> turblen_varID,
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>> end if
>>>>>>>>>>>
>>>>>>>>>>> ! tracer data
>>>>>>>>>>> if (out_type == 1) then
>>>>>>>>>>> print *, 'NetCDF output type (1-picture,2-movie)
>>>>>>>>>>> ',out_type !
>>>>>>>>>>> ##### DO NOT REMOVE!!! optimization problem, trace source
>>>>>>>>>>> do i=1,ntgas_picture
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, tgas_varID(i),
>>>>>>>>>>> 'long_name', 25_MPI_OFFSET_KIND,
>>>>>>>>>>> des_tgas_picture(i)),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, tgas_varID(i),
>>>>>>>>>>> 'units', 9_MPI_OFFSET_KIND, 'gram
>>>>>>>>>>> kg-1'),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tgas_varID(i),
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1000.),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tgas_varID(i),
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tgas_varID(i),
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntrac_picture
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, trac_varID(i),
>>>>>>>>>>> 'long_name', 25_MPI_OFFSET_KIND,
>>>>>>>>>>> des_trac_picture(i)),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, trac_varID(i),
>>>>>>>>>>> 'units', 9_MPI_OFFSET_KIND, 'gram
>>>>>>>>>>> kg-1'),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, trac_varID(i),
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1000.),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, trac_varID(i),
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, trac_varID(i),
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntpas_picture
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, tpas_varID(i),
>>>>>>>>>>> 'long_name', 25_MPI_OFFSET_KIND,
>>>>>>>>>>> des_tpas_picture(i)),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, tpas_varID(i),
>>>>>>>>>>> 'units', 1_MPI_OFFSET_KIND, '%'),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tpas_varID(i),
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 100.),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tpas_varID(i),
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tpas_varID(i),
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntrac_picture
>>>>>>>>>>> des='deposited_'//trim(des_trac_picture(i))
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, d_trac_varID(i),
>>>>>>>>>>> 'long_name', 35_MPI_OFFSET_KIND, des),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, d_trac_varID(i),
>>>>>>>>>>> 'units', 9_MPI_OFFSET_KIND, 'gram m-2'),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, d_trac_varID(i),
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1000.),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, d_trac_varID(i),
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, d_trac_varID(i),
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> else if (out_type == 2) then
>>>>>>>>>>> print *, 'NetCDF output type (1-picture,2-movie)
>>>>>>>>>>> ',out_type !
>>>>>>>>>>> ##### DO NOT REMOVE!!! optimization problem, trace source
>>>>>>>>>>> do i=1,ntgas_movie
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, tgas_varID(i),
>>>>>>>>>>> 'long_name', 25_MPI_OFFSET_KIND,
>>>>>>>>>>> des_tgas_movie(i)),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, tgas_varID(i),
>>>>>>>>>>> 'units', 9_MPI_OFFSET_KIND, 'gram
>>>>>>>>>>> kg-1'),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tgas_varID(i),
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1000.),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tgas_varID(i),
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tgas_varID(i),
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntrac_movie
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, trac_varID(i),
>>>>>>>>>>> 'long_name', 25_MPI_OFFSET_KIND,
>>>>>>>>>>> des_trac_movie(i)),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, trac_varID(i),
>>>>>>>>>>> 'units', 9_MPI_OFFSET_KIND, 'gram
>>>>>>>>>>> kg-1'),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, trac_varID(i),
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 1000.),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, trac_varID(i),
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, trac_varID(i),
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> do i=1,ntpas_movie
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, tpas_varID(i),
>>>>>>>>>>> 'long_name', 25_MPI_OFFSET_KIND,
>>>>>>>>>>> des_tpas_movie(i)),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_text(NCID, tpas_varID(i),
>>>>>>>>>>> 'units', 1_MPI_OFFSET_KIND, '%'),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tpas_varID(i),
>>>>>>>>>>> 'scale_factor',NF_FLOAT, 1_MPI_OFFSET_KIND, 100.),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tpas_varID(i),
>>>>>>>>>>> 'add_offset', NF_FLOAT, 1_MPI_OFFSET_KIND, 0.0),OP)
>>>>>>>>>>> call handle_err(nfmpi_put_att_real(NCID, tpas_varID(i),
>>>>>>>>>>> '_FillValue', NF_FLOAT, 1_MPI_OFFSET_KIND, undef),OP)
>>>>>>>>>>> enddo
>>>>>>>>>>> end if
>>>>>>>>>>>
>>>>>>>>>>> ! === create netCDF dataset: exit define mode, enter data
>>>>>>>>>>> mode
>>>>>>>>>>> ----!
>>>>>>>>>>> OP = 'exit define mode'
>>>>>>>>>>> call handle_err(nfmpi_enddef(NCID),OP)
>>>>>>>>>>>
>>>>>>>>>>> ! --- write data values
>>>>>>>>>>> --------------------!
>>>>>>>>>>> ! dimensions (only one process writes the whole dimension,
>>>>>>>>>>> i.e.
>>>>>>>>>>> accross all processes)
>>>>>>>>>>> OP = 'write data values - dimensions'
>>>>>>>>>>>
>>>>>>>>>>> call handle_err(nfmpi_begin_indep_data(NCID))
>>>>>>>>>>>
>>>>>>>>>>> if (myrank==0) then
>>>>>>>>>>> call handle_err(nfmpi_put_vara_real(NCID, x_varID,
>>>>>>>>>>> (/1_MPI_OFFSET_KIND/), (/
>>>>>>>>>>> ntx_mpi/),real(xg(2:ntx-1),real4)),OP)
>>>>>>>>>>> if (nty>4) then
>>>>>>>>>>> call handle_err(nfmpi_put_vara_real(NCID, y_varID,
>>>>>>>>>>> (/1_MPI_OFFSET_KIND/), (/nty_mpi/),
>>>>>>>>>>> real(yg(2:nty-1),real4)),OP)
>>>>>>>>>>> else
>>>>>>>>>>> call handle_err(nfmpi_put_vara_real(NCID, y_varID,
>>>>>>>>>>> (/1_MPI_OFFSET_KIND/), (/1_MPI_OFFSET_KIND/),
>>>>>>>>>>> real(yg(nty-1),real4)),OP)
>>>>>>>>>>> end if
>>>>>>>>>>> call handle_err(nfmpi_put_vara_real(NCID, z_varID,
>>>>>>>>>>> (/1_MPI_OFFSET_KIND/), (/nz_mpi/), real(z,real4)),OP)
>>>>>>>>>>> end if
>>>>>>>>>>>
>>>>>>>>>>> call handle_err(nfmpi_end_indep_data(NCID))
>>>>>>>>>>>
>>>>>>>>>>> contains
>>>>>>>>>>> !
>>>>>>>>>>> ----------------------------------------------------------------
>>>>>>>>>>> subroutine
>>>>>>>>>>> gridconvert
>>>>>>>>>>> (cd_scale,cd_x_offset,cd_x_name,cd_x_lname,cd_x_units,
>>>>>>>>>>> &
>>>>>>>>>>>
>>>>>>>>>>> cd_y_offset,cd_y_name,cd_y_lname,cd_y_units, &
>>>>>>>>>>>
>>>>>>>>>>> cd_z_name,cd_t_offset,cd_t_name,cd_t_units)
>>>>>>>>>>> ! IF: cd_* = coordinate parameters
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> !--------------------------------------------------------------!
>>>>>>>>>>> ! linear conversion from cartesian grid into lon/lat
>>>>>>>>>>> system !
>>>>>>>>>>> ! only valid as first approxiamtion over small
>>>>>>>>>>> domains !
>>>>>>>>>>> ! to be replaced once a dyn core in a lat/lon system
>>>>>>>>>>> available !
>>>>>>>>>>>
>>>>>>>>>>> !--------------------------------------------------------------!
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : deglat_start,
>>>>>>>>>>> deglon_start
>>>>>>>>>>>
>>>>>>>>>>> use phys_constants, only : pi
>>>>>>>>>>>
>>>>>>>>>>> real(real4), intent(inout) :: cd_scale, cd_x_offset,
>>>>>>>>>>> cd_y_offset, cd_t_offset
>>>>>>>>>>> character(len=3), intent(inout) :: cd_x_name, cd_y_name,
>>>>>>>>>>> cd_z_name
>>>>>>>>>>> character(len=4), intent(inout) :: cd_t_name
>>>>>>>>>>> character(len=9), intent(inout) :: cd_x_lname, cd_y_lname
>>>>>>>>>>> character(len=13), intent(inout) :: cd_x_units, cd_y_units
>>>>>>>>>>> character(len=33), intent(inout) :: cd_t_units
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> !--------------------------------------------------------------!
>>>>>>>>>>> ! local
>>>>>>>>>>> variables !
>>>>>>>>>>>
>>>>>>>>>>> !--------------------------------------------------------------!
>>>>>>>>>>>
>>>>>>>>>>> real(kreal) :: r
>>>>>>>>>>> real(kreal), parameter :: a =
>>>>>>>>>>> 6378137._kreal !
>>>>>>>>>>> (WGS 84)
>>>>>>>>>>> real(kreal), parameter :: b =
>>>>>>>>>>> 6356752.3142_kreal !
>>>>>>>>>>> (WGS 84)
>>>>>>>>>>>
>>>>>>>>>>> ! ellipsoid radius at ATHAM domain start
>>>>>>>>>>> r = a*b / sqrt((b*b*(cos(deglat_start*pi/180._kreal))**2
>>>>>>>>>>> )+(a*a*(sin(deglat_start*pi/180._kreal))**2 ))
>>>>>>>>>>>
>>>>>>>>>>> cd_scale = real((1/r) * (180._kreal/pi),real4)
>>>>>>>>>>>
>>>>>>>>>>> cd_x_offset = deglon_start
>>>>>>>>>>> cd_x_name = 'lon'
>>>>>>>>>>> cd_x_lname = 'longitude'
>>>>>>>>>>> cd_x_units = 'degrees_east '
>>>>>>>>>>>
>>>>>>>>>>> cd_y_offset = deglat_start
>>>>>>>>>>> cd_y_name = 'lat'
>>>>>>>>>>> cd_y_lname = 'latitude '
>>>>>>>>>>> cd_y_units = 'degrees_north'
>>>>>>>>>>>
>>>>>>>>>>> cd_z_name = 'lev'
>>>>>>>>>>>
>>>>>>>>>>> coord_t_offset = real((sim_t_hh*3600 + sim_t_mm*60
>>>>>>>>>>> +sim_t_ss),real4)
>>>>>>>>>>> coord_t_name = 'date'
>>>>>>>>>>> coord_t_units = 'seconds since '//simdate//' 00:00:00
>>>>>>>>>>> [local]'
>>>>>>>>>>> ! COARDS conventions require time zone from UTC to be
>>>>>>>>>>> specified
>>>>>>>>>>> ! not necessary for local area version of ATHAM
>>>>>>>>>>>
>>>>>>>>>>> end subroutine gridconvert
>>>>>>>>>>> !
>>>>>>>>>>> ----------------------------------------------------------------
>>>>>>>>>>>
>>>>>>>>>>> end subroutine netcdf_define
>>>>>>>>>>> !
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> ============================================================
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> !
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> ============================================================
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> subroutine handle_err(status,op)
>>>>>>>>>>>
>>>>>>>>>>> use atham_module, only : myrank
>>>>>>>>>>>
>>>>>>>>>>> integer, intent (in ) :: status
>>>>>>>>>>> character(len=*), optional, intent (in ) :: op
>>>>>>>>>>>
>>>>>>>>>>> if ( status /= nf_noerr ) then
>>>>>>>>>>> if (myrank==0) then
>>>>>>>>>>> print *, ""
>>>>>>>>>>> print *, "P-NetCDF operation: ",trim(op)
>>>>>>>>>>> print *, trim(nfmpi_strerror(status))
>>>>>>>>>>> print *, "atham_pnetcdf error: program execution stopped"
>>>>>>>>>>> print *, ""
>>>>>>>>>>>
>>>>>>>>>>> stop
>>>>>>>>>>> end if
>>>>>>>>>>> end if
>>>>>>>>>>>
>>>>>>>>>>> return
>>>>>>>>>>>
>>>>>>>>>>> end subroutine handle_err
>>>>>>>>>>> !
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> =
>>>>>>>>>>> ============================================================
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> end module atham_pnetcdf
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> ______________________________________________________________________
>>>>>>>
>>>>>>>
>>>>>>> Alex HOFFMANN PhD Candidate
>>>>>>> Centre for Atmospheric Science
>>>>>>> Department of Geography University of Cambridge
>>>>>>> Downing Place, Cambridge CB2 3EN, UK
>>>>>>> e-mail: ah519 at cam.ac.uk tel: +44 (0)1223 766581
>>>>>>> www.geog.cam.ac.uk/people/hoffmann/
>>>>>>> ______________________________________________________________________
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>> --
>>>>> ______________________________________________________________________
>>>>>
>>>>> Alex HOFFMANN PhD Candidate
>>>>> Centre for Atmospheric Science
>>>>> Department of Geography University of Cambridge
>>>>> Downing Place, Cambridge CB2 3EN, UK
>>>>> e-mail: ah519 at cam.ac.uk tel: +44 (0)1223 766581
>>>>> www.geog.cam.ac.uk/people/hoffmann/
>>>>> ______________________________________________________________________
>>>>> !-*- F90 -*-
>>>>> program mpi_setup_netCDFout
>>>>>
>>>>> !--------------------------------------------------------------------!
>>>>> ! author: Alex
>>>>> HOFFMANN !
>>>>> ! email:
>>>>> ah519 at cam.ac.uk !
>>>>> ! date: February
>>>>> 2009 !
>>>>> ! version:
>>>>> development !
>>>>> ! !
>>>>> ! simple program to test mpi runs & netCDF output (serial &
>>>>> parallel)!
>>>>> !--------------------------------------------------------------------!
>>>>>
>>>>> ! test code designed to be used with the variables from the
>>>>> mesoscale
>>>>> ! atmospheric model ATHAM
>>>>> ! re-structured as stand-alone
>>>>> ! reads and writes as netCDF an index matrix for a 4x4 matrix
>>>>> in real4
>>>>>
>>>>> ! use precision
>>>>> ! use atham_module, only : nx,ny,ntx,nty,npx,npy
>>>>>
>>>>> implicit none
>>>>>
>>>>> !private
>>>>> !public
>>>>>
>>>>> ! --- mpi interface for Fortran 77 only, use header
>>>>> files-----------
>>>>> include 'mpif.h'
>>>>>
>>>>>
>>>>> ! --- precision
>>>>> ----------------------------------------------------
>>>>> integer, parameter :: real4 =
>>>>> selected_real_kind(6), &
>>>>> real8 =
>>>>> selected_real_kind(12), &
>>>>> kreal = real8
>>>>> integer, parameter :: int4 =
>>>>> selected_int_kind(7), &
>>>>> int8 =
>>>>> selected_int_kind(13), &
>>>>> kint = int4
>>>>> ! --- atham_module
>>>>> -------------------------------------------------
>>>>> integer(kint) :: nx,ny,ntx,nty,npx,npy
>>>>> integer(kint) :: my_ranks,
>>>>> nprocs, &
>>>>> my_cart, myrank,
>>>>> mycoor(2)
>>>>> logical :: mpi_initialized, mpi
>>>>> ! --- other
>>>>> --------------------------------------------------------
>>>>> integer(kint) :: ierror
>>>>> real(real4), allocatable, dimension(:,:) :: array_g, array
>>>>>
>>>>>
>>>>>
>>>>> mpi_initialized=.false.
>>>>> ! ---for initial
>>>>> testing--------------------------------------------
>>>>> nx = 4_kint ! user-specified number of grid points along x
>>>>> ny = 4_kint
>>>>> npx = 2_kint ! number of processes along x
>>>>> npy = 2_kint
>>>>>
>>>>> ntx = nx ! re-adjusted total number of grid points along x
>>>>> nty = ny
>>>>> nx = nx/npx ! number of grid points along x per process
>>>>> ny = ny/npy
>>>>>
>>>>>
>>>>> ! --- MAIN
>>>>> ! --- mpi setup calls
>>>>> ----------------------------------------------
>>>>> call initialize_mpi
>>>>> call setup_mpi
>>>>>
>>>>> ! --- netCDF out calls
>>>>> ---------------------------------------------
>>>>> call nc_out_s
>>>>> call nc_out_p
>>>>>
>>>>> ! --- mpi finalize call
>>>>> --------------------------------------------
>>>>> call mpi_Finalize(ierror)
>>>>> ! --- END MAIN
>>>>>
>>>>> contains
>>>>> !
>>>>> =
>>>>> =
>>>>> =
>>>>> ==================================================================
>>>>> subroutine initialize_mpi()
>>>>>
>>>>> ! use atham_module, only : mpi, mpi_initialized
>>>>> ! use atham_module, only : myrank, nprocs
>>>>>
>>>>> integer(kint) :: ierror, errorclass
>>>>> character(len=100) :: message
>>>>>
>>>>> ! assign variables for serial version
>>>>> ierror=0
>>>>> myrank=0
>>>>> nprocs=1
>>>>> mpi=.false.
>>>>> ! initialize MPI
>>>>> mpi=.true.
>>>>> if (.not. mpi_initialized) call mpi_init(ierror)
>>>>> call MPI_Comm_rank(mpi_comm_world,myrank,ierror)
>>>>> call MPI_Comm_size(mpi_comm_world,nprocs,ierror)
>>>>> mpi=.true.
>>>>> mpi_initialized=.true.
>>>>>
>>>>> if (myrank==0) then
>>>>> print *, 'Myrank 0: ',myrank
>>>>> print *, 'MPI initialized!'
>>>>> print *, 'Number of processes [nprocs]: ',nprocs
>>>>> endif
>>>>>
>>>>> if (ierror>0) then
>>>>> call MPI_Error_string(ierror,message,100)
>>>>> call MPI_Error_class(ierror,errorclass)
>>>>> call exec_stop(message,errorclass)
>>>>> endif
>>>>>
>>>>> end subroutine initialize_mpi
>>>>> !
>>>>> =
>>>>> =
>>>>> =
>>>>> ==================================================================
>>>>> subroutine setup_mpi()
>>>>>
>>>>> ! use atham_module, only : mpi, mpi_initialized
>>>>> ! use atham_module, only : myrank,nprocs,my_cart,mycoor
>>>>>
>>>>> integer(kint) :: ierror, errorclass
>>>>> character(len=100) :: message
>>>>> integer(kint) :: idims(2),shape_array(2)
>>>>> logical :: reorder
>>>>> integer(kint) :: i,j,istart,jstart
>>>>>
>>>>> if (nprocs/=npx*npy) then
>>>>> print *, 'nprocs /= npx*npy - execution aborted'
>>>>> call mpi_abort(mpi_comm_world,ierror)
>>>>> stop
>>>>> endif
>>>>> ! create virtual topology
>>>>> idims(1:2) = (/npx,npy/)
>>>>> reorder = .true.
>>>>>
>>>>> call
>>>>> MPI_Cart_create(mpi_comm_world,2,idims,
>>>>> (/.false.,.false./),reorder,my_cart,ierror)
>>>>>
>>>>>
>>>>> call MPI_Comm_rank(my_cart,myrank,ierror) ! ranks on new
>>>>> communicator
>>>>> call MPI_Cart_coords(my_cart,myrank,2,mycoor,ierror)
>>>>>
>>>>> if(myrank==0) then
>>>>> print *, 'Coords of process ',myrank,' :',mycoor
>>>>> endif
>>>>>
>>>>> if (ierror>0) then
>>>>> call MPI_Error_string(ierror,message,100)
>>>>> call MPI_Error_class(ierror,errorclass)
>>>>> call exec_stop(message,errorclass)
>>>>> endif
>>>>>
>>>>> ! allocate and assign global grid on each process
>>>>> allocate(array_g(ntx,nty))
>>>>>
>>>>> array_g(1,:) = (/ 11.0,12.0,13.0,14.0 /)
>>>>> array_g(2,:) = (/ 21.0,22.0,23.0,24.0 /)
>>>>> array_g(3,:) = (/ 31.0,32.0,33.0,34.0 /)
>>>>> array_g(4,:) = (/ 41.0,42.0,43.0,44.0 /)
>>>>> if (myrank==0) then
>>>>> print *, 'whole 4*4 array on each process:'
>>>>> print *, array_g(1,:)
>>>>> print *, array_g(2,:)
>>>>> print *, array_g(3,:)
>>>>> print *, array_g(4,:)
>>>>> endif
>>>>>
>>>>> ! allocate and assign subgrid portions on various processes
>>>>> allocate(array(nx,ny))
>>>>> istart = mycoor(1)*nx
>>>>> jstart = mycoor(2)*ny
>>>>>
>>>>> do i=1,nx
>>>>> do j=1,ny
>>>>> array(i,j) = array_g(istart+i,jstart+j)
>>>>> enddo
>>>>> enddo
>>>>>
>>>>> ! output results from one process
>>>>> if (myrank==0) then
>>>>> print *, ''
>>>>> print *, 'subgrid array on process with coords ',mycoor,':'
>>>>> shape_array = shape(array)
>>>>> print *, 'shape: ',shape_array
>>>>> do i=1,shape_array(1)
>>>>> print *, array(i,:)
>>>>> enddo
>>>>> print *, ''
>>>>> endif
>>>>>
>>>>> end subroutine setup_mpi
>>>>> !
>>>>> =
>>>>> =
>>>>> =
>>>>> ==================================================================
>>>>> subroutine nc_out_s()
>>>>>
>>>>> use netcdf
>>>>>
>>>>> ! use atham_module, only : makename
>>>>> ! use atham_module, only : mycoor
>>>>> ! use atham_module, only : nx,ny
>>>>>
>>>>> integer(kint) :: NCID
>>>>> character(len=10) :: name
>>>>> integer(kint) ::
>>>>> x_dimID,y_dimID,x_varID,y_varID,array_varID
>>>>>
>>>>>
>>>>> call makename('out',mycoor,name)
>>>>> name = trim(name)//'.nc'
>>>>>
>>>>> call handle_err_nc_s(nf90_create(path = './'//trim(name), cmode =
>>>>> 0, ncid = NCID))
>>>>> call handle_err_nc_s(nf90_def_dim(ncid = NCID, name = 'x', len
>>>>> = nx, dimid = x_dimID))
>>>>> call handle_err_nc_s(nf90_def_dim(ncid = NCID, name = 'y', len
>>>>> = ny, dimid = y_dimID))
>>>>> call handle_err_nc_s(nf90_def_var(ncid = NCID, name = 'x',
>>>>> xtype = NF90_FLOAT, dimids = x_dimID, varid = x_varID))
>>>>> call handle_err_nc_s(nf90_def_var(ncid = NCID, name = 'y',
>>>>> xtype = NF90_FLOAT, dimids = y_dimID, varid = y_varID))
>>>>> call handle_err_nc_s(nf90_def_var(ncid = NCID, name = 'array',
>>>>> xtype = NF90_FLOAT, dimids = (/x_dimID,y_dimID/), varid =
>>>>> array_varID))
>>>>> call handle_err_nc_s(nf90_enddef(ncid = NCID))
>>>>> call handle_err_nc_s(nf90_put_var(ncid = NCID, varid =
>>>>> array_varID,
>>>>> values = array))
>>>>> call handle_err_nc_s(nf90_close(ncid = NCID))
>>>>>
>>>>> end subroutine nc_out_s
>>>>> !
>>>>> =
>>>>> =
>>>>> =
>>>>> ==================================================================
>>>>> subroutine nc_out_p
>>>>>
>>>>> ! use atham_module, only : mycoor,my_cart,myrank
>>>>> ! use atham_module, only : nx,ny
>>>>>
>>>>> ! precompiler problem: include file had to be hardwired into code
>>>>> ! include 'pnetcdf.inc'
>>>>> #include "/usr/local/parallel-netcdf-1.0.1/include/pnetcdf.inc"
>>>>>
>>>>>
>>>>> integer(kint) :: NCIDP
>>>>> character(len=10) :: name
>>>>> integer(kint) ::
>>>>> x_dimIDP,y_dimIDP,x_varIDP,y_varIDP,array_varIDP
>>>>> integer(kint) :: dimIDPs(2)
>>>>> integer(MPI_OFFSET_KIND) :: dimlenx,dimleny
>>>>> integer(MPI_OFFSET_KIND) :: istart,jstart
>>>>> integer(MPI_OFFSET_KIND) :: ntx_mpi,nty_mpi,nx_mpi,ny_mpi
>>>>>
>>>>> ntx_mpi = ntx
>>>>> nty_mpi = nty
>>>>> nx_mpi = nx
>>>>> ny_mpi = ny
>>>>>
>>>>> name = 'out_glb.nc'
>>>>> istart = mycoor(1)*nx_mpi + 1
>>>>> jstart = mycoor(2)*ny_mpi + 1
>>>>>
>>>>>
>>>>> call
>>>>> handle_err_nc_p(nfmpi_create(my_cart,'./'//trim(name),
>>>>> 0,MPI_INFO_NULL,NCIDP),0)
>>>>>
>>>>>
>>>>>
>>>>> call handle_err_nc_p(nfmpi_def_dim(NCIDP,'x',ntx_mpi,x_dimIDP),1)
>>>>>
>>>>> call handle_err_nc_p(nfmpi_def_dim(NCIDP,'y',nty_mpi,y_dimIDP),2)
>>>>>
>>>>> call
>>>>> handle_err_nc_p(nfmpi_def_var(NCIDP,'x',NF_FLOAT,
>>>>> 1,x_dimIDP,x_varIDP),3)
>>>>>
>>>>>
>>>>> call
>>>>> handle_err_nc_p(nfmpi_def_var(NCIDP,'y',NF_FLOAT,
>>>>> 1,y_dimIDP,y_varIDP),4)
>>>>>
>>>>>
>>>>> call
>>>>> handle_err_nc_p(nfmpi_def_var(NCIDP,'array',NF_FLOAT,2,(/
>>>>> x_dimIDP,y_dimIDP/),array_varIDP),5)
>>>>>
>>>>>
>>>>>
>>>>> call handle_err_nc_p(nfmpi_enddef(NCIDP),6)
>>>>>
>>>>> if (myrank==0) then
>>>>> print *, ''
>>>>> print *, 'array in nc_out_p subroutine'
>>>>> print *, array
>>>>> print *, ''
>>>>> print *, '(/nx,ny/) ',(/nx,ny/)
>>>>> print *, '(/istart,jstart/) ',(/istart,jstart/)
>>>>> endif
>>>>>
>>>>> if (myrank==0) then
>>>>> print *, ''
>>>>> call handle_err_nc_p(nfmpi_inq_dimlen(NCIDP,x_dimIDP,dimlenx),7)
>>>>> call handle_err_nc_p(nfmpi_inq_dimlen(NCIDP,y_dimIDP,dimleny),8)
>>>>> print *, 'pnetCDF var dimensions in xy ',dimlenx,dimleny
>>>>> print *, 'ntx_mpi ',ntx_mpi
>>>>> print *, 'ntx ',ntx
>>>>> print *, 'nx_mpi ',nx_mpi
>>>>> print *, 'nx ',nx
>>>>> endif
>>>>>
>>>>> ! this should have been possible in collective data mode, but did
>>>>> not seem to work !!!
>>>>> call handle_err_nc_p(nfmpi_begin_indep_data(NCIDP),9)
>>>>>
>>>>> call
>>>>> handle_err_nc_p(nfmpi_put_vara_real(NCIDP,array_varIDP,(/
>>>>> istart,jstart/),(/nx_mpi,ny_mpi/),reshape(array,(/
>>>>> nx_mpi,ny_mpi/))),10)
>>>>>
>>>>>
>>>>>
>>>>> call handle_err_nc_p(nfmpi_end_indep_data(NCIDP),11)
>>>>>
>>>>> call handle_err_nc_p(nfmpi_close(NCIDP),12)
>>>>>
>>>>> end subroutine nc_out_p
>>>>> !
>>>>> =
>>>>> =
>>>>> =
>>>>> ==================================================================
>>>>> subroutine handle_err_nc_s(status)
>>>>>
>>>>> use netcdf
>>>>>
>>>>> integer, intent (in ) :: status
>>>>>
>>>>> if ( status /= nf90_noerr ) then
>>>>> print *, ""
>>>>> print *, trim(nf90_strerror(status))
>>>>> print *, "atham_netcdf error: program execution stopped"
>>>>> print *, ""
>>>>> stop
>>>>> end if
>>>>> return
>>>>> end subroutine handle_err_nc_s
>>>>> !
>>>>> =
>>>>> =
>>>>> =
>>>>> ==================================================================
>>>>> subroutine handle_err_nc_p(status,callID)
>>>>>
>>>>> ! use atham_module, only : myrank
>>>>>
>>>>> #include "/usr/local/parallel-netcdf-1.0.1/include/pnetcdf.inc"
>>>>>
>>>>> integer, intent (in ) :: status
>>>>> integer, intent (in ) :: callID
>>>>>
>>>>> if ( status /= nf_noerr ) then
>>>>> if (myrank==0) then
>>>>> print *, ""
>>>>> print *, trim(nfmpi_strerror(status))
>>>>> print *, "atham_pnetcdf error: program execution stopped
>>>>> ",callID
>>>>> print *, ""
>>>>> ! stop
>>>>> end if
>>>>> endif
>>>>> return
>>>>> end subroutine handle_err_nc_p
>>>>> !
>>>>> =
>>>>> =
>>>>> =
>>>>> ==================================================================
>>>>> subroutine exec_stop(MESSAGE,ERRORCLASS)
>>>>>
>>>>> character(len=*), intent(in ) :: MESSAGE
>>>>> integer(kint), intent(in ) :: ERRORCLASS
>>>>>
>>>>> integer(kint) :: ierror
>>>>>
>>>>> print *, 'ERROR: MPI execution stopped'
>>>>> print *, 'error class: ', ERRORCLASS
>>>>> print *, 'message: ', trim(MESSAGE)
>>>>> call mpi_abort(mpi_comm_world,ierror)
>>>>> stop
>>>>>
>>>>> end subroutine exec_stop
>>>>> !
>>>>> =
>>>>> =
>>>>> =
>>>>> ==================================================================
>>>>> subroutine makename(prefix,icoor,name)
>>>>> !------------------------------------------------------------------!
>>>>> ! create character
>>>>> name=prefix_(icoor(1)+1)_(icoor(2)+1) !
>>>>> !------------------------------------------------------------------!
>>>>> character(len=*), intent(in) :: prefix
>>>>> integer(kint), dimension(2), intent(in) :: icoor
>>>>> character(len=*), intent(out) :: name
>>>>> !------------------------------------------------------------------!
>>>>> ! local
>>>>> variables !
>>>>> !------------------------------------------------------------------!
>>>>> character(len=25) :: numx,numy
>>>>> integer(kint) :: lenx,leny,lenf
>>>>>
>>>>> write(numx,'(i5)') icoor(1)+1
>>>>> numx=adjustl(numx)
>>>>> lenx=len_trim(numx)
>>>>>
>>>>> write(numy,'(i5)') icoor(2)+1
>>>>> numy=adjustl(numy)
>>>>> leny=len_trim(numy)
>>>>>
>>>>> name=adjustl(prefix)
>>>>> lenf=len_trim(name)
>>>>>
>>>>> name=name(1:lenf)//'_'//numx(1:lenx)//'_'//numy(1:leny)
>>>>>
>>>>> end subroutine makename
>>>>> !
>>>>> =
>>>>> =
>>>>> =
>>>>> ==================================================================
>>>>>
>>>>> end program mpi_setup_netCDFout
>>>>> # MAKEFILE used for development module testing
>>>>> # Alex Hoffmann
>>>>>
>>>>> # set netCDF path
>>>>> NETCDF = /usr/local/netcdf-3.6.2-ifort
>>>>> PNETCDF= /usr/local/parallel-netcdf-1.0.1
>>>>>
>>>>> # set compiler
>>>>> F90 = mpif90 # mpif90 refers to a shell script that includes
>>>>> links
>>>>> to the libraries
>>>>>
>>>>>
>>>>> # set compiler flags
>>>>> OFLAGS = -g
>>>>> # set include flags
>>>>> IFLAGS = -I$(NETCDF)/include/ -I$(PNETCDF)/include/
>>>>> # set pre-processing flags
>>>>> #PFLAGS = -fpp
>>>>> # concatenate flags
>>>>> FLAGS = $(OFLAGS) $(IFLAGS) $(PFLAGS)
>>>>> # set library flags
>>>>> LFLAGS = -L$(NETCDF)/lib -lnetcdf -L$(PNETCDF)/lib -lpnetcdf
>>>>>
>>>>> RM = rm -f
>>>>>
>>>>> #MAIN =
>>>>> MAIN = mpi_setup_netCDFout
>>>>> OBJECTS =
>>>>>
>>>>>
>>>>> all: $(MAIN)
>>>>>
>>>>>
>>>>> $(MAIN): $(MAIN).F90 $(OBJECTS)
>>>>> $(F90) $(FLAGS) -o $@ $^ $(LFLAGS)
>>>>>
>>>>> clean:
>>>>> $(RM) *.o *.mod
>>>>
>>>
>>> --
>>> ______________________________________________________________________
>>>
>>> Alex HOFFMANN PhD Candidate
>>> Centre for Atmospheric Science
>>> Department of Geography University of Cambridge
>>> Downing Place, Cambridge CB2 3EN, UK
>>> e-mail: ah519 at cam.ac.uk tel: +44 (0)1223 766581
>>> www.geog.cam.ac.uk/people/hoffmann/
>>> ______________________________________________________________________
>>>
>>
>
> --
> ______________________________________________________________________
>
> Alex HOFFMANN PhD Candidate
> Centre for Atmospheric Science
> Department of Geography University of Cambridge
> Downing Place, Cambridge CB2 3EN, UK
> e-mail: ah519 at cam.ac.uk tel: +44 (0)1223 766581
> www.geog.cam.ac.uk/people/hoffmann/
> ______________________________________________________________________
>
More information about the parallel-netcdf
mailing list