regression harness in pre-release of PnetCDF 1.9.0

Wei-keng Liao wkliao at eecs.northwestern.edu
Tue Nov 21 22:16:11 CST 2017


Hi, Carl

I have modified the test scripts, so the screen outputs of "make check"
look closer to what you like. You are welcomed to give a try the latest
PnetCDF codes in SVN repo at:
https://svn.mcs.anl.gov/repos/parallel-netcdf/trunk

Wei-keng

On Nov 21, 2017, at 12:13 PM, Wei-keng Liao wrote:

> 
> PnetCDF uses script-based test mechanism, because on some machines
> the MPI programs must run through commands such as mpiexec, and srun.
> In addition, scripts allow users to specify the output folders for all
> tests, which can be a parallel file system on large machines, as on
> those machines writing the output files on local home folders can be very slow.
> 
> In fact, NetCDF also uses scripts. Check this line from your screen shot.
>> PASS: run_par_tests.sh
> 
> 
> The script run_par_tests.sh under folder h5_test contains three runs of ./tst_h_par.
> Similar scripts can be found in other folders, such as nc_test4, nc_test, etc.
> 
> Please note this is the way automake reports the test results when using scripts.
> 
> Wei-keng
> 
> On Nov 21, 2017, at 9:32 AM, Carl Ponder wrote:
> 
>> On 11/21/2017 02:33 AM, Wei-keng Liao wrote:
>>> The "make check" error reporting mechanism has been changed after migrating to automake.
>>> It is now using the script-based style of automake. See automake guide in
>>> 
>>> http://www.gnu.org/software/automake/manual/html_node/Scripts_002dbased-Testsuites.html
>>> 
>>> 
>>> After running "make check", the outputs (to stdout and stderr) of test programs are stored
>>> in test/*/seq_runs.sh.log. From there you will see those familiar outputs from 1.8.1.
>>> In the new design, the outputs of all test programs in the same folder will go to the
>>> same log file.
>>> 
>>> Running "make check" will stop immediately when encountering the first failed test and
>>> print the failure messages on screen. An example of failed make check with error messages
>>> is given below. If you did not see any error/failed message at the end, then all tests
>>> were passed. To make that more obvious, I will add a print statement at the end of
>>> "make" and "make check" to indicate whether libraries have been built and all test programs
>>> have passed, respectively.
>>> 
>>> 
>> I always use
>> make -i -k check
>> to force all the tests to run. This way I get a full accounting of what worked and what didn't.
>> There have been cases with app's where I've been assured that some tests don't matter for various reasons, so I want to see if these are the only ones failing, or if the failures went away with a the latest version of the source base.
>> As far as the tool goes, I get this kind of output from NetCDF-C for example:
>> PASS: tst_h_strings2
>> PASS: tst_h_ints
>> FAIL: tst_h_dimscales
>> PASS: tst_h_dimscales1
>> PASS: tst_h_dimscales2
>> PASS: tst_h_dimscales3
>> PASS: tst_h_enums
>> PASS: tst_h_dimscales4
>> PASS: run_par_tests.sh
>> ============================================================================
>> Testsuite summary for netCDF 4.5.0
>> ============================================================================
>> # TOTAL: 27
>> # PASS:  26
>> # SKIP:  0
>> # XFAIL: 0
>> # FAIL:  1
>> # XPASS: 0
>> # ERROR: 0
>> so while it's running a battery of sub-tests here, it's giving me counts of them individually rather than just a total of one pass.
>> Are you using the same toolkit that they are?
>> 
>>        -- Carl
>> 
>> On 11/21/2017 02:33 AM, Wei-keng Liao wrote:
>>> Example of screen shot for failed "make check":
>>> ...
>>> Making check in test
>>> Making check in common
>>> Making check in C
>>> PASS: seq_runs.sh
>>> ============================================================================
>>> Testsuite summary for parallel-netcdf 1.9.0.pre1
>>> ============================================================================
>>> # TOTAL: 1
>>> # PASS:  1
>>> # SKIP:  0
>>> # XFAIL: 0
>>> # FAIL:  0
>>> # XPASS: 0
>>> # ERROR: 0
>>> ============================================================================
>>> Making check in fandc
>>> Making check in nc_test
>>> FAIL: seq_runs.sh
>>> ============================================================================
>>> Testsuite summary for parallel-netcdf 1.9.0.pre1
>>> ============================================================================
>>> # TOTAL: 1
>>> # PASS:  0
>>> # SKIP:  0
>>> # XFAIL: 0
>>> # FAIL:  1
>>> # XPASS: 0
>>> # ERROR: 0
>>> ============================================================================
>>> See test/nc_test/test-suite.log
>>> Please report to 
>>> parallel-netcdf at mcs.anl.gov
>>> 
>>> ============================================================================
>>> make[5]: *** [test-suite.log] Error 1
>>> make[4]: *** [check-TESTS] Error 2
>>> make[3]: *** [check-am] Error 2
>>> make[2]: *** [check] Error 2
>>> make[1]: *** [check-recursive] Error 1
>>> make: *** [check-recursive] Error 1
>>> 
>> On Nov 21, 2017, at 1:38 AM, Carl Ponder wrote:
>>> A question on the regression-tests: in this pre-release version, I see 10 regression-tests being reported
>>> # TOTAL: 1
>>> # TOTAL: 1
>>> # TOTAL: 1
>>> # TOTAL: 1
>>> # TOTAL: 1
>>> # TOTAL: 1
>>> # TOTAL: 0
>>> # TOTAL: 1
>>> # TOTAL: 1
>>> # TOTAL: 1
>>> # TOTAL: 1
>>> whereas in 1.8.1, there were 86 tests reported:
>>> *** TESTING C   nc_test for format CDF-1
>>> *** TESTING C   nc_test for format CDF-2
>>> *** TESTING C   nc_test for format CDF-5
>>> *** TESTING C   t_nc for emulating netCDF t_nc                     ------ pass
>>> *** TESTING C   tst_misc for emulating netCDF t_misc               ------ pass
>>> *** TESTING C   tst_norm for emulating netCDF tst_norm             ------ pass
>>> *** TESTING C   tst_small for emulating netCDF tst_small           ------ pass
>>> *** TESTING C   tst_names for emulating netCDF tst_names           ------ pass
>>> *** TESTING C   tst_atts3 for emulating netCDF tst_atts3           ------ pass
>>> *** TESTING C   tst_atts for emulating netCDF tst_atts             ------ pass
>>> *** TESTING C   tst_nofill for fill/nofill modes                   ------ pass
>>> *** TESTING C   pres_temp_4D_wr for writing file                   ------ pass
>>> *** TESTING C   pres_temp_4D_rd for reading file                   ------ pass
>>> *** TESTING C   mcoll_perf for mput/iput APIs                      ------ pass
>>> *** TESTING C   test_bput for bput API                             ------ pass
>>> *** TESTING C   interleaved for writing interleaved fileviews      ------ pass
>>> *** TESTING C   i_varn_int64 for iput/iget varn                    ------ pass
>>> *** TESTING C   flexible_bput for flexible bput_varm               ------ pass
>>> *** TESTING C   wait_after_indep for ncmpi_end_indep_data          ------ pass
>>> *** TESTING C   req_all for NC_REQ_ALL                             ------ pass
>>> *** TESTING C   i_varn_indef for iput/iget varn in define mode     ------ pass
>>> *** TESTING C   bput_varn for bput_varn                            ------ pass
>>> *** TESTING C   column_wise for iput/iget interleaved access       ------ pass
>>> *** TESTING F77 ./mcoll_testf77 for iput API                       ------ pass
>>> *** TESTING F77 ./test_bputf77 for bput_varm_real API              ------ pass
>>> *** TESTING F90 ./mcoll_testf for nf90mpi_iput_var API             ------ pass
>>> *** TESTING F90 ./test_bputf for bput_var                          ------ pass
>>> *** TESTING C   test_inq_format for inquiring CDF file formats     ------ pass
>>> *** TESTING C   cdf_type for CDF-5 type in CDF-1 and 2             ------ pass
>>> *** TESTING C   dim_cdf12 for defining dim in CDF-1/2 format       ------ pass
>>> *** TESTING C   ncmpi_vars_null_stride for NULL stride             ------ pass
>>> *** TESTING C   vectors for put_vara/get_vara                      ------ pass
>>> *** TESTING C   collective_error for collective abort              ------ pass
>>> *** TESTING C   test_varm for get/put varm                         ------ pass
>>> *** TESTING C   alignment_test for alignment                       ------ pass
>>> *** TESTING C   flexible for flexible put and get                  ------ pass
>>> *** TESTING C   flexible2 for flexible APIs                        ------ pass
>>> *** TESTING C   flexible_varm for flexible varm APIs               ------ pass
>>> *** TESTING C   nonblocking for using ncmpi_iput_vara_int()        ------ pass
>>> *** TESTING C   noclobber for NC_NOCLOBBER and NC_EEXIST           ------ pass
>>> *** TESTING C   record for write records in reversed order         ------ pass
>>> *** TESTING C   inq_num_vars for no. record/fixed variables        ------ pass
>>> *** TESTING C   varn_int for ncmpi_put_varn_int_all()              ------ pass
>>> *** TESTING C   modes for file create/open modes
>>> *** TESTING C   one_record for only one record variable            ------ pass
>>> *** TESTING C   inq_recsize for inquiring record size              ------ pass
>>> *** TESTING C   test_vard for flexible put and get                 ------ pass
>>> *** TESTING C   varn_contig for put_varn with contig fileview      ------ pass
>>> *** TESTING C   ivarn for ncmpi_iput_varn_<type>()                 ------ pass
>>> *** TESTING C   check_striping for strining info                   ------ pass
>>> *** TESTING C   add_var for checking offsets of new variables      ------ pass
>>> *** TESTING C   buftype_free for free buftype in flexible API      ------ pass
>>> *** TESTING C   last_large_var for last large var in CDF-1/2       ------ pass
>>> *** TESTING C   check_type for checking for type conflict          ------ pass
>>> *** TESTING C   test_erange for checking for NC_ERANGE             ------ pass
>>> *** TESTING C   scalar for get/put scalar variables                ------ pass
>>> *** TESTING C   redef1 for entering re-define mode                 ------ pass
>>> *** TESTING F77 ./varn_intf for varn API                           ------ pass
>>> *** TESTING F77 ./attrf for attribute overflow                     ------ pass
>>> *** TESTING F77 ./buftype_freef for flexible API                   ------ pass
>>> *** TESTING F77 ./put_parameter for using immutable write buf      ------ pass
>>> *** TESTING F77 ./test_vardf for vard API                          ------ pass
>>> *** TESTING F90 ./inq_num_varsf for no. record/fixed variables     ------ pass
>>> *** TESTING F90 ./inq_recsizef for inquiring record size           ------ pass
>>> *** TESTING F90 ./test_vardf90 for vard API                        ------ pass
>>> *** TESTING F90 ./varn_real for varn API                           ------ pass
>>> *** TESTING C   erange_fill for checking for type conflict         ------ pass
>>> *** TESTING C   redef1 for entering re-define mode                 ------ pass
>>> *** TESTING C++ nctst for APIs with different netCDF formats       ------ pass
>>> *** TESTING C++ test_classic for creation of classic format file   ------ pass
>>> *** TESTING F77 ./nf_test for CDF-1                                ------ pass
>>> *** TESTING F77 ./nf_test for CDF-2                                ------ pass
>>> *** TESTING F77 ./nf_test for CDF-5                                ------ pass
>>> *** TESTING F90 ./nf90_test for CDF-1                              ------ pass
>>> *** TESTING F90 ./nf90_test for CDF-2                              ------ pass
>>> *** TESTING F90 ./nf90_test for CDF-5                              ------ pass
>>> *** TESTING F90 ./test_intent for INTENT modifier                  ------ pass
>>> *** TESTING F90 ./tst_f90                                          ------ pass
>>> *** TESTING F90 ./f90tst_vars for def_var API                      ------ pass
>>> *** TESTING F90 ./tst_types2 for 64-bit integer types              ------ pass
>>> *** TESTING F90 ./tst_f90_cdf5                                     ------ pass
>>> *** TESTING F90 ./f90tst_vars2 for def_var API                     ------ pass
>>> *** TESTING F90 ./f90tst_vars3 for def_var API                     ------ pass
>>> *** TESTING F90 ./f90tst_vars4 for def_var API                     ------ pass
>>> *** TESTING F90 ./tst_flarge for large files                       ------ pass
>>> *** TESTING F90 ./tst_io                                           ------ pass
>>> In the 1.9.0 list, are you aggregating the the results of more than one test in the 1-total's?
>>> For example, it looks like you're compiling several tests here but running a single driver, and reporting it as a single test:
>>> Making check in cdf_format
>>> make[2]: Entering directory `/cm/extra/apps/PNetCDF/1.9.0.pre1/PGI-17.10_OpenMPI-1.10.7_CUDA-9.0.176_Power/distro/test/cdf_format'
>>> make  test_inq_format cdf_type dim_cdf12 tst_open_cdf5 tst_corrupt
>>> make[3]: Entering directory `/cm/extra/apps/PNetCDF/1.9.0.pre1/PGI-17.10_OpenMPI-1.10.7_CUDA-9.0.176_Power/distro/test/cdf_format'
>>>  CC       test_inq_format.o
>>>  CCLD     test_inq_format
>>>  CC       cdf_type.o
>>>  CCLD     cdf_type
>>>  CC       dim_cdf12.o
>>>  CCLD     dim_cdf12
>>>  CC       tst_open_cdf5.o
>>>  CCLD     tst_open_cdf5
>>>  CC       tst_corrupt.o
>>>  CCLD     tst_corrupt
>>> make[3]: Leaving directory `/cm/extra/apps/PNetCDF/1.9.0.pre1/PGI-17.10_OpenMPI-1.10.7_CUDA-9.0.176_Power/distro/test/cdf_format'
>>> make  check-TESTS
>>> make[3]: Entering directory `/cm/extra/apps/PNetCDF/1.9.0.pre1/PGI-17.10_OpenMPI-1.10.7_CUDA-9.0.176_Power/distro/test/cdf_format'
>>> make[4]: Entering directory `/cm/extra/apps/PNetCDF/1.9.0.pre1/PGI-17.10_OpenMPI-1.10.7_CUDA-9.0.176_Power/distro/test/cdf_format'
>>> PASS: seq_runs.sh
>>> ============================================================================
>>> Testsuite summary for parallel-netcdf 1.9.0.pre1
>>> ============================================================================
>>> # TOTAL: 1
>>> # PASS:  1
>>> # SKIP:  0
>>> # XFAIL: 0
>>> # FAIL:  0
>>> # XPASS: 0
>>> # ERROR: 0
>>> ============================================================================
>>> This would make it difficult to identify any single failures.
>>> 
>> This email message is for the sole use of the intended recipient(s) and may contain confidential information.  Any unauthorized review, use, disclosure or distribution is prohibited.  If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message.
> 



More information about the parallel-netcdf mailing list