[petsc-dev] every test example runs in a new directory with new test harness

Barry Smith bsmith at mcs.anl.gov
Sun Feb 5 21:57:42 CST 2017


  Easier just to have the same example reload the file generated and run with that.

> On Feb 5, 2017, at 9:52 PM, Satish Balay <balay at mcs.anl.gov> wrote:
> 
> Probably not [yet]. one potential issue is 'nsize' is listed separately [to mpiexec]..
> 
> Satish
> 
> On Sun, 5 Feb 2017, Barry Smith wrote:
> 
>> 
>>  Does the test harness support this? If so it sounds like an ok solution
>> 
>> 
>>> On Feb 5, 2017, at 9:41 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>>> 
>>> Perhaps the following could be converted into a single run script with
>>> 2 mpiexec commands.
>>> 
>>>>>>>>>>>> 
>>> test:
>>>   suffix: restart_0
>>>   requires: hdf5
>>>   args: -run_type test -refinement_limit 0.0    -bc_type dirichlet -interpolate 1 -petscspace_order 1 -dm_view hdf5:sol.h5 -vec_view hdf5:sol.h5::append
>>>   args: -run_type test -refinement_limit 0.0    -bc_type dirichlet -interpolate 1 -petscspace_order 1 -f sol.h5 -restart
>>> <<<<<<<
>>> 
>>> Satish
>>> 
>>> On Sun, 5 Feb 2017, Barry Smith wrote:
>>> 
>>>> 
>>>> test:
>>>>   suffix: restart_0
>>>>   requires: hdf5
>>>>   args: -run_type test -refinement_limit 0.0    -bc_type dirichlet -interpolate 1 -petscspace_order 1 -dm_view hdf5:sol.h5 -vec_view hdf5:sol.h5::append
>>>> 
>>>> test:
>>>>   suffix: restart_1
>>>>   requires: hdf5
>>>>   args: -run_type test -refinement_limit 0.0    -bc_type dirichlet -interpolate 1 -petscspace_order 1 -f sol.h5 -restart
>>>> 
>>>> See a problem?
>>>> 
>>>> Should the same run of the example view the files and then load them back in? versus trying to read in a data file from another run that may not even have been created before and even if it was, the file was definitely created  in a different directory? 
>>>> 
>>>> I think that if you want to do this kind of test the same example must first write the file and then read it back in. Cannot assume any relationship between different runs of tests with parallel scheduling.
>>>> 
>>>> Barry
>>>> 
>>>> 
>>> 
>> 
>> 
> 




More information about the petsc-dev mailing list