[petsc-dev] VecGetArrayAndMemType

Mark Adams mfadams at lbl.gov
Thu Jun 24 06:00:21 CDT 2021


OK, this is what I get with bisect:

(base) 06:54 (c3b2925bfb...)|BISECTING ~/Codes/petsc2$ git bisect good
The merge base c3b2925bfbe1c0a0b3a69c9a76295d0747e33d47 is bad.
This means the bug has been fixed between
c3b2925bfbe1c0a0b3a69c9a76295d0747e33d47 and
[09da24df01e50defd94bc4f7396f866a808ecea5
c3b2925bfbe1c0a0b3a69c9a76295d0747e33d47].

(base) 06:56 (v3.15.1) ~/Codes/petsc2$ git checkout
c3b2925bfbe1c0a0b3a69c9a76295d0747e33d47
Previous HEAD position was 09da24df01 Increase patchlevel to 3.15.1
HEAD is now at c3b2925bfb Merge branch 'knepley-main-patch-78504' into
'release'
(base) 06:56 (c3b2925bfb...) ~/Codes/petsc2$

On Wed, Jun 23, 2021 at 11:14 PM Junchao Zhang <junchao.zhang at gmail.com>
wrote:

> Mark,
>   I am not sure what your problem is.  If it is a regression, can you
> bisect it?
> --Junchao Zhang
>
>
> On Wed, Jun 23, 2021 at 4:04 PM Mark Adams <mfadams at lbl.gov> wrote:
>
>> I also tried commenting out the second VecView, so there is just one step
>> in the file, and the .h5 file is only 8 bytes smaller and the .xmf file
>> goes from 5373  bytes to 3090 bytes.
>>
>> On Wed, Jun 23, 2021 at 4:01 PM Mark Adams <mfadams at lbl.gov> wrote:
>>
>>> It is not a device issue but it is a regression.
>>>
>>> Landau ex1 is tiny and just calls VecView before and after the TSsolve,
>>> which is one time step. If you add "*-dm_view hdf5:f.h5 -vec_view
>>> hdf5:f.h5::append -dm_landau_Ez 10.*" to landau/ex1 (see below), you
>>> get an h5 file with two time steps, as it should be.
>>> This is a huge electric field, Ez=10, which makes the electron
>>> distribution (u_e) get visibly pulled off center.
>>> In Visit, both time steps have identical data that is clearly after the
>>> solve and not the initial condition (see attached).
>>>
>>> I ran this again with -ex1_ts_max_steps 0 and get the expected result
>>> of two steps/frames with the symmetric initial condition in both. THis is
>>> correct behavior.
>>>
>>> Any ideas?
>>> Thanks
>>>
>>> diff --git a/src/ts/utils/dmplexlandau/tutorials/ex1.c
>>> b/src/ts/utils/dmplexlandau/tutorials/ex1.c
>>> index 9e4c8f1b61..31dfda2fad 100644
>>> --- a/src/ts/utils/dmplexlandau/tutorials/ex1.c
>>> +++ b/src/ts/utils/dmplexlandau/tutorials/ex1.c
>>> @@ -66,6 +66,6 @@ int main(int argc, char **argv)
>>>    test:
>>>      suffix: 0
>>>      requires: p4est !complex
>>> -    args: -petscspace_degree 3 -petscspace_poly_tensor 1
>>> -dm_landau_type p4est -dm_landau_ion_masses 2,4 -dm_landau_ion_charges 1,18
>>> -dm_landau_thermal_temps 5,5,.5 -dm_landau_n 1.00018,1,1e-5 -dm_landau_n_0
>>> 1e20 -ex1_ts_monitor -ex1_snes_rtol 1.e-14 -ex1_snes_stol 1.e-14
>>> -ex1_snes_monitor -ex1_snes_converged_reason -ex1_ts_type arkimex
>>> -ex1_ts_arkimex_type 1bee -ex1_ts_max_snes_failures -1 -ex1_ts_rtol 1e-1
>>> -ex1_ts_dt 1.e-1 -ex1_ts_max_time 1 -ex1_ts_adapt_clip .5,1.25
>>> -ex1_ts_adapt_scale_solve_failed 0.75
>>> -ex1_ts_adapt_time_step_increase_delay 5 -ex1_ts_max_steps 1 -ex1_pc_type
>>> lu -ex1_ksp_type preonly -dm_landau_amr_levels_max 7
>>> -dm_landau_domain_radius 5 -dm_landau_amr_re_levels 0 -dm_landau_re_radius
>>> 1 -dm_landau_amr_z_refine1 1 -dm_landau_amr_z_refine2 0
>>> -dm_landau_amr_post_refine 0 -dm_landau_z_radius1 .1 -dm_landau_z_radius2
>>> .1 -dm_refine 1 -dm_landau_gpu_assembly false
>>> +    args: -petscspace_degree 3 -petscspace_poly_tensor 1
>>> -dm_landau_type p4est -dm_landau_ion_masses 2,4 -dm_landau_ion_charges 1,18
>>> -dm_landau_thermal_temps 5,5,.5 -dm_landau_n 1.00018,1,1e-5 -dm_landau_n_0
>>> 1e20 -ex1_ts_monitor -ex1_snes_rtol 1.e-14 -ex1_snes_stol 1.e-14
>>> -ex1_snes_monitor -ex1_snes_converged_reason -ex1_ts_type arkimex
>>> -ex1_ts_arkimex_type 1bee -ex1_ts_max_snes_failures -1 -ex1_ts_rtol 1e-1
>>> -ex1_ts_dt 1.e-1 -ex1_ts_max_time 1 -ex1_ts_adapt_clip .5,1.25
>>> -ex1_ts_adapt_scale_solve_failed 0.75
>>> -ex1_ts_adapt_time_step_increase_delay 5 -ex1_ts_max_steps 1 -ex1_pc_type
>>> lu -ex1_ksp_type preonly -dm_landau_amr_levels_max 7
>>> -dm_landau_domain_radius 5 -dm_landau_amr_re_levels 0 -dm_landau_re_radius
>>> 1 -dm_landau_amr_z_refine1 1 -dm_landau_amr_z_refine2 0
>>> -dm_landau_amr_post_refine 0 -dm_landau_z_radius1 .1 -dm_landau_z_radius2
>>> .1 -dm_refine 1 -dm_landau_gpu_assembly false *-dm_view hdf5:f.h5
>>> -vec_view hdf5:f.h5::append -dm_landau_Ez 10.*
>>>
>>>  TEST*/
>>>
>>> On Wed, Jun 23, 2021 at 1:38 PM Mark Adams <mfadams at lbl.gov> wrote:
>>>
>>>> Landau ex1 should work. I will test.
>>>>
>>>> On Wed, Jun 23, 2021 at 10:47 AM Matthew Knepley <knepley at gmail.com>
>>>> wrote:
>>>>
>>>>> On Wed, Jun 23, 2021 at 10:44 AM Junchao Zhang <
>>>>> junchao.zhang at gmail.com> wrote:
>>>>>
>>>>>> Use VecGetArrayRead/Write() to get up-to-date host pointers to the
>>>>>> vector array.
>>>>>>
>>>>>
>>>>> I think Mark is saying that those are not working. We do call
>>>>> VecGetArrayRead() in the HDF5 code.
>>>>>
>>>>> Mark, it seem like a small broken code is necessary.
>>>>>
>>>>>   Thanks,
>>>>>
>>>>>     Matt
>>>>>
>>>>>
>>>>>> --Junchao Zhang
>>>>>>
>>>>>>
>>>>>> On Wed, Jun 23, 2021 at 9:15 AM Mark Adams <mfadams at lbl.gov> wrote:
>>>>>>
>>>>>>> First, there seem to be two pages for VecGetArrayAndMemType (one has
>>>>>>> a pointer to the other).
>>>>>>>
>>>>>>> So I need to get a CPU array for HDF5 viewing. Totally broken for
>>>>>>> devices.
>>>>>>>
>>>>>>> I don't find a VecGetArrayCpu[HOST] that does the right thing.
>>>>>>>
>>>>>>> Perhaps have VecGetArrayAndMemType return a valid CPU pointer when
>>>>>>> "mtype==NULL"?
>>>>>>>
>>>>>>> Mark
>>>>>>>
>>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin their
>>>>> experiments is infinitely more interesting than any results to which their
>>>>> experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>
>>>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20210624/654f6847/attachment-0001.html>


More information about the petsc-dev mailing list