[petsc-dev] ASM for each field solve on GPUs

Matthew Knepley knepley at gmail.com
Thu Dec 31 10:03:32 CST 2020


On Thu, Dec 31, 2020 at 8:42 AM Mark Adams <mfadams at lbl.gov> wrote:

> Humm, I have two fields (Nf) here:
>

I never check the default mode of splitting all fields apart. That might be
broken. You can always use explicit specification

  -sub_fieldsplit_0_fields 0 -sub_fieldsplit_1_fields 1

  Thanks,

     Matt


> /usr/local/Cellar/mpich/3.3.2_1/bin/mpiexec -n 1 ./ex2 -ex2_test_type
> spitzer -ex2_connor_e_field_units -dm_landau_Ez 0 -petscspace_degree 3
> -dm_landau_type p4est -dm_landau_ion_masses 1 -dm_landau_ion_charges 1
> -dm_landau_thermal_temps 4,4 -dm_landau_n 1,1 -ts_monitor -ts_adapt_monitor
> -snes_rtol 1.e-14 -snes_stol 1.e-14 -snes_monitor -snes_converged_reason
> -snes_max_it 15 -ts_type arkimex -ts_exact_final_time stepover
> -ts_arkimex_type 1bee -ts_max_snes_failures -1 -ts_rtol 1e-3 -ts_dt .1e-1
> -ts_adapt_clip .25,1.1 -ts_max_time 100 -ts_max_steps 1
> -ts_adapt_scale_solve_failed 0.75 -ts_adapt_time_step_increase_delay 5
> -pc_type lu -ksp_type preonly -dm_landau_amr_levels_max 8
> -dm_landau_amr_post_refine 0 -dm_landau_domain_radius 5 -ex2_plot_dt 1
> -info :dm -ts_adapt_dt_min 1e-4 -ts_adapt_dt_max 1. -dm_preallocate_only
> -ex2_impurity_source_type pulse -ex2_pulse_start_time 32
> -ex2_pulse_width_time 15 -ex2_pulse_rate 2e+0 -ex2_t_cold 1
> -dm_landau_device_type cpu -pc_type asm -sub_pc_type fieldsplit
> -sub_pc_fieldsplit_type additive -sub_fieldsplit_0_pc_type lu -ksp_monitor
> [0] ProcessOptions(): num_species set to number of thermal temps provided
> (2)
> masses:        e= 9.109e-31; ions in proton mass units:    1.000e+00
>  0.000e+00 ...
> charges:       e=-1.602e-19; charges in elementary units:  1.000e+00
>  0.000e+00
> thermal T (K): e= 4.642e+07 i= 4.642e+07 imp= 0.000e+00. v_0= 2.652e+07
> n_0= 1.000e+20 t_0= 4.630e-05 domain= 5.000e+00
> [0] ProcessREOptions(): Num electrons from ions=1., T_cold= 1.160e+07, ion
> potential= 1.500e+01, E_z= 0.000e+00 v_0= 2.652e+07
> CalculateE j0=0. Ec = 0.0509908
> [0] DMGetDMSNES(): Creating new DMSNES
> [0] DMGetDMTS(): Creating new DMTS
> [0] DMGetDMSNESWrite(): Copying DMSNES due to write
> 0 TS dt 0.01 time 0.
>
> *0) species-0: charge density= -1.6022862392985e+01 z-momentum=
> -1.4067414979736e-19 energy=  9.6063873494138e+04  0) species-1: charge
> density=  1.6029890912678e+01 z-momentum= -8.1631830255319e-18 energy=
>  9.6333618295472e+04*
>  0) Total: charge density=  7.0285196929767e-03, momentum=
> -8.3038571753292e-18, energy=  1.9239749178961e+05 (m_i[0]/m_e = 1835.47,
> 50 cells)
> testSpitzer    0) time= 0.000e+00 n_e=  1.000e+00 E=  0.000e+00 J=
> -1.495e-07 J_re= -0.000e+00 0 % Te_kev=  3.995e+00 Z_eff=1. E/J to eta
> ratio=-0. (diff=1e+10) constant E
> [0] DMGetDMKSP(): Creating new DMKSP
> [0]FormLandau: 800 IPs, 50 cells, totDim=32, Nb=16, Nq=16,
> elemMatSize=1024, dim=2, Tab: Nb=16 *Nf=2 *Np=16 cdim=2 N=896 N+hang=1176
>     0 SNES Function norm 4.974988592014e-03
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Petsc has generated inconsistent data
>
> *[0]PETSC ERROR: Unhandled case, must have at least two fields, not 1*[0]PETSC
> ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for
> trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.14.2-353-gc57b1f101b
>  GIT Date: 2020-12-27 11:51:24 -0600
> [0]PETSC ERROR: ./ex2 on a arch-macosx-gnu-g named MarksMac-302.local by
> markadams Thu Dec 31 08:35:49 2020
> [0]PETSC ERROR: Configure options
> --with-mpi-dir=/usr/local/Cellar/mpich/3.3.2_1 COPTFLAGS="-g -O0"
> CXXOPTFLAGS="-g -O0" --download-metis=1 --download-parmetis=1
> --download-kokkos=1 --download-kokkos-kernels=1 --download-p4est=1
> --with-zlib=1 --with-make-np=4 --download-hdf5=1 -with-cuda=0 --with-x=0
> --with-debugging=1 PETSC_ARCH=arch-macosx-gnu-g --with-64-bit-indices=0
> --with-openmp=0 --with-ctable=0
> [0]PETSC ERROR: #1 PCFieldSplitSetDefaults() line 571 in
> /Users/markadams/Codes/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> [0]PETSC ERROR: #2 PCSetUp_FieldSplit() line 614 in
> /Users/markadams/Codes/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> [0]PETSC ERROR: #3 PCSetUp() line 1015 in
> /Users/markadams/Codes/petsc/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: #4 KSPSetUp() line 406 in
> /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #5 PCSetUpOnBlocks_ASM() line 437 in
> /Users/markadams/Codes/petsc/src/ksp/pc/impls/asm/asm.c
> [0]PETSC ERROR: #6 PCSetUpOnBlocks() line 1046 in
> /Users/markadams/Codes/petsc/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: #7 KSPSetUpOnBlocks() line 214 in
> /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #8 KSPSolve_Private() line 727 in
> /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #9 KSPSolve() line 963 in
> /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #10 SNESSolve_NEWTONLS() line 225 in
> /Users/markadams/Codes/petsc/src/snes/impls/ls/ls.c
> [0]PETSC ERROR: #11 SNESSolve() line 4563 in
> /Users/markadams/Codes/petsc/src/snes/interface/snes.c
> [0]PETSC ERROR: #12 TSStep_ARKIMEX() line 845 in
> /Users/markadams/Codes/petsc/src/ts/impls/arkimex/arkimex.c
> [0]PETSC ERROR: #13 TSStep() line 3771 in
> /Users/markadams/Codes/petsc/src/ts/interface/ts.c
> [0]PETSC ERROR: #14 TSSolve() line 4168 in
> /Users/markadams/Codes/petsc/src/ts/interface/ts.c
> [0]PETSC ERROR: #15 main() line 774 in ex2.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -dm_landau_amr_levels_max 8
> [0]PETSC ERROR: -dm_landau_amr_post_refine 0
> [0]PETSC ERROR: -dm_landau_device_type cpu
> [0]PETSC ERROR: -dm_landau_domain_radius 5
> [0]PETSC ERROR: -dm_landau_Ez 0
> [0]PETSC ERROR: -dm_landau_ion_charges 1
> [0]PETSC ERROR: -dm_landau_ion_masses 1
> [0]PETSC ERROR: -dm_landau_n 1,1
> [0]PETSC ERROR: -dm_landau_thermal_temps 4,4
> [0]PETSC ERROR: -dm_landau_type p4est
> [0]PETSC ERROR: -dm_preallocate_only
> [0]PETSC ERROR: -ex2_connor_e_field_units
> [0]PETSC ERROR: -ex2_impurity_source_type pulse
> [0]PETSC ERROR: -ex2_plot_dt 1
> [0]PETSC ERROR: -ex2_pulse_rate 2e+0
> [0]PETSC ERROR: -ex2_pulse_start_time 32
> [0]PETSC ERROR: -ex2_pulse_width_time 15
> [0]PETSC ERROR: -ex2_t_cold 1
> [0]PETSC ERROR: -ex2_test_type spitzer
> [0]PETSC ERROR: -info :dm
> [0]PETSC ERROR: -ksp_monitor
> [0]PETSC ERROR: -ksp_type preonly
> [0]PETSC ERROR: -pc_type asm
> [0]PETSC ERROR: -petscspace_degree 3
> [0]PETSC ERROR: -snes_converged_reason
> [0]PETSC ERROR: -snes_max_it 15
> [0]PETSC ERROR: -snes_monitor
> [0]PETSC ERROR: -snes_rtol 1.e-14
> [0]PETSC ERROR: -snes_stol 1.e-14
> [0]PETSC ERROR: -sub_fieldsplit_0_pc_type lu
> [0]PETSC ERROR: -sub_pc_fieldsplit_type additive
> [0]PETSC ERROR: -sub_pc_type fieldsplit
> [0]PETSC ERROR: -ts_adapt_clip .25,1.1
> [0]PETSC ERROR: -ts_adapt_dt_max 1.
> [0]PETSC ERROR: -ts_adapt_dt_min 1e-4
> [0]PETSC ERROR: -ts_adapt_monitor
> [0]PETSC ERROR: -ts_adapt_scale_solve_failed 0.75
> [0]PETSC ERROR: -ts_adapt_time_step_increase_delay 5
> [0]PETSC ERROR: -ts_arkimex_type 1bee
> [0]PETSC ERROR: -ts_dt .1e-1
> [0]PETSC ERROR: -ts_exact_final_time stepover
> [0]PETSC ERROR: -ts_max_snes_failures -1
> [0]PETSC ERROR: -ts_max_steps 1
> [0]PETSC ERROR: -ts_max_time 100
> [0]PETSC ERROR: -ts_monitor
> [0]PETSC ERROR: -ts_rtol 1e-3
> [0]PETSC ERROR: -ts_type arkimex
> [0]PETSC
>
> On Thu, Dec 31, 2020 at 12:30 AM Jed Brown <jed at jedbrown.org> wrote:
>
>> Mark Adams <mfadams at lbl.gov> writes:
>>
>> > I think the first thing that I need to do is get ASM working on the CPU
>> in
>> > my code as a direct solver.
>> >
>> > FieldSplit was mentioned the way to do this. How would I do that?
>>
>> -pc_type asm -sub_pc_type fieldsplit -sub_pc_fieldsplit_type additive
>> -sub_fieldsplit_0_pc_type lu
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20201231/d78e86d2/attachment.html>


More information about the petsc-dev mailing list