[petsc-dev] problem with aijcusparse and DMForest

Mark Adams mfadams at lbl.gov
Fri Nov 20 13:45:38 CST 2020


I am running src/ts/utils/dmplexlandau/tutorials/ex1.c which use DMPlex. I
am adding GPU assembly and I need a cusparse matrix for that, because that
is where I put the Cuda assembly code.
(Maybe Device assembly should not be folded into aijcusparse. It actually
has nothing to do with cusparse.)

It looks like DMForest is not moving these types into the new DM
in DMAdaptLabel. The code runs without p4est but I get this error with it.

I had to manually get the preallocate_only flag to get passes along in
p4est. I probably did not do that in the best way. Is there a more
systematic way of getting DM flags passed through DMForest properly?

Thanks,
Mark

14:32 adams/landau-gpu-assembly=
~/petsc/src/ts/utils/dmplexlandau/tutorials$ make
PETSC_DIR=/ccs/home/adams/petsc PETSC_ARCH=arch-summit-opt-gnu-cuda -f
mymake runex1 ORDER=1 EXTRA='-dm_mat_type aijcusparse -dm_vec_type cuda
-mat_type aijcusparse -vec_type cuda'
jsrun -n 1 -a 1 -c 1 -g 1 ./ex1 -petscspace_degree 4
-petscspace_poly_tensor 1 -dm_landau_type p4est -info :dm,tsadapt
-dm_landau_ion_masses 2,4 -dm_landau_ion_charges 1,18
-dm_landau_thermal_temps 5,5,.5 -dm_landau_n 1.00018,1,1e-5 -dm_landau_n_0
1e20 -ex1_ts_monitor -ex1_snes_rtol 1.e-6 -ex1_snes_monitor
-ex1_snes_converged_reason -ex1_ts_type arkimex -ex1_ts_arkimex_type 1bee
-ex1_ts_max_snes_failures -1 -ex1_ts_rtol 1e-4 -ex1_ts_dt 1.e-6
-ex1_ts_max_time 1 -ex1_ts_adapt_clip .5,1.25
-ex1_ts_adapt_scale_solve_failed 0.75
-ex1_ts_adapt_time_step_increase_delay 5 -ex1_ts_max_steps 1 -ex1_pc_type
lu -ex1_ksp_type preonly -dm_landau_amr_levels_max 7
-dm_landau_domain_radius 5 -dm_landau_amr_re_levels 0 -dm_landau_re_radius
1 -dm_landau_amr_z_refine1 1 -dm_landau_amr_z_refine2 0
-dm_landau_amr_post_refine 0 -dm_landau_z_radius1 .1 -dm_landau_z_radius2
.1 -dm_refine 1 -dm_mat_type aijcusparse -dm_vec_type cuda -mat_type
aijcusparse -vec_type cuda
[0] ProcessOptions(): num_species set to number of thermal temps provided
(3)
masses:        e= 9.109e-31; ions in proton mass units:    2.000e+00
 4.000e+00 ...
charges:       e=-1.602e-19; charges in elementary units:  1.000e+00
 1.800e+01
thermal T (K): e= 5.802e+07 i= 5.802e+07 imp= 5.802e+06. v_0= 2.965e+07
n_0= 1.000e+20 t_0= 6.469e-05 domain= 5.000e+00
  0) species-0: charge density= -1.6024250714143e+01 z-momentum=
 3.0934072854942e-19 energy=  1.2018267949397e+05
  0) species-1: charge density=  1.6021676664193e+01 z-momentum=
 1.0972563727970e-17 energy=  1.2017223194844e+05
  0) species-2: charge density=  2.8822941386019e-03 z-momentum=
 2.6930279969350e-22 energy=  1.2030899911623e-01
 0) Total: charge density=  3.0824418842895e-04, momentum=
 1.1282173759319e-17, energy=  2.4035503175141e+05 (m_i[0]/m_e = 3670.94,
56 cells)
[0] DMGetDMSNES(): Creating new DMSNES
[0] DMGetDMTS(): Creating new DMTS
[0] DMGetDMSNESWrite(): Copying DMSNES due to write
0 TS dt 1e-06 time 0.
[0] DMGetDMKSP(): Creating new DMKSP
[0]FormLandau: 1400 IPs, 56 cells, totDim=75, Nb=25, Nq=25,
elemMatSize=5625, dim=2, Tab: Nb=25 Nf=3 Np=25 cdim=2 N=2667 N+hang=3339
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Invalid argument
[0]PETSC ERROR: Object (seq) is not seqcuda or mpicuda
[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.14.1-204-g530525b  GIT
Date: 2020-11-19 21:52:41 -0500
[0]PETSC ERROR: ./ex1 on a arch-summit-opt-gnu-cuda named h10n08 by adams
Fri Nov 20 14:32:27 2020
[0]PETSC ERROR: Configure options --with-fc=0 --CFLAGS=-fPIC
--CXXFLAGS=-fPIC --COPTFLAGS="-g -O0" --CXXOPTFLAGS="-g -O0"
--CUDAFLAGS="-arch=sm_70 -Xcompiler -rdynamic -lineinfo" --CUDAOPTFLAGS=-g
--with-ctable=0 --with-ssl=0 --with-batch=0 --with-cxx=mpicxx
--with-mpiexec="jsrun -g1 --smpiargs "-gpu"" --with-cuda=1
--with-cudac=nvcc --download-p4est=1 --download-zlib --download-hdf5=1
--download-metis --download-parmetis --download-triangle
--with-blaslapack-lib="-L/autofs/nccs-svm1_sw/summit/.swci/1-compute/opt/spack/20180914/linux-rhel7-ppc64le/gcc-6.4.0/netlib-lapack-3.8.0-wcabdyqhdi5rooxbkqa6x5d7hxyxwdkm/lib64
-lblas -llapack" --with-cc=mpicc --with-shared-libraries=1 --with-x=0
--with-64-bit-indices=1 --with-debugging=1
PETSC_ARCH=arch-summit-opt-gnu-cuda --with-openmp=0
PETSC_DIR=/ccs/home/adams/petsc
[0]PETSC ERROR: #1 VecCUDAGetArray() line 2171 in
/autofs/nccs-svm1_home1/adams/petsc/src/vec/vec/interface/rvector.c
[0]PETSC ERROR: #2 VecCUDAGetArrayRead() line 2249 in
/autofs/nccs-svm1_home1/adams/petsc/src/vec/vec/interface/rvector.c
[0]PETSC ERROR: #3 MatMultAddKernel_SeqAIJCUSPARSE() line 2261 in
/autofs/nccs-svm1_home1/adams/petsc/src/mat/impls/aij/seq/seqcusparse/
aijcusparse.cu
[0]PETSC ERROR: #4 MatMult_SeqAIJCUSPARSE() line 2180 in
/autofs/nccs-svm1_home1/adams/petsc/src/mat/impls/aij/seq/seqcusparse/
aijcusparse.cu
[0]PETSC ERROR: #5 MatMult() line 2424 in
/autofs/nccs-svm1_home1/adams/petsc/src/mat/interface/matrix.c
[0]PETSC ERROR: #6 LandauIFunction() line 1956 in
/autofs/nccs-svm1_home1/adams/petsc/src/ts/utils/dmplexlandau/plexland.c
[0]PETSC ERROR: #7 TSComputeIFunction() line 857 in
/autofs/nccs-svm1_home1/adams/petsc/src/ts/interface/ts.c
[0]PETSC ERROR: #8 SNESTSFormFunction_ARKIMEX() line 1034 in
/autofs/nccs-svm1_home1/adams/petsc/src/ts/impls/arkimex/arkimex.c
[0]PETSC ERROR: #9 SNESTSFormFunction() line 5014 in
/autofs/nccs-svm1_home1/adams/petsc/src/ts/interface/ts.c
[0]PETSC ERROR: #10 SNESComputeFunction() line 2376 in
/autofs/nccs-svm1_home1/adams/petsc/src/snes/interface/snes.c
[0]PETSC ERROR: #11 SNESSolve_NEWTONLS() line 175 in
/autofs/nccs-svm1_home1/adams/petsc/src/snes/impls/ls/ls.c
[0]PETSC ERROR: #12 SNESSolve() line 4563 in
/autofs/nccs-svm1_home1/adams/petsc/src/snes/interface/snes.c
[0]PETSC ERROR: #13 TSStep_ARKIMEX() line 811 in
/autofs/nccs-svm1_home1/adams/petsc/src/ts/impls/arkimex/arkimex.c
[0]PETSC ERROR: #14 TSStep() line 3757 in
/autofs/nccs-svm1_home1/adams/petsc/src/ts/interface/ts.c
[0]PETSC ERROR: #15 TSSolve() line 4154 in
/autofs/nccs-svm1_home1/adams/petsc/src/ts/interface/ts.c
[0]PETSC ERROR: #16 main() line 49 in ex1.c
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -dm_landau_amr_levels_max 7
[0]PETSC ERROR: -dm_landau_amr_post_refine 0
[0]PETSC ERROR: -dm_landau_amr_re_levels 0
[0]PETSC ERROR: -dm_landau_amr_z_refine1 1
[0]PETSC ERROR: -dm_landau_amr_z_refine2 0
[0]PETSC ERROR: -dm_landau_domain_radius 5
[0]PETSC ERROR: -dm_landau_ion_charges 1,18
[0]PETSC ERROR: -dm_landau_ion_masses 2,4
[0]PETSC ERROR: -dm_landau_n 1.00018,1,1e-5
[0]PETSC ERROR: -dm_landau_n_0 1e20
[0]PETSC ERROR: -dm_landau_re_radius 1
[0]PETSC ERROR: -dm_landau_thermal_temps 5,5,.5
[0]PETSC ERROR: -dm_landau_type p4est
[0]PETSC ERROR: -dm_landau_z_radius1 .1
[0]PETSC ERROR: -dm_landau_z_radius2 .1
[0]PETSC ERROR: -dm_mat_type aijcusparse
[0]PETSC ERROR: -dm_refine 1
[0]PETSC ERROR: -dm_vec_type cuda
[0]PETSC ERROR: -ex1_ksp_type preonly
[0]PETSC ERROR: -ex1_pc_type lu
[0]PETSC ERROR: -ex1_snes_converged_reason
[0]PETSC ERROR: -ex1_snes_monitor
[0]PETSC ERROR: -ex1_snes_rtol 1.e-6
[0]PETSC ERROR: -ex1_ts_adapt_clip .5,1.25
[0]PETSC ERROR: -ex1_ts_adapt_scale_solve_failed 0.75
[0]PETSC ERROR: -ex1_ts_adapt_time_step_increase_delay 5
[0]PETSC ERROR: -ex1_ts_arkimex_type 1bee
[0]PETSC ERROR: -ex1_ts_dt 1.e-6
[0]PETSC ERROR: -ex1_ts_max_snes_failures -1
[0]PETSC ERROR: -ex1_ts_max_steps 1
[0]PETSC ERROR: -ex1_ts_max_time 1
[0]PETSC ERROR: -ex1_ts_monitor
[0]PETSC ERROR: -ex1_ts_rtol 1e-4
[0]PETSC ERROR: -ex1_ts_type arkimex
[0]PETSC ERROR: -info :dm,tsadapt
[0]PETSC ERROR: -mat_type aijcusparse
[0]PETSC ERROR: -petscspace_degree 4
[0]PETSC ERROR: -petscspace_poly_tensor 1
[0]PETSC ERROR: -vec_type cuda
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20201120/3f73b0b4/attachment-0001.html>


More information about the petsc-dev mailing list