[petsc-users] Correlation between da_refine and pg_mg_levels

Barry Smith bsmith at mcs.anl.gov
Thu Mar 30 17:11:29 CDT 2017


   You should always work with master git branch. Tar balls and releases are for people who learned Unix on Digital Vaxen. And never try to jump back and forth between master and releases ("just because the release was already installed on some machine"), that will drive you nuts.

   There was some changes in the handling of options for DM (and DMDA) since the 3.7 release. Previously if you passed
negative numbers for the grid spacing variables and DMDA would check the options database and allow the values to be changed. But if you passed positive values the options would be ignored.

   This was recently fixed to use the standard PETSc paradigm of DMSetFromOptions() DMSetUp() after the call to DMDACreate...().

  Barry



> On Mar 30, 2017, at 4:27 PM, Justin Chang <jychang48 at gmail.com> wrote:
> 
> -dm_refine didn't work either
> 
> On Thu, Mar 30, 2017 at 4:21 PM, Matthew Knepley <knepley at gmail.com> wrote:
> I think its now -dm_refine 
> 
>    Matt
> 
> On Thu, Mar 30, 2017 at 4:17 PM, Justin Chang <jychang48 at gmail.com> wrote:
> Also, this was the output before the error message:
> 
> Level 0 domain size (m)    1e+04 x    1e+04 x    1e+03, num elements 5 x 5 x 3 (75), size (m) 2000. x 2000. x 500.
> Level -1 domain size (m)    1e+04 x    1e+04 x    1e+03, num elements 2 x 2 x 2 (8), size (m) 5000. x 5000. x 1000.
> Level -2 domain size (m)    1e+04 x    1e+04 x    1e+03, num elements 1 x 1 x 1 (1), size (m) 10000. x 10000. x inf.
> 
> Which tells me '-da_refine 4' is not registering
> 
> On Thu, Mar 30, 2017 at 4:15 PM, Justin Chang <jychang48 at gmail.com> wrote:
> Okay I'll give it a shot.
> 
> Somewhat unrelated, but I tried running this on Cori's Haswell node (loaded the module 'petsc/3.7.4-64'). But I get these errors:
> 
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Argument out of range
> [0]PETSC ERROR: Partition in y direction is too fine! 0 1
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.7.4, Oct, 02, 2016 
> [0]PETSC ERROR: /global/u1/j/jychang/Icesheet/./ex48 on a arch-cori-opt64-INTEL-3.7.4-64 named nid00020 by jychang Thu Mar 30 14:04:35 2017
> [0]PETSC ERROR: Configure options --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-size_t=8 --known-mpi-int64_t=1 --known-has-attribute-aligned=1 --prefix=/global/common/cori/software/petsc/3.7.4-64/hsw/intel PETSC_ARCH=arch-cori-opt64-INTEL-3.7.4-64 --COPTFLAGS="-mkl -O2 -no-ipo -g -axMIC-AVX512,CORE-AVX2,AVX" --CXXOPTFLAGS="-mkl -O2 -no-ipo -g -axMIC-AVX512,CORE-AVX2,AVX" --FOPTFLAGS="-mkl -O2 -no-ipo -g -axMIC-AVX512,CORE-AVX2,AVX" --with-hdf5-dir=/opt/cray/pe/hdf5-parallel/1.8.16/INTEL/15.0 --with-hwloc-dir=/global/common/cori/software/hwloc/1.11.4/hsw --with-scalapack-include=/opt/intel/compilers_and_libraries_2017.1.132/linux/mkl/include --with-scalapack-lib= --LIBS="-mkl -L/global/common/cori/software/petsc/3.7.4-64/hsw/intel/lib -I/global/common/cori/software/petsc/3.7.4-64/hsw/intel/include -L/global/common/cori/software/xz/5.2.2/hsw/lib -I/global/common/cori/software/xz/5.2.2/hsw/include -L/global/common/cori/software/zlib/1.2.8/hsw/intel/lib -I/global/common/cori/software/zlib/1.2.8/hsw/intel/include -L/global/common/cori/software/libxml2/2.9.4/hsw/lib -I/global/common/cori/software/libxml2/2.9.4/hsw/include -L/global/common/cori/software/numactl/2.0.11/hsw/lib -I/global/common/cori/software/numactl/2.0.11/hsw/include -L/global/common/cori/software/hwloc/1.11.4/hsw/lib -I/global/common/cori/software/hwloc/1.11.4/hsw/include -L/global/common/cori/software/openssl/1.1.0a/hsw/lib -I/global/common/cori/software/openssl/1.1.0a/hsw/include -L/global/common/cori/software/subversion/1.9.4/hsw/lib -I/global/common/cori/software/subversion/1.9.4/hsw/include -lhwloc -lpciaccess -lxml2 -lz -llzma -Wl,--start-group /opt/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64/libmkl_scalapack_lp64.a /opt/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64/libmkl_core.a /opt/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64/libmkl_intel_thread.a /opt/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.a -Wl,--end-group -lstdc++" --download-parmetis --download-metis --with-ssl=0 --with-batch --known-mpi-shared-libraries=0 --with-clib-autodetect=0 --with-cxxlib-autodetect=0 --with-debugging=0 --with-fortranlib-autodetect=0 --with-mpiexec=srun --with-shared-libraries=0 --with-x=0 --known-mpi-int64-t=0 --known-bits-per-byte=8 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --known-level1-dcache-assoc=0 --known-level1-dcache-linesize=32 --known-level1-dcache-size=32768 --known-memcmp-ok=1 --known-mpi-c-double-complex=1 --known-mpi-long-double=1 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-sizeof-char=1 --known-sizeof-double=8 CC=cc MPICC=cc CXX=CC MPICXX=CC FC=ftn F77=ftn F90=ftn MPIF90=ftn MPIF77=ftn CFLAGS=-axMIC-AVX512,CORE-AVX2,AVX CXXFLAGS=-axMIC-AVX512,CORE-AVX2,AVX FFLAGS=-axMIC-AVX512,CORE-AVX2,AVX CC=cc MPICC=cc CXX=CC MPICXX=CC FC=ftn F77=ftn F90=ftn MPIF90=ftn MPIF77=ftn CFLAGS=-fPIC FFLAGS=-fPIC LDFLAGS=-fPIE --download-hypre --with-64-bit-indices
> [0]PETSC ERROR: #1 DMSetUp_DA_3D() line 298 in /global/cscratch1/sd/swowner/sleak/petsc-3.7.4/src/dm/impls/da/da3.c
> [0]PETSC ERROR: #2 DMSetUp_DA() line 27 in /global/cscratch1/sd/swowner/sleak/petsc-3.7.4/src/dm/impls/da/dareg.c
> [0]PETSC ERROR: #3 DMSetUp() line 744 in /global/cscratch1/sd/swowner/sleak/petsc-3.7.4/src/dm/interface/dm.c
> [0]PETSC ERROR: #4 DMCoarsen_DA() line 1196 in /global/cscratch1/sd/swowner/sleak/petsc-3.7.4/src/dm/impls/da/da.c
> [0]PETSC ERROR: #5 DMCoarsen() line 2371 in /global/cscratch1/sd/swowner/sleak/petsc-3.7.4/src/dm/interface/dm.c
> [0]PETSC ERROR: #6 PCSetUp_MG() line 616 in /global/cscratch1/sd/swowner/sleak/petsc-3.7.4/src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: #7 PCSetUp() line 968 in /global/cscratch1/sd/swowner/sleak/petsc-3.7.4/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: #8 KSPSetUp() line 390 in /global/cscratch1/sd/swowner/sleak/petsc-3.7.4/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #9 KSPSolve() line 599 in /global/cscratch1/sd/swowner/sleak/petsc-3.7.4/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #10 SNESSolve_NEWTONLS() line 230 in /global/cscratch1/sd/swowner/sleak/petsc-3.7.4/src/snes/impls/ls/ls.c
> [0]PETSC ERROR: #11 SNESSolve() line 4005 in /global/cscratch1/sd/swowner/sleak/petsc-3.7.4/src/snes/interface/snes.c
> [0]PETSC ERROR: #12 main() line 1548 in /global/homes/j/jychang/Icesheet/ex48.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -da_refine 4
> [0]PETSC ERROR: -ksp_rtol 1e-7
> [0]PETSC ERROR: -M 5
> [0]PETSC ERROR: -N 5
> [0]PETSC ERROR: -P 3
> [0]PETSC ERROR: -pc_mg_levels 5
> [0]PETSC ERROR: -pc_type mg
> [0]PETSC ERROR: -thi_mat_type baij
> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
> Rank 0 [Thu Mar 30 14:04:35 2017] [c0-0c0s5n0] application called MPI_Abort(MPI_COMM_WORLD, 63) - process 0
> srun: error: nid00020: task 0: Aborted
> srun: Terminating job step 4363145.1z
> 
> it seems to me the PETSc from this module is not registering the '-da_refine' entry. This is strange because I have no issue with this on the latest petsc-dev version, anyone know about this error and/or why it happens?
> 
> On Thu, Mar 30, 2017 at 3:39 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Thu, Mar 30, 2017 at 3:38 PM, Justin Chang <jychang48 at gmail.com> wrote:
> Okay, got it. What are the options for setting GAMG as the coarse solver?
> 
> -mg_coarse_pc_type gamg I think
>  
> On Thu, Mar 30, 2017 at 3:37 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Thu, Mar 30, 2017 at 3:04 PM, Justin Chang <jychang48 at gmail.com> wrote:
> Yeah based on my experiments it seems setting pc_mg_levels to $DAREFINE + 1 has decent performance. 
> 
> 1) is there ever a case where you'd want $MGLEVELS <= $DAREFINE? In some of the PETSc tutorial slides (e.g., http://www.mcs.anl.gov/petsc/documentation/tutorials/TutorialCEMRACS2016.pdf on slide 203/227) they say to use $MGLEVELS = 4 and $DAREFINE = 5, but when I ran this, it was almost twice as slow as if $MGLEVELS >= $DAREFINE
> 
> Depending on how big the initial grid is, you may want this. There is a balance between coarse grid and fine grid work.
>  
> 2) So I understand that one cannot refine further than one grid point in each direction, but is there any benefit to having $MGLEVELS > $DAREFINE by a lot?
> 
> Again, it depends on the size of the initial grid.
> 
> On really large problems, you want to use GAMG as the coarse solver, which will move the problem onto a smaller number of nodes
> so that you can coarsen further.
> 
>    Matt
>  
> Thanks,
> Justin
> 
> On Thu, Mar 30, 2017 at 2:35 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> 
>   -da_refine $DAREFINE  determines how large the final problem will be.
> 
>   By default if you don't supply pc_mg_levels then it uses $DAREFINE + 1 as the number of levels of MG to use; for example -da_refine 1 would result in 2 levels of multigrid.
> 
> 
> > On Mar 30, 2017, at 2:17 PM, Justin Chang <jychang48 at gmail.com> wrote:
> >
> > Hi all,
> >
> > Just a general conceptual question: say I am tinkering around with SNES ex48.c and am running the program with these options:
> >
> > mpirun -n $NPROCS -pc_type mg -M $XSEED -N $YSEED -P $ZSEED -thi_mat_type baij -da_refine $DAREFINE -pc_mg_levels $MGLEVELS
> >
> > I am not too familiar with mg, but it seems to me there is a very strong correlation between $MGLEVELS and $DAREFINE as well as perhaps even the initial coarse grid size (provided by $X/YZSEED).
> >
> > Is there a rule of thumb on how these parameters should be? I am guessing it probably is also hardware/architectural dependent?
> >
> > Thanks,
> > Justin
> 
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 



More information about the petsc-users mailing list