<div><div dir="auto">Henrik,</div><div dir="auto"><br></div><br><div class="gmail_quote"><div dir="auto">On Wed, 10 Jan 2018 at 16:39, Smith, Barry F. <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
DMDA requires that there be at least 1 grid points in each direction on each process (this simplified the implementation a huge amount but gives up flexibility). In your case you have 16 processes in a particular direction but a total of only 12 grid points hence not enough grid points to have at least 1 per process.</blockquote><div dir="auto"><br></div><div dir="auto">This particular implementation limitation can be overcome using PCTELESCOPE. It allows you to repartition the coarse levels onto fewer ranks.</div><div dir="auto"><br></div><div dir="auto">Cheers,</div><div dir="auto"> Dave</div><div dir="auto"><br></div><div dir="auto"><br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
<br>
Barry<br>
<br>
<br>
> On Jan 10, 2018, at 5:25 AM, Buesing, Henrik <<a href="mailto:hbuesing@eonerc.rwth-aachen.de" target="_blank">hbuesing@eonerc.rwth-aachen.de</a>> wrote:<br>
><br>
> Dear all,<br>
><br>
> I am doing a weak scaling test using geometric multigrid. With increasing the number of cells, and the number of processes, I also increase the number of multigrid levels. With 64 cores, 12288 cells in x-direction and 11 multigrid levels, I see error message [1].<br>
><br>
> Could you help me understand what is happening here?<br>
><br>
> The characteristics of the weak scaling test are summarized in table [2]. Refinement level 7-9 went through fine.<br>
><br>
> Thank you!<br>
> Henrik<br>
><br>
> [1]<br>
><br>
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> [0]PETSC ERROR: Argument out of range<br>
> [0]PETSC ERROR: Partition in x direction is too fine! 12 16<br>
> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>
> [0]PETSC ERROR: Petsc Development GIT revision: v3.8.2-48-g851ec02 GIT Date: 2017-12-05 09:52:17 -0600<br>
> [0]PETSC ERROR: shem_fw64gnu_const.x on a gnu_openmpi named <a href="http://linuxihfc033.rz.RWTH-Aachen.DE" rel="noreferrer" target="_blank">linuxihfc033.rz.RWTH-Aachen.DE</a> by hb111949 Wed Jan 10 11:48:09 2018<br>
> [0]PETSC ERROR: Configure options --download-fblaslapack --with-cc=mpicc -with-fc=mpif90 --with-cxx=mpicxx --download-hypre --download-superlu_dist --download-suitesparse --download-scalapack --download-blacs --download-hdf5 --download-parmetis --download-metis --with-debugging=0 --download-mumps<br>
> [0]PETSC ERROR: #1 DMSetUp_DA_3D() line 299 in /rwthfs/rz/cluster/work/hb111949/Code/petsc/src/dm/impls/da/da3.c<br>
> [0]PETSC ERROR: #2 DMSetUp_DA() line 25 in /rwthfs/rz/cluster/work/hb111949/Code/petsc/src/dm/impls/da/dareg.c<br>
> [0]PETSC ERROR: #3 DMSetUp() line 720 in /rwthfs/rz/cluster/work/hb111949/Code/petsc/src/dm/interface/dm.c<br>
> [0]PETSC ERROR: #4 DMCoarsen_DA() line 1203 in /rwthfs/rz/cluster/work/hb111949/Code/petsc/src/dm/impls/da/da.c<br>
> [0]PETSC ERROR: #5 DMCoarsen() line 2427 in /rwthfs/rz/cluster/work/hb111949/Code/petsc/src/dm/interface/dm.c<br>
> [0]PETSC ERROR: #6 PCSetUp_MG() line 618 in /rwthfs/rz/cluster/work/hb111949/Code/petsc/src/ksp/pc/impls/mg/mg.c<br>
> [0]PETSC ERROR: #7 PCSetUp() line 924 in /rwthfs/rz/cluster/work/hb111949/Code/petsc/src/ksp/pc/interface/precon.c<br>
> [0]PETSC ERROR: #8 KSPSetUp() line 381 in /rwthfs/rz/cluster/work/hb111949/Code/petsc/src/ksp/ksp/interface/itfunc.c<br>
> [0]PETSC ERROR: #9 KSPSolve() line 612 in /rwthfs/rz/cluster/work/hb111949/Code/petsc/src/ksp/ksp/interface/itfunc.c<br>
> [0]PETSC ERROR: #10 SNESSolve_NEWTONLS() line 224 in /rwthfs/rz/cluster/work/hb111949/Code/petsc/src/snes/impls/ls/ls.c<br>
> [0]PETSC ERROR: #11 SNESSolve() line 4179 in /rwthfs/rz/cluster/work/hb111949/Code/petsc/src/snes/interface/snes.c<br>
><br>
><br>
> [2]<br>
><br>
> # refinement level<br>
> # cores<br>
> # cells in x<br>
> # cells in y<br>
> # cells in z<br>
> # mg levels<br>
> 7<br>
> 1<br>
> 1536<br>
> 1<br>
> 256<br>
> 8<br>
> 8<br>
> 4<br>
> 3072<br>
> 1<br>
> 512<br>
> 9<br>
> 9<br>
> 16<br>
> 6144<br>
> 1<br>
> 1024<br>
> 10<br>
> 10<br>
> 64<br>
> 12288<br>
> 1<br>
> 2048<br>
> 11<br>
><br>
> --<br>
> Dipl.-Math. Henrik Büsing<br>
> Institute for <a href="https://maps.google.com/?q=Applied+Geoph&entry=gmail&source=g">Applied Geoph</a>ysics and Geothermal Energy<br>
> E.ON Energy Research Center<br>
> RWTH Aachen University<br>
> ------------------------------------------------------<br>
> Mathieustr. 10 | Tel +49 (0)241 80 49907<br>
> 52074 Aachen, Germany | Fax +49 (0)241 80 49889<br>
> ------------------------------------------------------<br>
> <a href="http://www.eonerc.rwth-aachen.de/GGE" rel="noreferrer" target="_blank">http://www.eonerc.rwth-aachen.de/GGE</a><br>
> <a href="mailto:hbuesing@eonerc.rwth-aachen.de" target="_blank">hbuesing@eonerc.rwth-aachen.de</a><br>
> ------------------------------------------------------<br>
<br>
</blockquote></div></div>