<div dir="ltr">Ok many thanks Barry,<div><br></div><div>For the cpu:sockets binding i get an ugly error:</div><div><br></div><div> <span style="font-family:menlo;font-size:11px">[valera@ocean petsc]$ make streams NPMAX=4 MPI_BINDING="--binding cpu:sockets"</span></div>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">cd src/benchmarks/streams; /usr/bin/gmake --no-print-directory PETSC_DIR=/home/valera/petsc PETSC_ARCH=arch-linux2-c-debug streams</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">/home/valera/petsc/arch-linux2-c-debug/bin/mpicc -o MPIVersion.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O -I/home/valera/petsc/include -I/home/valera/petsc/arch-linux2-c-debug/include `pwd`/MPIVersion.c</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">Running streams with '/home/valera/petsc/arch-linux2-c-debug/bin/mpiexec --binding cpu:sockets' using 'NPMAX=4' </span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets"</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)"</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets"</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)"</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets"</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)"</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets"</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)"</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[proxy:0:0@ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">[mpiexec@ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion</span></p>
<p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures">------------------------------------------------</span></p><p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures"><br></span></p><p style="margin:0px;font-size:11px;line-height:normal;font-family:menlo"><span style="font-variant-ligatures:no-common-ligatures"><br></span></p><p style="margin:0px;line-height:normal"><span style="font-variant-ligatures:no-common-ligatures"><font face="arial, helvetica, sans-serif">Im sending the binary file for the other list in a separate mail next, </font></span></p><p style="margin:0px;line-height:normal"><span style="font-variant-ligatures:no-common-ligatures"><font face="arial, helvetica, sans-serif"><br></font></span></p><p style="margin:0px;line-height:normal"><font face="arial, helvetica, sans-serif">Regards,</font></p></div><div class="gmail_extra"><br><div class="gmail_quote">On Sun, Jan 8, 2017 at 4:05 PM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
Manuel,<br>
<br>
Ok there are two (actually 3) distinct things you need to deal with to get get any kind of performance out of this machine.<br>
<br>
0) When running on the machine you cannot share it with other peoples jobs or you will get timings all over the place so run streams and benchmarks of your code when no one else has jobs running (The Unix top command helps)<br>
<br>
1) mpiexec is making bad decisions about process binding (what MPI processes are bound/assigned to what MPI cores).<br>
<br>
>From streams you have<br>
<br>
np speedup<br>
1 1.0<br>
2 1.95<br>
3 0.57<br>
4 0.6<br>
5 2.79<br>
6 2.8<br>
7 2.74<br>
8 2.67<br>
9 2.55<br>
10 2.68<br>
.....<br>
<br>
This is nuts. When going from 2 to 3 processes the performance goes WAY down. If the machine is empty and MPI did a good assignment of processes to cores the speedup should not go down for more cores it should just stagnate.<br>
<br>
So you need to find out how to do process binding with MPI see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#computers" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html#<wbr>computers</a> and the links from there. You can run the streams test with binding by for example make streams NPMAX=4 MPI_BINDING="--binding cpu:sockets".<br>
<br>
Once you have a good binding for your MPI make sure you always run the mpiexec with that binding when running your code.<br>
<br>
2) Both preconditioners you have tried for your problem are terrible. With block Jacobi it went from 156 linear iterations (for 5 linear solves) to 546 iterations. With AMG it went from 1463!! iterations to 1760. These are huge numbers of iterations for algebraic multigrid!<br>
<br>
For some reason AMG doesn't like your pressure matrix (even though AMG generally loves pressure matrices). What do you have for boundary conditions for your pressure?<br>
<br>
Please run with -ksp_view_mat binary -ksp_view_rhs binary and then send the resulting file binaryoutput to <a href="mailto:petsc-maint@mcs.anl.gov">petsc-maint@mcs.anl.gov</a> and we'll see if we can figure out why AMG doesn't like it.<br>
<div><div class="h5"><br>
<br>
<br>
<br>
<br>
<br>
<br>
> On Jan 8, 2017, at 4:41 PM, Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> wrote:<br>
><br>
> Ok, i just did the streams and log_summary tests, im attaching the output for each run, with NPMAX=4 and NPMAX=32, also -log_summary runs with -pc_type hypre and without it, with 1 and 2 cores, all of this with debugging turned off.<br>
><br>
> The matrix is 200,000x200,000, full curvilinear 3d meshes, non-hydrostatic pressure solver.<br>
><br>
> Thanks a lot for your insight,<br>
><br>
> Manuel<br>
><br>
> On Sun, Jan 8, 2017 at 9:48 AM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
><br>
> we need to see the -log_summary with hypre on 1 and 2 processes (with debugging tuned off) also we need to see the output from<br>
><br>
> make stream NPMAX=4<br>
><br>
> run in the PETSc directory.<br>
><br>
><br>
><br>
> > On Jan 7, 2017, at 7:38 PM, Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> wrote:<br>
> ><br>
> > Ok great, i tried those command line args and this is the result:<br>
> ><br>
> > when i use -pc_type gamg:<br>
> ><br>
> > [1]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > [1]PETSC ERROR: Petsc has generated inconsistent data<br>
> > [1]PETSC ERROR: Have un-symmetric graph (apparently). Use '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold -1.0' if the matrix is structurally symmetric.<br>
> > [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown<br>
> > [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera Sat Jan 7 17:35:05 2017<br>
> > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos<br>
> > [1]PETSC ERROR: #1 smoothAggs() line 462 in /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/agg.c<br>
> > [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/agg.c<br>
> > [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/gamg.c<br>
> > [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/interface/precon.c<br>
> > [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/interface/itfunc.c<br>
> > application called MPI_Abort(comm=0x84000002, 77) - process 1<br>
> ><br>
> ><br>
> > when i use -pc_type gamg and -pc_gamg_sym_graph true:<br>
> ><br>
> > ------------------------------<wbr>------------------------------<wbr>------------<br>
> > [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero<br>
> > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
> > [0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html#<wbr>valgrind</a><br>
> > [0]PETSC ERROR: or try <a href="http://valgrind.org" rel="noreferrer" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
> > [1]PETSC ERROR: ------------------------------<wbr>------------------------------<wbr>------------<br>
> > [1]PETSC ERROR: --------------------- Stack Frames ------------------------------<wbr>------<br>
> > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
> > [1]PETSC ERROR: INSTEAD the line number of the start of the function<br>
> > [1]PETSC ERROR: is given.<br>
> > [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/impls/gmres/<wbr>gmreig.c<br>
> > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValue<wbr>s_GMRES line 24 /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/impls/gmres/<wbr>gmreig.c<br>
> > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValue<wbr>s line 51 /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/interface/itfunc.c<br>
> > [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/agg.c<br>
> > [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/gamg.c<br>
> > [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/interface/precon.c<br>
> > [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/interface/itfunc.c<br>
> > [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/agg.c<br>
> > [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/gamg.c<br>
> > [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/interface/precon.c<br>
> > [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/interface/itfunc.c<br>
> > [0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> ><br>
> > when i use -pc_type hypre it actually shows something different on -ksp_view :<br>
> ><br>
> > KSP Object: 2 MPI processes<br>
> > type: gcr<br>
> > GCR: restart = 30<br>
> > GCR: restarts performed = 37<br>
> > maximum iterations=10000, initial guess is zero<br>
> > tolerances: relative=1e-14, absolute=1e-50, divergence=10000.<br>
> > right preconditioning<br>
> > using UNPRECONDITIONED norm type for convergence test<br>
> > PC Object: 2 MPI processes<br>
> > type: hypre<br>
> > HYPRE BoomerAMG preconditioning<br>
> > HYPRE BoomerAMG: Cycle type V<br>
> > HYPRE BoomerAMG: Maximum number of levels 25<br>
> > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1<br>
> > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0.<br>
> > HYPRE BoomerAMG: Threshold for strong coupling 0.25<br>
> > HYPRE BoomerAMG: Interpolation truncation factor 0.<br>
> > HYPRE BoomerAMG: Interpolation: max elements per row 0<br>
> > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0<br>
> > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1<br>
> > HYPRE BoomerAMG: Maximum row sums 0.9<br>
> > HYPRE BoomerAMG: Sweeps down 1<br>
> > HYPRE BoomerAMG: Sweeps up 1<br>
> > HYPRE BoomerAMG: Sweeps on coarse 1<br>
> > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi<br>
> > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi<br>
> > HYPRE BoomerAMG: Relax on coarse Gaussian-elimination<br>
> > HYPRE BoomerAMG: Relax weight (all) 1.<br>
> > HYPRE BoomerAMG: Outer relax weight (all) 1.<br>
> > HYPRE BoomerAMG: Using CF-relaxation<br>
> > HYPRE BoomerAMG: Not using more complex smoothers.<br>
> > HYPRE BoomerAMG: Measure type local<br>
> > HYPRE BoomerAMG: Coarsen type Falgout<br>
> > HYPRE BoomerAMG: Interpolation type classical<br>
> > HYPRE BoomerAMG: Using nodal coarsening (with HYPRE_BOOMERAMGSetNodal() 1<br>
> > HYPRE BoomerAMG: HYPRE_<wbr>BoomerAMGSetInterpVecVariant() 1<br>
> > linear system matrix = precond matrix:<br>
> > Mat Object: 2 MPI processes<br>
> > type: mpiaij<br>
> > rows=200000, cols=200000<br>
> > total: nonzeros=3373340, allocated nonzeros=3373340<br>
> > total number of mallocs used during MatSetValues calls =0<br>
> > not using I-node (on process 0) routines<br>
> ><br>
> ><br>
> > but still the timing is terrible.<br>
> ><br>
> ><br>
> ><br>
> ><br>
> > On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown <<a href="mailto:jed@jedbrown.org">jed@jedbrown.org</a>> wrote:<br>
> > Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> writes:<br>
> ><br>
> > > Awesome Matt and Jed,<br>
> > ><br>
> > > The GCR is used because the matrix is not invertible and because this was<br>
> > > the algorithm that the previous library used,<br>
> > ><br>
> > > The Preconditioned im aiming to use is multigrid, i thought i configured<br>
> > > the hypre-boomerAmg solver for this, but i agree in that it doesn't show in<br>
> > > the log anywhere, how can i be sure is being used ? i sent -ksp_view log<br>
> > > before in this thread<br>
> ><br>
> > Did you run with -pc_type hypre?<br>
> ><br>
> > > I had a problem with the matrix block sizes so i couldn't make the petsc<br>
> > > native multigrid solver to work,<br>
> ><br>
> > What block sizes? If the only variable is pressure, the block size<br>
> > would be 1 (default).<br>
> ><br>
> > > This is a nonhidrostatic pressure solver, it is an elliptic problem so<br>
> > > multigrid is a must,<br>
> ><br>
> > Yes, multigrid should work well.<br>
> ><br>
><br>
><br>
</div></div>> <logsumm1hypre.txt><<wbr>logsumm1jacobi.txt><<wbr>logsumm2hypre.txt><<wbr>logsumm2jacobi.txt><steams4.<wbr>txt><steams32.txt><br>
<br>
</blockquote></div><br></div>