<div dir="ltr">Ok, i just did the streams and log_summary tests, im attaching the output for each run, with NPMAX=4 and NPMAX=32, also -log_summary runs with -pc_type hypre and without it, with 1 and 2 cores, all of this with debugging turned off. <div><br></div><div>The matrix is 200,000x200,000, full curvilinear 3d meshes, non-hydrostatic pressure solver.<div><br></div><div>Thanks a lot for your insight,</div><div><br></div><div>Manuel </div></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Sun, Jan 8, 2017 at 9:48 AM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><span class=""><br>
we need to see the -log_summary with hypre on 1 and 2 processes (with debugging tuned off) also we need to see the output from<br>
<br>
</span> make stream NPMAX=4<br>
<span class="im HOEnZb"><br>
run in the PETSc directory.<br>
<br>
<br>
<br>
</span><div class="HOEnZb"><div class="h5">> On Jan 7, 2017, at 7:38 PM, Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> wrote:<br>
><br>
> Ok great, i tried those command line args and this is the result:<br>
><br>
> when i use -pc_type gamg:<br>
><br>
> [1]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> [1]PETSC ERROR: Petsc has generated inconsistent data<br>
> [1]PETSC ERROR: Have un-symmetric graph (apparently). Use '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold -1.0' if the matrix is structurally symmetric.<br>
> [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown<br>
> [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera Sat Jan 7 17:35:05 2017<br>
> [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos<br>
> [1]PETSC ERROR: #1 smoothAggs() line 462 in /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/agg.c<br>
> [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/agg.c<br>
> [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/gamg.c<br>
> [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/interface/precon.c<br>
> [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/interface/itfunc.c<br>
> application called MPI_Abort(comm=0x84000002, 77) - process 1<br>
><br>
><br>
> when i use -pc_type gamg and -pc_gamg_sym_graph true:<br>
><br>
> ------------------------------<wbr>------------------------------<wbr>------------<br>
> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero<br>
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
> [0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html#<wbr>valgrind</a><br>
> [0]PETSC ERROR: or try <a href="http://valgrind.org" rel="noreferrer" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
> [1]PETSC ERROR: ------------------------------<wbr>------------------------------<wbr>------------<br>
> [1]PETSC ERROR: --------------------- Stack Frames ------------------------------<wbr>------<br>
> [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
> [1]PETSC ERROR: INSTEAD the line number of the start of the function<br>
> [1]PETSC ERROR: is given.<br>
> [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/impls/gmres/<wbr>gmreig.c<br>
> [1]PETSC ERROR: [1] KSPComputeExtremeSingularValue<wbr>s_GMRES line 24 /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/impls/gmres/<wbr>gmreig.c<br>
> [1]PETSC ERROR: [1] KSPComputeExtremeSingularValue<wbr>s line 51 /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/interface/itfunc.c<br>
> [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/agg.c<br>
> [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/gamg.c<br>
> [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/interface/precon.c<br>
> [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/interface/itfunc.c<br>
> [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/agg.c<br>
> [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/impls/gamg/gamg.c<br>
> [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/<wbr>src/ksp/pc/interface/precon.c<br>
> [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/<wbr>src/ksp/ksp/interface/itfunc.c<br>
> [0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
><br>
> when i use -pc_type hypre it actually shows something different on -ksp_view :<br>
><br>
> KSP Object: 2 MPI processes<br>
> type: gcr<br>
> GCR: restart = 30<br>
> GCR: restarts performed = 37<br>
> maximum iterations=10000, initial guess is zero<br>
> tolerances: relative=1e-14, absolute=1e-50, divergence=10000.<br>
> right preconditioning<br>
> using UNPRECONDITIONED norm type for convergence test<br>
> PC Object: 2 MPI processes<br>
> type: hypre<br>
> HYPRE BoomerAMG preconditioning<br>
> HYPRE BoomerAMG: Cycle type V<br>
> HYPRE BoomerAMG: Maximum number of levels 25<br>
> HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1<br>
> HYPRE BoomerAMG: Convergence tolerance PER hypre call 0.<br>
> HYPRE BoomerAMG: Threshold for strong coupling 0.25<br>
> HYPRE BoomerAMG: Interpolation truncation factor 0.<br>
> HYPRE BoomerAMG: Interpolation: max elements per row 0<br>
> HYPRE BoomerAMG: Number of levels of aggressive coarsening 0<br>
> HYPRE BoomerAMG: Number of paths for aggressive coarsening 1<br>
> HYPRE BoomerAMG: Maximum row sums 0.9<br>
> HYPRE BoomerAMG: Sweeps down 1<br>
> HYPRE BoomerAMG: Sweeps up 1<br>
> HYPRE BoomerAMG: Sweeps on coarse 1<br>
> HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi<br>
> HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi<br>
> HYPRE BoomerAMG: Relax on coarse Gaussian-elimination<br>
> HYPRE BoomerAMG: Relax weight (all) 1.<br>
> HYPRE BoomerAMG: Outer relax weight (all) 1.<br>
> HYPRE BoomerAMG: Using CF-relaxation<br>
> HYPRE BoomerAMG: Not using more complex smoothers.<br>
> HYPRE BoomerAMG: Measure type local<br>
> HYPRE BoomerAMG: Coarsen type Falgout<br>
> HYPRE BoomerAMG: Interpolation type classical<br>
> HYPRE BoomerAMG: Using nodal coarsening (with HYPRE_BOOMERAMGSetNodal() 1<br>
> HYPRE BoomerAMG: HYPRE_<wbr>BoomerAMGSetInterpVecVariant() 1<br>
> linear system matrix = precond matrix:<br>
> Mat Object: 2 MPI processes<br>
> type: mpiaij<br>
> rows=200000, cols=200000<br>
> total: nonzeros=3373340, allocated nonzeros=3373340<br>
> total number of mallocs used during MatSetValues calls =0<br>
> not using I-node (on process 0) routines<br>
><br>
><br>
> but still the timing is terrible.<br>
><br>
><br>
><br>
><br>
> On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown <<a href="mailto:jed@jedbrown.org">jed@jedbrown.org</a>> wrote:<br>
> Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> writes:<br>
><br>
> > Awesome Matt and Jed,<br>
> ><br>
> > The GCR is used because the matrix is not invertible and because this was<br>
> > the algorithm that the previous library used,<br>
> ><br>
> > The Preconditioned im aiming to use is multigrid, i thought i configured<br>
> > the hypre-boomerAmg solver for this, but i agree in that it doesn't show in<br>
> > the log anywhere, how can i be sure is being used ? i sent -ksp_view log<br>
> > before in this thread<br>
><br>
> Did you run with -pc_type hypre?<br>
><br>
> > I had a problem with the matrix block sizes so i couldn't make the petsc<br>
> > native multigrid solver to work,<br>
><br>
> What block sizes? If the only variable is pressure, the block size<br>
> would be 1 (default).<br>
><br>
> > This is a nonhidrostatic pressure solver, it is an elliptic problem so<br>
> > multigrid is a must,<br>
><br>
> Yes, multigrid should work well.<br>
><br>
<br>
</div></div></blockquote></div><br></div>