<html><head><meta http-equiv="Content-Type" content="text/html charset=iso-8859-1"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><br><div><div>On Aug 9, 2013, at 1:08 PM, Bishesh Khanal <<a href="mailto:bisheshkh@gmail.com">bisheshkh@gmail.com</a>> wrote:</div><br class="Apple-interchange-newline"><blockquote type="cite"><div dir="ltr"><br><div class="gmail_extra"><br><br><div class="gmail_quote">On Fri, Aug 9, 2013 at 6:58 PM, Matthew Knepley <span dir="ltr"><<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="im">On Fri, Aug 9, 2013 at 11:52 AM, Bishesh Khanal <span dir="ltr"><<a href="mailto:bisheshkh@gmail.com" target="_blank">bisheshkh@gmail.com</a>></span> wrote:<br>
</div><div class="gmail_extra"><div class="gmail_quote"><div class="im">
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div>Dear all,<br>I was experimenting with my stokes problem in 3D staggered grid with high viscosity jump using -pc_fieldsplit of type schur complement. Using hypre pilut preconditioner for the ksp for A00 block seemed to be giving nice results for smaller size. Using the following options in my laptop, or in the cluster I'm using with ONE node multiple cores WORKS fine:<br>
-ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view -fieldsplit_0_ksp_type gcr -fieldsplit_0_ksp_rtol 1.0e-5 -fieldsplit_0_pc_type hypre -fieldsplit_0_pc_hypre_type pilut<br>
<br></div>But when I try to submit jobs with multiple nodes, the process never seem to end! When using gamg instead of hypre, the same program works with multiple nodes in the same cluster. <br>But gamg gave much slower convergence than the hypre, so I wanted to use the hypre.<br>
</div></blockquote><div><br></div></div><div>Did you give a near nullspace to GAMG (probably the 3 translational and 3 rotational modes for this problem)? Without these,</div><div>convergence can be quite slow.</div></div>
</div></div></blockquote><div>I have enforced the zero velocity on the boundary, i.e. on all the faces of the cube, by changing the corresponding rows of the system matrix. With this I think the nullspace would just correspond to the constant pressure. </div></div></div></div></blockquote><div><br></div><div>We are talking about the 00 block solver (velocity) and there are 6 (near) null space vectors. GAMG does want to be told you have 3 dof/node. </div><div><br></div><div>Should DM do this!!! </div><div><br></div><div>If you tell GAMG there is only one (the (1,1,1) vector) then that will mess it up. If you set the blocks size on the matrix (to 3), which should be done by PETSc, and do not set a null space, GAMG will construct the default null space: three constant functions. This should work OK. As Matt pointed out you want to give it all six (the 3 constant vectors and 3 rotational null space vectors) to get an optimal solver.</div><div><br></div><div>GAMG is an AMG method like hypre boomeramg. Boomeramg is very robust and I would recommend using it to start. GAMG might be better for vector values problems but if hypre is probably pretty good (for low order discretizations especially).</div><div><br></div><div>Mark</div><div><br></div><blockquote type="cite"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><div>For this I set the nullspace using MatNullSpace, for both the outer level ksp and ksp object for Schur complement. Using MatNullSpaceTest seemed to return good (true) value for both of these set nullspaces.<br>
</div><div>Please correct me if I'm doing sth wrong or sth not preferred!</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra">
<div class="gmail_quote"><div class="im"><div> </div>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">
When I kill the job and look at the error file, the error it reports:<br></div></blockquote></div></div></div></div></blockquote><div></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><div class="im"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"></div></blockquote><div>
<br></div></div><div>It looks like PILUT is just slow.</div></div></div></div></blockquote><div><br></div><div>But using PILUT gives me results for smaller sizes with a single node. And the failure with multiple nodes is still for the same problem size!! So if it were slower it wouldn't probably have given results in my laptop too right ?<br>
<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><div><br></div><div> Matt</div><div><div class="h5">
<div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div dir="ltr">[8]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>[8]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
[8]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[8]PETSC" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[8]PETSC</a> ERROR: or try <a href="http://valgrind.org/" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
[8]PETSC ERROR: likely location of problem given in stack below<br>[8]PETSC ERROR: --------------------- Stack Frames ------------------------------------<br>[8]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
[8]PETSC ERROR: INSTEAD the line number of the start of the function<br>[8]PETSC ERROR: is given.<br>[8]PETSC ERROR: [8] HYPRE_SetupXXX line 130 /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c<br>[8]PETSC ERROR: [8] PCSetUp_HYPRE line 94 /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c<br>
[8]PETSC ERROR: [8] PCSetUp line 868 /tmp/petsc-3.4.1/src/ksp/pc/interface/precon.c<br>[8]PETSC ERROR: [8] KSPSetUp line 192 /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c<br>[8]PETSC ERROR: [8] KSPSolve line 356 /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c<br>
[8]PETSC ERROR: [8] MatMult_SchurComplement line 75 /tmp/petsc-3.4.1/src/ksp/ksp/utils/schurm.c<br>[8]PETSC ERROR: [8] MatNullSpaceTest line 408 /tmp/petsc-3.4.1/src/mat/interface/matnull.c<br>[8]PETSC ERROR: [8] solveModel line 133 "unknowndirectory/"/epi/asclepios2/bkhanal/works/AdLemModel/src/PetscAdLemTaras3D.cxx<br>
[8]PETSC ERROR: --------------------- Error Message ------------------------------------<br>[8]PETSC ERROR: Signal received!<br>[8]PETSC ERROR: ------------------------------------------------------------------------<br>
[8]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013<br>
[8]PETSC ERROR: See docs/changes/index.html for recent updates.<br>[8]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>[8]PETSC ERROR: See docs/index.html for manual pages.<br>[8]PETSC ERROR: ------------------------------------------------------------------------<br>
[8]PETSC ERROR: /epi/asclepios2/bkhanal/works/AdLemModel/build/src/AdLemMain on a arch-linux2-cxx-debug named nef002 by bkhanal Fri Aug 9 18:00:22 2013<br>[8]PETSC ERROR: Libraries linked from /home/bkhanal/petsc/lib<br>
[8]PETSC ERROR: Configure run at Mon Jul 1 13:44:30 2013<br>[8]PETSC ERROR: Configure options --with-mpi-dir=/opt/openmpi-gcc/current/ --with-shared-libraries --prefix=/home/bkhanal/petsc -download-f-blas-lapack=1 --download-hypre --with-clanguage=cxx<br>
[8]PETSC ERROR: ------------------------------------------------------------------------<br>[8]PETSC ERROR: User provided function() line 0 in unknown directory unknown file<br><br></div>
</blockquote></div></div></div><span class="HOEnZb"><font color="#888888"><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener
</font></span></div></div>
</blockquote></div><br></div></div>
</blockquote></div><br></body></html>