<div dir="ltr">Good, sorry for the confusion, I thought this stuff was squared away in v3.6 and certainly in v3.6.2. (its a little disconcerting that it failed. it would be interesting to see if master fails if you set gmres for the coarse grid solver, ie, was gmres really the problem?)<div><div>Mark</div></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Mon, Oct 5, 2015 at 3:53 PM, Gil Forsyth <span dir="ltr"><<a href="mailto:gforsyth@gwu.edu" target="_blank">gforsyth@gwu.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Hi Barry and Mark, <div><br></div><div>Everything is now working as expected on PETSc master, both in serial and parallel. Many thanks for all of your help.</div><span class="HOEnZb"><font color="#888888"><div><br></div><div>Gil Forsyth</div></font></span></div><div class="HOEnZb"><div class="h5"><div class="gmail_extra"><br><div class="gmail_quote">On Mon, Oct 5, 2015 at 2:51 PM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
Looks like the bug of using gmres on the coarse mesh is still there in the latest patch release.<br>
<br>
If you switch to PETSc master <a href="http://www.mcs.anl.gov/petsc/developers/index.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/developers/index.html</a> it will not use gmres<br>
<br>
Barry<br>
<div><div><br>
> On Oct 5, 2015, at 9:06 AM, Gil Forsyth <<a href="mailto:gforsyth@gwu.edu" target="_blank">gforsyth@gwu.edu</a>> wrote:<br>
><br>
> Hi Mark,<br>
><br>
> I've lost it, too. I was bisecting to find the change that started returning the indefinite PC error to our code that has previously worked -- but this was using KSPSetNullSpace.<br>
> Increasing the number of steps was in an effort to see if it impacted the error or not, partially based on this thread from PETSC-users (<a href="http://lists.mcs.anl.gov/pipermail/petsc-users/2014-November/023653.html" rel="noreferrer" target="_blank">http://lists.mcs.anl.gov/pipermail/petsc-users/2014-November/023653.html</a>) with one of the previous authors of our code.<br>
><br>
> Updating to PETSc master briefly eliminated the error in the poisson solver, although this was only the case in serial, it still failed with an indefinite PC error in parallel.<br>
><br>
> I'll confess that I'm not sure what to bisect between, as we don't have a "good" version after the switch from KSPSetNullSpace -> MatSetNullSpace. That's what prompted the initial bisection search in and around the 3.5.4 commit range. I'm going to take another crack at that today in a more automated fashion, since I expect I inserted some human error somewhere along the way.<br>
><br>
> Compiled against petsc v3.6.2, it shows again that the coarse grid solver is using GMRES even when using -mg_coarse_ksp_type preonly. Logs are attached.<br>
><br>
> $PETSC_ARCH/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100 -poisson_mg_coarse_ksp_type preonly -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 0 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info<br>
><br>
> On Sun, Oct 4, 2015 at 8:57 AM, Mark Adams <<a href="mailto:mfadams@lbl.gov" target="_blank">mfadams@lbl.gov</a>> wrote:<br>
> I've lost this thread a bit, but you seem to be bisecting to find where a problem started and you are noticing the gmres coarse grid solver. We fixed a bug where PETSc was resetting the coarse grid solver to GMRES when it should not. So older versions have this, but the current version, and I think this has been in place for all of v3.6, but it might have missed v3.6.1, have the fix of not resetting the coarse grid solver type. GAMG sets the coarse grid solver type to preonly, but you can override it. Let me know if I'm missing something here.<br>
><br>
> I also see that you are setting -pc_gamg_agg_nsmooths 1,2,3,4. This is the number of smoothing steps of the prolongation operator and you should not use more than 1. In fact for CFD, you should use no smoothing (0), probably.<br>
><br>
> Mark<br>
><br>
> On Thu, Oct 1, 2015 at 4:27 PM, Gil Forsyth <<a href="mailto:gforsyth@gwu.edu" target="_blank">gforsyth@gwu.edu</a>> wrote:<br>
> Hi Mark,<br>
><br>
> I just noticed that in the previous commit 7743f89, it's also using GMRES in the multigrid solve but doesn't complain until the 2nd timestep, so my bisection criteria is off, since I was giving commits a PASS if they made it through the one timestep without complaining about the indefinite PC. I think I'm still close to the problem commit, but it's probably a little bit before 25a145a. Apologies for the goose chase.<br>
><br>
> Thanks,<br>
> Gil Forsyth<br>
><br>
> On Thu, Oct 1, 2015 at 4:21 PM, Gil Forsyth <<a href="mailto:gforsyth@gwu.edu" target="_blank">gforsyth@gwu.edu</a>> wrote:<br>
> I ran for one timestep against 3.5.4 with<br>
> #+BEGIN_SRC<br>
> petibm-3.5.4/bin/petibm2d -directory examples/2d/lidDrivenCavity/Re100 -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason -info > kspview_3.5.4.log<br>
> #+END_SRC<br>
><br>
> and then against 25a145a with the same inputs. I notice that the poisson multigrid solve in 25a145a is using GMRES again while 3.5.4 is using preonly.<br>
><br>
> Logs from both runs are attached.<br>
><br>
><br>
> On Thu, Oct 1, 2015 at 3:52 PM, Mark Adams <<a href="mailto:mfadams@lbl.gov" target="_blank">mfadams@lbl.gov</a>> wrote:<br>
> Can you please send a good log also, with the ksp_view.<br>
> Mark<br>
><br>
> On Wed, Sep 30, 2015 at 3:11 PM, Gil Forsyth <<a href="mailto:gforsyth@gwu.edu" target="_blank">gforsyth@gwu.edu</a>> wrote:<br>
> Using PETSc master branch solved the problem in serial, but I'm still seeing the same KSP_DIVERGED_INDEFINITE_PC error when I run with MPI. This runs to completion when I don't use GAMG. Log is attached for the following run.<br>
><br>
> $PETSC_DIR/$PETSC_ARCH/bin/mpirun -n 2 $PETIBM_DIR/petibm-git/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason<br>
><br>
><br>
> Thanks again,<br>
> Gil Forsyth<br>
><br>
><br>
> On Tue, Sep 29, 2015 at 1:12 PM, Gil Forsyth <<a href="mailto:gforsyth@gwu.edu" target="_blank">gforsyth@gwu.edu</a>> wrote:<br>
> Ah, got it. I'll checkout the master branch and see if the behavior persists.<br>
><br>
> Many thanks,<br>
> Gil<br>
><br>
> On Tue, Sep 29, 2015 at 1:10 PM, Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br>
> On Tue, Sep 29, 2015 at 12:08 PM, Gil Forsyth <<a href="mailto:gforsyth@gwu.edu" target="_blank">gforsyth@gwu.edu</a>> wrote:<br>
> PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that the behavior seems to have changed between versions. The only difference in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to MatSetNullSpace.<br>
><br>
> Mark made some GAMG changes which were later reversed because they had unintended consequences like this.<br>
> I think what Barry means is, "you should get the behavior you expect using the master branch from PETSc development"<br>
><br>
> Thanks,<br>
><br>
> Matt<br>
><br>
> On Tue, Sep 29, 2015 at 1:04 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>> wrote:<br>
><br>
> Update your PETSc<br>
><br>
><br>
> > On Sep 29, 2015, at 12:00 PM, Gil Forsyth <<a href="mailto:gforsyth@gwu.edu" target="_blank">gforsyth@gwu.edu</a>> wrote:<br>
> ><br>
> > Hi Barry,<br>
> ><br>
> > We aren't explicitly setting GMRES anywhere in the code and I'm not sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace works with:<br>
> ><br>
> > $PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview3.5.4<br>
> ><br>
> > and shows that the coarse grid solver is of type:preonly<br>
> ><br>
> > running the newer version that uses MatSetNullSpace in its stead and adding in -poisson_mg_coarse_ksp_type preonly<br>
> ><br>
> > $PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_mg_coarse_ksp_type preonly -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview3.6.1<br>
> ><br>
> > still shows<br>
> ><br>
> > KSP Object:(poisson_) 1 MPI processes<br>
> > type: cg<br>
> > maximum iterations=10000<br>
> > tolerances: relative=1e-05, absolute=1e-50, divergence=10000<br>
> > left preconditioning<br>
> > using nonzero initial guess<br>
> > using PRECONDITIONED norm type for convergence test<br>
> > PC Object:(poisson_) 1 MPI processes<br>
> > type: gamg<br>
> > MG: type is MULTIPLICATIVE, levels=3 cycles=v<br>
> > Cycles per PCApply=1<br>
> > Using Galerkin computed coarse grid matrices<br>
> > GAMG specific options<br>
> > Threshold for dropping small values from graph 0<br>
> > AGG specific options<br>
> > Symmetric graph false<br>
> > Coarse grid solver -- level -------------------------------<br>
> > KSP Object: (poisson_mg_coarse_) 1 MPI processes<br>
> > type: gmres<br>
> > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br>
> > GMRES: happy breakdown tolerance 1e-30<br>
> > maximum iterations=1, initial guess is zero<br>
> > tolerances: relative=1e-05, absolute=1e-50, divergence=10000<br>
> > left preconditioning<br>
> > using NONE norm type for convergence test<br>
> ><br>
> ><br>
> > both logs are attached.<br>
> ><br>
> ><br>
> > On Tue, Sep 29, 2015 at 12:37 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>> wrote:<br>
> ><br>
> > This can't work. You can't use a GMRES inside a CG. Try changing to -poisson_mg_coarse_ksp_type preonly<br>
> ><br>
> > KSP Object:(poisson_) 1 MPI processes<br>
> > type: cg<br>
> ><br>
> > KSP Object: (poisson_mg_coarse_) 1 MPI processes<br>
> > type: gmres<br>
> > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br>
> > GMRES: happy breakdown tolerance 1e-30<br>
> > maximum iterations=1, initial guess is zero<br>
> ><br>
> ><br>
> > > On Sep 29, 2015, at 10:53 AM, Gil Forsyth <<a href="mailto:gforsyth@gwu.edu" target="_blank">gforsyth@gwu.edu</a>> wrote:<br>
> > ><br>
> > ><br>
> > > On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br>
> > > On Tue, Sep 29, 2015 at 10:28 AM, Gil Forsyth <<a href="mailto:gforsyth@gwu.edu" target="_blank">gforsyth@gwu.edu</a>> wrote:<br>
> > > Hi all,<br>
> > ><br>
> > > I've been having some trouble with what should be a relatively simple update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1<br>
> > ><br>
> > > I'm getting indefinite PC errors for a simple lid-driven cavity test problem, 32x32 at Re 100<br>
> > ><br>
> > > Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set the null space. This is for a 2D Poisson system with no immersed boundary and so the null space is the constant vector.<br>
> > ><br>
> > > MatNullSpace nsp;<br>
> > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL, &nsp); CHKERRQ(ierr);<br>
> > > ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);<br>
> > > ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);<br>
> > ><br>
> > > Clearly this has to happen in the reverse order, since ksp2 would not be created yet.<br>
> > ><br>
> > > For questions about solvers, we HAVE to see the complete output of -ksp_view so we<br>
> > > know what we are dealing with. Its also nice to have -ksp_monitor_true_residual -ksp_converged_reason<br>
> > ><br>
> > > Matt<br>
> > ><br>
> > > Yes -- sorry, those are both in inline files and are called in the reverse order that I wrote them out.<br>
> > ><br>
> > > I've attached the output of<br>
> > ><br>
> > > $PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1 -poisson_ksp_view -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason > kspview.log<br>
> > ><br>
> > ><br>
> > ><br>
> > > And then setup the KSP with<br>
> > > ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);<br>
> > > ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);<br>
> > > ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);<br>
> > > ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);<br>
> > > ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);<br>
> > > ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);<br>
> > > ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);<br>
> > ><br>
> > > The matrix QTBNQ does not change, only the rhs of the system is updated.<br>
> > ><br>
> > > We run this with `-pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1` and everything seems to work as expected.<br>
> > ><br>
> > > Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to<br>
> > ><br>
> > > ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);<br>
> > ><br>
> > > and the same code diverges after 1 timestep and returns a -8 KSP_DIVERGED_INDEFINITE_PC<br>
> > ><br>
> > > This is weird, especially because if we change nsmooths to 2, it runs for 264 timesteps and the returns the same error. But we have explicitly set KSPSetReusePreconditioner so it should be using the same PC, right?<br>
> > ><br>
> > > Change nsmooths to 3 and it again diverges after 1 timestep.<br>
> > ><br>
> > > Change nsmooths to 4 and it runs to completion.<br>
> > ><br>
> > > It seems like either gamg's behavior has changed, or that KSPSetNullSpace was doing something implicitly that we now need to do explicitly in addition to MatSetNullSpace?<br>
> > ><br>
> > > Thanks,<br>
> > > Gil Forsyth<br>
> > ><br>
> > ><br>
> > ><br>
> > > --<br>
> > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
> > > -- Norbert Wiener<br>
> > ><br>
> > > <kspview.log><br>
> ><br>
> ><br>
> > <kspview3.5.4><kspview3.6.1><br>
><br>
><br>
><br>
><br>
><br>
> --<br>
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
> -- Norbert Wiener<br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
</div></div>> <kspview_arch-3264318.log><br>
<br>
</blockquote></div><br></div>
</div></div></blockquote></div><br></div>