[petsc-users] Singlar values of the GMRES Hessenberg matrix
Matthew Knepley
knepley at gmail.com
Fri May 24 07:48:51 CDT 2019
On Fri, May 24, 2019 at 8:38 AM Dave Lee <davelee2804 at gmail.com> wrote:
> Thanks Matt, great suggestion.
>
> I did indeed find a transpose error this way. The SVD as reconstructed via
> U S V^T now matches the input Hessenberg matrix as derived via the
> *HES(row,col) macro, and all the singular values are non-zero. However
> the solution to example src/ksp/ksp/examples/tutorials/ex1.c as
> determined via the expansion over the singular vectors is still not
> correct. I suspect I'm doing something wrong with regards to the expansion
> over the vec array VEC_VV(), which I assume are the orthonormal vectors
> of the Q_k matrix in the Arnoldi iteration....
>
Here we are building the solution:
https://bitbucket.org/petsc/petsc/src/7c23e6aa64ffbff85a2457e1aa154ec3d7f238e3/src/ksp/ksp/impls/gmres/gmres.c#lines-331
There are some subtleties if you have a nonzero initial guess or a
preconditioner.
Thanks,
Matt
> Thanks again for your advice, I'll keep digging.
>
> Cheers, Dave.
>
> On Thu, May 23, 2019 at 8:20 PM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Thu, May 23, 2019 at 5:09 AM Dave Lee via petsc-users <
>> petsc-users at mcs.anl.gov> wrote:
>>
>>> Hi PETSc,
>>>
>>> I'm trying to add a "hook step" to the SNES trust region solver (at the
>>> end of the function: KSPGMRESBuildSoln())
>>>
>>> I'm testing this using the (linear) example:
>>> src/ksp/ksp/examples/tutorials/ex1.c
>>> as
>>> gdb --args ./test -snes_mf -snes_type newtontr -ksp_rtol 1.0e-12
>>> -snes_stol 1.0e-12 -ksp_converged_reason -snes_converged_reason
>>> -ksp_monitor -snes_monitor
>>> (Ignore the SNES stuff, this is for when I test nonlinear examples).
>>>
>>> When I call the LAPACK SVD routine via PETSc as
>>> PetscStackCallBLAS("LAPACKgesvd",LAPACKgesvd_(...))
>>> I get the following singular values:
>>>
>>> 0 KSP Residual norm 7.071067811865e-01
>>> 1 KSP Residual norm 3.162277660168e-01
>>> 2 KSP Residual norm 1.889822365046e-01
>>> 3 KSP Residual norm 1.290994448736e-01
>>> 4 KSP Residual norm 9.534625892456e-02
>>> 5 KSP Residual norm 8.082545620881e-16
>>>
>>> 1 0.5 -7.85046e-16 1.17757e-15
>>> 0.5 1 0.5 1.7271e-15
>>> 0 0.5 1 0.5
>>> 0 0 0.5 1
>>> 0 0 0 0.5
>>>
>>> singular values: 2.36264 0.409816 1.97794e-15 6.67632e-16
>>>
>>> Linear solve converged due to CONVERGED_RTOL iterations 5
>>>
>>> Where the lines above the singular values are the Hessenberg matrix that
>>> I'm doing the SVD on.
>>>
>>
>> First, write out all the SVD matrices you get and make sure that they
>> reconstruct the input matrix (that
>> you do not have something transposed somewhere).
>>
>> Matt
>>
>>
>>> When I build the solution in terms of the leading two right singular
>>> vectors (and subsequently the first two orthonormal basis vectors in
>>> VECS_VV I get an error norm as:
>>> Norm of error 3.16228, Iterations 5
>>>
>>> My suspicion is that I'm creating the Hessenberg incorrectly, as I would
>>> have thought that this problem should have more than two non-zero leading
>>> singular values.
>>>
>>> Within my modified version of the GMRES build solution function
>>> (attached) I'm creating this (and passing it to LAPACK as):
>>>
>>> nRows = gmres->it+1;
>>> nCols = nRows-1;
>>>
>>> ierr = PetscBLASIntCast(nRows,&nRows_blas);CHKERRQ(ierr);
>>> ierr = PetscBLASIntCast(nCols,&nCols_blas);CHKERRQ(ierr);
>>> ierr = PetscBLASIntCast(5*nRows,&lwork);CHKERRQ(ierr);
>>> ierr = PetscMalloc1(5*nRows,&work);CHKERRQ(ierr);
>>> ierr = PetscMalloc1(nRows*nCols,&R);CHKERRQ(ierr);
>>> ierr = PetscMalloc1(nRows*nCols,&H);CHKERRQ(ierr);
>>> for (jj = 0; jj < nRows; jj++) {
>>> for (ii = 0; ii < nCols; ii++) {
>>> R[jj*nCols+ii] = *HES(jj,ii);
>>> }
>>> }
>>> // Duplicate the Hessenberg matrix as the one passed to the SVD
>>> solver is destroyed
>>> for (ii=0; ii<nRows*nCols; ii++) H[ii] = R[ii];
>>>
>>> ierr = PetscMalloc1(nRows*nRows,&U);CHKERRQ(ierr);
>>> ierr = PetscMalloc1(nCols*nCols,&VT);CHKERRQ(ierr);
>>> ierr = PetscMalloc1(nRows*nRows,&UT);CHKERRQ(ierr);
>>> ierr = PetscMalloc1(nCols*nCols,&V);CHKERRQ(ierr);
>>> ierr = PetscMalloc1(nRows,&p);CHKERRQ(ierr);
>>> ierr = PetscMalloc1(nCols,&q);CHKERRQ(ierr);
>>> ierr = PetscMalloc1(nCols,&y);CHKERRQ(ierr);
>>>
>>> // Perform an SVD on the Hessenberg matrix - Note: this call
>>> destroys the input Hessenberg
>>> ierr = PetscFPTrapPush(PETSC_FP_TRAP_OFF);CHKERRQ(ierr);
>>>
>>> PetscStackCallBLAS("LAPACKgesvd",LAPACKgesvd_("A","A",&nRows_blas,&nCols_blas,R,&nRows_blas,S,UT,&nRows_blas,V,&nCols_blas,work,&lwork,&lierr));
>>> if (lierr) SETERRQ1(PETSC_COMM_SELF,PETSC_ERR_LIB,"Error in SVD
>>> Lapack routine %d",(int)lierr);
>>> ierr = PetscFPTrapPop();CHKERRQ(ierr);
>>>
>>> // Find the number of non-zero singular values
>>> for(nnz=0; nnz<nCols; nnz++) {
>>> if(fabs(S[nnz]) < 1.0e-8) break;
>>> }
>>> printf("number of nonzero singular values: %d\n",nnz);
>>>
>>> trans(nRows,nRows,UT,U);
>>> trans(nCols,nCols,V,VT);
>>>
>>> // Compute p = ||r_0|| U^T e_1
>>> beta = gmres->res_beta;
>>> for (ii=0; ii<nCols; ii++) {
>>> p[ii] = beta*UT[ii*nRows];
>>> }
>>> p[nCols] = 0.0;
>>>
>>> // Original GMRES solution (\mu = 0)
>>> for (ii=0; ii<nnz; ii++) {
>>> q[ii] = p[ii]/S[ii];
>>> }
>>>
>>> // Expand y in terms of the right singular vectors as y = V q
>>> for (jj=0; jj<nnz; jj++) {
>>> y[jj] = 0.0;
>>> for (ii=0; ii<nCols; ii++) {
>>> y[jj] += V[jj*nCols+ii]*q[ii]; // transpose of the transpose
>>> }
>>> }
>>>
>>> // Pass the orthnomalized Krylov vector weights back out
>>> for (ii=0; ii<nnz; ii++) {
>>> nrs[ii] = y[ii];
>>> }
>>>
>>> I just wanted to check that this is the correct way to extract the
>>> Hessenberg from the KSP_GMRES structure, and to pass it to LAPACK, and if
>>> so, should I really be expecting only two non-zero singular values in
>>> return for this problem?
>>>
>>> Cheers, Dave.
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190524/3f4d249d/attachment-0001.html>
More information about the petsc-users
mailing list