[petsc-users] Matrix free generalized eigenvalue problem
Миша Сальников
rolekst095 at gmail.com
Sun May 15 12:15:31 CDT 2022
Ok, thank you very much for your help and links to articles.
вс, 15 мая 2022 г. в 20:02, Jose E. Roman <jroman at dsic.upv.es>:
> Preconditioned eigensolvers avoid an explicit inversion. You can try
> Davidson methods https://doi.org/10.1145/2543696 or LOBPCG if the problem
> is symmetric-definite. However, for reasonable convergence these methods
> usually require a good preconditioner, so if your matrix is implicit you
> still need to explicitly build some type of approximation that can be used
> to build the preconditioner.
>
> Jose
>
>
> > El 15 may 2022, a las 18:54, Миша Сальников <rolekst095 at gmail.com>
> escribió:
> >
> > Thanks a lot for your answer. I found a solution in the documentation.
> So if I understand correctly SLEPc during eigenvalue iterations of Ax = k
> Bx is trying to calculate the inverse of B or y = Bx. So the question is
> don't you know maybe there exist methods or ways for generalized
> eigenproblems without inverting of B? Again thanks a lot for the answer
> >
> > вс, 15 мая 2022 г. в 17:52, Jose E. Roman <jroman at dsic.upv.es>:
> > You cannot compute the LU factorization of a shell matrix. You will have
> to use a preconditioned iterative linear solver, see section 3.4.1 of the
> SLEPc users guide.
> >
> > Jose
> >
> >
> > > El 15 may 2022, a las 14:39, Миша Сальников <rolekst095 at gmail.com>
> escribió:
> > >
> > > Hi, I am developing a program for solving generalized eigenvalue
> problem Ax = k Bx when both operators A and B are matrix-free. I found some
> examples on SLEPc documentation but and there was and example where problem
> Ax = k x are solving matrix free. So I tried to change this example a
> little bit and added operator B as identity matrix-free operator. But for
> now I am getting an error. My source code is attached. And the error
> message is below
> > > Can you please explain to me what I am doing wrong?
> > >
> > > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > [0]PETSC ERROR: See
> https://petsc.org/release/overview/linear_solve_table/ for possible LU
> and Cholesky solvers
> > > [0]PETSC ERROR: Could not locate a solver type for factorization type
> LU and matrix type shell.
> > > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble
> shooting.
> > > [0]PETSC ERROR: Petsc Release Version 3.15.5, Sep 29, 2021
> > > [0]PETSC ERROR: ./exe_eps_impl_matr on a named LAPTOP-7DN2DH7N by
> misha Sun May 15 15:21:59 2022
> > > [0]PETSC ERROR: Configure options --build=x86_64-linux-gnu
> --prefix=/usr --includedir=${prefix}/include --mandir=${prefix}/share/man
> --infodir=${prefix}/share/info --sysconfdir=/etc --localstatedir=/var
> --with-option-checking=0 --with-silent-rules=0
> --libdir=${prefix}/lib/x86_64-linux-gnu --runstatedir=/run
> --with-maintainer-mode=0 --with-dependency-tracking=0 --with-64-bit-indices
> --with-debugging=0 --shared-library-extension=64_real
> --with-shared-libraries --with-pic=1 --with-cc=mpicc --with-cxx=mpicxx
> --with-fc=mpif90 --with-cxx-dialect=C++11 --with-opencl=1
> --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-scalapack=1
> --with-scalapack-lib=-lscalapack-openmpi --with-ptscotch=1
> --with-ptscotch-include=/usr/include/scotch
> --with-ptscotch-lib="-lptesmumps -lptscotch -lptscotcherr" --with-fftw=1
> --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi"
> --with-yaml=1 --with-hdf5-include=/usr/include/hdf5/openmpi
> --with-hdf5-lib="-L/usr/lib/x86_64-linux-gnu/hdf5/openmpi
> -L/usr/lib/x86_64-linux-gnu/openmpi/lib -lhdf5 -lmpi"
> --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-hypre=1
> --with-hypre-include=/usr/include/hypre64m --with-hypre-lib=-lHYPRE64m
> --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps_64
> -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64"
> --with-suitesparse=1 --with-suitesparse-include=/usr/include/suitesparse
> --with-suitesparse-lib="-lumfpack -lamd -lcholmod -lklu" --with-zoltan=1
> --with-zoltan-include=/usr/include/trilinos
> --with-zoltan-lib=-ltrilinos_zoltan
> --prefix=/usr/lib/petscdir/petsc64-3.15/x86_64-linux-gnu-real
> --PETSC_ARCH=x86_64-linux-gnu-real-64 CFLAGS="-g -O2
> -ffile-prefix-map=/build/petsc-tFq6Xk/petsc-3.15.5+dfsg1=. -flto=auto
> -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong
> -Wformat -Werror=format-security -fPIC" CXXFLAGS="-g -O2
> -ffile-prefix-map=/build/petsc-tFq6Xk/petsc-3.15.5+dfsg1=. -flto=auto
> -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong
> -Wformat -Werror=format-security -fPIC" FCFLAGS="-g -O2
> -ffile-prefix-map=/build/petsc-tFq6Xk/petsc-3.15.5+dfsg1=. -flto=auto
> -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong
> -fPIC -ffree-line-length-0" FFLAGS="-g -O2
> -ffile-prefix-map=/build/petsc-tFq6Xk/petsc-3.15.5+dfsg1=. -flto=auto
> -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong
> -fPIC -ffree-line-length-0" CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2"
> LDFLAGS="-Wl,-Bsymbolic-functions -flto=auto -ffat-lto-objects -flto=auto
> -Wl,-z,relro -fPIC" MAKEFLAGS=w
> > > [0]PETSC ERROR: #1 MatGetFactor() at ./src/mat/interface/matrix.c:4756
> > > [0]PETSC ERROR: #2 PCSetUp_LU() at ./src/ksp/pc/impls/factor/lu/lu.c:82
> > > [0]PETSC ERROR: #3 PCSetUp() at ./src/ksp/pc/interface/precon.c:1017
> > > [0]PETSC ERROR: #4 KSPSetUp() at ./src/ksp/ksp/interface/itfunc.c:406
> > > [0]PETSC ERROR: #5 STSetUp_Shift() at
> ./src/sys/classes/st/impls/shift/shift.c:107
> > > [0]PETSC ERROR: #6 STSetUp() at
> ./src/sys/classes/st/interface/stsolve.c:582
> > > [0]PETSC ERROR: #7 EPSSetUp() at ./src/eps/interface/epssetup.c:350
> > > [0]PETSC ERROR: #8 EPSSolve() at ./src/eps/interface/epssolve.c:136
> > > [0]PETSC ERROR: #9 main() at ex_implicit_matr.cpp:95
> > > [0]PETSC ERROR: No PETSc Option Table entries
> > > [0]PETSC ERROR: ----------------End of Error Message -------send
> entire error message to petsc-maint at mcs.anl.gov----------
> > >
> --------------------------------------------------------------------------
> > > MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> > > with errorcode 92.
> > >
> > > NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> > > You may or may not see output from other processes, depending on
> > > exactly when Open MPI kills them.
> > > <ex_implicit_matr.cpp>
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220515/f5559cee/attachment.html>
More information about the petsc-users
mailing list