[petsc-users] Newbie question: Something is wrong with this Slepc simple example for generalised eigenvalue problem

Jose E. Roman jroman at dsic.upv.es
Sun Mar 28 11:42:23 CDT 2021


You should run in debug mode until you get a correct code, otherwise you may no see some error messages. Also, it is recommended to add error checking after every call to PETSc, otherwise the execution continues and may get blocked. See the CHKERRA macro in https://slepc.upv.es/documentation/current/src/eps/tutorials/ex1f90.F90.html

The problem you are probably having is that you are running with several MPI processes, so you need a parallel LU solver. See FAQ #10 https://slepc.upv.es/documentation/faq.htm

Jose


> El 28 mar 2021, a las 11:25, dazza simplythebest <sayosale at hotmail.com> escribió:
> 
> Dear All,
>             I am seeking to use slepc/petsc to solve a generalised eigenvalue problem
> that arises in a  hydrodynamic stability problem. The code is a parallelisation of an existing 
> serial Fortran code. Before I get to grips with this target problem, I would of course like to get
> some relevant examples working. I have installed petsc/ slepc seemingly without any problem,
> and the provided slepc example fortran program ex1f.F, which solves a regular eigenvalue
> problem Ax = lambda x, seemed to compile and run correctly.
> I have now written a short program to instead solve the complex generalised 
> problem Ax = lambda B x (see below) . This code compiles and runs w/out errors but
> for some reason hangs when calling EPSSolve - we enter EPSSolve but never leave.
> The matrices appear to be correctly assembled -all the values are correct in the Matview
> printout, so I am not quite sure where I have gone wrong, can anyone spot my mistake?
>  ( Note that for the actual problem I wish to solve I have already written the code to construct the matrix,
> which distributes the rows across the processes and it is fully tested and working. Hence I want to specify
>  the distribution of rows and not leave it up to a PETS_DECIDE .)
> I would be very grateful if someone can point out what is wrong with this small example code (see below),
>    Many thanks,
>                              Dan.
> 
> !  this program MUST be run with NGLOBAL = 10, MY_NO_ROWS = 5
> !  and two MPI processes since row distribution is hard-baked into code 
> !
>       program main
> #include <slepc/finclude/slepceps.h>
>       use slepceps
>       implicit none
> 
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> !     Declarations
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> !
> !  Variables:
> !     A , B    double complex operator matrices
> !     eps   eigenproblem solver context
> 
>       Mat            A,B
>       EPS            eps
>       EPSType        tname
>       PetscReal      tol, error
>       PetscScalar    kr, ki
>       Vec            xr, xi
>       PetscInt       NGLOBAL ,  MY_NO_ROWS, NL3, owner
>       PetscInt       nev, maxit, its, nconv
>       PetscInt       i,j,ic,jc
>       PetscReal      reala, imaga, realb, imagb, di, dj
>       PetscScalar    a_entry, b_entry
>       PetscMPIInt    rank
>       PetscErrorCode ierr
>       PetscInt,parameter :: zero = 0, one = 1, two = 2, three = 3  
>       
>       PetscInt   M1, N1, mm1, nn1
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> !     Beginning of program
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> 
>       call SlepcInitialize(PETSC_NULL_CHARACTER,ierr)
>       call MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr)
> !     make sure you set NGLOBAL = 10, MY_NO_ROWS = 5 and run with two processes
>       NGLOBAL = 10
>       MY_NO_ROWS = 5
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> !     Compute the operator matrices that define the eigensystem, Ax=kBx
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> !!!!!!!!!  Setup  A matrix
> 
>       call MatCreate(PETSC_COMM_WORLD,A,ierr) 
>       call MatSetsizes(A,MY_NO_ROWS, MY_NO_ROWS ,NGLOBAL,NGLOBAL,ierr) 
>       call MatSetFromOptions(A,ierr)
>       call MatGetSize(A,M1,N1,ierr)
>       write(*,*)'Rank [',rank,']: global size of A is ',M1, N1
>       call MatGetLocalSize(A,mm1,nn1,ierr)
>       write(*,*)'Rank [',rank,']: my local size of A is ',mm1, nn1
>       call MatMPIAIJSetPreallocation(A,three, PETSC_NULL_INTEGER,one,    &
>      &      PETSC_NULL_INTEGER,ierr)  !parallel (MPI) allocation
> 
> !!!!!!!!!  Setup  B matrix
>       call MatCreate(PETSC_COMM_WORLD,B,ierr) 
>       call MatSetsizes(B,MY_NO_ROWS, MY_NO_ROWS ,NGLOBAL,NGLOBAL,ierr) 
>       call MatSetFromOptions(B,ierr)
>       call MatGetSize(B,M1,N1,ierr)
>        write(*,*)'Rank [',rank,']: global size of B is ',M1, N1
>       call MatGetLocalSize(B,mm1,nn1,ierr)
>       write(*,*)'Rank [',rank,']: my local size of B is ',mm1, nn1
>   
>       call MatMPIAIJSetPreallocation(B,three, PETSC_NULL_INTEGER,one,    &
>      &      PETSC_NULL_INTEGER,ierr)  !parallel (MPI) allocation
>  
> ! initalise
>       call MatZeroEntries(A,ierr)
>       call MatZeroEntries(B,ierr)
>  
> !  Fill in values of A, B and assemble matrices
> !  Both matrices are tridiagonal with
> !  Aij = cmplx( (i-j)**2, (i+j)**2)
> !  Bij = cmplx( ij/i + j, (i/j)**2)
> !  (numbering from 1 )
>  
>       do i = 1, NGLOBAL
>         ! a rather crude way to distribute rows      
>         if (i < 6) owner = 0
>         if (i >= 6) owner = 1
>         if (rank /= owner) cycle
>         do j = 1, NGLOBAL
>             if ( abs(i-j) < 2 ) then
>                  write(*,*)rank,' : Setting ',i,j
>                  di = dble(i)         ; dj = dble(j)
>                
>                  reala = (di - dj)**2 ; imaga = (di + dj)**2
>                  a_entry = dcmplx(reala, imaga)
>                  realb = (di*dj)/(di + dj) ; imagb = di**2/dj**2
>                  b_entry = dcmplx(realb, imagb)              
>                
>                  ic = i -1 ; jc = j-1  ! convert to C indexing
>                  call MatSetValue(A, ic, jc, a_entry, ADD_VALUES,ierr)
>                  call MatSetValue(B, ic, jc, b_entry, ADD_VALUES,ierr)
>             endif
>         enddo
>       enddo
> 
>       call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr)  
>       call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr)
>       call MatAssemblyBegin(B,MAT_FINAL_ASSEMBLY,ierr) 
>       call MatAssemblyEnd(B,MAT_FINAL_ASSEMBLY,ierr)
>  
> !     Check matrices 
>       write(*,*)'A matrix ... '
>       call MatView(A,PETSC_VIEWER_STDOUT_WORLD,ierr)
>       write(*,*)'B matrix ... '
>       call MatView(B,PETSC_VIEWER_STDOUT_WORLD,ierr)
>  
>       call MatCreateVecs(A,PETSC_NULL_VEC,xr)
>       call MatCreateVecs(A, PETSC_NULL_VEC,xi)
> 
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> !     Create the eigensolver and display info
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> !     ** Create eigensolver context
>       call EPSCreate(PETSC_COMM_WORLD,eps,ierr)
> 
> !     ** Set operators.for general problem  Ax = lambda B x
>       call EPSSetOperators(eps,A, B, ierr)   
>       call EPSSetProblemType(eps,EPS_GNHEP,ierr)
> 
> !     ** Set solver parameters at runtime
>       call EPSSetFromOptions(eps,ierr)
>  
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> !     Solve the eigensystem
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> 
>       write(*,*)'rank',rank, 'entering solver ...'
>       call EPSSolve(eps,ierr)
>   
> !     ** Free work space
>       call EPSDestroy(eps,ierr)
>       call MatDestroy(A,ierr)
>       call MatDestroy(B,ierr)
>       
>       call VecDestroy(xr,ierr)
>       call VecDestroy(xi,ierr)
> 
>       call SlepcFinalize(ierr)
>       end



More information about the petsc-users mailing list