[petsc-users] SLEPc: Convergence Problems

Christopher Pierce cmpierce at WPI.EDU
Tue Oct 18 13:52:46 CDT 2016


Actually I don't.  Sorry, I'm fairly new to using SLEPc.  That would
explain why when I use krylov methods convergence is extremely slow
(~15,000 iterations for the first eigenpair), but when I use other
methods such as lobpcg and rqcg, which I've heard use preconditioners
automatically, convergence is much faster.

I'm using the MPIAIJ format to store my matrix which is mostly block
diagonal, but with a significant number of non-zero entries outside of
those regions.  I'm not running a multi-physics simulations so I don't
really have blocks in that sense.  I'm trying to solve the Schrodinger
equation in 2D/3D using the Finite Element Method.

Thanks,

Chris




On 10/17/16 12:39, Julian Andrej wrote:
> Do you precondition your eigenvalue problem? If not, you should. Let
> us know what structure your matrix has and which blocks (if there are
> any) include which physics.
>
> Regards
> Julian
>
> On Mon, Oct 17, 2016 at 5:30 PM, Christopher Pierce <cmpierce at wpi.edu> wrote:
>> I've implemented my application using MatGetSubMatrix and the solvers
>> appear to be converging correctly now, just slowly.  I assume that this
>> is due to the clustering of eigenvalues inherent to the problem that I'm
>> using, however.  I think that this should be enough to get me on track
>> to solving problems with it.
>>
>> Thanks,
>>
>> Chris
>>
>>
>> On 10/14/16 01:43, Christopher Pierce wrote:
>>> Thank You,
>>>
>>> That looks like what I need to do if the highly degenerate eigenpairs
>>> are my problem.  I'll try that out this week and see if that helps.
>>>
>>> Chris
>>>
>>>
>>>
>>>
>>> On 10/13/16 20:01, Barry Smith wrote:
>>>>   I would use MatGetSubMatrix() to pull out the part of the matrix you care about and hand that matrix off to SLEPc.
>>>>
>>>>   Others prefer to remove the Dirichlet boundary value locations while doing the finite element assembly, this way those locations never appear in the matrix.
>>>>
>>>>    The end result is the same, you have the slightly smaller matrix of interest to compute the eigenvalues from.
>>>>
>>>>
>>>> Barry
>>>>
>>>>> On Oct 13, 2016, at 5:48 PM, Christopher Pierce <cmpierce at WPI.EDU> wrote:
>>>>>
>>>>> Hello All,
>>>>>
>>>>> As there isn't a SLEPc specific list, it was recommended that I bring my
>>>>> question here.  I am using SLEPc to solve a generalized eigenvalue
>>>>> problem generated as part of the Finite Element Method, but am having
>>>>> difficulty getting the diagonalizer to converge.  I am worried that the
>>>>> method used to set boundary conditions in the matrix is creating the
>>>>> problem and am looking for other people's input.
>>>>>
>>>>> In order to set the boundary conditions, I find the list of IDs that
>>>>> should be zero in the resulting eigenvectors and then use
>>>>> MatZeroRowsColumns to zero the rows and columns and in the matrix A
>>>>> insert a large value such as 1E10 on each diagonal element that was
>>>>> zeroed and likewise for the B matrix except with the value 1.0.  That
>>>>> way the eigenvalues resulting from those solutions are on the order of
>>>>> 1E10 and are outside of the region of interest for my problem.
>>>>>
>>>>> When I tried to diagonal the matrices I could only get converged
>>>>> solutions from the rqcg method which I have found to not scale well with
>>>>> my problem.  When using any other method, the approximate error of the
>>>>> eigenpairs hovers around 1E00 and 1E01 until it reaches the max number
>>>>> of iterations.  Could having so many identical eigenvalues (~1,000) in
>>>>> the spectrum be causing this to happen even if they are far outside of
>>>>> the range of interest?
>>>>>
>>>>> Thank,
>>>>>
>>>>> Chris Pierce
>>>>> WPI Center for Computation Nano-Science
>>>>>
>>>>>
>>


-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 819 bytes
Desc: OpenPGP digital signature
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20161018/57596e34/attachment.pgp>


More information about the petsc-users mailing list