[petsc-users] Monte carlo for eigenvalue problem in paralle

Hong hzhang at mcs.anl.gov
Wed Jan 21 10:41:54 CST 2015


Luc,
SLEPc is for eigenvalue problems with sparse matrices; it interfaces to
LAPACK for sequential dense problems.

I'm adding MatElementalHermitianGenDefiniteEig() into PETSc/Elemental
interface for a petsc user (see
https://bitbucket.org/petsc/petsc/branch/hzhang/elemental-matconvert).

You may take a look at Elemental and check which eigenvalue routine would
work for your problem. We may add it to the interface if it does not take
much of effort.

Hong

On Wed, Jan 21, 2015 at 9:55 AM, Luc Berger-Vergiat <lb2653 at columbia.edu>
wrote:

>  You can also look into SLEPc to compute eigenvalues in parallel for the
> global system.
> SLEPc is based on PETSc so it should not be too hard to use in your code.
>
> Best,
> Luc
>
> On 01/21/2015 10:53 AM, Matthew Knepley wrote:
>
>  On Wed, Jan 21, 2015 at 9:51 AM, siddhesh godbole <
> siddhesh4godbole at gmail.com> wrote:
>
>>  Hello,
>>
>>  I want to execute Monte Carlo simulations for eigenvalue problem in
>> parallel with PETSC. i am using LAPACKsygvx function to compute all the
>> eigenvalues. Now the problem i am facing is that, i think LAPACKsygvx
>> requires Matrices to be declared as MATSEQDENSE, and this sequential type
>> is not allowing me to execute this program on multiple processors.
>>
>>  How should i go about this problem? can i use different communicators
>> or something? Please help!
>>
>
>  Do you need all the eigenvalues? If so, you can try to use Elemental,
> with the development version of PETSc.
>
>    Thanks,
>
>      Matt
>
>
>>
>>   *Siddhesh M Godbole*
>>
>>  5th year Dual Degree,
>> Civil Eng & Applied Mech.
>> IIT Madras
>>
>
>
>
>  --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150121/cbbf2e3d/attachment.html>


More information about the petsc-users mailing list