[petsc-users] SLEPc generalized eigenvalue problem question?

John Chludzinski jchludzinski at gmail.com
Mon Aug 1 13:36:40 CDT 2011


These are definitely not dense matrices (>99% zeros).

The way I described storing them just happened to be a way I found that
worked (for EPS = LAPACK).  But I want to use methods intended for sparse
matrices + MPI.

Which is the best canonical PETSc form (for the matrices) and best EPS?

---John


On Mon, Aug 1, 2011 at 2:27 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Mon, Aug 1, 2011 at 6:22 PM, John Chludzinski <jchludzinski at gmail.com>wrote:
>
>> I'm a newbie with both PETSc & SLEPc and have had some trouble finding
>> examples/tutorials for newbies.  I've looked through the examples in the
>> PETSc and SLEPc directories but still am having "issues" seeing how to set
>> this up for the type of problem I have.
>>
>> SLEPc ex7.c is a good place to start but there's still how best to store
>> the matrices and which EPS to use (besides LAPACK).
>>
>> Found a PDF, "MATRICES IN PETSc", (after much googling) but not sure which
>> of the many forms will work and which is best.
>>
>
> 1) PETSc and SLEPc are designed to be efficient for sparse matrices. If you
> want eigenvalues of dense matrices, use Elemental (as I pointed out in a
> previous message)
>
> 2) If you generate matrices with C code, why not just call MatSetValues()
> for each row in that code?
>
>    Matt
>
>
>> ---John
>>
>>
>> On Mon, Aug 1, 2011 at 2:12 PM, John Chludzinski <jchludzinski at gmail.com>wrote:
>>
>>> I have 2 files (matrices) in simply binary form (IEEE-754, generated by
>>> some C code) and wished to get them into canonical "PETSc binary form". So I
>>> did:
>>>
>>> Mat A;
>>> PetscScalar *a;
>>>
>>> ierr = PetscMalloc(SIZE*SIZE*sizeof(PetscScalar),&a);CHKERRQ(ierr);
>>> // stored the file into the space malloc'ed for 'a'.
>>> MatCreateSeqDense(PETSC_COMM_SELF, n, n, a, &A);
>>> MatView(A,PETSC_VIEWER_BINARY_(PETSC_COMM_WORLD));
>>>
>>> This works when I use: -eps_type lapack.  As long as I store the matrix
>>> in column major order.
>>>
>>> ---John
>>>
>>> On Mon, Aug 1, 2011 at 1:40 PM, Matthew Knepley <knepley at gmail.com>wrote:
>>>
>>> On Mon, Aug 1, 2011 at 5:27 PM, John Chludzinski <jchludzinski at gmail.com
>>>> > wrote:
>>>>
>>>>> I create 2 matrices using:
>>>>>
>>>>> MatCreateSeqDense(PETSC_COMM_SELF, n, n, Ka, &A);
>>>>> MatCreateSeqDense(PETSC_COMM_SELF, n, n, Kb, &B);
>>>>>
>>>>> These matrices are 99% zeros ( 16,016,004 entries and 18660 non-zeros).
>>>>>  They are symmetric and real.  Their tri-diagonal elements are non-zero plus
>>>>> a few other entries.
>>>>>
>>>>
>>>> Please give some justification for doing this? On the surface, it just
>>>> seems perverse.
>>>>
>>>>    Matt
>>>>
>>>>
>>>>> I tried to use ex7 for the generalized eigenvalue problem:
>>>>>
>>>>> ./ex7.exe -f1 k.dat -f2 m.dat -eps_gen_hermitian -eps_smallest_real >
>>>>> x.out 2>&1
>>>>>
>>>>> without specifying an EPS and get:
>>>>>
>>>>> Generalized eigenproblem stored in file.
>>>>>
>>>>> Reading REAL matrices from binary files...
>>>>> Number of iterations of the method: 500
>>>>>  Number of linear iterations of the method: 4009
>>>>> Solution method: krylovschur
>>>>>
>>>>> Number of requested eigenvalues: 1
>>>>> Stopping condition: tol=1e-07, maxit=500
>>>>> Number of converged approximate eigenpairs: 0
>>>>>
>>>>> Is krylovschur inappropriate for this problem or have I set up the
>>>>> problem incorrectly by using   MatCreateSeqDense(...) to create the matrix
>>>>> input files in PETSc binary form?
>>>>>
>>>>> ---John
>>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>
>>>
>>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110801/c655feff/attachment.htm>


More information about the petsc-users mailing list