[petsc-users] Matrix Construction Question
Jack Poulson
jack.poulson at gmail.com
Tue Jun 28 17:06:05 CDT 2011
Adam,
What operations do you need to perform with the inverses of your Hamiltonian
matrices?
The inverse of a sparse matrix is still, in general, dense. That is what
Matt was getting at. Using a sparse matrix format will be a bad idea. If
your operator was close to Hermitian positive-definite then you could look
into the H-matrix/semi-separable literature for means of forming the inverse
in roughly O(n log^2(n)) arithmetic, but Hamiltonian matrices are not quite
as well-behaved and I'm not aware of readily-available techniques for
forming approximations to their inverses. With the unstructured dense linear
algebra approach, it will cost you O(n^3) operations to form the inverse
even if your sparse matrix only contains O(n) entries.
If I were you, I would think carefully about how to avoid forming the
inverse.
Regards,
Jack Poulson
On Tue, Jun 28, 2011 at 5:01 PM, Adam Byrd <adam1.byrd at gmail.com> wrote:
> Actually, it's quite sparse. In the 3600x3600 there are only just 4 nonzero
> entries in each row. This means it's 99.9% empty. My smaller 6x6 example is
> dense, but it's only practice building and manipulating matrices.
>
> Respectfully,
> Adam
>
>
> On Tue, Jun 28, 2011 at 5:55 PM, Matthew Knepley <knepley at gmail.com>wrote:
>
>> It sounds like you have a dense matrix (from your example). Is this true?
>> If so, you should use Elemental (on Google Code).
>>
>> Thanks,
>>
>> Matt
>>
>> On Tue, Jun 28, 2011 at 8:55 AM, Adam Byrd <adam1.byrd at gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I'm rather new to PETSc and trying to work out the best way to create and
>>> fill a large sparse matrix distributed over many processors. Currently, my
>>> goal is to create a 3600x3600 matrix in units of 12x12 blocks with several
>>> blocks on any given node. I'd like to create the matrix in such a way that
>>> each node only holds the information in it's handful of blocks and not the
>>> entire matrix. Eventually, this matrix is to be inverted (I know, inversion
>>> should be avoided, but as this is a Hamiltonian matrix from which I need the
>>> Green's function, I'm unaware of a way to forgo carrying out the inversion).
>>> Additionally, the values will be changed slightly and the matrix will be
>>> repeatedly inverted. It's structure will remain the same. In order to learn
>>> how to do this is I am starting with a small 6x6 matrix broken into four 3x3
>>> blocks and distributed one block per node. I've been able to create a local
>>> 3x3 matrix on each node, with it's own values, and with the global
>>> row/column IDs correctly set to [0, 1, 2] or [3, 4, 5] depending on where
>>> the block is in the matrix. My problem manifests when I try to create the
>>> larger matrix from the individual smaller ones. When the matrix is
>>> constructed I'm trying to use MatSetValues and having each node pass in it's
>>> 3x3 block. I end up with an error that the sum of local lengths 12x12 does
>>> not match the global length 6x6. It appears as though this is from passing
>>> in four 3x3s and the program interpreting that as a 12x12 instead of as a
>>> 6x6 with the blocks in a grid.
>>>
>>> My question is then: is it possible to fill a matrix as a grid of blocks,
>>> or can I only fill it in groups of rows or columns? Also, am I approaching
>>> this problem the correct way, or are there more efficient ways of building
>>> this matrix with the ultimate goal of inverting it?
>>>
>>> I have included my copy of a modified example if it helps. I do apologize
>>> if this is answered somewhere in the documentation, I have been unable to
>>> find a solution.
>>>
>>> Respectfully,
>>> Adam
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110628/e493256f/attachment.htm>
More information about the petsc-users
mailing list