[petsc-users] Storage space for symmetric (SBAIJ) matrix
Daniel Langr
daniel.langr at gmail.com
Tue Sep 21 14:20:49 CDT 2010
Dne 21.9.2010 20:53, Jed Brown napsal(a):
> On Tue, Sep 21, 2010 at 20:44, Daniel Langr<daniel.langr at gmail.com> wrote:
>> we do not need and do not want to use PETSc for writing a matrix into a
>> file. Such file should be independent of any particular solver. That's why
>> we want do use HDF5 library with parallel I/O capabilities. I can simply
>> store CSR (or COO or any other scheme) arrays for the upper triangular part
>> of a matrix to the file and some supporting information such as number of
>> rows and nonzeroes.
>
> You are tying yourself much closer to the solver if you write the
> matrix out in partitioned split arrays (a PETSc-specific format, plus
> a very specific decomposition).
That's right, thanks. Very firstly I thought that partitioned split
arrays are simply the CSR arrays of the partitioned submatrix.
>> As Jed mentioned, we can write only the state to the file. But parallel I/O
>> is also part of our project and research, that's why we bother :). Also,
>> when we want to compare different methods for solution, we would need to
>> construct the matrix multiple times instead of just read it from the file,
>> which can by quicker.
>
> Benchmark it, reading the matrix in will always be slower than
> assembling it in parallel (a factor>100 would not be surprising).
> How long does it take to write 100 TB (any file system in the world,
> any number of IO nodes)? Compare that to the couple of seconds it
> would take to assemble the 100 TB matrix with 100k procs.
>
> Jed
Certainly true, but the time for computing matrix elements will take
much more time than the assembly process. To say truth we don't know yet
how much, because new method for computing Hamiltonian are still being
developed. Within competing methods this takes about 20 to 30 percent of
the time of the iterative eigensolver run, which is not neglectable. If
not stored, the elements would need to be evaluated multiple times.
Daniel
More information about the petsc-users
mailing list