Newbie question about synchronization

Diego Nehab diego.nehab at gmail.com
Thu Mar 29 02:42:50 CDT 2007


Hi,

I am very new to Petsc, and I am not very knowledgeable in
numerical methods. Therefore, please forgive my ignorance.

I need to solve sparse (at most 7 non-zero elements per row)
overconstrained linear systems, in the least squares sense.
The systems are very large (typically 3M x 500k), but have a
reasonably strong diagonal component that apparently makes
them solvable.

In the past I have directly used the C version of
C. C. Paige and M. A. Saunders LSQR library, with no problems.
My interest in Petsc is the eternal quest for speed. That,
I hope, could come in the form of better methods, a better matrix
representation and, naturally, parallelism.

1) Am I right to assume the only solver that accepts
overconstrained systems is the lsqr?

So far, using only one process, everything is simple and
works (it took me longer to compile and test MPI and Petsc
then to write code that solves my problem :)).

When I move to multiple processes, the solver still works, but I
couldn't figure out how to select one of the processes to
consolidate the solution vector and use it to save a file on disk.
I always get an error of the form

    [0]PETSC ERROR: Argument out of range!
    [0]PETSC ERROR: Can only get local values, trying xxx!

I assume I must bring the solution vector back from the other
processes, right?

2) If so, how do I do this?

As it turns out, the normal version of the system (A^t A x = A^t b)
is still very sparse in my case. So I used MatMatMultTranspose
and MatMultTranspose to produce the normal equations and
solved the square system.

Using ksp_type cg and the default settings for the ilu, the solver
converges much faster than using lsqr, which is great. However,
MatMatMultTranspose doesn't seem to like the mpiaij matrix format.

3) If I want to try the normal approach in multiple processes, do I
have perform
the product myself, or is there another function that does this for me?

Thanks in advance,
Diego.




More information about the petsc-users mailing list