Running squential and parallel linear solver in the same code?
Ben Tay
zonexo at gmail.com
Tue Jan 23 20:18:45 CST 2007
Hi Barry,
Thanks for your recommendation. I've managed to use mat_view to view the
matrix. However, there's something which I don't understand. My linear eqns
consist of 36 pts - so I see 36 rows - 0 to 35.
For the uniprocessor, I saw 0-35rows. These show the coefficients of my
matrix. Then there's a repeat of 0-35 rows again. But what are these values?
I don't understand.
For the 4 processors, I saw exactly the same 0-35rows. After that I saw 0-8
rows repeated 4 times. The values are now different, even for the same 1st 0
row. Some values are missing. I guess there's where my mistakes are.
Thank you.
On 1/23/07, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>
> Ben,
>
> You can use -pc_type redundant -redundant_pc_type lu to solve the
> linear system with a sequential LU to test if that improves the situation.
>
> Likely your generation of the matrix on 2 processes is wrong, you can run
> with -mat_view on a small problem with both 1 and 2 processes and see if
> they are the same in both cases.
>
> I don't think what you propose to do below makes sense,
>
> Barry
>
>
> On Tue, 23 Jan 2007, Ben Tay wrote:
>
> > Hi,
> >
> > In my MPI CFD code, I have to solve 2 linear eqns - momentum and poisson
> > eqn. There's some error in the code and I don't know which one causes
> the
> > error. May I know if it is possible to solve 1 eqn in sequential mode
> while
> > the other in MPI to pinpoint the error since my code works ok in a
> > sequential mode.
> >
> > I tried to use this for one of the eqn:
> >
> >
> > call
> >
> MatCreateSeqAIJ(PETSC_COMM_SELF,size_x*size_y,size_x*size_y,9,PETSC_NULL_INTEGER,A_mat,ierr)
> >
> > call VecCreateSeq(PETSC_COMM_SELF,size_x*size_y,b_rhs,ierr)
> >
> > but there's error.
> >
> > Thank you.
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20070124/26829375/attachment.htm>
More information about the petsc-users
mailing list