Some questions about the parallel implementation of PETSc

Ben Tay zonexo at gmail.com
Sat Jan 13 01:44:24 CST 2007


hi,

actually it's a c grid which forms an airfoil. can I use DA & multi
grid too? btw , regarding my previous question , is the answer for
each process the I ocal values only or is it global?

thanks !

On 1/13/07, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>   Ben,
>
>   Sounds like you are using a logically rectangular grid in two dimensions?
> (or 3). If so I highly recommend using the DMMG infrastructure in PETSc it
> handles all the decomposition of the domain into subrectangles, preallocates
> the
> matrix, handles ghost point updates and even lets you using multigrid. Check
> src/ksp/ksp/examples/tutorials/ex29.c and ex22.c
>
>   Good luck,
>
>    Barry
>
>
> On Sat, 13 Jan 2007, Ben Tay wrote:
>
> > Hi,
> >
> > I've the sequential version of PETSc working and I have some questions
> about
> > the parallel implementation.
> >
> > My code is to solve the NS eqn. Mainly it has to solve 2 linear eqns -
> > momentum & poisson. The poisson eqn solving takes the most % of time so it
> > going to be solved in parallel. Most likely the momentum eqn 'll be as
> well.
> > The other parts of the code may be parallelized later using MPI.
> >
> >
> >
> > The parallel part is coded as follows:
> >
> > call MatCreate(PETSC_COMM_WORLD,A,ierr)
> >
> > call MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,m*n,m*n,ierr)
> >
> > call MatSetFromOptions(A,ierr)
> >
> > call MatGetOwnershipRange(A,Istart,Iend,ierr)
> >
> > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,m*n,b,ierr)
> >
> > call VecSetFromOptions(b,ierr)
> >
> > call VecDuplicate(b,x,ierr)
> >
> > ... insert matrix in parallel making use of Istart,Iend.
> >
> > ... assembly
> >
> > ... solve
> >
> > Since I've used PETSC_DECIDE, PETSc will determine the local dimension of
> A
> > and b and x automatically.
> >
> > Is the no. of rows of A and b and x (Iend-Istart+1) locally?
> >
> > After the eqn is solved, I need to obtain the answer x
> >
> > I've used:
> >
> >
> > call VecGetArray(x,ppv,i_vec,ierr)
> >
> > do j=1,size_y
> >
> >     do i=1,size_x
> >
> >        k=(j-1)*size_x+i
> >
> >        p(i,j)=ppv(k+i_vec)
> >
> >     end do
> >
> > end do
> >
> > call VecRestoreArray(x,ppv,i_vec,ierr)
> >
> > I'm trying to get the address of the 1st value of x and map the values
> onto
> > a rectangular grid to update p. Is this correct?
> >
> > Or am I only able to get the local value, as in the local value of x. Do I
> > need to use some MPI routines if I need to update the p values on the
> whole
> > global grid?
> >
> > thanks alot!
> >
>
>




More information about the petsc-users mailing list