Parallel use of Prometheus
Nicolas Tardieu
niko.karin at gmail.com
Wed Jan 16 15:24:13 CST 2008
I also did this on my software and it shows terrific performance in serial.
My problem may come from the default distribution of the matrix in PETSc : N
rows on proc 0, N rows on proc 1, ....
How did you distribute the matrix in your program?
Nicolas
2008/1/16, Sanjay Govindjee <sanjay at ce.berkeley.edu>:
>
> I have not tried this with any of the examples in the distribution only
> with my own program.
>
> The steps are relatively straightforward but you must know them already
> since you have it
> working in serial. Nonetheless here is the order of call I have:
>
> call MatCreateMPIBAIJ(... ! must be blocked
> call MatAssemblyBegin (... ! Assemble the matrix
> call MatAssemblyEnd (...
>
> call VecAssemblyBegin (... !Assemble the RHS
> call VecAssemblyEnd (...
>
> call KSPCreate (...
> call KSPSetOperators (...
> call KSPGetPC (...
> call PCSetCoordinates(... ! I'm not fully certain
> but your problem may have to be defined over R^3 for this to work well
> call KSPSetFromOptions(...
>
> call KSPSolve (...
>
>
> Nicolas Tardieu wrote:
> > Hi Sanjay,
> >
> > Have you used it on simple examples of the PETSc distribution?
> >
> > Nicolas
> >
> > 2008/1/16, Sanjay Govindjee <sanjay at ce.berkeley.edu
> > <mailto:sanjay at ce.berkeley.edu>>:
> >
> > We have used it extensively with good success in the past.
> > -sanjay
> >
> > Nicolas Tardieu wrote:
> > > Hi everyone,
> > >
> > > I have been using Prometheus for scalar and vector problems with
> > great
> > > success on sequential computers.
> > > But when trying to use it in parallel, Prometheus crashes.
> > > Then I tried it on some PETSc tests
> > (src/ksp/ksp/examples/tests/ex3.c
> > > for instance) on parallel and they also crash...
> > > Does anyone have experienced Prometheus in parallel?
> > >
> > > Thanks,
> > >
> > > Nicolas
> > >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20080116/7ede8446/attachment.htm>
More information about the petsc-users
mailing list