[petsc-dev] Non-scalable matrix operations
Matthew Knepley
knepley at gmail.com
Fri Dec 23 10:54:34 CST 2011
On Fri, Dec 23, 2011 at 10:48 AM, Mark F. Adams <mark.adams at columbia.edu>wrote:
>
> On Dec 23, 2011, at 10:53 AM, Jed Brown wrote:
>
> On Fri, Dec 23, 2011 at 09:50, Mark F. Adams <mark.adams at columbia.edu>wrote:
>
>> Humm, my G-S in not in PETSc and it is perfectly scalable. It does have
>> more complex communication patterns but they are O(1) in latency and
>> bandwidth. I'm not sure I understand your description above.
>
>
> It was more like, here's something that perhaps we want to put in PETSc,
> what rich communication pattern does it use, such that, if provided, the
> implementation would be simple?
>
>
> There is the implementation in Prometheus that uses my C++ linked lists
> and hash tables. I would like to implement this with STLs. I also hack
> into MPIAIJ matrices to provide a primitive of applying G-S on an index set
> of local vertices, required for the algorithm. This should be rethought.
> I would guess that it would take about a week or two to move this into
> PETSc.
>
> The complex communication required make this code work much better with
> large subdomains, so it is getting less attractive in a flat MPI mode, as
> it is currently written. If I do this I would like to think about doing it
> in the next programming model of PETSc (pthreads?). Anyway, this would
> take enough work that I'd like to think a bit about its design and even the
> algorithm in a non flat MPI model.
>
I think we should give at least some thought to how this would look in
Thrust/OpenCL.
Matt
> Note, I see the win with G-S over Cheby in highly unsymmetric (convection,
> hyperbolic) problems where Cheby is not very good.
>
> Mark
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111223/eca66ab9/attachment.html>
More information about the petsc-dev
mailing list