[petsc-dev] Non-scalable matrix operations
Mark F. Adams
mark.adams at columbia.edu
Fri Dec 23 12:19:16 CST 2011
On Dec 23, 2011, at 12:02 PM, Jed Brown wrote:
> On Fri, Dec 23, 2011 at 10:48, Mark F. Adams <mark.adams at columbia.edu> wrote:
> There is the implementation in Prometheus that uses my C++ linked lists and hash tables. I would like to implement this with STLs. I also hack into MPIAIJ matrices to provide a primitive of applying G-S on an index set of local vertices, required for the algorithm. This should be rethought. I would guess that it would take about a week or two to move this into PETSc.
>
> The complex communication required make this code work much better with large subdomains, so it is getting less attractive in a flat MPI mode, as it is currently written. If I do this I would like to think about doing it in the next programming model of PETSc (pthreads?). Anyway, this would take enough work that I'd like to think a bit about its design and even the algorithm in a non flat MPI model.
>
> Note, I see the win with G-S over Cheby in highly unsymmetric (convection, hyperbolic) problems where Cheby is not very good.
>
> Can we mix with Eisenstat to apply multiple cycles?
Eisenstat is not computing with zero data if there is no initial guess, right? I'm not sure I understand you here.
> I know the theory doesn't argue for it, but G-S with Cheby sometimes wins over everything else I've tried.
People damp G-S for convection etc, which is what Cheb/GS does. Do you do this for SPD systems?
> Is there any hope of doing nonlinear G-S where the user can provide something moderately simple?
Nonlinear should be very simple actually. Just replace my hacks into (MPI)AIJ matrices with your own operator.
Mark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111223/9aed7c79/attachment.html>
More information about the petsc-dev
mailing list