[Petsc-trilinos-discussion] Scope and requirements

Barry Smith bsmith at mcs.anl.gov
Wed Nov 20 17:25:39 CST 2013


On Nov 20, 2013, at 12:46 PM, Heroux, Michael A <maherou at sandia.gov> wrote:

> It seems to me that we should start this discussion with scope and requirements.
> 
> What needs are we addressing?  What are the requirements?
> 
> Are we restricting our efforts to solvers?  Linear, nonlinear, eigen, transient, more?

   I do NOT like it when users use only the PETSc linear solvers. The further up the software stack they start the better off they are and the better off we are. So if they are solving a time dependent problem I want them to use our time dependent solver interface. 

   In addition, I much prefer if they use our DM abstraction (think of it as encapsulating information about the mesh needed/desired by the solvers to get good performance) rather than manually handling providing all the needed information to the solvers. 

   In particular if they are, say, solving two coupled sets of PDEs (time dependent) I want them to hand off to us the entire system, not to have themselves manage the coupling and call our solvers each time-step for each of the two sub solves separately. 

   As with Trilinos we offer lots and lots of user control over the two solves, the nonlinear solves, the linear solves, the matrix formats, the vector formats. This makes switching between Trilinos and PETSc time-consuming because both expose an enormous API (the entire stack). One can roughly map between the two APIs but there are not exact mappings and only experts (or know one) understands the details of both. We all know the problems and drawbacks with trying to come up with a common API. But perhaps we could try to document the possible mappings??? For example, here is the PETSc Vec API and the Trilinos equivalent. Regardless of the outcome of this discussion we would like benefit enormously from this mapping because we would see other ways of doing things, things we missed, things we could improve and simplify. 

   This is not really the place for it but since this is a free form mailing list I’ll start it here anyways. 

   Topic                          PETSc                                                                             Trilinos

    Communication
    and computation 
    contexts                     MPI_Comm (with thread comm attribute)                       I know you guys have something here
                                      Operations *  PetscCommGetNewTag()
                                                         *  MPI_Comm ops

    Indexing                    IS, PetscSection

    Vectors                      Vec

    Communication         PetscSF, VecScatter, 
      
    Matrices                     Mat

    Linear solvers             PC, KSP

    Nonlinear solvers        SNES
 
    ODE integrators          TS

     Solver factories          DM

     Mesh stuff                   DM

     “Coupling”                  Handled within each solver object (and Mat/vec object)
                                        by splitting etc. No special objects for it.

      Differentiation            Only finite differences within Mat, no cool AD stuff



   Big differences I see in our design:

      1)  that Trilinos uses explicit factories for even simple objects, while PETSc’s factories are (through delegation) internal to the abstract object interface.
     
       2) Trilinos uses templates that permeate the public interface (seems to make it difficult to mix with some Fortran code?)



    

> 
> What usage models are we supporting?
> 
> Mike
> _______________________________________________
> Petsc-trilinos-discussion mailing list
> Petsc-trilinos-discussion at lists.mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/petsc-trilinos-discussion



More information about the Petsc-trilinos-discussion mailing list