[petsc-dev] getting ready for PETSc release

Jed Brown jed at jedbrown.org
Sun Jun 8 12:20:13 CDT 2014


Matthew Knepley <knepley at gmail.com> writes:

> On Sun, Jun 8, 2014 at 3:41 AM, Jed Brown <jed at jedbrown.org> wrote:
>
>> Where is your Riemann solver interface?  I see
>> DMPlexTSSetRHSFunctionLocal which has nothing in its name indicating
>> that it is a Riemann solver, has no support for non-conservative systems
>> (for which the Riemann solver returns a wave decomposition because
>> "flux" is not defined), and cannot support a transverse solve (to double
>> or triple the stable time step).  Note that this interface has a
>> context, unlike your FE quadrature functions.  (Interface inconsistency
>> is evil.)  Also, ex11 does not offer characteristic reconstruction.
>>
>> I meant for ex11 to be a demonstration of one way to solve these
>> problems, but certainly not THE CANONICAL WAY.  I do not agree with
>> moving it into PETSc, especially not using a name like PetscProblem.
>> I'd rather it be in a separate project (that was the idea with the
>> parmod repository, but I don't think you ever used that).
>
>
> Fine, but what in the FV interface is inappropriate for this. Also, I
> distrust the argument here because some of what you describe is
> incredibly special-purpose, namely it will only work on Cartesian
> grids, and thus should not determine large parts of the top level
> interface.

What only works on Cartesian grids?  Wave decomposition and such works
on unstructured grids.  PetscProblem is the special-purpose thing that
excludes large classes of good methods.  I think it is inappropriate to
use the PetscProblem namespace for a specific class of methods.

>> >> >> > Maybe MOOSE does. However, neither of those has good integration
>> with
>> >> >> > the solver (I have discussed this personally with Roy and Derek and
>> >> >> > they agree).
>> >> >>
>> >> >> You have to be able to formulate a similar scope of problems before
>> you
>> >> >> get to argue fine points of supporting geometric multigrid and the
>> like.
>> >> >> FWIW, it does not support the solve structure we use in pTatin so
>> gives
>> >> >> up significant performance relative to existing packages that use
>> solver
>> >> >> algorithms you want.
>> >> >>
>> >> >
>> >> > I asked you to help with this.
>> >>
>> >> I pointed out that your interface can't do that algorithm and I showed
>> >> you code that can.  I'm not sure what you want me to do.  Change your
>> >> interface to be able to do that method fast, implying other data
>> >> structure changes and possible loss of functionality that you want?
>> >
>> >
>> > Of course, there is nothing else to do but point out shortcomings. That
>> > makes things easy.
>>
>> Some problems are not appropriate for a given level of abstraction.  I
>> believe you are creating a specialized high-level PDE interface, which I
>> consider to be out of scope for PETSc.  Imagine if KSP was only capable
>> of solving scalar SPD problems.  "It's so easy to point out
>> shortcomings" and yet those shortcomings define a scope that is
>> incompatible with the name and not in the spirit of the software
>> package.
>
>
> KSP is of course only for Krylov solvers, and we dump everything else in PC.
> There is nothing wrong with limiting the scope of an abstraction to a useful
> subset. This is the essence of design, and a small-minded committment to
> absorbing everything will dilute the interface to nothing.

PetscProblem is a name that claims to be far more than it is.  You can't
compose these things because the scope is so limited and can't even
commit to using them because you'll end up spending lots of time
extending the interface (which is not currently user-extensible and
perhaps not useful to become user-extensible).

Why can't you develop this in a separate package?

>> >> How else would you describe it?  I suggested
>> >> DMMattsCoolDiscretizationTSSetIFunctionVolumetricQuadrature().
>> >
>> >
>> > Its not a DM.
>>
>> You access it via DMGetProblem() and your interfaces are analogous to
>> the DM-specific interfaces we have used in the past.
>>
>> You might recall that we had a discussion a couple years ago about
>> whether DM should be a "function space" or should hold the problem
>> definition (via callbacks).  I argued for separation and still think
>> that unadorned function spaces are useful, but others (especially Barry)
>> wanted to limit the number of objects.  And in fact, the physics
>> influences things like coarse-space homogenization and grid transfer
>> within solver algorithms, so now I mixed opinions about it.  But in any
>> case, we made a decision and built code around the DM owning that stuff.
>> If you want to fundamentally change that model, it needs to be a
>> project-wide decision, implemented consistently across projects.
>>
>> From the two examples that use PetscProblem, I think the issue is that
>> you want it to work with DMDA and DMPlex because the formulation does
>> not depend on grid topology, but the naming convention for
>> implementation-specific functions uses different namespaces.  But we can
>> have DM functions that are implemented by multiple implementations.  So
>> maybe this maps to
>> DMTSSetIFunctionMattsCoolDiscretizationVolumetricQuadrature().
>
>
> 1) Volumetric here is inappropriate because it supports boundary integrals

Through a different function, PetscProblemSetBdResidual().  (I will also
complain that this does not support different boundary types.)

> 2) It is supposed to support both FEM and FVM

But you need new functions for FV because PetscProblemSetResidual
definitely does not support it.

> 3) It is a sub-object, just like PetscLayout, meant to operate is service
> of a DM.
>      Its never used apart from it, just like PetscLayout and Vec/Mat.
> Layout is
>      also a generic name, but Layout only supports the simples kind of
> partition,
>      and thus cannot do more complex things, but we did not have a huge
>      discussion about it. I still fail to see how this is a big deal.

PetscLayout contains nontrivial data size and can be shared between
multiple objects (and we should do a better job of that sharing because
PetscLayout storage is not trivial with a million MPI ranks if each
object has its own).

We added the DMTS, DMSNES, etc., to hold functions of this sort.  Those
are shared between nested solvers via a DM.  Now you're adding a new
interface that does not explicitly name a problem level (TS, SNES, etc.)
or a class of discretizations.  (I'm using "discretization" here in a
general sense, independent of mesh topology.)  This interface
fragmentation is inconsistent and forces the user to deal with yet
another object.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 818 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140608/deb039f5/attachment.sig>


More information about the petsc-dev mailing list