[petsc-dev] ugliness due to missing lapack routines

Jed Brown jedbrown at mcs.anl.gov
Thu Feb 7 23:28:54 CST 2013


On Thu, Feb 7, 2013 at 11:01 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>  No! The tools are going to DO type inference! Why not?
>

"tools", "compiler", "preprocessor". I don't care what name you use; they
all mean the same thing. You're not manipulating plain C because the
semantics aren't plain C, they involve the output of the preprocessor.


> >
> > 2. Somewhere we have to decide between static and dynamic typing. Will
> your language have a JIT?
>
>    Pretty much all static I think is fine.
>

We do a lot of dynamic stuff now, modifying types at run-time (injecting
methods, etc).


>    So today you use two languages, one absolutely horrible.  The only way
> to represent something together in them is as ASCII text! There is no AST
> for CPP + C nor were there ever be. That means there is no way to
> manipulate them except as ASCII text (which is why the only manipulation we
> do on PETSc source code is using an editor or using regular expressions)
> (and in this circumstance using regular expressions is pretty limited).
> This means the only improvement and expansion of our code base we can do is
> by manual programmers typing away. This is not good enough.
>

1. "metaprogramming" is still "programming"

2. Your tool is producing new semantics. If you're writing code so that it
transforms in some way when a tool is applied, you're writing in a new
language.


>
>   Here is an explicit example, the 2 by 2 SNES problem. If PETSc were
> coded without any dynamic polymorphism we could essentially inline
> everything so the Newton loop and the direct solve inside it were all in
> one block of code and the compiler could do the best it could on it.
> Because of the polymorphism through function pointers in PETSc this could
> not happen today. So what could I do? I could open an editor, cut and paste
> the source code for the MatLUFactor_Dense() (just pretend for this example
> it is source code and doesn't call LAPACK) and drop it directly into
> MatLUFactor(), I could do the same thing for MatSolve_Dense(), for
> PCApply_LU() into PCApply(), VecAXPY_Seq() into VecAXPY(), SNESSolve_LS()
> into SNESSolve() etc. Now I want to go in and hardwire the loop size to two
> in the various places, maybe even strip out the loop part of the code
> completely in cases where the compiler optimizer sucks.
>

1. What language will you write all these "desired transformations" in?

2. I think that fusion belongs in a JIT so that you can still interact with
it using the normal run-time interface.

3. I think the JIT should be _only_ on optimization. I would implement it
by using a macro (yeah, yeah) to note the function name/file/line-number
associated with each dynamically registered function pointer (which are now
stored in a struct alongside the function pointer). Then I would always use
a special dispatch construct when I call or assign such "location-aware
function pointers". The JIT would then take a function of this sort and
chase down all the function pointers, gathering the call graph into one
compilation unit and replacing each dynamic call with a static call. We
have to _control_ use of CPP for this to work, but many uses of CPP are
harmless for this sort of thing.

The function-pointer resolution could be done using Clang or even a tool
like pycparser.


> Now there would be no function pointer polymorphism or complicated code
> and (in theory) the compiler could clump everything together and output
> good object code.
>
>   Now please tell me a better way today, that is more productive in
> programmers time to do this? I could try regular expressions but not likely
> to work and sure would be fragile.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20130207/33e42b4e/attachment.html>


More information about the petsc-dev mailing list