<html>
<head>
<meta content="text/html; charset=utf-8" http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
Based on your point here and earlier, Tim, I agree with going with
option #1. So count this as a +1 for that.<br>
<br>
Function references give us something we can not otherwise do, while
overloaded functions is just a convenience (which not everyone will
like).<br>
<br>
How does this apply to intrinsic functions in the library, though,
eg, strcat( ) which you could view as an overloaded function?<br>
<br>
Can we address this more cleanly by introducing a type "var"? (I
can't recall where we left prior discussion on that issue). Also,
are we dipping into the tangentially related issue of var-args
(varying number of args) here?<br>
<br>
- Mike<br>
<br>
ps. I prefer the approach of "simple and consistent but powerful set
of core rules that are orthogonal and compose well together" in the
C, Haskell, Python vein.<br>
<br>
<br>
<div class="moz-cite-prefix">On 1/28/15 9:01 AM, Tim Armstrong
wrote:<br>
</div>
<blockquote
cite="mid:CAC0jiV5GuoX7ttBA5pC+wJdm1mRkD29_ft+Uqg9zM1VWp=uMSA@mail.gmail.com"
type="cite">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<div dir="ltr">
<div>Mike - that's technically possible to implement but I don't
see the cost/benefit stacking up that well - I think any new
syntactic constructs come with a non-neglible
implementation/documentation/user attention cost that can
quickly accumulate if we're not selective about what we put
into the language (see C++). Tangentially, that particular
syntax is difficult to parse unambiguously - you could be
trying to index a variable called int and pass it into max. <br>
<br>
</div>
<div>The problem I see with #3 is that we're allowing overloaded
functions so that the standard library seems slightly
neater/more friendly. If that results in a bunch of
exceptions where users can't do something that "ought to" work
- composing two standard library functions reduce/max, I just
don't see enough of a net benefit (if any) to the programmer
experience to justify the implementation burden. <br>
<br>
This of course is somewhat subjective - I tend to like
languages where there is a simple and consistent but powerful
set of core rules that are orthogonal and compose well
together - you can learn the basic rules and don't have to
worry about many exceptions. Other language designs will
special-case common constructs so that use-cases that the
language designer expects to be common will be shorter or
"cuter", or add features that don't compose well with other
features and come with caveats/exceptions. I think C, Haskell
and Python are examples of the former, and Perl, R and C++ are
examples of the latter. I'm not sure which camp we see
ourselves in.<br>
<br>
</div>
<div>I haven't found complete documentation on Apple Swift's
type inference, but it doesn't seem like it does anything very
sophisticated - Swift/T already does similar local type
inference.<br>
<br>
</div>
<div>- Tim<br>
</div>
</div>
</blockquote>
<br>
<pre class="moz-signature" cols="72">--
Michael Wilde
Mathematics and Computer Science Computation Institute
Argonne National Laboratory The University of Chicago
</pre>
</body>
</html>