[Swift-devel] numeric types still

Ben Clifford benc at hawaga.org.uk
Fri Jun 20 05:41:32 CDT 2008


Numeric types bother me still. Substantially moreso now that my gsoc 
student Milena has been working on typechecking and has reached the point 
where the somewhat vague definitions of numerical types and implicit 
casting are causing trouble.

Option A:

I'm very much in favour of a single numeric type, represented internally 
by a java.lang.Double.

This is very much in line with what the implementation does at the moment 
(for example the thread Tibi started the other day, where @extractint 
actually uses java.lang.Double and can be used in either an int or a float 
context in SwiftScript).

This would change (though not necessarily make worse) the behaviour of 
integers when trying to go out of the range of what can be represented to 
the accuracy of 1.

With integer representation, max_int+1 would roll round to min_int, those 
being (respectively) +/- 2^31.

With double representation, the largest accurate integer is around 2^52 
(which is larger than integer can store, though not as long as if we used 
a long); after that I think that the problem will be that 2^52 + 1 = 2^52 
(so incrementing will get stuck rather than rolling round). (give or take 
a few orders of magnitude here, and assuming IEEE754 representation)

These numbers are sufficiently large to probably not cause problems any 
time soon in real usage of SwiftScript; should accuracy prove a problem, 
there is a clear path to move from double to some other more accuracte 
general purpose numerical representation that does not massively change 
the language (for example, switching to java.lang.BigDecimal).

On the downside, it changes type names - float and int both become 
'number' or some such. However, float and int could be deprecated and 
internally aliases to 'number'.

I think that the actual use of numbers in SwiftScript, as counters and 
things like that, that a rich numerical type system is not needed; and 
that this requires straightforward code change to implement.

Option B:

We could instead define when it is appropriate to implicitly cast between 
different types of numbers and in which direction.

I think this is more difficult to implement - it needs someone who is 
interested in thinking about and defining the casting rules, which I think 
no one is.

I think this also would increase, rather than decrease, the complexity of 
the code base.

As you can probably tell, I am not in favour of option B.

I'm going offline this weekend; but Milena is lurking on this list and 
might even come out of hiding to discuss further.

-- 



More information about the Swift-devel mailing list