Wow that takes me back. Back to a conference room where we were talking about Integers in Java. If you made it a class, an Integer could carry along all this other stuff about how big it was, what the operators were, etc. But generating code for it was painful because your code had to do all of these checks when 99% of the time you probably just wanted the native integer implementation of the CPU. And Boolean's were they their own type or just a 1 bit Integer? And did that make an enum {foo, bar, baz, bletch, blech, barf, bingo} just a 3 bit integer?<p>Integers as types can compile quickly, but then you need multiple types to handle the multiple cases. Essentially you have pre-decoded the size by making into a type.<p>At one point you had class Number, subclasses Real, Cardinal, and Complex, and within those a constructor which defined their precision. But I think everyone agreed it wasn't going to replace Fortran.<p>The scripting languages get pretty close to making this a non-visible thing, at the cost of some execution speed. Swift took it to an extreme, which I understand, but I probably wouldn't have gone there myself. The old char, short, long types seem so quaint now.