A thought on dependent types:<p>Can a dependent type system catch <i>all</i> type errors at compile time? For example, suppose I write the following (in pseudo-code):<p><pre><code> // Variable x is an integer greater than or equal to 0 and less than 256.
int x (>=0, <256)
x = 128 // This is valid.
x = x * 3 // This violates the type.
</code></pre>
I can imagine how a compiler could catch that kind of error. But that's trivial. What happens in programs like this:<p><pre><code> int x (>= 0, <=10)
x = parseInt(getKeyboardInput)
</code></pre>
Now the compiler can't know for sure whether the type has been violated, because the value of getKeyboardInput could be anything. To take a page from Haskell, you could do something like this (which is still pseudocode, not valid Haskell):<p><pre><code> // x is a value that is either 1) an int from 0 to 10, or 2) nothing at all.
maybe (int (>= 0, <=10) x
// applyConstraint recognizes that parseInt may return a value violating x's contraints.
// Thus it transforms the return type of parseInt from int to maybe (int (>= 0, <=10)
x = applyConstraint(parseInt(getKeyboardInput))
</code></pre>
Or perhaps applyConstraint wouldn't have to be called explicitly, but would be implicitly added by the compiler as needed. I'm not sure which is better stylistically.<p>Either way, applyConstraint would be required any time a computation could return an invalid value. That would get tricky, because the compiler would have to track the constraints on every variable, even where those constraints aren't declared. For example:<p><pre><code> int w (>= 0, <= 10)
int x (>= 0, <= 2)
int y
int z (>= 0, <= 20)
y = w * x
z = y
</code></pre>
Here, the compiler would have to infer from the assignment "y = w * x" that y is always between 0 and 20.<p>Do any languages currently take the idea this far (or farther)?