I'm working with interval arithmetic.<p>Intervals have the annoying arithmetic behavior of tending to get wider.
So [1..3]+[2..5] ends up with [1..8]<p>On the other hand, if I'm trying to compute the "cover" of an area
this tendency to cover more area is what I want.<p>On the third hand, it seems that singleton intervals mirror arithmetic.
So [1..1]+[2..2] ends up with [3..3]. Since I have addition it seems I get multiplication.<p>Which leads me to the question... is type theory turing complete?
Can I compute any result using only types?
Can I create a "type computer"?
Not if the system is strongly normalizing, which most type systems aim for as a design objective: <a href="https://en.wikipedia.org/wiki/Normal_form_(abstract_rewriting)" rel="nofollow noreferrer">https://en.wikipedia.org/wiki/Normal_form_(abstract_rewritin...</a>