I found this post and the code and math so fascinating it changed my life when I found it three years ago. I did not previously understand the capability of computers to work with the abstract logic of infinite sets in a meaningful way and it led me into the world of formalization of mathematics in the dependently typed programming language Agda, which is one of the current activities of the mathematician (Martin Escardo) who wrote this guest blog post.
The article mentions the inability to decide equality of functions. A paper [1] about proving properties of functional programs shows how to do that modulo halting theorem. E.g., if (\x -> f x == g x) halts we can decide whether f and g are equal.<p>[1] <a href="http://xenon.kiam.ru/~roman/doc/2009-Klyuchnikov_Romanenko--Proving_the_Equivalence_of_Higher-Order_Terms_by_Means_of_Supercompilation--slides.pdf" rel="nofollow">http://xenon.kiam.ru/~roman/doc/2009-Klyuchnikov_Romanenko--...</a><p>Basically, if supercompilation of the expression above results in a huge nested case that has either True as a result (termination) or an application of function (not terminated due to infinite data structures, for example), then functions are equivalent. Otherwise, if there is at least one False, they are not, the path to False provides a counterexample.
Can someone explain to me in plain English what this is?<p>I looked up Cantor Sets so I have a vague understanding... but there are many things I don't understand. Like what does it mean to have a sequence with a bit appended to a Cantor Set? Why is that a Cantor Set? What the heck is a total <i>p</i> - is it a function?
This seems like little more than a parlor trick.<p>First, a true "exhaustive search" is actually impossible. The cantor space is uncountable (which can be seen by prepending '0.' before every binary sequence which gives the real number interval [0,1).) But an exhaustive search will visit all elements in some order which provides an ordering. Thus its existence implies that the cantor space is countable. Contradiction. The post gets around this by only looking at the first n elements of every binary sequence: f,g and h evaluate little more than the 7th digit. Sure, exhaustively searching 2^7 elements can be done in less than a second, why is this new?<p>> In fact, e.g. the function type Integer -> Integer doesn’t have decidable equality because of the Halting Problem, as is well known. However, common wisdom is not always correct, and, in fact, some other function types do have decidable equality, for example the type Cantor -> y for any type y with decidable equality, without contradicting Turing.<p>I fail to see how this can be true. We can set y = Integer, since integers have decidable equality. Then we can encode every integer as an element of cantor space by converting it into binary with the least significant digit first (and using the 0th element of the sequence to distinguish positive and negative integers) and padding the sequence with zeros towards infinity. Then two functions f,g : Integer -> Integer are equal iff their transformed representations f . p, g . p : Cantor -> Integer are equal. Thus functions from the cantor space having decidable equality implies functions from the integers having decidable equality which "solves" the halting problem. Again the post gets around this by only looking at the first few elements... but equality of functions f,g : Integer -> Integer is trivially decidable if f and g are zero for integers bigger than some N, so how is this new?
The first thing to note that, given a deterministic program that computes a boolean given an infinite bit string will either run in O(1) time for all inputs or will have an input where it doesn't halt.<p>Thus if you assume that the function is total and thus doesn't halt, it must run in O(1) time.<p>To find an infinite input bit string where the function halts, we can just record which k = O(1) bits the function is querying and then try all 2^k possibilities and as calculating the function is O(1) the total time is also O(1). (We don't always have to try all 2^k possibilities).
> The Maybe type constructor is predefined by Haskell as [...]<p>It's hard for me to imagine a reader that understands the rest of this jargon, presented without explanation, who does not know what `Maybe` is.
Maybe I am not a enough of a functional programmer, but I don't see what's impossible here? Any symbolic algebra system (like Wolfram Mathematica) can do such derivations, and much more.<p>Sure, this is interesting, but in the sense of "look at this emergent behavior -- such a simple system can do unusually complex result", rather than "wow! no one could do such things with computer before"
I find this very interesting, but I'm unable to understand it, so the result still seems impossible to me.<p>Can someone explain how find works if the predicate is something like: return true if and only if every other bit is 1
Same ideas as the above: <a href="https://en.wikipedia.org/wiki/Computable_analysis" rel="nofollow">https://en.wikipedia.org/wiki/Computable_analysis</a>