|
[Jan. 9th, 2012|08:10 pm]
Jason
|
Reading some about Ray Solomonoff I was reminded how funny universality of computation really is.
WARNING WARNING PROBABLY NOT A VALID PROOF OF ANYTHING
For example: let a computable function f : N → 2 be given; that is, an infinite stream of bits. We're going to try to play the game of guessing the next bit of f, given the first n bits. So, given the first n bits (that is, the value of f on values 0, ..., n-1) we simulate the n smallest turing machines on all inputs 0..n for n steps, take the smallest turing machine g in that set that satisfies (a) g(n) terminates (in the n steps that we waited for it) (b) g(0), ... g(n-1) are consistent with the bits we've seen so far (i.e. g(i) = f(i) or else g(i) doesn't terminate, for all i) and we use g(n) as our guess.
If I haven't slipped up somewhere, this guessing game has the property that it will eventually always be right! Constructively, it is a contradiction for it to have an infinite number of errors. For let N be the goedel number of a turing machine that computes f. We will never choose a turing machine for our guess with goedel number greater than N, and each of the turing machines less than N can only fool us once before they are invalidated forever. Any turing machine that never fools us is actually never *wrong* --- and so is just a witness that our original N wasn't minimal, which is fine. |
|