Jason (jcreed) wrote,
Jason
jcreed

A crazy thing from a student of John Baez:

"My student Mike Stay did computer science before he came to UCR. When he was applying, he mentioned a result he helped prove, which relates Goedel's theorem to the Heisenberg uncertainty principle:

2) C. S. Calude and M. A. Stay, From Heisenberg to Goedel via Chaitin, International Journal of Theoretical Physics, 44 (2005), 1053-1065. Also available at http://math.ucr.edu/~mike/

Now, this particular combination of topics is classic crackpot fodder. People think "Gee, uncertainty sounds like incompleteness, they're both limitations on knowledge - they must be related!" and go off the deep end. So I got pretty suspicious until I read his paper and saw it was CORRECT... at which point I definitely wanted him around! The connection they establish is not as precise as I'd like, but it's solid math."


(from http://math.ucr.edu/home/baez/week230.html)

A summary of Calude and Stay's result is this:

Chaitin's theorem that it's hard to compute bits of Ω, the probability that a random turing machine will halt. (you have to pick an arbitrary notion of what exactly a turing machine is, and what the probability distribution over them is, but neither choice essentially matters) Moreover, any formal system can only determine what finitely many bits of Ω are, and in fact how many it can determine is roughly the number of bits it took to write down that formal system.

So from the point of view of algorithmic complexity theory, Ω really is an example of a random number. What "random" means is that if you want to write a program to spit out the first n digits of Ω (call it Ωn) about the best you can do asymptotically is write a program that has hard-coded into it an n-bit literal; there aren't any significant computable patterns in Ω that you can take advantage of to write a shorter program. If you make the definition Δn = 2-n and ΔC = 2size of smallest program that computes Ωn then there exists a constant ε so that asymptotically

Δn ΔC > ε

Now you're supposed to squint at this and see that it's kind of like Heisenberg's classic uncertainty inequality

δp δx > ℏ

for uncertainties of momentum and position — but on the algorithmic complexity side of things, we have Δn, our uncertainty about the value of Ω, and ΔC is somehow a measure of our uncertainty of what program actually computes the approximation of Ω correctly. (because there are roughly 2n prefix-free programs of n bits, and remember that any program that computes Ωn has to be asymptotically of size n) You might validly complain at this point that this is just an artificial way of casting Chaitin's theorem — or merely the fact that there exist random numbers at all — and that we're just raising 2 to these powers to sneak in a multiplication.

However, the paper also has a statistical gedankenexperiment that is supposed to bring the example much closer in line with how uncertainty princples arise in physics. Sadly I don't know enough physics to really get that part.
Tags: math, physics
Subscribe
  • Post a new comment

    Error

    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

  • 2 comments