"My student Mike Stay did computer science before he came to UCR. When he was applying, he mentioned a result he helped prove, which relates Goedel's theorem to the Heisenberg uncertainty principle:

2) C. S. Calude and M. A. Stay, From Heisenberg to Goedel via Chaitin, International Journal of Theoretical Physics, 44 (2005), 1053-1065. Also available at http://math.ucr.edu/~mike/

Now, this particular combination of topics is classic crackpot fodder. People think "Gee, uncertainty sounds like incompleteness, they're both limitations on knowledge - they must be related!" and go off the deep end. So I got pretty suspicious until I read his paper and saw it was CORRECT... at which point I definitely wanted him around! The connection they establish is not as precise as I'd like, but it's solid math."

(from http://math.ucr.edu/home/baez/week230.html)

A summary of Calude and Stay's result is this:

Chaitin's theorem that it's hard to compute bits of Ω, the probability that a random turing machine will halt. (you have to pick an arbitrary notion of what exactly a turing machine is, and what the probability distribution over them is, but neither choice essentially matters) Moreover, any formal system can only determine what finitely many bits of Ω

*are*, and in fact how many it can determine is roughly the number of bits it took to write down that formal system.

So from the point of view of algorithmic complexity theory, Ω really is an example of a random number. What "random" means is that if you want to write a program to spit out the first n digits of Ω (call it Ω

_{n}) about the best you can do asymptotically is write a program that has hard-coded into it an n-bit literal; there aren't any significant computable patterns in Ω that you can take advantage of to write a shorter program. If you make the definition Δ

_{n}= 2

^{-n}and Δ

_{C}= 2

^{size of smallest program that computes Ωn}then there exists a constant ε so that asymptotically

Δ

_{n}Δ

_{C}> ε

Now you're supposed to squint at this and see that it's kind of like Heisenberg's classic uncertainty inequality

δ

_{p}δ

_{x}> ℏ

for uncertainties of momentum and position — but on the algorithmic complexity side of things, we have Δ

_{n}, our uncertainty about the value of Ω, and Δ

_{C}is somehow a measure of our uncertainty of what program actually computes the approximation of Ω correctly. (because there are roughly 2

^{n}prefix-free programs of n bits, and remember that any program that computes Ω

_{n}has to be asymptotically of size n) You might validly complain at this point that this is just an artificial way of casting Chaitin's theorem — or merely the fact that there exist random numbers at all — and that we're just raising 2 to these powers to sneak in a multiplication.

However, the paper also has a statistical gedankenexperiment that is supposed to bring the example much closer in line with how uncertainty princples arise in physics. Sadly I don't know enough physics to really get that part.