Jason (jcreed) wrote,
Jason
jcreed

In the evening I poked around in a copy of Nielsen and Chuang that the library had, and I was blown away by how much sense it made this time. I don't know what it is, but whatever made the two times I tried taking the Quantum Computation class at CMU completely baffling has since vanished. I credit me being a lot more comfortable with linear algebra (which is in part due to me now having heard of Penrose graph notation, so that for example the role of the trace in forming density operators is now more or less obvious; it's only there because of the limitations of linear notation), and the Scott Aaronson rant about quantum essentially being about negative probabilities that rweba linked to recently.

The "Principle of Deferred Measurement" that N&C mentions is another key insight that I half-assedly reinvented for myself while reading Altenkirch and Grattage's paper "A functional quantum programming language". They make the claim that the really dangerous structural rule about quantum programming is weakening not contraction. The argument being that we can copy around bits (in a chosen basis, at least) all we want to by using a CNOT gate, but weakening, oh no, that induces scary things like measurement and decoherence.

But I don't really buy it. Measurement itself isn't a thing that happens; you can't distinguish weakening the variable right now from simply failing to use it later. The things I'm willing to believe in are quantum circuits consisting of a bunch of qubits starting off in some state, which go through a unitary transformation built up from quantum gates, and then a final measurement according to the Born rule in the computational basis. I don't have to believe in anything else. Deferred measurement says I can replace any measurement with a CNOT in the right basis. "Being in a basis" can be replaced by unitarily transforming the computational basis there and back. Decoherence is not a weird spooky thing that all-of-a-sudden happens, but it's a description of how and when systems are not likely to experience interference effects. Classical bits aren't magically different from qubits (how could they be? Real classical bits are manufactured, after all, in a quantum world) but are just unboundedly entangled, decoherent qubits --- you could never hope to uncompute all their entanglements in order to get them to interfere with themselves again.

It almost begins to make me pissed off that quantum is still taught in such a "mystery-forward" way. Sure, it is weird, and that's part of what makes it a sexy topic, but the "mystery" of measurement collapsing wave-states and so on seems like a total confusion of issues. Though maybe I only think this way because I find many-worlds to be a completely plausible serious metaphysical explanation of the whole business.
Tags: physics, quantum
Subscribe

  • (no subject)

    Something that's bugged me for a long time is this: How many paths, starting at the origin, taking N steps either up, down, left or right, end up at…

  • (no subject)

    Still sad that SAC seems to end up being as complicated as it is. Surely there's some deeper duality between…

  • (no subject)

    I had already been meaning to dig into JaneSt's "Incremental" library, which bills itself as a practical implementation (in ocaml) of the ideas in…

  • Post a new comment

    Error

    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

  • 19 comments

  • (no subject)

    Something that's bugged me for a long time is this: How many paths, starting at the origin, taking N steps either up, down, left or right, end up at…

  • (no subject)

    Still sad that SAC seems to end up being as complicated as it is. Surely there's some deeper duality between…

  • (no subject)

    I had already been meaning to dig into JaneSt's "Incremental" library, which bills itself as a practical implementation (in ocaml) of the ideas in…