James Baugh's personal rants and raves

The Problem with Understanding QM

James
James

The problem with understanding quantum mechanics is that there is a substantial shift in the scope of the semantics. Classically we think of a particle as an object with an objective state and thus speak of that state without needing to refer to the empirical act of verifying that state. The world is a world of objects. (An “object” in this context is a system with a complete set of properties, i.e. what might be observed and, critically, each of these properties has a specific value defining the system’s state.)

Since, in classical mechanics, the evolution is deterministic at the fundamental scale we do not need to distinguish between a specific instance of say a (classical) electron or the class of electrons with the same objective state. How one behaves all behave and we can be sloppy with the language. The language we use for classical systems is ontological (what is). We can imagine the classical object’s state as a point in a “state space” (typically phase space) spanning the range of a complete set of observable values from which all other observables can be expressed as functions. Now we do refer to classes of classical systems when we begin describing uncertain knowledge about the system state and for this, we utilize probability distributions over the state space.

Now with quantum mechanics, we suspend the assumptions stated above. Quantum systems (or simply quanta) also have a complete set of properties (potential observables) but we only acknowledge values for these properties at the point of observation. When we casually say “an electron with momentum p and (z component of) spin ##s_z##” i.e. when we write down the wave-function for such an electron we are in point of fact expressing the class of such electrons. Insofar as we define a measurement as actually having occurred, we know that an immediate subsequent measurement of the same properties will yield the same values. So, we know the measurement is meaningful. We can even predict future compatible measurements after dynamic evolution of the system.

At this point it is important then to make a distinction between a single instance of a quantum and a class of quanta. This is because without classical determinism our description must expand to probabilistic predictions. A singular system (both classical and quantum) will have a singular sequence of measured values. It doesn’t have a probability distribution. A class of (either type of) systems may have a predicted distribution of future behaviors and we can either write down a classical probability distribution over the state space, or with quanta a wave function or other Hilbert space vector, or more generally a density operator.

So in the examples described as “wave function collapse” we must understand that one observes a given instance of a quantum, then based on that measurement, we classify that system and write down its wave-function expressing the class to which it belongs. This description comes with a system of probabilistic predictions about its future behavior. If we then later observe that specific quantum, we will observe one of those possible predicted outcomes but having done so we update our class description. It is no different than the classical Bayesian updating of probability distributions given new information, but we are not using classical probability distributions. The term “collapse” is unfortunate as it is rather more a “jump”. The discrete jumping of descriptions simply reflects the fact that we are carrying out a discrete sequence of observations.

The “mystery” of “wave function collapse” comes when we over reify the wave-function thinking it is an ontological representation of the system rather than a description of the framework of predictions for the class of systems into which we categorize a given observed quantum. This mystery evaporates when one understands the description as being pragmatic and not representative. (This is the essential framework of the Copenhagen Interpretation to which I ascribe.)

Of course there are countless books and reams of articles exploring all this and alternative interpretations and I’m already posting a very long comment but let me add one last point. We can directly argue via to Bell’s inequality violations that the probabilities associated with quantum predictions are inconsistent with a probability measure over some state space (even including hidden variables). The additivity of probabilities just doesn’t work the same. The invocation of locality in the various discussions is a red herring IMNSHO. Attempts to reconcile classical ontological descriptions necessarily invoke either phenomena which allow future actions to alter past (hidden) state variables which is just a novel way to invalidate the use of an objective description of a system… or invoke the “all realities are equally real” which renders meaningless the statements that some phenomenon did or did not occur. They are attempts to fit a square peg into a round hole by banging on it until the hole becomes square and they declare the peg is really round. Again, this is my opinion based on my Copenhagenist understanding of QM.

James
PhilosophyQuantum PhysicsScience