Wednesday, February 16, 2011

"Bit from It" vs. "It from Bit"

Julian Barbour presented his essay "Bit from It" at the FQXi essay contest Is Reality Digital or Analog?.

The essay is beautiful and I agree with the conclusion "Bit from It", in a way I will try to make clear. But I disagree with the way the conclusion was reached - it seems to me that the central part of Wheeler's 'ontology' "It from Bit" was overlooked, and this makes it look naive, while it is in fact very profound.

In a classical world, Wheeler's "It from Bit" would be obviously silly. When we measure something, we can write down the outcome as a string of digits, and by collecting all these digits we can determine the state. In such a world, "bit" would indeed originate from "it".

But Wheeler is discussing the quantum world. And for Wheeler, the quantum world is not just "classical world" plus "probability". Julian Barbour said: "Crucially, even if individual quantum outcomes are unpredictable, the probabilities for them are beautifully determined by a theory based on 'its'", but this is not the whole story. If this would be all, then he would indeed be right to say "I see nothing in Wheeler's arguments to suggest that we should reverse the mode of explanation that has so far served science so well". Julian Barbour tries to understand how Wheeler could do so trivial mistakes: "Wheeler's thesis mistakes abstraction for reality", and "A 'bit' has no meaning except in the context of the universe". Yet, there is no such a gross mistake.

Wheeler's "It from Bit" can be understood in the context of the "delayed choice experiment". He realizes that it is not enough to specify the outcome, but also what we measure - for example "which way" or "both ways" in the Mach-Zehnder experiment. But he realizes that our choice of what to measure determines how the state was (yes, in the past). This is the key problem of quantum mechanics, and this is the fundamental obstacle of all realistic interpretations of quantum mechanics: we choose "now" what to measure, and our present choice dictates how the state was, long time before we made our choice. We can think that there is an ontology behind the outcomes of our measurements, as in the classical world. But the "delayed choice experiment" shows that the "elements of reality" depend of the future choice of our measurements. And the outcomes depend of these choices too. So, it is in fact "the choice of what to measure" (Hermitian operator) plus "the outcome" (eigenvalue) that forms the "Bit" from Wheeler's "It from Bit". And the "It" is in fact the eigenstate corresponding to the obtained eigenvalue, given that the observable was that particular Hermitian operator. Wheeler was not that naive to think that eigenvalues determine eigenstates by themselves, without considering the Hermitian operator, so he accounted well for the prescription "A 'bit' has no meaning except in the context of the universe".

The central point of Wheeler's "It from Bit" is that the reality of today depends on the choices we make tomorrow, when we decide what to observe, and of the outcomes of the observations. He compares this with the game of 20 questions, when we try to guess a word by asking 20 yes/no questions, under the prescription that the choice of the word is not done at the beginning. The person who "knows" the word changes it by wish, so long as it remains consistent with the answer she already gave to our question. Wheeler wants to emphasize by this the similarity with the quantum state we try to determine, but which depends on what we choose to observe. This is why he was led to the idea that the state of the universe (it) results from the observations (bit).

I give more credit than Julian Barbour to the "It from Bit" philosophy - I view it as a way to present a central problem of quantum mechanics. I think, nevertheless, that it is exaggerated to conclude from this, as many do, that the world is digital. It may be or it may be not, but we should not force the conclusion. After all, the "It from Bit" philosophy is intended to clarify some points of a theory based on continuum - Quantum Mechanics.

My viewpoint on "It from Bit" is that we should regard the outcomes of measurements as "delayed initial conditions" for the Schrödinger's equation. I presented my view in this article, and this video. A solution of a partial differential equation like Schrödinger's is determined by a set of initial conditions. Classically, the initial conditions can be determined from future observations. In Quantum Mechanics, the future observations determine the state in the two meanings of the word "determine": passive - "find out what it is" (by the selection of an eigenvalue of the observable), and active - "choose what it is" (by the choice of that observable). Another central problem is that two consecutive observations of the same quantum system are incompatible, if the observables do not commute. That is, they impose incompatible initial conditions to the wavefunction. But, the second measurement is not, in fact, a measurement of the same system. The system interacted with the first measurement device, and this measurement device has many degrees of freedom which are not determined yet. So, the second observation measures in fact the composed system - the observed system plus the apparatuses used for the previous observations, and all the past interactions of the observed system. This may offer enough degrees of freedom to maintain the unitary evolution and to avoid a discontinuous collapse of the wavefunction.

My interpretation comes with a realistic wavefunction, which is not yet determined among the possible wavefunctions, but whose "delayed initial conditions" are determined by all future and past observations. I think that we cannot avoid the idea of "delayed initial conditions", no matter what "It" we choose to consider as the underlying ontology.

My view is therefore that "It from Bit" and "Bit from It" are reciprocal: a set of possible "It"s (solutions to the Schrödinger's equation), a set of possible "Bit"s (observations, delayed initial conditions) and the Universe is a pair (It, Bit), so that the "It" and the "Bit"s are compatible.

On the other hand, the "Bit" itself is part of the solution of the Schrödinger's equation, that is, of the "It". This is why I said at the beginning that I agree with "Bit from It". But if we have some "delayed initial conditions" - the "Bit"s - the "It" that satisfies to them is not necessarily unique. So, in fact, what we have is not a pair (It, Bit), but a pair ("It"s that satisfy to the observed "Bit", the observed "Bit"). There is a relation "one-to-many" between the "Bit" and the "It"s. The "Bit" appears to be discrete, but the "It" may very well be continuous. So, although "It from Bit" reflects an important aspect of Quantum Mechanics, it should not be taken too far.

Friday, February 11, 2011

Heisenberg's Relations and Uncertainty

Quantum Mechanics, in particular the Uncertainty Relations, need indeed a good interpretation. Well, I think that it is more than a matter of interpretation. If its internal logic is self-consistent, then there would not be needed an interpretation. The long discussions about interpretations actually reveal the existence of internal inconsistencies in the formalism of Quantum Mechanics. The "no interpretation" alternative, the "operational interpretation", tries to ignore the inconsistencies by avoiding discussing about reality, focusing only on the operations we perform when making experiments of Quantum Mechanics. I think that what really is needed is to resolve the internal conflicts of Quantum Mechanics. Actually, I think that the expression "interpretation of Quantum Mechanics" is used in fact for alternative theories, which propose mechanisms by which QM is implemented. Because what we can observe is described already by QM, such mechanisms are usually hidden, practically impossible to observe. So, in my opinion, they are named "interpretations" and not "theories" because of the exigencies of modern science to name them "theories" only if they are testable. We may call them "hypotheses", because they are not interpretations - they actually propose new mechanisms, but they cannot be tested, so they don't qualify to the modern definition of the word "theory". Of course, it can be argued that the assumption (superstition?) that Nature really gave us access to all its mechanisms, as if She had the purpose to allow us to test every statement we can make about them, should be kept open to debate.

Seeing the Uncertainty Relations as fundamental is indeed problematic for several reasons. First, they are in fact the mix of two principles. The second of these principles is the Born rule, giving the probability to obtain a given state as outcome of an observation of a quantum state. The Born rule, by specifying the probability, provides the probabilistic interpretation of a wavefunction. If the Born rule already contains the probabilities, I think it would be better if we could see the Heisenberg Relations separated of the probabilities.

If we take the solutions of the Schrödinger's equation - that is, the wavefunctions - as fundamental, then the basic Heisenberg relations appear from their very properties. We just take the relations between the size of the interval of the time (position) and the size of the interval of the frequency (wave vector), known from Fourier analysis. These relations are much more general: if we represent the same wavefunction in two different bases in the space of all possible wavefunctions, there is always such a relation between the corresponding intervals. Of course, an observable (Hermitian operator) comes with its own set of eigenfunctions, which are orthogonal, so it is naturally to obtain similar relations if we refer only to the observables and their commutation relations.

Therefore, the Uncertainty Relations come directly from the wave nature of the solutions to Schrödinger's equation, combined with the Born rule. By "Heisenberg Relations", I will refer to the relations as they appear from the wave nature of the wavefunction, reserving the names "Heisenberg Uncertainty Relations" or "Uncertainty Relations" for their probabilistic interpretation.

In a similar way, the entanglement between two or more particles is in fact a property of the tensor products between wavefunctions representing single particles. When the total state cannot be represented as a pure tensor product (which can be a combination of symmetric and antisymmetric products), but only as a superposition, we have entanglement. When we appeal to the Born rule, the entanglement manifests as correlations between the possible outcomes of the observation of the particles.

The Born rule has been thus tested by all experiments in QM, involving entanglement or not. Being probabilistic, they are tested only statistical, but this doesn't mean that they reveal an intrinsic probabilistic reality.

One central problem of Quantum Mechanics is to accommodate the unitary evolution described by the Schrödinger's equation, and the apparent collapse of the wavefunction due to the observation. There is clearly a contradiction here. If we introduce an internal mechanism to explain this collapse, then we have to make this mechanism able to explain both the unitary evolution and the collapse. This is difficult, because both processes are very simple. In a vector space, what can be simpler than unitary transformations and projections? Any hidden mechanism would have to compete with them. This is why it is so difficult to explain QM in terms of hidden variables, of multiverse, of nonlinear collapse and spontaneous diagonalization of the density matrix caused by the environment.

On the other hand, there are already enough unknown factors even if we consider the wavefunction as the only real element. The Schrödinger's equation gives us the evolution, it doesn't give us the initial conditions. The initial conditions can be partially obtained from observation. Due to the particular nature of quantum observation, our choice of what to observe also is a choice of what the initial conditions were (yes, in the past). This is why the initial conditions are delayed until the measurement is taken. To this, let us add that we do not observe the initial conditions of just a particle, but of that particle and every system with which it interacted in the past - such as the preparation device, which ensures the state of that particle at a previous time. Since such a device is large and complex, we don't really know its initial conditions, so when we observe the particle, we also observe the preparation device, and everything with which they interacted. Therefore, there are much more factors to introduce in the Schrödinger's equation. These factors are complex enough to make the conclusion that the wavefunction collapse is discontinuous not so necessary as it initially seemed. It is possible to have a unitary evolution leading from the state before the preparation to that after the measurement, given that we need to account for the interaction with the preparation device, which also have much freedom in its initial conditions. I described these ideas here, and there is also a video. In this view, the wavefunctions are real, therefore the Heisenberg Relations are real too. By applying to them the Born rule, it follows their probabilistic meaning, the Heisenberg Uncertainty Relations. It would be nice to have an explanation for the Born rule as well, because it is very plausible that it just follows somehow from a measure defined over the space of all possible wavefunctions.

"Explanation" between concrete and abstract

I realized that an apparently well-understood word, "explanation", may lead to controversies in discussions about the foundations of physics. The foundations are already controversial enough, but this adds even more to the confusion. It gives you a double featured feeling: on the one hand, of being misunderstood, and on the other hand, that you don't understand where the interlocutor is going on.

What is an "explanation"? Probably the most usual meaning is that explanation is to reduce the unknown to the known, the unfamiliar to the familiar. When this happens, we get the sense of understanding.

Even since childhood, we had so many questions, and the grown ups explained them - reduced the unfamiliar to more familiar notions. In school, the teachers continued to provide us explanations, and we appreciated most the teachers who managed to make the unclear things more intuitive for us. When reading about the foundations of physics, we usually start with popular physics books. The most recommended such books are those providing the feeling of understanding, appealing to our intuition. When we try to read something more advanced, even if it is recommended by our favorite pop-sci books, we find ourselves in a totally different situation. Instead of finding the deeper explanations we are looking for, we find ourselves thrown in the turbulent torrents of the abstract mathematics, drifting without an apparent purpose. And what is most annoying, these textbooks and articles full of equations actually claim to explain things!

Why is this happening? I think that they are guided by another meaning of the term "explanation": "to give an explanation to a phenomenon is to deduce the existence of that phenomenon from hypotheses considered more fundamental. For example, when from the principles of General Relativity was deduced the correct value from the perihelion precession of Mercury, it was considered that GR explained this precession. On the other hand, the deflection of light by the Sun was considered a prediction. After the full experimental confirmation, it became an explanation. I consider that "prediction" is just a temporary status of a scientific explanation, and that the fact that many explanations are first predictions is a historical accident.

There seem to be a similarity between principles/phenomena and axioms/theorems. This similarity suggests the reason why mathematics plays such an important role in the explanation of phenomena. To deduce more from less, complicated from simple, diverse from universal, this means to use logic and mathematics. And there is no limit of the difficulty of the needed mathematics, even if the principles are not that difficult.

This notion of explanation, I understand now, it is not shared by all of us. The reason is simple: because "explanation" usually means to reduce the unfamiliar to familiar. When somebody claimed to explain a phenomenon, we expect him to show how this strange phenomenon can be described in more familiar, concrete terms. Instead, we find that he or she starts describing it in more abstract terms. How come that such more and more abstract terms are shamelessly named "more basic principles", "more elementary principles" and so on? Isn't this a lie?

Maybe the explanation by "reducing to concrete things" has pedagogical reasons, and the explanation by "reducing to universal principles" is in fact foundational research. But does this means that the gap between pedagogical and scientific explanation should grow as it does nowadays? Wouldn't be much, much better to have a mechanistic explanation? After all, Maxwell sought for such an explanation of the electromagnetic waves, even though he had the equations! The ether theorists of the XIXth century tried to reduce electromagnetism to vibrations in a medium. This tradition still continues, and we encounter on a daily basis renowned scientists trying to explain things which other renowned scientists consider to be already explained: electromagnetism, wave-particle duality, gravity, entropy, the Unruh effect, spacetime, time, black holes and so on.

Probably it would be better to have a mechanistic explanation of everything. This would definitely help the public outreach of physics, and will help physics to advance faster. This may have a huge impact on technology, and on our lives. But who can bet that God, when created the world, bothered about our need to reduce the things to what we know? Why would the universe care about our limited understanding, when decided what principles to follow? Who are we, why would we be so important? I think that, although it would be desirable to find concrete, familiar universal principles behind this complex and diverse world, we have no guarantee that this will ever happen. "You shall not make for yourself a carved image, or any likeness of anything that is in heaven above, or that is in the earth beneath, or that is in the water under the earth."

The definition of "explanation" as a reduction to universal principles has its own advantages, given that we do not take these principles as ultimate truths, but just as hypotheses. One of these advantages is that it allows us to equally appreciate theories which seem to contradict each other. We can appreciate its explanatory power in the sense stated above: as its efficient encapsulation of a wide variety of phenomena in fewer, simpler, and more general principles. This doesn't mean that we should consider these principles as being "true". It is not about being "true", just about encapsulating as much phenomena as possible in as few principles as possible, even if these principles are more abstract. If we insist to become fans of one theory or another as the ultimate "truth", we may reduce our capacity to grasp other explanations. This would not be a problem, if we could prove our theories beyond any doubt, but the truth is that we cannot, no matter how convincing they may look to us.