the relevance and irrelevance of heisenberg’s uncertainty principle for the quantum measurement...

14

Click here to load reader

Upload: joel-hunter

Post on 23-Jun-2015

214 views

Category:

Education


2 download

DESCRIPTION

Quantum mechanics is not something you would have guessed. The moment you juxtapose quantum mechanics and everyday experience, the mysteries of how the former relates to, much less explains, the latter seem to have no end. Scientists are predisposed to take the obviousness of the world for granted (rightfully so) while trying to explain and justify quantum mechanics. Many philosophers also take the obviousness of the world for granted (improperly so). But there are a few philosophers who have taken note that the very obviousness of the world is rather surprising. It’s surprising because that which is so obvious is at the same time so unobtrusive; it is so obvious it practically insists that we overlook it. Why does the world already make sense to us, at least in an unreflective way, the moment we turn our attention to it, before we’ve had a chance to formulate the first question about it? The child contends with and utilizes gravity long before its unceasing effects arouse curiosity. Upon a moment’s reflection, we can see that our first tentative intellectual steps toward understanding, like learning our first musical tune, are already upheld by a robust commitment to the consistency and congruity of sensuous experience. We enter the world with a basic commitment to the world, what Merleau-Ponty called “perceptual faith.”

TRANSCRIPT

Page 1: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

1

The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum

Measurement Problem

Quantum mechanics is not something you would have guessed. The moment you

juxtapose quantum mechanics and everyday experience, the mysteries of how the former relates

to, much less explains, the latter seem to have no end. Scientists are predisposed to take the

obviousness of the world for granted (rightfully so) while trying to explain and justify quantum

mechanics. Many philosophers also take the obviousness of the world for granted (improperly

so). But there are a few philosophers who have taken note that the very obviousness of the world

is rather surprising. It’s surprising because that which is so obvious is at the same time so

unobtrusive; it is so obvious it practically insists that we overlook it. Why does the world already

make sense to us, at least in an unreflective way, the moment we turn our attention to it, before

we’ve had a chance to formulate the first question about it? The child contends with and utilizes

gravity long before its unceasing effects arouse curiosity. Upon a moment’s reflection, we can

see that our first tentative intellectual steps toward understanding, like learning our first musical

tune, are already upheld by a robust commitment to the consistency and congruity of sensuous

experience. We enter the world with a basic commitment to the world, what Merleau-Ponty

called “perceptual faith.”

Then why does quantum mechanics, the most empirically successful physical theory ever

formulated, exhibit features that are inconsistent and incongruous with our understanding of the

world, subverting our perceptual faith? Or do we exaggerate the strangeness of quantum

mechanics? Is it in fact replete with everydayness? In my view, to “save the phenomena” of

quantum mechanics from irrationalism, one must first recover the felt wonder of the world from

its mundane unobtrusiveness. Perhaps by recovering the primordial sense of the world we would

also find that quantum “mysteries” cohere with it. One of these quantum mysteries is the

Page 2: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

2

measurement problem. Before we turn to this problem specifically, we should briefly review the

nature of measurement generally.

Measurement is a mathematical activity that constitutes the possibility of a thing not just

being present in the here and now, but in a mode that is ever-present, identical for any subjective

viewer. Thus, it is the primitive mathematical act. There are other such objectifying activities of

a mathematical character (comparison, subordination, colligation), but measurement is that

special activity whereby we are engaged concretely with a thing in the register of the sensible

and we construe that thing in terms of a number. Measurement is of a higher order than the other

basic mathematical activities of putting things side by side, or ordering them with respect to one

another, or binding them together. These, too, involve us concretely with things, i.e., in the realm

of the visible or the tangible. But measurement makes the number of a thing as definitive and

provides the entry (some might say the escape) to a rarefied realm of intelligibility “beyond” the

sensible.

The measurement problem in quantum mechanics

The broad foundational question about the connection between quantum theory and

physical reality has an attenuated form in the so-called “measurement problem.” The problem is

straightforward: when juxtaposing probabilistic expectations for experimental outcomes with

real experimental experience, the world “shows a unique real datum: an actual fact.” One might

be inclined to think that this is no more a mystery than how an actual number turns up when

rolling a fair die. The difficulty is that the theory gives no account of how an actual unique

datum comes to be realized at the end of an individual measurement (whereas ordinary classical

mechanics tells how this works in the case of the die). Furthermore, when measurements are

repeated under the same conditions, even repeated many times over, the disjunction between the

Page 3: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

3

theory and the actual outcome holds every time. “The status of actual facts in the theory remains

nevertheless an open and troublesome question. Where does this uniqueness and even this

existence come from? This is undoubtedly the main remaining problem of quantum mechanics

and the most formidable one.”1

Most scientists work under the presumption that doing more science will resolve this

problem. But not all of them are willing to stride, like one of Arthur Koestler’s sleepwalkers,

through such an admitted “fundamental obscurity.” John Bell is perhaps one of the more famous

of these realists. He demanded that any interpretation of quantum mechanics meet the minimum

condition of maintaining the Copernican perspective that displaced human beings from the center

of the universe. Accordingly, he argued that concepts such as ‘observable’ and ‘measurement’

were “rather woolly,” and being anthropocentric, had no place in an authentic physical theory.

So, within the community of scientific practitioners we see fundamental disagreement

over what the measurement problem means and what are the conditions for its solution and

explanation. How shall we get our bearings? What is the convergent perceptual setting upon

which theorists diverge conceptually? Specific examples are easy to find because the problem

exists for any kind of quantum measurement, indeed all quantum measurements, whether the

system is as simple as a single photon exciting an atom or as complex as the highly energetic

experiments in giant accelerators searching for new particles. It seems that some measurements

within this range of events should fail to have determinate outcomes. But this flies in the face of

manifest perceptual experience. So how is the transition from indeterminate states to determinate

states effected?

1 The Interpretation of Quantum Mechanics. Roland Omnès. (Princeton, NJ: Princeton University Press, 1994), 60-

61, 350.

Page 4: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

4

Quantum mechanics provides a set of causal principles which describe and predict the

mechanics of a quantum system. The functional cornerstone of these principles is the unitary

transformation postulate, which describes how one state at some initial time evolves into another

state at some later time. The problem is that the foundational deterministic equation arising from

this postulate (the Schrödinger equation) seems to exclude the possibility of a measurement ever

occurring. So theorists add a separate principle of measurement, which requires a rupture to the

smooth, linear evolution of the quantum system. This postulate requires the theorist to “project”

what was a potentially determinate value onto an actually determinate value. It is this projection

postulate that has sustained the most attention and criticism because it introduces physically

incomprehensible notions like “collapse of the wave function” or “reduction of the state vector.”

It is an admittedly irrational worm in an otherwise lovely apple.

What are we to make of a (purportedly) sensible thing that seems to have no definite

place or position? If ‘to be’ means ‘to be there’, then how are we to understand the ‘there’ of a

photon or an electron that is described by a wave that propagates everywhere? (Heidegger’s

meditation on nearness and annihilation in the opening section of “The Thing” is appropriate

here.) Furthermore, how are we to understand an object that is not indifferent to acts of

observation or measurement? How must we transform our classical view of measurement, that a

pre-formed reality is open to human observation while yet remaining uninfluenced by actual

measurements, in the light of quantum mechanics where measurement is an intrinsically invasive

procedure?

A phenomenological analysis of the measurement problem would involve at least five

questions:

Page 5: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

5

1. Why has measurement become a “problem” in quantum mechanics? What is the true

source of the trouble?

2. How do the entities investigated “phenomenalize?” How do they emerge into the register

of the sensible, the visible?

3. How does measurement of quantum entities and processes relate to phenomenalization?

4. What is the role, if any, of human involvement, e.g., perception, in measurement? What

does the measurement problem teach us about perception?

5. What is the role, if any, of human involvement in the thing measured? What does the

measurement problem teach us about the (non)sensible thing?

In this paper, I shall limit our considerations to the first question. Let us begin by setting to the

side any predetermination of the “reality” or “existence” of the entities in question. We need to

minimize the influence of our natural predispositions to talk about and think about atoms,

electrons and photons as if they were ordinary things like tables and chairs, an equivalence which

is manifestly not the case. My phenomenological feint is not proposed in order to answer the

same questions taken up elsewhere, only now from a “phenomenological” perspective, whatever

that might mean to the hearer. What I do hope to elucidate is the nature of the watershed in

physics between realists and anti-realists, its genealogy, and other possibilities that might be

envisioned.

In order to do justice to the task of concrete research, I begin from an atypical beginning

than most philosophical research on the subject of quantum mechanics. I want to gain some

understanding of how the measurement problem ever became a problem at all. This is not meant

in the sense of a question of empirical history. Rather, we will need to undertake some

conceptual archeology. The measurement problem, characterized as an interaction between an

observer and something observed, suffers from the obscurity created by the entrenched

conceptual doublets of modernist metaphysics (nature-man, mind-body, self-other, subject-

Page 6: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

6

object, constituting agent-constituted thing, etc.). The first task then, is to clarify the interaction

at the root of the measurement problem on a basis that is not so conceptually hamstrung. It is true

that there are many claims in the scientific and philosophical literature that the measurement

problem has been solved (or that it is merely a pseudo-problem), but the proposed “solutions”

entail other nonrealistic consequences (e.g., nonlocality); and, while these insights are

philosophically suggestive, so far, solutions to the measurement problem have merely transposed

the original problem into a different register with the same metaphysical precommitments.

Heisenberg’s uncertainty principle and the Pythagorean root of quantum mechanics

Why has measurement become a problem in quantum mechanics? Because it, more than

the other presumed problems of quantum physics, is the problem of foundations. It is a

philosophical problem posed by physics. Other features of quantum mechanics which have

incited much philosophical reflection are not our real concern, even though they sometimes are

falsely associated with the measurement problem. Chief of these is Heisenberg’s famous

“Uncertainty Principle.” This tenet of quantum mechanics, which to many is so closely

associated with the inherent “mystery” of quantum theory, is not, as it turns out, relevant to

quantum mechanics per se. It is quite simply not a discovery or determination unique to quantum

mechanics. It tells us little to nothing about the concrete aspects of microscopic phenomena and

our involvement with them (despite breathless claims to the contrary in some popularizations of

quantum mechanics, beginning with Heisenberg himself). The “Uncertainty” (better:

“Indeterminacy”) Principle is a mathematical artifact created by a precommitment to economical

priorities in the interest of simplifying calculation or computability, not from measurement

disturbances. The Heisenberg indeterminacy relation takes two forms:

Page 7: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

7

∆p ∆q > h/4π (1)

∆E ∆t > h/4π (2)

They express the variance of two canonically conjugate2 quantities: momentum and position in

the first case and energy and time in the second case (Heisenberg’s original derivation published

in 1927 describes an electron moving in empty space). The right side of the inequalities is a

constant, with Planck’s constant, h, in the numerator (6.626 x 10-34

J • s). This indeterminacy

principle is as ubiquitous as potsherds in the mathematical sciences. Let us examine the ways in

which it appears in different guises in the theoretical and applied sciences and attempt to trace its

genealogy.

In my undergraduate days in electrical engineering, I toiled long hours on signal analysis.

Be it an osprey call or an FM radio transmission, all signals have two elementary features,

irreducible (though transformable) to one another: time and frequency. In the case of the bird

song, you can listen as the signal varies in intensity through time. You can also hear, at any given

moment, the pitch or pitches of the signal, its frequency component. You cannot hear all of the

frequencies simultaneously, just like you cannot see all of the colors in ordinary light; you need a

tool to break up the complex signal into its components. For light, we use a prism; for signals

(more precisely: for the functional representation of a signal), we use the Fourier transform.

When you transform a signal from the time domain into the frequency domain you transform a

signal into a spectrum with harmonics at different frequencies and different magnitudes

(amplitudes). Now, a signal generated “naturally,” or “in the wild,” is sloppy; the tones aren’t

pure or perfect, they’re “noisy.” The clicks, chirps, chattering, or other interruptions to the

2 ‘Canonically conjugate’ variables are “quantities that are not independent of each other,” i.e., they have some

relation such that one is irreducible to the other.

Page 8: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

8

subject of the signal (speaking musically) are not the features that we want to stand out; quite the

contrary, we want to filter them out so that the subject stands out more clearly.

In the representation of the signal, there is always some spread or variance from where

the frequency is centered, the value around which it is concentrated (in statistics, this is the

expectation value). This is where the indeterminacy relation enters: there is always a minimum

degree of divergence between the two spreads, between the time variance and the frequency

variance, and that divergence is expressed as an inequality:

s x S ≥ 1/16π

2, (3)

where s is the time signal variance and S is the Fourier transform or frequency signal variance.

What does this particular mathematical expression mean? The purer (or clearer or more

defined) the time signal, the fuzzier is the frequency signal. And vice versa: the clearer the

spectrum, the more indistinct the time signal. Note well: the indeterminacy (or “fuzziness”) is

not an aspect of the actual phenomenon as it is experientially manifest (e.g., the osprey call that I

hear); it is a result of the abstract analysis we have applied to the signal, which is a functional

representation of the phenomenon (in the case of the bird song, a representation of something

audible). In other words, the mathematized expression of the bird song re-presents an irrevocable

distortion of the original phenomenon, the song as it is sung or heard. But how did we generate

this mathematical artifact? Hidden within the function we applied to the signal to determine the

variances s and S is a simplification: it is linear. But the original signal to which we applied the

function is nonlinear—there is harmonic distortion, frequency compression, clipping—and vastly

more complex than we would prefer or manage for calculative purposes. So, for economic

reasons, we make a simplification, we make the math more convenient. Note well: other

interests shape and guide us, practical interests, according to which we discard features or

Page 9: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

9

elements of the phenomenological totality for the sake of aesthetic, pragmatic or other

considerations.

Now, it is no accident that the Heisenberg indeterminacy equations (1) and (2) have more

than a family resemblance to the signal variance inequality (3). Structurally, they express the

same relation: the product of two spreads or variances on the left side which is greater than or

equal to some constant value. And, just as was the case with (3), we must keep in mind that (1)

and (2) are also functional representations of abstract concepts; i.e., the indeterminacy of

‘position’ or ‘momentum’ spread expressed by the equations is not a feature of a concrete

phenomenon. Furthermore, these are also ideal operations: the resolution of one variable can be

varied infinitely with corresponding deterioration or improvement in the quantitative

determinacy of the companion variable without any implication that some real sound in the

world approximated by one of the variations is itself sensuously indeterminate. Where theorists

too often go astray is in the common assumption (since Galileo) that mathematical phenomena

transparently and unproblematically map onto or correspond with the phenomena encountered

and engaged in experiential manifestness, “in the wild,” if you will; that our neat, cultivated

idealities must have some positive ontological status, either in themselves or as the only “true”

representation of some concrete phenomenon. Obviously, this selective perspective or eidos of

bird songs, electromagnetic waves and electrons means that the way in which we are going to

contrive these as objects and signify the world itself as object is as rigorously representable by

linear means.

Let me give another example to reinforce my earlier claim about oft-overlooked

simplifications of linearity. One of the most popular mathematical expressions formulated in the

twentieth century is E=mc2. This is Einstein’s famous mass-energy equivalence formulation, a

Page 10: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

10

follow-up to his original Special Relativity theory of 1917. The relation between energy E and

mass m is modified by a constant of proportionality, c2. But, there are an infinite number of

nonlinear terms on the right side of the equation that are not shown that make a more precise

determination of the desired variable (either E or m) far less manageable. Exactitude is sacrificed

for elegance. It is no wonder that the trade is sought given the high value placed on an objective

sense of balance (viz., laws of conservation) and completeness and totality of representation.

Thus, the determination of a number by measurement does not entail that precision or exactitude

of quantity is the desired aim.

Modern natural science finds the pragmatic principle of “for all practical purposes”

indispensable. Analyses are condensed or abridged without noticeably relinquishing control of

prediction, planning or common standards of measurement. Some conscientious theorists are

uneasy with this pragmatic incursion into quantificational matters because they can find no

rational basis for calling a halt to what they already know is rational, viz., the mathematical rigor

of the formulation and the certitude of the calculative operations. Can we know in advance the

value at which we’ve reached the threshold of mathematical “control”? Why or why not? If so,

can we state or specify this a priori as clearly and distinctly as the mathematical certitude

derived from it? If not, is there anything from “nature” other than experimental repetition or a

posteriori empirical operations that we can point to as a basis for our decision to interrupt the

infinite iterations that unfold before us? These are the questions that need to be asked and that

constitute the real philosophical import of the Heisenberg indeterminacy principle.3 But,

3 It is on these foundational questions that you find commendable philosophical sensitivity on the part of physics

theorists in the scientific literature.

Page 11: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

11

historically speaking, we could have asked these questions before quantum mechanics was

formulated.

Heisenberg indeterminacy relations, both quantum and classical, arise because the world

is just too complex, or, speaking mathematically, “nonlinear,” for our practical purposes. In the

process of idealization from nonlinear to linear, and abstraction from phenomena “in the wild” to

their more docile, cultivated mathematical representations, we simplify the representations of

concrete phenomena so that we can perform linear math on them. We find ourselves in a forest

out of a Brothers Grimm tale and in order to make sense of it, we raze, prune, trim, flatten, and

straighten all the wildness out of it until we have a tame, formal, English garden. The “higher

order” terms are ignored as Rococo excesses of nature. This sweeping approximation requires

the insertion of an estimated value into what is manifest (the “knowns”), a straightening of

crooked curves and wiggles, a smoothing of rough terrain, i.e., an idealization. No matter how

disheveled the crown of a tree, one can always determine smoothness by arbitrarily narrowing

the focus to a smaller region. What must always be borne in mind is that the “global” view of the

tree manifestly differs from the linear, smooth, local view. I am not inferring that we cannot

thereby mathematicize the global phenomenon; I merely wish to point out that we ought to avoid

recklessly transferring the “good fit” of a linear formulation from a local level to a nonlinear

holistic level.

We need to trace the ancestry of indeterminacy relations still further, for we have not yet

reached their origin. Indeterminacy relations are found throughout the mathematical sciences,

both classical and quantum. The quantum indeterminacy inequalities (1) and (2), and my chosen

example of a classical indeterminacy inequality (3), are both representations of abstract objects:

variance in ‘position’, ‘energy’, ‘frequency’, etc. Once objectified, these conceptual abstractions

Page 12: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

12

can be thought of in some (abstract) space. How are they related to one another in this abstract

space? The fundamental mathematical activities (e.g., comparison, subordination, colligation,

measurement) are not available to us in a non-sensible register, so we require a higher order

analysis. First, we represent magnitudes by fixing arbitrary points A and B in an abstract space.

Arbitrary vectors, A and B, can then be drawn with lines from a common origin to the two

points. To complete their relation, construct an orthogonal (perpendicular) projection of one line

to be superimposed on the other line. This projection creates a right-triangle relationship and

leads to the “normal” equations of least-squares curve fitting. This projected right triangle

contains a Heisenberg indeterminacy relation, the Cauchy-Schwarz inequality, which relates the

lengths of the two vectors (the product of their norms) to the absolute value of the inner (or dot)

product (also called the ‘correlation’) between them:

lA x lB ≥ |A ● B|, (4)

or

‹x, y›2 ≤ ‹x, x› ● ‹y, y› (5)

in generalized bra-ket notation. The Heisenberg indeterminacy principle just is the quantum

mechanical expression of the Cauchy-Schwarz inequality. But recall how we generated the

Cauchy-Schwarz inequality: two abstract straight lines and the formation of a right triangle. This

procedure allows us to “normalize” unfixed vectors and simplify the “fitness” of an unknown

quantity given a minimum number of known quantities. The paradigmatic example of

determining an unknown value in light of two known values is the solution of the length of the

side of a right triangle or the deflection of one of its unknown angles. All of the mathematical

sciences, including both quantum and classical physics, insofar as they utilize or impose the

Page 13: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

13

constraint of linearity, contain an indeterminacy relation whose common ancestry can be traced

to the Pythagorean theorem.

So, the mathematical formalism of quantum mechanics—its abstract “objects” (operators)

and “space” (Hilbert space)—finds its roots not only in the algebraization of geometry begun by

Descartes (quantum mechanics makes extensive use of the linear algebra generalized from

analytic geometry) but also in the humble beginnings of Pythagorean and Euclidean geometry.

Indeterminacy relations ultimately rest on the ubiquitous Pythagorean theorem. Underlying the

modern use of the Pythagorean theorem is a notion of problem-solving and optimality whereby

an unknown path is inferred from known components. The theorem depends on orthogonality

conditions whereby two abstract objects intersect as if they were the legs of a right triangle. The

orthogonality conditions permit the easiest way to find a trend in a scattering of data points and

filter some of the noise from your car radio. The application of the Pythagorean theorem outside

the realm of pure geometry, the finding of an optimum direction or value, the simplest

interpolation, the easiest or least calculation, all indicate the supremacy of a principle of

economy. But that is most certainly not a Pythagorean or Platonic principle. A philosophical

reorientation was required to make it possible to have an interest in simplifying a problem for

calculative purposes.4

Linearization is achieved by application of the Pythagorean theorem and it enables us to

focus our efforts on the elements of a system that matter for calculative control, on the

determination of manageable parts. This is precisely the approach taken in quantum mechanics.

The initial appearance of subatomic entities and electromagnetic radiation as classically wavy

4 An excellent review of this tectonic shift is found in David Lachterman’s The Ethics of Geometry: A Genealogy of

Modernity. See also Marc Richir’s excellent review of this book.

Page 14: The Relevance and Irrelevance of Heisenberg’s Uncertainty Principle for the Quantum Measurement Problem

Joel Hunter

November, 2005

14

phenomena allowed theorists to study them using well-understood and relatively simple concepts

of linear wave mathematics: reflection, diffraction, interference, intensity, frequency, periodicity,

superimposition. Formally, quantum mechanics is not about “things in the world” but about

swarms of “linear operators” in a cosmos of matter waves. But how are we to understand the

necessary interface between these classically derived concepts, these abstract objects, this

abstract space, and the perceived world of lived experience where these formulations are

confirmed, the empirical manifold, the world of manifest perceptual experience?

Linear mathematics spawns indeterminacy relations. So, our philosophical interest is

spurred not by indeterminacy relations per se, but by their origin in linearity assumptions. If a

system is nonlinear, the parts of the linearized subsystem do not add up to the whole. The

behavior of groups cannot be sufficiently understood as the accumulation of their components’

behaviors. Even if the mathematical artifacts of linearity assumptions, the indeterminacy

relations, are somehow transferable or superimposable on phenomena, then it is possible to ask:

how can it be that a measuring instrument (or a measurer, for that matter), which is a big,

complex chunk of material, is a reliable guide for studying the finest divisions of matter? This

question remains highly controversial and the analysis of indeterminacy relations can carry us no

further. We must seek out the question where it is questionable, not in the register of the

intelligible entities of mathematical operations, but in the register of the sensible. This is the

central problematic for further phenomenological research on the measurement problem.