The theory of relativity deals with the geometric
structure of a four-dimensional spacetime. Quantum mechanics
describes properties of matter. Combining these
two theoretical edifices is a difficult proposition. For example,
there is no way of defining a relativistic proper
time for a quantum system which is spread all over
space. A proper time can in principle be defined for a
massive apparatus (‘‘observer’’) whose Compton wavelength
is so small that its center of mass has classical
coordinates and follows a continuous world line. However,
when there is more than one apparatus, there is no
role for the private proper times that might be attached
to the observers’ world lines. Therefore a physical situation
involving several observers in relative motion cannot
be described by a wave function with a relativistic
transformation law (Aharonov and Albert, 1981; Peres,
1995, and references therein). This should not be surprising
because a wave function is not a physical object.
It is only a tool for computing the probabilities of objective
Einstein’s [special] principle of relativity asserts that there are
no privileged inertial frames.
[Comment #3: Einstein's general principle of relativity is that there are no privileged local accelerating frames (AKA LNIFs). In addition, Einstein's equivalence principle is that one can always find a local inertial frame (LIF) coincident with a LNIF (over a small enough region of 4D space-time) in which to a good approximation, Newton's 1/r^2 force is negligible "Einstein's happiest thought" Therefore, Newton's universal "gravity force" is a purely inertial, fictitious, pseudo-force exactly like Coriolis, centrifugal and Euler forces that are artifacts of the proper acceleration of the detector having no real effect on the test particle being measured by the detector. The latter assumes no rigid constraint between detector and test particle. For example a test particle clamped to the edge r of a uniformly slowly rotating disk will have a real EM force of constraint that is equal to m w x w x r.]
This does not imply the
necessity or even the possibility of using manifestly symmetric
four-dimensional notations. This is not a peculiarity
of relativistic quantum mechanics. Likewise, in classical
canonical theories, time has a special role in the
equations of motion.
The relativity principle is extraordinarily restrictive.
For example, in ordinary classical mechanics with a finite
number of degrees of freedom, the requirement that
the canonical coordinates q have the meaning of positions,
so that particle trajectories q(t) transform like
four-dimensional world lines, implies that these lines
consist of straight segments. Long-range interactions are
forbidden; there can be only contact interactions between
point particles (Currie, Jordan, and Sudarshan,
1963; Leutwyler, 1965). Nontrivial relativistic dynamics
requires an infinite number of degrees of freedom,
which are labeled by the spacetime coordinates (this is
called a field theory).
Combining relativity and quantum theory is not only a
difficult technical question on how to formulate dynamical
laws. The ontologies of these theories are radically
different. Classical theory asserts that fields, velocities,
etc., transform in a definite way and that the equations
of motion of particles and fields behave covariantly. …
For example, if the expression for the Lorentz force is written
...in one frame, the same expression is valid
in any other frame. These symbols …. have objective
values. They represent entities that really exist, according
to the theory. On the other hand, wave functions
are not defined in spacetime, but in a multidimensional
Hilbert space. They do not transform covariantly when
there are interventions by external agents, as will be
seen in Sec. III. Only the classical parameters attached
to each intervention transform covariantly. Yet, in spite
of the noncovariance of r, the final results of the calculations
(the probabilities of specified sets of events) must
be Lorentz invariant.
As a simple example, consider our two observers, conventionally
called Alice and Bob,4 holding a pair of spin-1/2
particles in a singlet state. Alice measures sz and finds
+1, say. This tells her what the state of Bob’s particle is,
namely, the probabilities that Bob would obtain + or - 1 if he
measures (or has measured, or will measure) s along
any direction he chooses. This is purely counterfactual
information: nothing changes at Bob’s location until he
performs the experiment himself, or receives a message
from Alice telling him the result that she found. In particular,
no experiment performed by Bob can tell him
whether Alice has measured (or will measure) her half
of the singlet.
A seemingly paradoxical way of presenting these results
is to ask the following naive question. Suppose that
Alice finds that sz = 1 while Bob does nothing. When
does the state of Bob’s particle, far away, become the
one for which sz = -1 with certainty? Although this
question is meaningless, it may be given a definite answer:
Bob’s particle state changes instantaneously. In
which Lorentz frame is this instantaneous? In any
frame! Whatever frame is chosen for defining simultaneity,
the experimentally observable result is the same, as
can be shown in a formal way (Peres, 2000b). Einstein
himself was puzzled by what seemed to be the instantaneous
transmission of quantum information. In his autobiography,
he used the words ‘‘telepathically’’ and
‘‘spook’’ (Einstein, 1949). …
In the laboratory, any experiment
has to be repeated many times in order to infer a
law; in a theoretical discussion, we may imagine an infinite
number of replicas of our gedanken experiment, so
as to have a genuine statistical ensemble. Yet the validity
of the statistical nature of quantum theory is not restricted
to situations in which there are a large number
of similar systems. Statistical predictions do apply to
single events. When we are told that the probability of
precipitation tomorrow is 35%, there is only one tomorrow.
This tells us that it may be advisable to carry an
umbrella. Probability theory is simply the quantitative
formulation of how to make rational decisions in the
face of uncertainty (Fuchs and Peres, 2000). A lucid
analysis of how probabilistic concepts are incorporated
into physical theories is given by Emch and Liu (2002).
[My comment #4: Peres is correct, but there is no conflict with Bohm's ontological
interpretation here. The Born probability rule is not fundamental to quantum reality
in Bohm's view, but is a limiting case when the beables are in thermal equilibrium.]
Some trends in modern quantum information theory
may be traced to security problems in quantum communication.
A very early contribution was Wiesner’s seminal
paper ‘‘Conjugate Coding,’’ which was submitted
circa 1970 to IEEE Transactions on Information Theory
and promptly rejected because it was written in a jargon
incomprehensible to computer scientists (this was actually
a paper about physics, but it had been submitted to
a computer science journal). Wiesner’s article was finally
published (Wiesner, 1983) in the newsletter of ACM
SIGACT (Association for Computing Machinery, Special
Interest Group in Algorithms and Computation
Theory). That article tacitly assumed that exact duplication
of an unknown quantum state was impossible, well
before the no-cloning theorem (Dieks, 1982; Wootters
and Zurek, 1982) became common knowledge. Another
early article, ‘‘Unforgeable Subway Tokens’’ (Bennett
et al., 1983) also tacitly assumed the same.
II. THE ACQUISITION OF INFORMATION
A. The ambivalent quantum observer
Quantum mechanics is used by theorists in two different
ways. It is a tool for computing accurate relationships
between physical constants, such as energy levels,
cross sections, transition rates, etc. These calculations
are technically difficult, but they are not controversial.
In addition to this, quantum mechanics also provides
statistical predictions for results of measurements performed
on physical systems that have been prepared in a
[My comment #5: No mention of Yakir Aharonov's intermediate present "weak measurements"
with both history past pre-selection and destiny future post-selection constraints. The latter in
Wheeler delayed choice mode would force the inference of real back-from-the-future retrocausality.
This would still be consistent with Abner Shimony's "passion at a distance," i.e. "signal locality"
in that the observer at the present weak measurement would not know what the future constraint
actually will be. In contrast, with signal non locality (Sarfatti 1976 MIT Tech Review (Martin Gardner) &
Antony Valentini (2002)) such spooky precognition would be possible as in Russell Targ's reports on
CIA funded RV experiments at SRI in the mid 70's and 80's.
This is, on the face of it, a gross violation of orthodox
quantum theory as laid out here in the Peres review paper.]
The quantum measuring process is the interface
of classical and quantum phenomena. The preparation
and measurement are performed by macroscopic
devices, and these are described in classical terms. The
necessity of using a classical terminology was emphasized
by Niels Bohr (1927) from the very early days of
quantum mechanics. Bohr’s insistence on a classical description
was very strict. He wrote (1949)
‘‘ . . . by the word ‘experiment’ we refer to a situation
where we can tell others what we have done and what
we have learned and that, therefore, the account of the
experimental arrangement and of the results of the observations
must be expressed in unambiguous language,
with suitable application of the terminology of
Note the words ‘‘we can tell.’’ Bohr was concerned
with information, in the broadest sense of this term. He
never said that there were classical systems or quantum
systems. There were physical systems, for which it was
appropriate to use the classical language or the quantum
language. There is no guarantee that either language
gives a perfect description, but in a well-designed experiment
it should be at least a good approximation.
Bohr’s approach divides the physical world into ‘‘endosystems’’
(Finkelstein, 1988), which are described by
quantum dynamics, and ‘‘exosystems’’ (such as measuring
apparatuses), which are not described by the dynamical
formalism of the endosystem under consideration.
A physical system is called ‘‘open’’ when parts of
the universe are excluded from its description. In different
Lorentz frames used by observers in relative motion,
different parts of the universe may be excluded. The
systems considered by these observers are then essentially
different, and no Lorentz transformation exists
that can relate them (Peres and Terno, 2002).
It is noteworthy that Bohr never described the measuring
process as a dynamical interaction between an
exophysical apparatus and the system under observation.
He was, of course, fully aware that measuring apparatuses
are made of the same kind of matter as everything
else, and they obey the same physical laws. It is
therefore tempting to use quantum theory in order to
investigate their behavior during a measurement. However,
if this is done, the quantized apparatus loses its
status as a measuring instrument. It becomes a mere intermediate
system in the measuring process, and there
must still be a final instrument that has a purely classical
description (Bohr, 1939).
Measurement was understood by Bohr as a primitive
notion. He could thereby elude questions which caused
considerable controversy among other authors. A
quantum-dynamical description of the measuring process
was first attempted by John von Neumann in his
treatise on the mathematical foundations of quantum
theory (1932). In the last section of that book, as in an
afterthought, von Neumann represented the apparatus
by a single degree of freedom, whose value was correlated
with that of the dynamical variable being measured.
Such an apparatus is not, in general, left in a definite
pure state, and it does not admit a classical
description. Therefore von Neumann introduced a second
apparatus which observes the first one, and possibly
a third apparatus, and so on, until there is a final measurement,
which is not described by quantum dynamics
and has a definite result (for which quantum mechanics
can give only statistical predictions). The essential point
that was suggested, but not proved by von Neumann, is
that the introduction of this sequence of apparatuses is
irrelevant: the final result is the same, irrespective of the
location of the ‘‘cut’’ between classical and quantum
These different approaches of Bohr and von Neumann
were reconciled by Hay and Peres (1998), who
8At this point, von Neumann also speculated that the final
step involves the consciousness of the observer—a bizarre
statement in a mathematically rigorous monograph (von Neumann,
B. The measuring process
Dirac (1947) wrote that ‘‘a measurement always
causes the system to jump into an eigenstate of the dynamical
variable being measured.’’ Here, we must be
careful: a quantum jump (also called a collapse) is something
that happens in our description of the system, not
to the system itself. Likewise, the time dependence of
the wave function does not represent the evolution of a
physical system. It only gives the evolution of probabilities
for the outcomes of potential experiments on that
system (Fuchs and Peres, 2000).
Let us examine more closely the measuring process.
First, we must refine the notion of measurement and
extend it to a more general one: an intervention. An
intervention is described by a set of parameters which
include the location of the intervention in spacetime, referred
to an arbitrary coordinate system. We also have
to specify the speed and orientation of the apparatus in
the coordinate system that we are using as well as various
other input parameters that control the apparatus,
such as the strength of a magnetic field or that of a rf
pulse used in the experiment. The input parameters are
determined by classical information received from past
interventions, or they may be chosen arbitrarily by the
observer who prepares that intervention or by a local
random device acting in lieu of the observer.
[My comment #6: Peres, in my opinion, makes another mistake.
Future interventions will affect past weak measurements.
Back From the Future
A series of quantum experiments shows that measurements performed in the future can influence the present. Does that mean the universe has a destiny—and the laws of physics pull us inexorably toward our prewritten fate?
By Zeeya Merali|Thursday, August 26, 2010
An intervention has two consequences. One is the acquisition
of information by means of an apparatus that
produces a record. This is the ‘‘measurement.’’ Its outcome,
which is in general unpredictable, is the output of
the intervention. The other consequence is a change of
the environment in which the quantum system will
evolve after completion of the intervention. For example,
the intervening apparatus may generate a new
Hamiltonian that depends on the recorded result. In particular,
classical signals may be emitted for controlling
the execution of further interventions. These signals are,
of course, limited to the velocity of light.
The experimental protocols that we consider all start
in the same way, with the same initial state ... , and the
first intervention is the same. However, later stages of
the experiment may involve different types of interventions,
possibly with different spacetime locations, depending
on the outcomes of the preceding events. Yet,
assuming that each intervention has only a finite number
of outcomes, there is for the entire experiment only a
finite number of possible records. (Here, the word
record means the complete list of outcomes that occurred
during the experiment. We do not want to use the
word history, which has acquired a different meaning in
the writings of some quantum theorists.)
Each one of these records has a definite probability in
the statistical ensemble. In the laboratory, experimenters
can observe its relative frequency among all the records
that were obtained; when the number of records tends
to infinity, this relative frequency is expected to tend to
the true probability. The aim of theory is to predict the
probability of each record, given the inputs of the various
interventions (both the inputs that are actually controlled
by the local experimenter and those determined
by the outputs of earlier interventions). Each record is
objective: everyone agrees on what happened (e.g.,
which detectors clicked). Therefore, everyone agrees on
what the various relative frequencies are, and the theoretical
probabilities are also the same for everyone.
Interventions are localized in spacetime, but quantum
systems are pervasive. In each experiment, irrespective
of its history, there is only one quantum system, which
may consist of several particles or other subsystems, created
or annihilated at the various interventions. Note
that all these properties still hold if the measurement
outcome is the absence of a detector click. It does not
matter whether this is due to an imperfection of the detector
or to a probability less than 1 that a perfect detector
would be excited. The state of the quantum system
does not remain unchanged. It has to change to
respect unitarity. The mere presence of a detector that
could have been excited implies that there has been an
interaction between that detector and the quantum system.
Even if the detector has a finite probability of remaining
in its initial state, the quantum system correlated
to the latter acquires a different state (Dicke,
1981). The absence of a click, when there could have
been one, is also an event.
The measuring process involves not only the physical
system under study and a measuring apparatus (which
together form the composite system C) but also their
environment, which includes unspecified degrees of freedom
of the apparatus and the rest of the world. These
unknown degrees of freedom interact with the relevant
ones, but they are not under the control of the experimenter
and cannot be explicitly described. Our partial
ignorance is not a sign of weakness. It is fundamental. If
everything were known, acquisition of information
would be a meaningless concept.
A complete description of C involves both macroscopic
and microscopic variables. The difference between
them is that the environment can be considered as
adequately isolated from the microscopic degrees of
freedom for the duration of the experiment and is not
influenced by them, while the environment is not isolated
from the macroscopic degrees of freedom. For example,
if there is a macroscopic pointer, air molecules bounce
from it in a way that depends on the position of that
pointer. Even if we can neglect the Brownian motion of
a massive pointer, its influence on the environment leads
to the phenomenon of decoherence, which is inherent to
the measuring process.
An essential property of the composite system C,
which is necessary to produce a meaningful measurement,
is that its states form a finite number of orthogonal
subspaces which are distinguishable by the observer.
[My comment #7: This is not the case for Aharonov's weak measurements where
<A>weak = <history|A|destiny>/<history|destiny>
Nor is it true when Alice's orthogonal micro-states are entangled with Bob's far away distinguishably non-orthogonal macro-quantum Glauber coherent and possibly squeezed states.
|Alice,Bob> = (1/2)[|Alice +1>|Bob alpha> + |Alice -1>|Bob beta>]
<Alice+1|Alice -1> = 0
<Bob alpha|Bob beta> =/= 0
e.g. Partial trace over Bob's states |<Alice +1|Alice-Bob>|^2 = (1/2)[1 + |<Bob alpha|Bob beta>|^2] > 1
this is formally like a weak measurement where the usual Born probability rule breaks down.
Complete isolation from environmental decoherence is assumed here.
It is clear violation of "passion at a distance" no-entanglement signaling arguments based on axioms that are empirically false in my opinion.
"The statistics of Bob’s result are not affected at all by what Alice may simultaneously do somewhere else. " (Peres)
While a logically correct formal proof is desirable in physics, Nature has ways of leap frogging over their premises.
One can have constrained pre and post-selected conditional probabilities that are greater than 1, negative and even complex numbers.
All of which correspond to observable effects in the laboratory - see Aephraim Steinberg's experimental papers
University of Toronto.]
Each macroscopically distinguishable subspace corresponds
to one of the outcomes of the intervention and
defines a POVM element Em , given explicitly by Eq. (8)
Up to now, quantum evolution is well defined and it is
in principle reversible. It would remain so if the environment
could be perfectly isolated from the macroscopic
degrees of freedom of the apparatus. This demand is of
course self-contradictory, since we have to read the result
of the measurement if we wish to make any use of it.
A detailed analysis of the interaction with the environment,
together with plausible hypotheses (Peres, 2000a),
shows that states of the environment that are correlated
with subspaces of C with different labels m can be treated
as if they were orthogonal. This is an excellent approximation
(physics is not an exact science, it is a science of
approximations). The resulting theoretical predictions
will almost always be correct, and if any rare small deviation
from them is ever observed, it will be considered
as a statistical quirk or an experimental error.
The density matrix of the quantum system is thus effectively
block diagonal, and all our statistical predictions
are identical to those obtained for an ordinary mixture
of (unnormalized) pure states
This process is called decoherence. Each subspace
m is stable under decoherence—it is their relative
phase that decoheres. From this moment on, the macroscopic
degrees of freedom of C have entered into the
classical domain. We can safely observe them and ‘‘lay
on them our grubby hands’’ (Caves, 1982). In particular,
they can be used to trigger amplification mechanisms
(the so-called detector clicks) for the convenience of the
Some authors claim that decoherence may provide a
solution of the ‘‘measurement problem,’’ with the particular
meaning that they attribute to that problem
(Zurek, 1991). Others dispute this point of view in their
comments on the above article (Zurek, 1993). A reassessment
of this issue and many important technical details
were recently published by Zurek (2002, 2003). Yet
decoherence has an essential role, as explained above. It
is essential that we distinguish decoherence, which results
from the disturbance of the environment by the
apparatus (and is a quantum effect), from noise, which
would result from the disturbance of the system or the
apparatus by the environment and would cause errors.
Noise is a mundane classical phenomenon, which we ignore
in this review.
E. The no-communication theorem
We now derive a sufficient condition that no instantaneous
information transfer can result from a distant intervention.
We shall show that the condition is
[Amm ,Bnn] = 0
where Amm and Bnn are Kraus matrices for the observation
of outcomes m by Alice and n by Bob.
[My comment #8: "The most beautiful theory is murdered by an ugly fact." - Feynman
e.g. Libet-Radin-Bierman presponse in living brain data
SRI CIA vetted reports of remote viewing by living brains.
Peres here is only talking about Von Neumann's strong measurements not
Aharonov's weak measurements.
Begin forwarded message:
The late Asher Peres http://en.wikipedia.org/wiki/Asher_Peres interpretation is the antithesis of the late David Bohm's ontological interpretation http://en.wikipedia.org/wiki/David_Bohm holding to a purely subjective epistemological Bohrian interpretation of the quantum BIT potential Q.He claims that Antony Valentini's signal non locality beyond orthodox quantum theory would violate the Second Law of Thermodynamics.REVIEWS OF MODERN PHYSICS, VOLUME 76, JANUARY 2004Quantum information and relativity theoryAsher PeresDepartment of Physics, Technion–Israel Institute of Technology, 32000 Haifa, IsraelDaniel R. TernoPerimeter Institute for Theoretical Physics, Waterloo, Ontario, Canada N2J 2W9(Published 6 January 2004)This article discusses the intimate relationship between quantum mechanics, information theory, andrelativity theory. Taken together these are the foundations of present-day theoretical physics, andtheir interrelationship is an essential part of the theory. The acquisition of information from aquantum system by an observer occurs at the interface of classical and quantum physics. The authorsreview the essential tools needed to describe this interface, i.e., Kraus matrices andpositive-operator-valued measures. They then discuss how special relativity imposes severerestrictions on the transfer of information between distant systems and the implications of the fact thatquantum entropy is not a Lorentz-covariant concept. This leads to a discussion of how it comes aboutthat Lorentz transformations of reduced density matrices for entangled systems may not becompletely positive maps. Quantum field theory is, of course, necessary for a consistent description ofinteractions. Its structure implies a fundamental tradeoff between detector reliability andlocalizability. Moreover, general relativity produces new and counterintuitive effects, particularlywhen black holes (or, more generally, event horizons) are involved. In this more general context theauthors discuss how most of the current concepts in quantum information theory may require areassessment.CONTENTSI. Three Inseparable Theories 93A. Relativity and information 93B. Quantum mechanics and information 94C. Relativity and quantum theory 95D. The meaning of probability 95E. The role of topology 96F. The essence of quantum information 96II. The Acquisition of Information 97A. The ambivalent quantum observer 97B. The measuring process 98C. Decoherence 99D. Kraus matrices and positive-operator-valuedmeasures (POVM’s) 99E. The no-communication theorem 100III. The Relativistic Measuring Process 102A. General properties 102B. The role of relativity 103C. Quantum nonlocality? 104D. Classical analogies 105IV. Quantum Entropy and Special Relativity 105A. Reduced density matrices 105B. Massive particles 105C. Photons 107D. Entanglement 109E. Communication channels 110V. The Role of Quantum Field Theory 110A. General theorems 110B. Particles and localization 111C. Entanglement in quantum field theory 112D. Accelerated detectors 113VI. Beyond Special Relativity 114A. Entanglement revisited 115B. The thermodynamics of black holes 116C. Open problems 118Acknowledgments and Apologies 118Appendix A: Relativistic State Transformations 119Appendix B: Black-Hole Radiation 119References 120I. THREE INSEPARABLE THEORIESQuantum theory and relativity theory emerged at thebeginning of the twentieth century to give answers tounexplained issues in physics: the blackbody spectrum,the structure of atoms and nuclei, the electrodynamics ofmoving bodies. Many years later, information theorywas developed by Claude Shannon (1948) for analyzingthe efficiency of communication methods. How do theseseemingly disparate disciplines relate to each other? Inthis review, we shall show that they are inseparablylinked.A. Relativity and informationCommon presentations of relativity theory employfictitious observers who send and receive signals. These‘‘observers’’ should not be thought of as human beings,but rather as ordinary physical emitters and detectors.Their role is to label and locate events in spacetime. Thespeed of transmission of these signals is bounded byc—the velocity of light—because information needs amaterial carrier, and the latter must obey the laws ofphysics. Information is physical (Landauer, 1991).[My comment #1: Indeed information is physical. Contrary to Peres, in Bohm's theory Q is also physical but not material (be able), consequently one can have entanglement negentropy transfer without be able material propagation of a classical signal. I think Peres makes a fundamental error here.]However, the mere existence of an upper bound onthe speed of propagation of physical effects does not dojustice to the fundamentally new concepts that were introducedby Albert Einstein (one could as well imaginecommunications limited by the speed of sound, or thatof the postal service). Einstein showed that simultaneityhad no absolute meaning, and that distant events mighthave different time orderings when referred to observersin relative motion. Relativistic kinematics is all aboutinformation transfer between observers in relative motion.Classical information theory involves concepts such asthe rates of emission and detection of signals, and thenoise power spectrum. These variables have well definedrelativistic transformation properties, independentof the actual physical implementation of the communicationsystem.