Text Size

Stardrive

Tag » Antony Valentini
The theory of relativity deals with the geometric
structure of a four-dimensional spacetime. Quantum mechanics
describes properties of matter. Combining these
two theoretical edifices is a difficult proposition. For example,
there is no way of defining a relativistic proper
time for a quantum system which is spread all over
space. A proper time can in principle be defined for a
massive apparatus (‘‘observer’’) whose Compton wavelength
is so small that its center of mass has classical
coordinates and follows a continuous world line. However,
when there is more than one apparatus, there is no
role for the private proper times that might be attached
to the observers’ world lines. Therefore a physical situation
involving several observers in relative motion cannot
be described by a wave function with a relativistic
transformation law (Aharonov and Albert, 1981; Peres,
1995, and references therein). This should not be surprising
because a wave function is not a physical object.
It is only a tool for computing the probabilities of objective
macroscopic events.
 
Einstein’s [special] principle of relativity asserts that there are
no privileged inertial frames. 
 
[Comment #3: Einstein's general principle of relativity is that there are no privileged local accelerating frames (AKA LNIFs). In addition, Einstein's equivalence principle is that one can always find a local inertial frame (LIF) coincident with a LNIF (over a small enough region of 4D space-time) in which to a good approximation, Newton's 1/r^2 force is negligible "Einstein's happiest thought" Therefore, Newton's universal "gravity force" is a purely inertial, fictitious, pseudo-force exactly like Coriolis, centrifugal and Euler forces that are artifacts of the proper acceleration of the detector having no real effect on the test particle being measured by the detector. The latter assumes no rigid constraint between detector and test particle. For example a test particle clamped to the edge r of a uniformly slowly rotating disk will have a real EM force of constraint that is equal to m w x w x r.]
 
This does not imply the
necessity or even the possibility of using manifestly symmetric
four-dimensional notations. This is not a peculiarity
of relativistic quantum mechanics. Likewise, in classical
canonical theories, time has a special role in the
equations of motion.
 
The relativity principle is extraordinarily restrictive.
For example, in ordinary classical mechanics with a finite
number of degrees of freedom, the requirement that
the canonical coordinates have the meaning of positions,
so that particle trajectories q(t) transform like
four-dimensional world lines, implies that these lines
consist of straight segments. Long-range interactions are
forbidden; there can be only contact interactions between
point particles (Currie, Jordan, and Sudarshan,
1963; Leutwyler, 1965). Nontrivial relativistic dynamics
requires an infinite number of degrees of freedom,
which are labeled by the spacetime coordinates (this is
called a field theory).
 
Combining relativity and quantum theory is not only a
difficult technical question on how to formulate dynamical
laws. The ontologies of these theories are radically
different. Classical theory asserts that fields, velocities,
etc., transform in a definite way and that the equations
of motion of particles and fields behave covariantly. …
 
For example, if the expression for the Lorentz force is written
...in one frame, the same expression is valid
in any other frame. These symbols …. have objective
values. They represent entities that really exist, according
to the theory. On the other hand, wave functions
are not defined in spacetime, but in a multidimensional
Hilbert space. They do not transform covariantly when
there are interventions by external agents, as will be
seen in Sec. III. Only the classical parameters attached
to each intervention transform covariantly. Yet, in spite
of the noncovariance of r, the final results of the calculations
(the probabilities of specified sets of events) must
be Lorentz invariant.
 
As a simple example, consider our two observers, conventionally
called Alice and Bob,4 holding a pair of spin-1/2
particles in a singlet state. Alice measures sand finds
+1, say. This tells her what the state of Bob’s particle is,
namely, the probabilities that Bob would obtain + or - 1 if he
measures (or has measured, or will measure) s along
any direction he chooses. This is purely counterfactual
information: nothing changes at Bob’s location until he
performs the experiment himself, or receives a message
from Alice telling him the result that she found. In particular,
no experiment performed by Bob can tell him
whether Alice has measured (or will measure) her half
of the singlet.
 
A seemingly paradoxical way of presenting these results
is to ask the following naive question. Suppose that
Alice finds that sz = 1 while Bob does nothing. When
does the state of Bob’s particle, far away, become the
one for which sz = -1 with certainty? Although this
question is meaningless, it may be given a definite answer:
Bob’s particle state changes instantaneously. In
which Lorentz frame is this instantaneous? In any
frame! Whatever frame is chosen for defining simultaneity,
the experimentally observable result is the same, as
can be shown in a formal way (Peres, 2000b). Einstein
himself was puzzled by what seemed to be the instantaneous
transmission of quantum information. In his autobiography,
he used the words ‘‘telepathically’’ and
‘‘spook’’ (Einstein, 1949). …
 
In the laboratory, any experiment
has to be repeated many times in order to infer a
law; in a theoretical discussion, we may imagine an infinite
number of replicas of our gedanken experiment, so
as to have a genuine statistical ensemble. Yet the validity
of the statistical nature of quantum theory is not restricted
to situations in which there are a large number
of similar systems. Statistical predictions do apply to
single eventsWhen we are told that the probability of
precipitation tomorrow is 35%, there is only one tomorrow.
This tells us that it may be advisable to carry an
umbrella. Probability theory is simply the quantitative
formulation of how to make rational decisions in the
face of uncertainty (Fuchs and Peres, 2000). A lucid
analysis of how probabilistic concepts are incorporated
into physical theories is given by Emch and Liu (2002).
 
[My comment #4: Peres is correct, but there is no conflict with Bohm's ontological
interpretation here. The Born probability rule is not fundamental to quantum reality
in Bohm's view, but is a limiting case when the beables are in thermal equilibrium.]
 
...
 
Some trends in modern quantum information theory
may be traced to security problems in quantum communication.
A very early contribution was Wiesner’s seminal
paper ‘‘Conjugate Coding,’’ which was submitted
circa 1970 to IEEE Transactions on Information Theory
and promptly rejected because it was written in a jargon
incomprehensible to computer scientists (this was actually
a paper about physics, but it had been submitted to
a computer science journal). Wiesner’s article was finally
published (Wiesner, 1983) in the newsletter of ACM
SIGACT (Association for Computing Machinery, Special
Interest Group in Algorithms and Computation
Theory). That article tacitly assumed that exact duplication
of an unknown quantum state was impossible, well
before the no-cloning theorem (Dieks, 1982; Wootters
and Zurek, 1982) became common knowledge. Another
early article, ‘‘Unforgeable Subway Tokens’’ (Bennett
et al., 1983) also tacitly assumed the same.
 
II. THE ACQUISITION OF INFORMATION
A. The ambivalent quantum observer
Quantum mechanics is used by theorists in two different
ways. It is a tool for computing accurate relationships
between physical constants, such as energy levels,
cross sections, transition rates, etc. These calculations
are technically difficult, but they are not controversial.
In addition to this, quantum mechanics also provides
statistical predictions for results of measurements performed
on physical systems that have been prepared in a
specified way. 
 
[My comment #5: No mention of Yakir Aharonov's intermediate present "weak measurements"
with both history past pre-selection and destiny future post-selection constraints. The latter in
Wheeler delayed choice mode would force the inference of real back-from-the-future retrocausality.
This would still be consistent with Abner Shimony's "passion at a distance," i.e. "signal locality"
in that the observer at the present weak measurement would not know what the future constraint 
actually will be. In contrast, with signal non locality (Sarfatti  1976 MIT Tech Review (Martin Gardner) & 
Antony Valentini (2002)) such spooky precognition would be possible as in Russell Targ's reports on 
CIA funded RV experiments at SRI in the mid 70's and 80's. 
This is, on the face of it, a gross violation of orthodox
quantum theory as laid out here in the Peres review paper.]
 
The quantum measuring process is the interface
of classical and quantum phenomena. The preparation
and measurement are performed by macroscopic
devices, and these are described in classical terms. The
necessity of using a classical terminology was emphasized
by Niels Bohr (1927) from the very early days of
quantum mechanics. Bohr’s insistence on a classical description
was very strict. He wrote (1949)
 
‘‘ . . . by the word ‘experiment’ we refer to a situation
where we can tell others what we have done and what
we have learned and that, therefore, the account of the
experimental arrangement and of the results of the observations
must be expressed in unambiguous language,
with suitable application of the terminology of
classical physics.’’
 
Note the words ‘‘we can tell.’’ Bohr was concerned
with information, in the broadest sense of this term. He
never said that there were classical systems or quantum
systems. There were physical systems, for which it was
appropriate to use the classical language or the quantum
language. There is no guarantee that either language
gives a perfect description, but in a well-designed experiment
it should be at least a good approximation.
 
Bohr’s approach divides the physical world into ‘‘endosystems’’
(Finkelstein, 1988), which are described by
quantum dynamics, and ‘‘exosystems’’ (such as measuring
apparatuses), which are not described by the dynamical
formalism of the endosystem under consideration.
A physical system is called ‘‘open’’ when parts of
the universe are excluded from its description. In different
Lorentz frames used by observers in relative motion,
different parts of the universe may be excluded. The
systems considered by these observers are then essentially
different, and no Lorentz transformation exists
that can relate them (Peres and Terno, 2002).
It is noteworthy that Bohr never described the measuring
process as a dynamical interaction between an
exophysical apparatus and the system under observation.
He was, of course, fully aware that measuring apparatuses
are made of the same kind of matter as everything
else, and they obey the same physical laws. It is
therefore tempting to use quantum theory in order to
investigate their behavior during a measurement. However,
if this is done, the quantized apparatus loses its
status as a measuring instrument. It becomes a mere intermediate
system in the measuring process, and there
must still be a final instrument that has a purely classical
description (Bohr, 1939).
 
Measurement was understood by Bohr as a primitive
notion. He could thereby elude questions which caused
considerable controversy among other authors. A
quantum-dynamical description of the measuring process
was first attempted by John von Neumann in his
treatise on the mathematical foundations of quantum
theory (1932). In the last section of that book, as in an
afterthought, von Neumann represented the apparatus
by a single degree of freedom, whose value was correlated
with that of the dynamical variable being measured.
Such an apparatus is not, in general, left in a definite
pure state, and it does not admit a classical
description. Therefore von Neumann introduced a second
apparatus which observes the first one, and possibly
a third apparatus, and so on, until there is a final measurement,
which is not described by quantum dynamics
and has a definite result (for which quantum mechanics
can give only statistical predictions). The essential point
that was suggested, but not proved by von Neumann, is
that the introduction of this sequence of apparatuses is
irrelevant: the final result is the same, irrespective of the
location of the ‘‘cut’’ between classical and quantum
physics.8
 
These different approaches of Bohr and von Neumann
were reconciled by Hay and Peres (1998), who
8At this point, von Neumann also speculated that the final
step involves the consciousness of the observer—a bizarre
statement in a mathematically rigorous monograph (von Neumann,
1955).
 
B. The measuring process
Dirac (1947) wrote that ‘‘a measurement always
causes the system to jump into an eigenstate of the dynamical
variable being measured.’’ Here, we must be
careful: a quantum jump (also called a collapse) is something
that happens in our description of the system, not
to the system itself. Likewise, the time dependence of
the wave function does not represent the evolution of a
physical system. It only gives the evolution of probabilities
for the outcomes of potential experiments on that
system (Fuchs and Peres, 2000).
 
Let us examine more closely the measuring process.
First, we must refine the notion of measurement and
extend it to a more general one: an interventionAn
intervention is described by a set of parameters which
include the location of the intervention in spacetime, referred
to an arbitrary coordinate system. We also have
to specify the speed and orientation of the apparatus in
the coordinate system that we are using as well as various
other input parameters that control the apparatus,
such as the strength of a magnetic field or that of a rf
pulse used in the experiment. The input parameters are
determined by classical information received from past
interventions, or they may be chosen arbitrarily by the
observer who prepares that intervention or by a local
random device acting in lieu of the observer.
 
[My comment #6: Peres, in my opinion, makes another mistake.
Future interventions will affect past weak measurements.
 

Back From the Future

A series of quantum experiments shows that measurements performed in the future can influence the present. Does that mean the universe has a destiny—and the laws of physics pull us inexorably toward our prewritten fate?

By Zeeya Merali|Thursday, August 26, 2010
http://discovermagazine.com/2010/apr/01-back-from-the-future#.UieOnhac5Hw ]
 
An intervention has two consequences. One is the acquisition
of information by means of an apparatus that
produces a record. This is the ‘‘measurement.’’ Its outcome,
which is in general unpredictable, is the output of
the intervention. The other consequence is a change of
the environment in which the quantum system will
evolve after completion of the intervention. For example,
the intervening apparatus may generate a new
Hamiltonian that depends on the recorded result. In particular,
classical signals may be emitted for controlling
the execution of further interventions. These signals are,
of course, limited to the velocity of light.
The experimental protocols that we consider all start
in the same way, with the same initial state ... , and the
first intervention is the same. However, later stages of
the experiment may involve different types of interventions,
possibly with different spacetime locations, depending
on the outcomes of the preceding events. Yet,
assuming that each intervention has only a finite number
of outcomes, there is for the entire experiment only a
finite number of possible records. (Here, the word
record means the complete list of outcomes that occurred
during the experiment. We do not want to use the
word history, which has acquired a different meaning in
the writings of some quantum theorists.)
 
Each one of these records has a definite probability in
the statistical ensemble. In the laboratory, experimenters
can observe its relative frequency among all the records
that were obtained; when the number of records tends
to infinity, this relative frequency is expected to tend to
the true probability. The aim of theory is to predict the
probability of each record, given the inputs of the various
interventions (both the inputs that are actually controlled
by the local experimenter and those determined
by the outputs of earlier interventions). Each record is
objective: everyone agrees on what happened (e.g.,
which detectors clicked). Therefore, everyone agrees on
what the various relative frequencies are, and the theoretical
probabilities are also the same for everyone.
Interventions are localized in spacetime, but quantum
systems are pervasive. In each experiment, irrespective
of its history, there is only one quantum system, which
may consist of several particles or other subsystems, created
or annihilated at the various interventions. Note
that all these properties still hold if the measurement
outcome is the absence of a detector click. It does not
matter whether this is due to an imperfection of the detector
or to a probability less than 1 that a perfect detector
would be excited. The state of the quantum system
does not remain unchanged. It has to change to
respect unitarity. The mere presence of a detector that
could have been excited implies that there has been an
interaction between that detector and the quantum system.
Even if the detector has a finite probability of remaining
in its initial state, the quantum system correlated
to the latter acquires a different state (Dicke,
1981). The absence of a click, when there could have
been one, is also an event.
 
 
The measuring process involves not only the physical
system under study and a measuring apparatus (which
together form the composite system C) but also their
environment, which includes unspecified degrees of freedom
of the apparatus and the rest of the world. These
unknown degrees of freedom interact with the relevant
ones, but they are not under the control of the experimenter
and cannot be explicitly described. Our partial
ignorance is not a sign of weakness. It is fundamental. If
everything were known, acquisition of information
would be a meaningless concept.
 
 
A complete description of involves both macroscopic
and microscopic variables. The difference between
them is that the environment can be considered as
adequately isolated from the microscopic degrees of
freedom for the duration of the experiment and is not
influenced by them, while the environment is not isolated
from the macroscopic degrees of freedomFor example,
if there is a macroscopic pointer, air molecules bounce
from it in a way that depends on the position of that
pointer. Even if we can neglect the Brownian motion of
a massive pointer, its influence on the environment leads
to the phenomenon of decoherence, which is inherent to
the measuring process.
 
An essential property of the composite system C,
which is necessary to produce a meaningful measurement,
is that its states form a finite number of orthogonal
subspaces which are distinguishable by the observer.
 
[My comment #7: This is not the case for Aharonov's weak measurements where
 
<A>weak = <history|A|destiny>/<history|destiny>
 
Nor is it true when Alice's orthogonal micro-states are entangled with Bob's far away distinguishably non-orthogonal macro-quantum Glauber coherent and possibly squeezed states.
 
  1. Coherent states - Wikipedia, the free encyclopedia

    en.wikipedia.org/wiki/Coherent_states
     
    In physics, in quantum mechanics, a coherent state is the specific quantum state of the quantum harmonic oscillator whose dynamics most closely resembles the ...
    You've visited this page many times. Last visit: 8/7/13
  2. Review of Entangled Coherent States

    arxiv.org › quant-ph
    by BC Sanders - ‎2011 - ‎Cited by 6 - ‎Related articles
    Dec 8, 2011 - Abstract: We review entangled coherent state research since its first implicit use in 1967
 
|Alice,Bob> = (1/2)[|Alice +1>|Bob alpha> + |Alice -1>|Bob beta>]
 
<Alice+1|Alice -1> = 0
 
<Bob alpha|Bob beta> =/= 0  
 
 
e.g. Partial trace over Bob's states  |<Alice +1|Alice-Bob>|^2 = (1/2)[1 + |<Bob alpha|Bob beta>|^2] > 1
 
this is formally like a weak measurement where the usual Born probability rule breaks down. 
 
Complete isolation from environmental decoherence is assumed here.
 
It is clear violation of "passion at a distance" no-entanglement signaling arguments based on axioms that are empirically false in my opinion.
 
"The statistics of Bob’s result are not affected at all by what Alice may simultaneously do somewhere else. " (Peres) 
 
is false.
 
While a logically correct formal proof is desirable in physics, Nature has ways of leap frogging over their premises.
 
One can have constrained pre and post-selected conditional probabilities that are greater than 1, negative and even complex numbers. 
 
All of which correspond to observable effects in the laboratory - see Aephraim Steinberg's experimental papers
University of Toronto.]
 
Each macroscopically distinguishable subspace corresponds
to one of the outcomes of the intervention and
defines a POVM element Em , given explicitly by Eq. (8)
below. …
 
C. Decoherence
Up to now, quantum evolution is well defined and it is
in principle reversible. It would remain so if the environment
could be perfectly isolated from the macroscopic
degrees of freedom of the apparatus. This demand is of
course self-contradictory, since we have to read the result
of the measurement if we wish to make any use of it.
 
A detailed analysis of the interaction with the environment,
together with plausible hypotheses (Peres, 2000a),
shows that states of the environment that are correlated
with subspaces of with different labels m can be treated
as if they were orthogonal. This is an excellent approximation
(physics is not an exact science, it is a science of
approximations). The resulting theoretical predictions
will almost always be correct, and if any rare small deviation
from them is ever observed, it will be considered
as a statistical quirk or an experimental error.
 
The density matrix of the quantum system is thus effectively
block diagonal, and all our statistical predictions
are identical to those obtained for an ordinary mixture
of (unnormalized) pure states
.
 
This process is called decoherence. Each subspace
m is stable under decoherence—it is their relative
phase that decoheres. From this moment on, the macroscopic
degrees of freedom of have entered into the
classical domain. We can safely observe them and ‘‘lay
on them our grubby hands’’ (Caves, 1982). In particular,
they can be used to trigger amplification mechanisms
(the so-called detector clicks) for the convenience of the
experimenter.
 
Some authors claim that decoherence may provide a
solution of the ‘‘measurement problem,’’ with the particular
meaning that they attribute to that problem
(Zurek, 1991). Others dispute this point of view in their
comments on the above article (Zurek, 1993). A reassessment
of this issue and many important technical details
were recently published by Zurek (2002, 2003). Yet
decoherence has an essential role, as explained above. It
is essential that we distinguish decoherence, which results
from the disturbance of the environment by the
apparatus (and is a quantum effect), from noise, which
would result from the disturbance of the system or the
apparatus by the environment and would cause errors.
Noise is a mundane classical phenomenon, which we ignore
in this review.
 
E. The no-communication theorem
We now derive a sufficient condition that no instantaneous
information transfer can result from a distant intervention.
We shall show that the condition is
 
[Am,Bnn] = 0
 
where Amand Bnare Kraus matrices for the observation
of outcomes m by Alice and n by Bob.
 
[My comment #8: "The most beautiful theory is murdered by an ugly fact." - Feynman
e.g. Libet-Radin-Bierman presponse in living brain data
SRI CIA vetted reports of remote viewing by living brains.
 
  1. CIA-Initiated Remote Viewing At Stanford Research Institute

    www.biomindsuperpowers.com/Pages/CIA-InitiatedRV.html
     
    As if to add insult to injury, he then went on to "remote view" the interior of the apparatus, .... Figure 6 - Left to right: Christopher Green, Pat Price, and Hal Puthoff.
    You've visited this page many times. Last visit: 5/30/13
  2. Harold E. Puthoff - Wikipedia, the free encyclopedia

    en.wikipedia.org/wiki/Harold_E._Puthoff
     
    PuthoffHal, Success Story, Scientology Advanced Org Los Angeles (AOLA) special... H. E. Puthoff, CIA-Initiated Remote Viewing At Stanford Research Institute, ...
  3. Remote viewing - Wikipedia, the free encyclopedia

    en.wikipedia.org/wiki/Remote_viewing
     
    Among some of the ideas that Puthoff supported regarding remote viewing was the ...by Russell Targ and Hal Puthoff at Stanford Research Institute in the 1970s  ...
    You've visited this page many times. Last visit: 7/5/13
  4. Dr. Harold Puthoff on Remote Viewing - YouTube

    www.youtube.com/watch?v=FOAfH1utUSM
    Apr 28, 2011 - Uploaded by corazondelsur
    Dr. Hal Puthoff is considered the father of the US government'sRemote Viewing program, which reportedly ...
     
  5. Remoteviewed.com - Hal Puthoff

    www.remoteviewed.com/remote_viewing_halputhoff.htm
     
    Dr. Harold E. Puthoff is Director of the Institute for Advanced Studies at Austin. A theoretical and experimental physicist specializing in fundamental ...
 
 
On Sep 4, 2013, at 9:06 AM, JACK SARFATTI <adastra1@icloud.com> wrote:
 
Peres here is only talking about Von Neumann's strong measurements not 
Aharonov's weak measurements.

Standard texbooks on quantum mechanics
tell you that observable quantities are represented by
Hermitian operators, that their possible values are the
eigenvalues of these operators, and that the probability
of detecting eigenvalue a, corresponding to eigenvector
|a>  |<a|psi>|2, where |psi> is the (pure) state of the
quantum system that is observed. With a bit more sophistication
to include mixed states, the probability can
be written in a general way <a|rho|a> …
 
This is nice and neat, but it does not describe what
happens in real lifeQuantum phenomena do not occur
in Hilbert space; they occur in a laboratory. If you visit a
real laboratory, you will never find Hermitian operators
there. All you can see are emitters (lasers, ion guns, synchrotrons,
and the like) and appropriate detectors. In
the latter, the time required for the irreversible act of
amplification (the formation of a microscopic bubble in
a bubble chamber, or the initial stage of an electric discharge)
is extremely brief, typically of the order of an
atomic radius divided by the velocity of light. Once irreversibility
has set in, the rest of the amplification process
is essentially classical. It is noteworthy that the time and
space needed for initiating the irreversible processes are
incomparably smaller than the macroscopic resolution
of the detecting equipment.
 
The experimenter controls the emission process and
observes detection events. The theorist’s problem is to
predict the probability of response of this or that detector,
for a given emission procedure. It often happens
that the preparation is unknown to the experimenter,
and then the theory can be used for discriminating between
different preparation hypotheses, once the detection
outcomes are known.
 
<Screen Shot 2013-09-04 at 8.57.50 AM.png>
 
Many physicists, perhaps a majority, have an intuitive,
realistic worldview and consider a quantum state as a
physical entity. Its value may not be known, but in principle
the quantum state of a physical system would be
well defined. However, there is no experimental evidence
whatsoever to support this naive belief. On the
contrary, if this view is taken seriously, it may lead to
bizarre consequences, called ‘‘quantum paradoxes.’’
These so-called paradoxes originate solely from an incorrect
interpretation of quantum theory, which is thoroughly
pragmatic and, when correctly used, never yields
two contradictory answers to a well-posed question. It is
only the misuse of quantum concepts, guided by a pseudorealistic
philosophy, that leads to paradoxical results.
 
[My comment #2: Here is the basic conflict between epistemological vs ontological views of quantum reality.]
 
In this review we shall adhere to the view that r is
only a mathematical expression which encodes information
about the potential results of our experimental interventions.
The latter are commonly called
‘‘measurements’’—an unfortunate terminology, which
gives the impression that there exists in the real world
some unknown property that we are measuring. Even
the very existence of particles depends on the context of
our experiments. In a classic article, Mott (1929) wrote
‘‘Until the final interpretation is made, no mention
should be made of the a ray being a particle at all.’’
Drell (1978a, 1978b) provocatively asked ‘‘When is a
particle?’’ In particular, observers whose world lines are
accelerated record different numbers of particles, as will
be explained in Sec. V.D (Unruh, 1976; Wald, 1994).
 
 
1The theory of relativity did not cause as much misunderstanding
and controversy as quantum theory, because people
were careful to avoid using the same nomenclature as in nonrelativistic
physics. For example, elementary textbooks on
relativity theory distinguish ‘‘rest mass’’ from ‘‘relativistic
mass’’ (hard-core relativists call them simply ‘‘mass’’ and ‘‘energy’’).
2The ‘‘irreversible act of amplification’’ is part of quantum
folklore, but it is not essential to physics. Amplification is
needed solely to facilitate the work of the experimenter.
3Positive operators are those having the property that
^curuc&>0 for any state c. These operators are always Hermitian.
94 A. Peres and D. R. Terno: Quantum information and relativity theory
Rev. Mod.
 
 
 
On Sep 4, 2013, at 8:48 AM, JACK SARFATTI <adastra1@icloud.com> wrote:



Begin forwarded message:

From: JACK SARFATTI <jacksarfatti@icloud.com>
Subject: Quantum information and relativity theory
Date: September 4, 2013 8:33:48 AM PDT
To: nick herbert <quanta@mail.cruzio.com>
 

The late Asher Peres http://en.wikipedia.org/wiki/Asher_Peres interpretation is the antithesis of the late David Bohm's ontological interpretation http://en.wikipedia.org/wiki/David_Bohm holding to a purely subjective epistemological Bohrian interpretation of the quantum BIT potential Q.
He claims that Antony Valentini's signal non locality beyond orthodox quantum theory would violate the Second Law of Thermodynamics.
 
REVIEWS OF MODERN PHYSICS, VOLUME 76, JANUARY 2004
Quantum information and relativity theory
Asher Peres
Department of Physics, Technion–Israel Institute of Technology, 32000 Haifa, Israel
Daniel R. Terno
Perimeter Institute for Theoretical Physics, Waterloo, Ontario, Canada N2J 2W9
(Published 6 January 2004)
This article discusses the intimate relationship between quantum mechanics, information theory, and
relativity theory. Taken together these are the foundations of present-day theoretical physics, and
their interrelationship is an essential part of the theory. The acquisition of information from a
quantum system by an observer occurs at the interface of classical and quantum physics. The authors
review the essential tools needed to describe this interface, i.e., Kraus matrices and
positive-operator-valued measures. They then discuss how special relativity imposes severe
restrictions on the transfer of information between distant systems and the implications of the fact that
quantum entropy is not a Lorentz-covariant concept. This leads to a discussion of how it comes about
that Lorentz transformations of reduced density matrices for entangled systems may not be
completely positive maps. Quantum field theory is, of course, necessary for a consistent description of
interactions. Its structure implies a fundamental tradeoff between detector reliability and
localizability. Moreover, general relativity produces new and counterintuitive effects, particularly
when black holes (or, more generally, event horizons) are involved. In this more general context the
authors discuss how most of the current concepts in quantum information theory may require a
reassessment.
CONTENTS
I. Three Inseparable Theories 93
A. Relativity and information 93
B. Quantum mechanics and information 94
C. Relativity and quantum theory 95
D. The meaning of probability 95
E. The role of topology 96
F. The essence of quantum information 96
II. The Acquisition of Information 97
A. The ambivalent quantum observer 97
B. The measuring process 98
C. Decoherence 99
D. Kraus matrices and positive-operator-valued
measures (POVM’s) 99
E. The no-communication theorem 100
III. The Relativistic Measuring Process 102
A. General properties 102
B. The role of relativity 103
C. Quantum nonlocality? 104
D. Classical analogies 105
IV. Quantum Entropy and Special Relativity 105
A. Reduced density matrices 105
B. Massive particles 105
C. Photons 107
D. Entanglement 109
E. Communication channels 110
V. The Role of Quantum Field Theory 110
A. General theorems 110
B. Particles and localization 111
C. Entanglement in quantum field theory 112
D. Accelerated detectors 113
VI. Beyond Special Relativity 114
A. Entanglement revisited 115
B. The thermodynamics of black holes 116
C. Open problems 118
Acknowledgments and Apologies 118
Appendix A: Relativistic State Transformations 119
Appendix B: Black-Hole Radiation 119
References 120
I. THREE INSEPARABLE THEORIES
Quantum theory and relativity theory emerged at the
beginning of the twentieth century to give answers to
unexplained issues in physics: the blackbody spectrum,
the structure of atoms and nuclei, the electrodynamics of
moving bodies. Many years later, information theory
was developed by Claude Shannon (1948) for analyzing
the efficiency of communication methods. How do these
seemingly disparate disciplines relate to each other? In
this review, we shall show that they are inseparably
linked.
A. Relativity and information
Common presentations of relativity theory employ
fictitious observers who send and receive signals. These
‘‘observers’’ should not be thought of as human beings,
but rather as ordinary physical emitters and detectors.
Their role is to label and locate events in spacetime. The
speed of transmission of these signals is bounded by
c—the velocity of light—because information needs a
material carrier, and the latter must obey the laws of
physics. Information is physical (Landauer, 1991).
 
[My comment #1: Indeed information is physical. Contrary to Peres, in Bohm's theory Q is also physical but not material (be able), consequently one can have entanglement negentropy transfer without be able material propagation of a classical signal. I think Peres makes a fundamental error here.]
 
However, the mere existence of an upper bound on
the speed of propagation of physical effects does not do
justice to the fundamentally new concepts that were introduced
by Albert Einstein (one could as well imagine
communications limited by the speed of sound, or that
of the postal service). Einstein showed that simultaneity
had no absolute meaning, and that distant events might
have different time orderings when referred to observers
in relative motion. Relativistic kinematics is all about
information transfer between observers in relative motion.
 
Classical information theory involves concepts such as
the rates of emission and detection of signals, and the
noise power spectrum. These variables have well defined
relativistic transformation properties, independent
of the actual physical implementation of the communication
system.


1)   . I intuited the connection between the Einstein-Rosen (ER) wormhole and Einstein-Podolsky-Rosen (EPR) quantum entanglement back in 1973 when I was with Abdus Salam at the International Centre of Theoretical Physics in Trieste, Italy. This idea was published in the wacky book “Space-Time and Beyond” (Dutton, 1975) described by MIT physics historian David Kaiser in his book “How the Hippies Saved Physics.” Lenny Susskind, who I worked with at Cornell 1963-4, rediscovered this ER = EPR connection in the black hole “firewall” paradox. Lenny envisions a multi-mouthed wormhole network connecting the Hawking radiation particles their entangled twins behind the evaporating event horizon. “each escaping particle remains connected to the black hole through a wormhole” Dennis Overbye, Einstein and the Black Hole, New York Times August 13, 2013.  The no-signaling theorem corresponds to the wormhole pinching off before a light speed limited signal can pass through one mouth to the other. Now we know that traversable wormhole stargates are possible using amplified anti-gravity dark energy. This corresponds to signal-nonlocality in post-quantum theory violating orthodox quantum theory. 

1)      Localizing global symmetries requires the addition of compensating gauge connections in a fiber bundle picture of the universe. Indeed, the original global symmetry group is a smaller subgroup of the local symmetry group. The gauge connections define parallel transport of tensor/spinor fields. They correspond to the interactions between the several kinds of charges of the above symmetries. I shall go into more details of this elsewhere. Indeed localizing the above spacetime symmetries corresponds to generalizations of Einstein’s General Relativity as a local gauge theory.[i] For example, localizing the space and time global translational symmetries means that the Lie group transformations at different events (places and times) in the universe are independent of each other. If one believes in the classical special relativity postulate of locality that there are no faster-than-light actions at a distance, then the transformations must certainly be independent of each other between pairs of spacelike separated events that cannot be connected by a light signal. However, the local gauge principle is much stronger, because it applies to pairs of events that can be connected not only by a light signal, but also by slower-than-light timelike signals. This poses a paradox when we add quantum entanglement.  Aspect’s experiment and others since then, show that faster-than-light influences do in fact exist in the conditional probabilities (aka correlations) connecting observed eigenvalues of quantum observable operators independently chosen by Alice and Bob when spacelike separated. I shall return to this in more detail elsewhere. However, the no entanglement-signaling postulate is thought by many mainstream theoretical physicists to define orthodox quantum theory. It’s believed that its violation would also violate the Second Law of Thermodynamics. Note that the entanglement signal need not be faster-than-light over a spacelike separation between sender and receiver. It could be lightlike or timelike separated as well. Indeed it can even be retrocausal with the message sent back-from-the-future. John Archibald Wheeler’s “delayed choice experiment” is actually consistent with orthodox quantum theory’s no-signaling premise. The point is, that one cannot decode the message encoded in the pattern of entanglement until one has a classical signal key that only propagates forward in time. What one sees before the classical key arrives and a correlation analysis is computed is only local random white noise. However, data on precognitive remote viewing as well as brain presponse data suggests that no-entanglement signaling is only true for dead matter. Nobel Prize physicist, Brian Josephson first published on this. I have also suggested it using Bohm’s ontological interpretation (Lecture 8 of Michael Towler’s Cambridge University Lectures on Bohm’s Pilot Wave). Antony Valentini has further developed this idea in several papers. Post-quantum “signal nonlocality” dispenses with the need to wait for the light-speed limited retarded signal key propagating from past to future. Local non-random noise will be seen in violation of the S-Matrix unitarity “conservation of information” postulate of G. ‘t Hooft, L. Susskind et-al.  Indeed the distinguishable non-orthogonality of entangled Glauber macro-quantum coherent states seems to be the way to get signal nonlocality. This gets us to the “Black Hole War” between Susskind and Hawking about information loss down evaporating black holes. It seems that Hawking caved in too fast to Susskind back in Dublin in 2004. I intuited the connection between the Einstein-Rosen (ER) wormhole and Einstein-Podolsky-Rosen (EPR) quantum entanglement back in 1973 when I was with Abdus Salam at the International Centre of Theoretical Physics in Trieste, Italy. This idea was published in the wacky book “Space-Time and Beyond” (Dutton, 1975) described by MIT physics historian David Kaiser in his book “How the Hippies Saved Physics.” Lenny Susskind, who I worked with at Cornell 1963-4, rediscovered this ER = EPR connection in the black hole “firewall” paradox.



[i] Localizing the four space and time translations corresponds to Einstein’s general coordinate transformations that are now gauge transformations defining an equivalence class of physically identical representations of the same curvature tensor field. However, the compensating gauge connection there corresponds to torsion fields not curvature fields. The curvature field corresponds to localizing the three space-space rotations and the three space-time Lorentz boost rotations together. Einstein’s General Relativity in final form (1916) has zero torsion with non-zero curvature. However, T.W.B. Kibble from Imperial College, London in 1961 showed how to get the Einstein-Cartan torsion + curvature extension of Einstein’s 1916 curvature-only model by localizing the full 10-parameter Poincare symmetry Lie group of Einstein’s 1905 Special Relativity. The natural geometric objects to use are the four Cartan tetrads that correspond to Local Inertial Frame (LIF) detector/observers that are not rotating about their Centers of Mass (COM) that are on weightless zero g-force timelike geodesics.  Zero torsion is then imposed as an ad-hoc constraint to regain Einstein’s 1916 model as a limiting case. The ten parameter Poincare Lie group is subgroup of the fifteen parameter conformal group that adds four constant proper acceleration hyperbolic Wolfgang Rindler horizon boosts and one dilation scale transformation that corresponds to Herman Weyl’s original failed attempt to unify gravity with electromagnetism. The spinor Dirac square roots of the conformal group correspond to Roger Penrose’s “twistors.”

 

My review of Jim Woodward's Making Starships book - V1 under construction
  • Jack Sarfatti Sarfatti’s Commentaries on James F. Woodward’s book 
    Making Starships and Star Gates 
    The Science of Interstellar Transport and Absurdly Benign Wormholes

    The book has many good insights except for some ambiguous statements regarding:

    1) The equivalence principle that is the foundation of Einstein’s theory of the gravitational field. This seems to be due to the author’s not clearly distinguishing between local frame invariant proper acceleration and frame dependent coordinate acceleration. Thus, the author says that Newton’s gravity force is eliminated in an “accelerating frame.” In fact, it is eliminated in a Local Inertial Frame (LIF) that has zero proper acceleration, though it has coordinate acceleration relative to the surface of Earth for example. All points of the rigid spherical surface of Earth have non-zero proper accelerations pointing radially outward. This violates common sense and confuses even some physicists as well as engineers not to mention laymen. It is a fact of the Alice in Wonderland topsy-turvy surreal world of the post-modern physics of Einstein’s relativity especially when combined with the faster-than-light and back from the future entanglement of particles and fields in quantum theory and beyond. 
    2) I find the author’s discussion of fictitious inertial pseudo forces puzzling. I include the centripetal force as a fictitious force in the limit of Newton’s particle mechanics sans Einstein’s local inertial frame dragging from rotating sources. That is, every local frame artifact that is inside the Levi-Civita connection is a fictitious inertial pseudo force. This includes, Coriolis, centrifugal, Euler, and most importantly Newton’s gravity force that is not a real force. The terms inside the Levi-Civita connection are not felt by the test particle under observation. Instead, they describe real forces acting on the observer’s local rest frame. A real force acts locally on a test particle’s accelerometer. It causes an accelerometer’s pointer to move showing a g-force. In contrast, Baron Munchausen sitting on a cannonball in free fall is weightless. This was essentially Einstein’s “happiest thought” leading him to the equivalence principle the cornerstone of his 1916 General Relativity of the Gravitational Field. 
    3) A really serious flaw in the book is the author’s dependence on Dennis Sciama’s electromagnetic equations for gravity. In fact, these equations only apply approximately in the weak field limit of Einstein’s field equations in the background-dependent case using the absolute non-dynamical globally-flat Minkowski space-time with gravity as a tiny perturbation. The author uses these equations way out of their limited domain of validity. In particular, the Sciama equations cannot describe the two cosmological horizons past and future of our dark energy accelerating expanding observable universe. What we can see with our telescopes is only a small patch (aka “causal diamond”) of a much larger “inflation bubble” corresponding to Max Tegmark’s “Level 1” in his four level classification of the use of “multiverse” and “parallel universes.” Our two cosmological horizons, past and future, that are thin spherical shells of light with us inside them at their exact centers may in fact be hologram computer screens projecting us as 3D images in a virtual reality quantum computer simulation. This is really a crazy idea emerging from Gerardus ‘t Hooft, Leonard Susskind, Seth Lloyd and others. Is it crazy enough to be true? 
  • Jack Sarfatti 4) John Cramer’s Foreword: I agree with Cramer that it’s too risky in the long run for us to be confined to the Earth and even to this solar system. British Astronomer Royal, Lord Martin Rees in his book “Our Final Hour” gives detailed reasons. Of course if a vacuum strangelet develops like Kurt Vonnegut’s “Ice-9”, then our entire observable universe can be wiped out, our causal diamond and beyond shattered, and there is no hope. That is essentially the apocalyptic worst-case scenario of the Bible’s “Revelations” and we will not dwell on it any further. Let’s hope it’s not a precognitive remote viewing like what the CIA observed in the Stanford Research Institute studies in the 1970’s.  Cramer cites the NASA-DARPA 100 Year Star Ship Project that I was involved with in the first two meetings. Cramer’s text is in quotes and italics. There is “little hope of reaching the nearby stars in a human lifetime using any conventional propulsion techniques … the universe is simply too big, and the stars are too far away. … What is needed is either trans-spatial shortcuts such as wormholes to avoid the need to traverse the enormous distances or a propulsion technique that somehow circumvents Newton’s third law and does not require the storage, transport and expulsion of large volumes of reaction mass.”
    Yes, indeed. I conjecture as a working hypothesis based on the UFO evidence that traversable wormhole stargate time travel machines are the only way to go with warp drive used only as a secondary mechanism at low speeds mainly for silent hovering near the surfaces of planets and for dogfights with conventional aerospace craft. The stargates do not have the blue shift problem that the Alcubierre warp drive has although the Natario warp drive does not have the blue shift problem (high-energy collisions with particles and radiation in the path of the starship). Newton’s third law that every force acting on a material object has an equal and opposite inertial reaction force on the source of that force is a conservation law that follows from symmetry Lie groups of transformations in parameters of the dynamical action of the entire closed system of source and material object. This is a very general organizing principle of theoretical physics known as Noether’s theorem for global symmetries in which the transformations are the same everywhere for all times in the universe. For example:
    Space Translation Symmetry Linear Momentum Conservation
    Time Translation Symmetry Energy Conservation
    Space-Space Rotation Symmetry Angular Momentum Conservation
    Space-Time Rotation Symmetry
    Internal U1 EM Force Symmetry Conserve 1 Electric Charge
    Internal SU2 Weak Force Symmetry Conserve 3 Weak Flavor Charges
    Internal SU3 Strong Force Symmetry Conserve 8 Strong Color Charges
  • Jack Sarfatti In a propellantless propulsion system without the rocket ejection of real particles and/or radiation one must include the gravity curvature field (dynamical space-time itself) as a source and sink of linear momentum. Furthermore, if we include quantum corrections to the classical fields there is the remote possibility of using virtual particle zero point fluctuations inside the vacuum as a source and sink of linear momentum. However, the conventional wisdom is that this kind of controllable small-scale metastable vacuum phase transition is impossible in principle and to do so would violate the Second Law of Thermodynamics (extracting work from an absolute zero temperature heat reservoir). Even if we could do the seemingly impossible, propellantless propulsion while necessary is not sufficient for a true warp drive. A true warp drive must be weightless (zero g-force) timelike geodesic and without time dilation for the crew relative to the external observer outside the warp bubble that they were initially clock synchronized with. Localizing global symmetries requires the addition of compensating gauge connections in a fiber bundle picture of the universe. Indeed, the original global symmetry group is a smaller subgroup of the local symmetry group. The gauge connections define parallel transport of tensor/spinor fields. They correspond to the interactions between the several kinds of charges of the above symmetries. I shall go into more details of this elsewhere. Indeed localizing the above spacetime symmetries corresponds to generalizations of Einstein’s General Relativity as a local gauge theory. For example, localizing the space and time global translational symmetries means that the Lie group transformations at different events (places and times) in the universe are independent of each other. If one believes in the classical special relativity postulate of locality that there are no faster-than-light actions at a distance, then the transformations must certainly be independent of each other between pairs of spacelike events that cannot be connected by a light signal. However, the local gauge principle is much stronger, because it applies to pairs of spacelike events that can be connected not only by a light signal, but also by slower-than-light timelike signals. This poses a paradox when we add quantum entanglement. Aspect’s experiment and others since then, show that faster-than-light influences do in fact exist in the conditional probabilities (aka correlations) connecting observed eigenvalues of quantum observable operators independently chosen by Alice and Bob when spacelike separated. I shall return to this in more detail elsewhere. Finally, we have the P.W. Anderson’s anti-reductionist “More is different” emergence of complex systems of real particles in their quantum ground states with quasi-particles and collective mode excitations in soft condensed matter in which the whole is greater than the sum of its parts. This corresponds to spontaneous symmetry breaking of the quantum vacuum’s virtual particles, in its high energy standard model analog, to the Higgs-Goldstone “God Particle” now found at ~ 125 Gev in CERN’s LHC that gives rest masses to leptons and quarks as well as to the three weak radioactivity force spin 1 gauge W-bosons though not to the single spin 1 photon gauge boson and the eight spin strong force gluon gauge bosons. In this quantum field theory picture, the near field non-radiating interactions among the leptons and quarks are caused by the exchange of virtual spacelike (tachyonic faster-than-light off-mass-shell) gauge bosons continuously randomly emitted and absorbed by the leptons and quarks. To make matters more complicated unlike the single rest massless U1 photon, the three weak rest massive SU2 W bosons and the eight strong rest massless SU3 gluons carry their respective Lie algebra charges, therefore, they self-interact. A single virtual gluon can split into two gluons for example. The SU3 quark-quark-gluon interaction gets stronger at low energy longer separations. This is called quantum chromodynamic confinement and it explains why we do not see free quarks in the present epoch of our causal diamond observable universe patch of the multiverse. Free quarks were there in a different quantum vacuum thermodynamic phase shortly after the Alpha Point chaotic inflation creation of our observable universe that we see with telescopes etc. Indeed, most of the rest mass of protons and neutrons comes from the confined Heisenberg uncertainty principle kinetic energy of the three real confined up and down quarks and their plasma cloud of virtual zero point gluons and virtual quark-antiquark pairs. The Higgs Yukawa interaction rest masses of three bound real quarks is about 1/20 or less than the total hadronic rest masses.

    The author, James F. Woodward (JFW), introduces Mach’s Principle though in an ambiguous way to my mind. He says that the computation of the rest mass from local quantum field theory as has been in fact accomplished for hadrons by MIT Nobel Laureate, Frank Wilczek et-al using supercomputers is not sufficient to explain the inertia of Newton’s Second Law of Particle Mechanics. This does sound like Occult Astrology at first glance, but we do have the 1940 Wheeler-Feynman classical electrodynamics in which radiation reaction is explained as a back-from-the-future retro causal advanced influence from the future absorber on the past emitter in a globally self-consistent loop in time. Indeed, Feynman’s path integral quantum theory grew out of this attempt. Hoyle and Narlikar, and John Cramer have extended the original classical Wheeler-Feynman theory to quantum theory. Indeed, the zero point virtual photons causing spontaneous emission decay of excited atomic electron states can be interpreted as a back from the future effect. The electromagnetic field in the classical Wheeler-Feynman model did not have independent dynamical degrees of freedom, but in the Feynman diagram quantum theory they do. However, the retro causal feature survives. Therefore the only way I can make sense of JFWs fringe physics proposal is to make the following conjecture. Let m0 be the renormalized rest mass of a real particle computed in the standard model of local quantum field theory. Then, the observed rest mass m0’ equals a dimensionless nonlocal coefficient C multiplied by the local m0 renormalized rest mass. Mach’s Principle is then C = 0 in an empty universe of only real test particles without any sources causing spacetime to bend. Furthermore, C splits into past history retarded and future destiny advanced pieces. Now is there any Popper falsifiable test of this excess baggage?
  • Jack Sarfatti 1) Springer-Praxis Books in Space Exploration (2013)
    2) Einstein in Zurich over one hundred years ago read of a house painter falling off his ladder saying he felt weightless.
    3) I have since disassociated myself from that project, as have other hard
    ...See More
  • Jack Sarfatti 4) Roughly speaking, for particle mechanics, the dynamical action is the time integral of the kinetic energy minus the potential energy. The classical physics action principle is that the actual path is an extremum in the sense of the calculus of variations relative to all nearby possible paths with the same initial and final conditions. Richard P. Feynman generalized this classical idea to quantum theory where the actual extremum path corresponds to constructive interference of complex number classical action phases one for each possible path. There are more complications for velocity-dependent non-central forces and there is also the issue of initial and final conditions. The action is generalized to classical fields where one must use local kinetic and potential analog densities and integrate the field Lagrangian density over the 4D spacetime region bounded by initial history and final teleological destiny 3D hypersurfaces boundary constraints. Indeed, Yakir Aharonov has generalized this to quantum theory in which there are back-from-the-future retro causal influences on present weak quantum measurements made between the past initial and future final boundary constraints. Indeed, in our observable expanding accelerating universe causal diamond, these boundary constraints, I conjecture, are our past cosmological particle horizon from the moment of chaotic inflation leading to the hot Big Bang, together with our future dark energy de Sitter event horizon. Both of them are BIT pixelated 2D hologram computer screens with us as IT voxelated “weak measurement” 3D hologram images projected from them. The horizon pixel BIT quanta of area are of magnitude (~10^-33 cm or 10^19 Gev)^2. The interior bulk voxel IT quanta of volume are of magnitude (~10^-13 cm or 1 Gev)^3. This ensures that the number N of BIT horizon pixels equals the number of IT interior voxels in a one-to-one correspondence. The actually measured dark energy density is proportional to the inverse fourth power of the geometric mean of the smallest quantum gravity Planck length with the largest Hubble-sized scale of our future de Sitter causal diamond ~ 10^28 cm. This, when combined with the Unruh effect, corresponds to the Stefan-Boltzmann law of black body radiation that started quantum physics back in 1900. However, this redshifted Hawking horizon blackbody radiation must be coming back from our future de Sitter cosmological horizon not from our past particle horizon.
  • Jack Sarfatti 5) Localizing the four space and time translations corresponds to Einstein’s general coordinate transformations that are now gauge transformations defining an equivalence class of physically identical representations of the same curvature tensor field. However, the compensating gauge connection there corresponds to torsion fields not curvature fields. The curvature field corresponds to localizing the three space-space rotations and the three space-time Lorentz boost rotations together. Einstein’s General Relativity in final form (1916) has zero torsion with non-zero curvature. However, T.W.B. Kibble from Imperial College, London in 1961 showed how to get the Einstein-Cartan torsion + curvature extension of Einstein’s 1916 curvature-only model by localizing the full 10-parameter Poincare symmetry Lie group of Einstein’s 1905 Special Relativity. The natural geometric objects to use are the four Cartan tetrads that correspond to Local Inertial Frame (LIF) detector/observers that are not rotating about their Centers of Mass (COM) that are on weightless zero g-force timelike geodesics. Zero torsion is then imposed as an ad-hoc constraint to regain Einstein’s 1916 model as a limiting case. The ten parameter Poincare Lie group is subgroup of the fifteen parameter conformal group that adds four constant proper acceleration hyperbolic Wolfgang Rindler horizon boosts and one dilation scale transformation that corresponds to Herman Weyl’s original failed attempt to unify gravity with electromagnetism. The spinor Dirac square roots of the conformal group correspond to Roger Penrose’s “twistors.”
  •  
     

On Jun 26, 2013, at 9:34 AM, Ruth Kastner <rekastner@hotmail.com> wrote:

"Thanks Basil for this clarification. It is true that Bohm's original motivation was a realist (as opposed to instrumentalist, Bohrian interpretation). I should have been more clear about that. But it rather quickly became a path to resolving the measurement problem -- if not for its original author(s), certainly for those who have championed it since then.
Also, regarding the quote ["What I felt to be particularly unsatisfactory was the fact that the quantum theory had no place in it for an adequate notion of an independent actuality-i.e. of an actual movement or activity by which one physical state could pass over into another".] This is a key component of the measurement problem.  Also, let me take the opportunity to note that it is not necessary to  identify a 'realist' view of qm with the existence of  'hidden variables'.  I have been proposing a realist view that does not involve hidden variables -- but it does involve an expansion of what we normally like to think of as 'real'. The usual tacit assumption is that
'real' = 'existing within spacetime'  (and that of course requires 'hidden variables' that tell us 'where' the entity lives in spacetime, or at least identifies some property compatible with spacetime existence)" (end-quote)

Me: We all seem to agree that the idea that "real" must be "local in spacetime" is false. Q is real, but it is generally not a local BIT field in 3D + 1 spacetime when there is entanglement. Oddly enough the macro-quantum coherent signal Q in spontaneous breakdown of ground state symmetry is local in 3D+1 but it is generally coupled to nonlocal micro-quantum "noise."

Ruth "In contrast, I think PTI provides us with a realist concept of an independent actuality -- a "movement or activity by which one physical state could pass over into another". "

Me: So does Bohm's ontological interpretation.

Ruth: "But that 'actuality' is rooted in potentiality, which is a natural view given the mathematical properties of quantum objects."

Me: Seems to me you are playing with nouns replacing one vague metaphysical notion with another. What is "potentiality"? Mathematically it's Bohm's Q - perhaps extended to Yakir Aharonov's weak measurements with advanced Wheeler-Feynman back from the future post selection in a post quantum theory with Antony Valentini's "signal nonlocality". Some think that violates the Second Law of Thermodynamics. However, since it only obtains in open systems that is not so. Furthermore our actual universe, the causal diamond bounded by both the past and future horizons is an open system out of thermal equilibrium.

Ruth: "So one can give a  realist, physical account, but it is indeterministic -- involving a kind of spontaneous symmetry breaking. Given that we already have spontaneous symmetry breaking elsewhere in physics, I think we should allow for it in QM.

Thanks again for the clarification --"

Best
Ruth

Jack Sarfatti
David Bohm, Albert Einstein, Louis De Broglie, Wolfgang Pauli, Richard Feynman
  • Jack Sarfatti On Jun 26, 2013, at 2:26 AM, Basil Hiley wrote:

    Ruth, may I make a correction to what you wrote below. Bohm '52 work was not 'originally undertaken to solve the measurement problem.' He had a different motive. I asked him to clarify, in writing, w
    ...See More
    www.tcm.phy.cam.ac.uk
    This paper is dedicated to three great thinkers who have insisted that the world is not quite the straightforward affair that our successes in describing it mathematically may have seemed to suggest: Niels Bohr, whose analyses of the problem of explaining life play a central role in the following di...
  • Jack Sarfatti On Jun 26, 2013, at 10:08 AM, JACK SARFATTI <adastra1@me.com> wrote:

    Ruth wrote:

    "I don't rule out that some deeper theory might eventually be found, that could help answer ultimate questions in more specific terms. But it hasn't been demonstrated, to my knowledge, that one has to have violations of Born Rule in order to explain life." (end quote)

    To the contrary, it has been demonstrated in my opinion. First start with Brian's paper "On the biological utilization of nonlocality" with the Greek physicist whose name escapes me for the moment.

    Second: Lecture 8 of http://www.tcm.phy.cam.ac.uk/~mdt26/pilot_waves.html

    Specifically, how the Born rule depends on violation of the generalized action-reaction (relativity) principle that Q has no sources. Q pilots matter without direct back-reaction of matter on Q.

    In other words, orthodox quantum theory treats matter beables as test particles! - clearly an approximation.

    Obviously signal nonlocality violating no-signaling theorems has a Darwinian advantage. Indeed, without it, entanglement appears as static noise locally. Imagine that Alice and Bob's minds are represented each by a giant macroscopic coherent entangled quantum potential Q(A,B). It would obviously be a survival advantage for Alice and Bob to directly send messages to each other at a distance like the Austraiian aborigines do in the Outback. Now use scale invariance. It's obviously an advantage for separate nerve cells in our brains to do so. Also in terms of morphological development of the organisim - signal nonlocality is an obvious plus, which I think is part of Brian Josephson's message in that paper.

    Third:

    Subquantum Information and Computation
    Antony Valentini
    (Submitted on 11 Mar 2002 (v1), last revised 12 Apr 2002 (this version, v2))
    It is argued that immense physical resources - for nonlocal communication, espionage, and exponentially-fast computation - are hidden from us by quantum noise, and that this noise is not fundamental but merely a property of an equilibrium state in which the universe happens to be at the present time. It is suggested that 'non-quantum' or nonequilibrium matter might exist today in the form of relic particles from the early universe. We describe how such matter could be detected and put to practical use. Nonequilibrium matter could be used to send instantaneous signals, to violate the uncertainty principle, to distinguish non-orthogonal quantum states without disturbing them, to eavesdrop on quantum key distribution, and to outpace quantum computation (solving NP-complete problems in polynomial time).
    Comments: 10 pages, Latex, no figures. To appear in 'Proceedings of the Second Winter Institute on Foundations of Quantum Theory and Quantum Optics: Quantum Information Processing', ed. R. Ghosh (Indian Academy of Science, Bangalore, 2002). Second version: shortened at editor's request; extra material on outpacing quantum computation (solving NP-complete problems in polynomial time)
    Subjects: Quantum Physics (quant-ph)
    Journal reference: Pramana - J. Phys. 59 (2002) 269-277
    DOI: 10.1007/s12043-002-0117-1
    Report number: Imperial/TP/1-02/15
    Cite as: arXiv:quant-ph/0203049
    (or arXiv:quant-ph/0203049v2 for this version)

It's clear that DK's scheme won't work - nor will any scheme that is based on unitary linear orthodox quantum theory using orthogonal base states.
However, concerning Valentini's, Josephson, Weinberg, Stapp & my different & independent from from DK's approaches: while the trace operation to get expectation values of observables on quantum density matrices is invariant under unitary transformations of the base states which preserve orthogonality, that is not true for the transformation from an orthogonal Fock basis to the non-orthogonal Glauber coherent state basis, which is clearly a non-unitary transformation that is OUTSIDE the domain of validity of orthodox quantum theory. I think many Pundits have missed this point?

Hawking's former assistant Bernard Carr spells this out clearly in Can Psychical Research Bridge the Gulf Between Matter and Mind?" Bernard Carr Proceedings of the Society for Psychical Research, Vol 59 Part 221 June 2008

Begin forwarded message:

From: nick herbert <quanta@cruzio.com>
Subject: Re: AW: AW: More on the |0>|0> term
Date: June 14, 2013 11:14:57 AM PDT
To: Suda Martin <Martin.Suda.fl@ait.ac.at>


Thank you, Martin.
I finally get it.
My confusion lay in the attribution of the short calculation below.
I thought this calculation (which leads to rA) was due to Gerry.

Instead it is a calculation done by Gerry but attributed to DK.
It was not a calculation that DK ever carried out but
arose from Gerry taking Gerry's FULL CALCULATION,
applying the Kalamidas approximation
and getting an incorrect result.

The correct result is Zero
on which you and Gerry agree.

So if Kalamidas would have carried out the calculation this way
he would have gotten an incorrect answer.

I hope I have now understood the situation correctly.

But Kalamidas did not carry out the calculation that Gerry displays.
DK did not start out with the FULL CALCULATION and then approximate.

DK starts with an approximation and then calculates.

DK starts with an approximation and carries out a series of steps which all seem to be valid
but whose conclusion is preposterous. Furthermore the approximation (weak coherent states)
is an approximation used in dozens of laboratories by serious quantum opticians without
as far as I am aware leading to preposterous or impossible conclusions.

Therefore it seems to me that the calculation below is another nail in the Kalamidas coffin, BUT
THE BEAST IS STILL ALIVE.

1. No one yet has started with Kalamidas's (approximate) assumptions, and discovered a mistake in his chain of logic.

2. No one yet has started with Kalamidas's (approximate) assumptions, followed a correct chain of logic and shown that FTL signaling does not happen.

Martin Suda came the closest to carrying out problem #2. He started with the Kalamidas (approximation) assumptions and decisively proved that all FTL terms are zero. But Martin's proof contains an unphysical |0>|0> term that mars his triumph.

I am certain that the Kalamidas claim is wrong. The FULL CALCULATION refutations of Ghirardi, Howell and Gerry are pretty substantial coffin nails. But unless I am blind there seems still something missing from a clean and definitive refutation of the Kalamidas claim. See problems #1 and #2 above.

I do not think that Nick is being stubborn or petty in continuing to bring these problems to your attentions. I should think it would be a matter of professional pride to be able to bring this matter to a clean and unambiguous conclusion by refuting Kalamidas on his own terms.

Thank you all for participating in this adventure whatever your opinions.

Nick Herbert


On Jun 14, 2013, at 3:29 AM, Suda Martin wrote:

Nick,

Thank you for comments!

I would still like to explain my short considerations below a bit more precisely, anyway. I feel there was perhaps something unclear as regards my email (12th June), because you wrote "you were confused".

I only considered the following:

DK disclosed a calculation (see attachment) which is completely wrong because he made a mathematical limit (see first line, where he omitted the term ra^{+}_{a3}) which is absolutely not justifiable here (just as CG mentioned, see below) because both parts are equally important if you make the expectation value properly. If you take both parts you get exactly zero: alpha^{*}(tr^{*}+rt^{*})=0.
So one does not obtain a quantity like (r alpha)^{*}.

That’s all. There is absolutely no discrepancy between me and CG.

Nice regards,
Martin




-----Ursprüngliche Nachricht-----
Von: nick herbert [mailto:quanta@cruzio.com]
Gesendet: Mittwoch, 12. Juni 2013 23:33

Betreff: Re: AW: More on the |0>|0> term

"And again, the notion that an alleged approximate calculation (I say "alleged" because as with everything else there are correct and incorrect approximate calculations) based on a weak signal coherent state somehow trumps an exact computation valid for any value of the coherent state parameter, is, well, just insane. If you want to see where things go wrong just take more terms in the series expansions. Add up enough terms and, viola, no effect! One can't get much more specific than that." --Christopher Gerry

Actually, Chris, one can get much more specific than that by explicitly displaying the Correct Approximation Scheme (CAS) and showing term by term than Alice's interference vanishes (to the proper order of approximation).

Absent a correct CAS and its refutation these general claims are little more than handwaving.

Produce a CAS.
Refute it.

Is anyone up to this new Kalamidas challenge?
Or does everyone on this list except me
consider deriving a CAS a waste of time?

Nick Herbert

On Jun 12, 2013, at 2:03 PM, CHRISTOPHER GERRY wrote:

We are both right: the two terms cancel each other out!  That the
whole expectation value is zero is actually exactly what's in our
paper's Eq. 9. This happens because the reciprocity relations must
hold. That Kalamidas thought (or maybe even still thinks) his
calculation is correct, is at the heart of the matter, that is, that
he is either unable to do the calculations or that he can do them but
chooses not too because they don't get him where he wants to go.

The Kalamidas scheme will not work not work on the basis of general
principles as we showed in the first part of our paper (see also
Ghirardi's paper).

And again, the notion that an alleged approximate calculation (I say
"alleged" because as with everything else there are correct and
incorrect approximate calculations) based on a weak signal coherent
state somehow trumps an exact computation valid for any value of the
coherent state parameter, is, well, just insane. If you want to see
where things go wrong just take more terms in the series expansions.
Add up enough terms and, viola, no effect! One can't get much more
specific than that.

Christopher C. Gerry
Professor of Physics
Lehman College
The City University of New York
718-960-8444
christopher.gerry@lehman.cuny.edu


---- Original message ----
Date: Wed, 12 Jun 2013 12:28:16 -0700
From: nick herbert <quanta@cruzio.com>
Subject: Re: AW: More on the |0>|0> term
To: Suda Martin
All--

Excuse me for being confused.
Gerry refutes Kalamidas by showing that an omitted term is large.
Suda refutes Kalamidas by showing that the same term is identically
zero.
What am I missing here?

I wish to say that I accept the general proofs. Kalamidas's scheme
will not work as claimed.
That is the bottom line. So if the general proofs say FTL will fail
for full calculation, then it will certainly fail for approximations.

The "weak coherent state" is a common approximation made in quantum
optics. And dozens of experiments have been correctly described using
this approximation. So it should be a simple matter to show if one
uses Kalamidas's approximation, that FTL terms vanish to the
appropriate level of approximation. If this did not happen we would
not be able to trust the results of approximation schemes not
involving FTL claims.

Gerry's criticism is that Kalamidas's scheme is simply WRONG--that he
has thrown away terms DK regards as small.
But in fact they are large. Therefore the scheme is flawed from the
outset.

If Gerry is correct, then it seems appropriate to ask: Is there a
CORRECT WAY of formulating the Kalamidas scheme using the "weak
coherent state" approximation, where it can be explicitly shown that
this correct scheme utterly fails?

It seems to me that there are still some loose ends in this Kalamidas
affair, if not a thorn in the side, at least an unscratched itch.

It seems to me that closure might be obtained. And the Kalamidas
affair properly put to rest if everyone can agree that 1. DK has
improperly treated his approximations; 2. Using the CORRECT
APPROXIMATION SCHEME, the scheme abjectly fails just as the exact
calculation says it must.

Why should it be so difficult to construct a correct description of
the Kalamidas proposal, with CORRECT APPROXIMATIONS, and show that it
fails to work as claimed?

AS seen from the Ghirardi review, there are really not that many
serious FTL proposals in existence. And each one teaches us
something-- mostly about some simple mistakes one should not make when thinking
about quantum systems. Since these proposals are so few, it is really
not a waste of time to consider them in great detail, so we can learn
to avoid the mistakes that sloppy thinking about QM brings about.

When Ghirardi considers the Kalamidas scheme in his review, I would
consider it less than adequate if he did not include the following
information:

1. Kalamidas's scheme is WRONG because he treats approximations
incorrectly.
2. When we treat the approximations correctly, the scheme fails, just
as the general proofs say it must.

Gerry has provided the first part of this information. What is
seriously lacking here is some smart person providing the second
part.

Nick Herbert


On Jun 12, 2013, at 8:50 AM, Suda Martin wrote:

Dear all,

Yes, if one calculates precisely the Kalamidas - expression given in
the attachment of the email of CG one obtains exactly

alpha^{*}(tr^{*}+rt^{*})=0

due to the Stokes-relation of beam splitters. No approximations are
necessary. So, I am astonished about the sloppy calculations of
Demetrios.

Cheers,
Martin

________________________________________
Von: CHRISTOPHER GERRY [CHRISTOPHER.GERRY@lehman.cuny.edu]

Betreff: Re: More on the |0>|0> term

I probably shouldn't jump in on this again, but...

I can assure you that there's no thorn in the side of the quantum
optics community concerning the scheme of Kalamidas. There are only
people doing bad calculations. Despite claims to the contrary, our
paper, as with Ghirardi's, does specifically deal with the Kalamidas
proposal. It is quite clearly the case that EXACT calculations in
the Kalamidas proposal shows that the claimed effect disappears. To
suggest that it's there in the approximate result obtained by series
expansion, and therefore must be a real effect, is simply
preposterous. All it means is that the approximation is wrong; in
this case being due to the dropping important terms.

The whole business about the |00> and whatever (the beam splitter
transformations and all that) is not the issue. I'm astonished at
how the debate on this continues. The real problem, and I cannot
emphasize it enough, is this: Kalamidas cannot do quantum optical
calculations, even simple ones and therefore nothing he does should
be taken seriously. As I've said before, his calculation of our Eq.
(9), which I have attached here, is embarrassingly wrong. It's
obvious from the expression of the expectation value in the upper
left that there has to be two terms in the result both containing
the product of r and t. But Kalamidas throws away one of the terms
which is of the same order of magnitude as the one he retains. Or
maybe he thinks that term is zero via the quantum mechanical
calculation of its expectation value, which it most certainly is
not.  His limits have been taken inconsistently.  So, he not only
does not know how to do the quantum mechanical calculations, he
doesn't even know how or when the limits should be taken. There's
absolutely no point in debating the meaning of the results incorrect
calculations. Of course, by incorrectly doing these things he gets
the result he wants, and then thinks it's the duty of those of us
who can do these calculations to spend time showing him why his
calculations are wrong, which he then dismisses anyway.
My point in again bringing this specific calculation of his is not
to say anything about his proposal per se, but to demonstrate the
abject incompetence of Kalamidas in trying to do even the most
elementary calculations.  And if anyone still wonders why I'm angry
about the whole affair, well, what should I feel if some guy unable
to do simple calculations tries to tell established quantum optics
researchers, like me and Mark Hillery, that our paper showing where
he's wrong dismisses ours as being "irrelevant?" He doesn't even
seem to know that what he said was an insult.

And finally, the continued claim that the specific proposal of
Kalamidas has not been addressed must simply stop. It has been
repeatedly. I suspect this claim is being made because people don't
like the results of the correct calculations. That's not the problem
of those of us can carry through quantum optical calculations.

CG


Christopher C. Gerry
Professor of Physics
Lehman College
The City University of New York
718-960-8444
christopher.gerry@lehman.cuny.edu


---- Original message ----
Date: Tue, 11 Jun 2013 14:12:19 -0700
From: nick herbert <quanta@cruzio.com>
Subject: Re: More on the |0>|0> term
To: "Demetrios Kalamidas" <dakalamidas@sci.ccny.cuny.edu>


yer right, demetrios--
the |00> term on the right is always accompanied in Suda's
calculation by a real photon on the left.

But this is entirely non-physical.
No real or virtual quantum event corresponds to this term.

Especially with the high amplitude required for
Suda-interference-destruction.

So your specific approximate FTL scheme despite many general
refutations still remains a puzzlement.

A thorn in the side
of the quantum optics community.

if any think otherwise
let them put on the table
one unambiguous refutation
OF YOUR SPECIFIC PROPOSAL--
not of their own
nor of somebody else's
totally different FTL signaling scheme,

Nick


On Jun 11, 2013, at 1:27 PM, Demetrios Kalamidas wrote:


Nick,

 The EP and CSs do derive from the same laser pulse: part of the
pulse pumps the nonlinear crystal and the other part is split off
accordingly to create the CSs.
 However, you are still misssing the point: If no EP pair is
created, then you will certainly get '00' on the right
sometimes.... BUT there will be no left photon in existence. The
problem with the Suda term is that when it appears, it appears
only accompanied by a left photon in a superposition state: ie it
always appears as (10+e01)(00+11).
 Think of it this way: Suppose you just have an EP source that
creates pairs, with one photon going left and the other right.
Imagine that on the right there is a highly trasnparent BS with
say
|r|^2=0.001. That means that only one out of every thousand right
photons from the EP are reflected, and 999 are transmitted. So,
this means that for every 1000 counts ON THE LEFT, there will be
999 counts tranmitted on the right. Now introduce, at the other
input of that same BS, a CS so that it has a tiny reflected
portion of amplitude |ralpha>. Allegedly then, there will arise
cases where no photon is found in the transmitted channel with
probability equal to |ralpha|^2. Since alpha is arbitrary, we can
choose |
ralpha|=0.1. This means that the probabilty of getting no
photon in
the transmitted channel will be |ralpha|^2=0.01.....Which now
means that, for every 1000 EP pairs created, we will get 1000
counts on the left, but only 900 counts in the transmitted channel
on the right! Whereas, without the CS in the other channel, there
would be
999 counts on the right for that same 1000 counts on the left.
Demetrios


On Tue, 11 Jun 2013 09:44:42 -0700
nick herbert <quanta@cruzio.com> wrote:
Demetrios--
I don't know how the entangled pair (EP) and CSs are generated.
I supposed all three are created with a single PULSE in a non-
linear  crystal.
Now one can imagine that this pulse fails to create an EP but
does  create a CS
Then some of Bob's detectors will fire but no ES is formed.
So this kind of process could lead to lots of |0>|0> terms.
However what we need are not "lots of |0>|0> terms" but a precise
amplitude (rA) of |0>|0> term.
Given our freedom (in the thought experiment world) to
arbitrarily  select
the efficiency of the non-linear crystal, it is hard to see why
the  elusive |0>|0>
term would have exactly the right magnitude and phase to cancel
out  the interference.
Your original FTL scheme still continues to puzzle me.
Nick
On Jun 11, 2013, at 6:54 AM, Demetrios Kalamidas wrote:
Nick,

 The 'entire experimental arrangement' is indeed where the
problem  (mystery) arises:
 When both CSs are generated it is easy to understand that '00'
will arise, simply because each CS has a non-zero vacuum term.
 However, the entire arrangement means inclusion of the
entangled  photon pair:
 Any time that pair is generated, you are guaranteed to get a
photon on the right, regardless of whether the CSs are there.
 So, when entangled pair and CSs are present, there must be at
least one photon at the right. In fact, when only one photon
emerges at the right WE KNOW both CSs were empty.

On Mon, 10 Jun 2013 10:34:30 -0700
nick herbert <quanta@cruzio.com> wrote:
Demetrios--
Sarfatti sent around a nice review of quantum optics
by Ulf Leonhardt that discusses the structure of path-uncertain
photons.
Here is an excerpt:
The interference experiments with single photons mentioned in
Sec.  4.3 have been
performed with photon pairs generated in spontaneous
parametric   downconversion
[127]. Here the quantum state (6.28) of light is essentially
|01> |02> + ζ |11>|12 >. (6.29)
In such experiments only those experimental runs count where
photons  are counted,
the time when the detectors are not firing is ignored, which
reduces  the quantum
state to the photon pair
|11> |12> .
Postselection disentangles the two-mode squeezed
vacuum.
We argued in Sec. 4.3 that the interference of the photon pair
|11> |12> at a 50:50 beam splitter generates the entangled
state   (4.24). Without postselection,
however, this state is the disentangled product of two single-
mode  squeezed vacua,
as we see from the factorization (6.6) of the S matrix. The
notion  of  entanglement
is to some extent relative.
this excerpt suggests a possible origin for Suda's |0>|0> term.
In  the above process, it's just
the inefficiency of the down converter that generates a |0>|0>
term.  That won't do the trick.
But in your more complicated situation--containing two properly
timed  coherent states--
when Bohr's "entire experimental arrangement" is considered,
the
| 0>| 0> term may
arise naturally with the proper amplitude and phase. It would
correspond to events when
the coherent states were successfully generated but there were
no   events in either upper or lower path.
If this conjecture can be shown to hold true, then the
original   Kalamidas proposal would
be refuted by Suda's calculation.
The trick would be to examine--in a thought experiment way--
exactly  how those two |A> beams
are created--looking for entanglement  with |0>|0> states in
the  part  of the experiment considered in your proposal.
Nick
ref: Ulf Leonhardt's wonderful review of quantum optics,
starting   with reflections from a window pane and concluding
with
Hawking radiation.



  1.  
  2. A crisis for Bohm's version of quantum theory
    Like · · Share
    • Jack Sarfatti re: http://xxx.lanl.gov/pdf/1306.1576.pdf

      Where is the flaw in Valentini's argument that the Born rule is so unstable in it, that orthodox quantum theory would not even work for inanimate simple systems like spectroscopy and scattering where in fact it works so well? It seems "too cheap" (Einstein to Bohm, 2952) that de Broglie's p = gradS works and dp/dt = - grad(V + Q ) does not. Q has such beautiful properties explaining spooky quantum weirdness.

      Will coupling to a gauge field help?

      p = gradS - (e/c)A ?

      even though the field harmonic oscillators are also unstable just like the hydrogen atom electron - perhaps when coupled to sources "a miracle happens"? I don't have much hope for that at the moment.

      Of course, I rejoice that the Born probability rule should be unstable - but not too unstable. It should be meta-stable to allow signal nonlocality - post-quantum voodoo "magick without magic" as in http://arxiv.org/abs/quant-ph/0203049 Valentini still seems to believe in that as well, but not with Q. What's wrong with this picture?
Jack Sarfatti
Sunday via Twitter
  • quantum heretic | research and creative discovery | Clemson University http://t.co/6695ZinRX9
    quantum heretic
    clemson.edu
    In the warm winter sunshine, a distinguished man stands on the curb outside a local bank, wearing a casual jacket, his dark, curly hair stranded with silver
  • Jack Sarfatti agreed
    his effective Hamiltonian for 4-port passive devices (beam splitters, interferometers) and for active devices like parametric down converters for making EPR pairs is useful - note formal analogy with BCS superconductivity effective Hamiltonian a
    1a2 + a1*a2* except in light bosons, in BCS fermions.

    ps the new Valentini paper claiming that Bohm's Q dynamics violates observation - but de Broglie's dynamics still OK is important.

    of course instability of Born rule collapsing no-signaling glass ceiling is what I am after - actually so is Valentini

    Life is that in my opinion.

    http://www.clemson.edu/glimpse/?p=1177

    http://arxiv.org/abs/1306.1576

    On Jun 10, 2013, at 10:12 AM, nick herbert <quanta@cruzio.com> wrote:

    Thanks, Jack.
    A review of quantum optics
    of astonishlng depth and breadth.
    Who is Ulf Leonhardt?
    Decendent of the Vikings
    who ran the place in the old days?

    On Jun 9, 2013, at 2:08 PM, JACK SARFATTI wrote:

    <QuantumOpticsReview0305007v2.pdf>
    www.clemson.edu
    In the warm winter sunshine, a distinguished man stands on the curb outside a local bank, wearing a casual jacket, his dark, curly hair stranded with silver
  • Jack Sarfatti It seems that special relativity won't save "Bohm dynamics" in Valentini's sense either.

    Valentini et-al write:

    "This is in sharp contrast with de Broglie's dynamics, where efficient relaxation to equilibrium implies that one should expect to see equilibrium at later times (except, possibly, for very long-wavelength modes in the early universe (Valentini 2007, 2008b, 2010; Colin and Valentini 2013)). It is then reasonable to conclude that, while de Broglie's dynamics is a viable physical
    theory, Bohm's dynamics is not. ...

    It might be suggested that Bohm's dynamics is only an approximation, and that corrections from a deeper theory will (in reasonable circumstances) drive the phase-space distribution to equilibrium. Such a suggestion was in fact made by Bohm (1952a, p. 179). While this may turn out to be the case, the fact remains that Bohm's dynamics as it stands is unstable and therefore (we
    claim) untenable.

    In our view Bohm's 1952 Newtonian reformulation of de Broglie's 1927 pilot wave dynamics was a mistake, and we ought to regard de Broglie's original
    formulation as the correct one. Such a preference is no longer merely a matter
    of taste: we have presented concrete physical reasons for preferring de Broglie's dynamics over Bohm's."

    "The above results provide strong evidence that there is no tendency to relax to
    quantum equilibrium in Bohm's dynamics, and that the quantum equilibrium
    state is in fact unstable. It is then reasonable to conclude that if the universe
    started in a nonequilibrium state { and if the universe were governed by Bohm's
    dynamics { then we would not see quantum equilibrium today. The Born rule
    for particle positions would fail, momenta would take non-quantum-mechanical values, and there would be no bound states such as atoms or nuclei. ... the same instability appears if one applies Bohm's dynamics to high-energy field theory. ... Similar results would be obtained for the electromagnetic field, for example, resulting in unboundedly large electric and magnetic field strengths even in the vacuum. This is grossly at variance with observation"

    On Jun 11, 2013, at 12:48 AM, Basil Hiley wrote:

    "Colin and Valentini are not addressing Bohmian non-commutative dynamics that I wrote about in arXiv 1303.6057
    They are considering what Bohm and I called the stochastic interpretation of QM. [see our paper "Non-locality and Locality in the Stochastic Interpretation of Quantum Mechanics, Phys. Reports 172, 93-122, (1989).] That was based on the earlier work of Bohm "Proof that Probability Density Approaches |Ψ|2 in Causal Interpretation of the Quantum Theory", Phys. Rev., 89, no. 2, 458-406, (1953) and the work in Bohm and Vigier, Model of the Causal Interpretation of Quantum Theory in Terms of a Fluid with Irregular Fluctuations, Phys. Rev. 96, no. 1, 208-216, (1954). These approaches add a new stochastic 'sub-quantum' field to 1952 model in order to explain the quantum probability P=|Ψ|^2 as an equilibrium condition in this stochastic background. It should be noted that de Broglie supported these approaches and conclusions in his book "Non-linear Wave Mechanics: a Causal Interpretation", Elsevier, Amsterdam, ch XIII, (1960). All these authors including de Broglie, concluded that under the right assumptions the distribution approaches quantum distribution. Bohm and I gave a brief summary of the essentials that lead to that conclusion. I have not had time to study why Colin and Valentini arrive at a contrary conclusion.

    One of the conclusions of our Phys. Reports paper was that because the stochastic model adds the possibility of new features arising beyond those given by the standard QM approach. For example, in sufficiently fast processes, results different from those given by the equilibrium Ψ could result and that further investigation could potentially be useful in giving rise to new physics. We failed to find any new physics that agreed with experiment and therefore abandoned the stochastic approach.

    I find it very surprising that Colin and Valentini set up de Broglie v Bohm in view of what de Broglie himself wrote in his book "Non-linear Wave Mechanics". Just read the book!

    Basil."

    On 10 Jun 2013, at 17:32, JACK SARFATTI wrote:

    11 hours ago via Twitter
    quantum heretic | research and creative discovery | Clemson University http://t.co/6695ZinRX9
    quantum heretic
    clemson.edu
    In the warm winter sunshine, a distinguished man stands on the curb outside a local bank, wearing a casual jacket, his dark, curly hair stranded with silver
    Like · · @JackSarfatti on Twitter · Share

    http://arxiv.org/abs/1306.1576
    [1306.1576] Instability of quantum equilibrium in Bohm's dynamics
    arxiv.org
    www.clemson.edu
    In the warm winter sunshine, a distinguished man stands on the curb outside a local bank, wearing a casual jacket, his dark, curly hair stranded with silver

OK, here is a simple case - not same as Kalamidas mind you - that seems to be outside the rules of orthodox quantum theory.

Alice the receiver has an ordinary orthodox quantum bit with base states |0> & |1> for a given orientation of her apparatus which never changes in the experiment. Bob the sender has two distinguishable non-orthogonal Glauber coherent eigenstates |z> and |w> of the non-Hermitian observable boson destruction operator a, where z and w are complex numbers. Right at this point we have violated one of the axioms of orthodox quantum theory in a factual way since Glauber states are facts.

Suppose we have the entangled state

|A,B> = (1/2)^1/2[|0>|z> + |1>|w>]

then using the orthodox Born probability rule in density matrix formulation gives

p(0) = p(1) = (1/2)[1 + |<z|w>|^2]

p(0) + p(1) = 1 +  |<z|w>|^2 > 1

the entanglement signal at Alice's receiver is  |<z|w>|^2 violating conservation of Born's rule for probability - because the observable is not hermitian and actually a closer examination shows a non-unitary time evolution. This is a larger theory that reduces to orthodox quantum theory in the appropriate limit.

note



http://en.wikipedia.org/wiki/Coherent_states


Now, we can squirm out of this by a-priori ad-hoc forcing of the non-universal normalization

|A,B>' =  [1 +  |<z|w>|^2]^-1/2|A,B>

giving

p'(0) = p'(1) = 1/2 with no signaling Note, that Bob does not need to use that normalization at all because of Alice's <0|1> = 0.

That's why I use "non-universal" above.

However, it's not clear the Nature works this way without more testing.

On Jun 1, 2013, at 1:04 PM, Ghirardi Giancarlo <ghirardi@ictp.it> wrote:


Il giorno 01/giu/2013, alle ore 18:38, JACK SARFATTI <adastra1@me.com> ha scritto:


Ghirardi: I do not agree at all on this. The actual situation is that there has never been a clear cut indication that in Kalamidas serf-up something (probabilities, outcomes or whatever you want) actually changes something at left as a consequence of preparing one or the other state at right, so that it can be used to send faster than light signals. It is his duty and not ours to prove that the effect exist. I believe to have argued against its existence and I have also checked that for the most natural observables at left no difference occurs when you choose one or the other of the two initial states. The game is back to Kalamidas. And, sincerely, I am a little bit disturbed by all this enormous mess and many inadequate and unjustified statements that have been put forward during the debate. I am not keen to follow the matter any more.

On Jun 1, 2013, at 1:54 PM, Suda Martin <Martin.Suda.fl@ait.ac.at> wrote:

Dear all,
thanks to everybody for emails, papers, contributions to discussion and comments. I enjoyed very much the highly interesting dialogues. I can fully agree to the arguments of CG and GG, of course.
Only a comment with respect to the question of the approximation:
As regards the approximation done in the calculation of DK, I would like to point out again - and I sent a pdf called Interf_BS_50_50_Suda.pdf two days ago -  that because of such an approach the normalization of the output wave function behind the 50/50 BS has been changed to (1+2|alpha|^2+|alpha|^4), see Eq.(7), instead of being exactly 1. The probabilities for the potential "interference part" (see Eq.(6)) are (|p_10|^2+|p_01|^2)/4=2|alpha|^2 and the other parts give all together  2(|q_10|^2+|q_01|^2)/4=1+|alpha|^4. One keeps therefore precisely the modified normalization of Eq.(7). One can clearly see that the "interference part" and the other parts are outcomes from an incorrect normalization.
Nice regards,
Martin

Begin forwarded message:

From: CHRISTOPHER GERRY <CHRISTOPHER.GERRY@lehman.cuny.edu>
Subject: Re: The Kalamidas affair
Date: June 1, 2013 9:46:37 AM PDT
To: nick herbert <quanta@cruzio.com>
Cc: Ghirardi Giancarlo <ghirardi@ictp.it>, Demetrios Kalamidas <dakalamidas@sci.ccny.cuny.edu>, John Howell <howell@pas.rochester.edu>, Suda Martin <martin.suda.fl@ait.ac.at>, Ruth Kastner <rekastner@hotmail.com>, JACK SARFATTI <adastra1@me.com>, "Romano rromano@iastate.edu [MATH]" <rromano@iastate.edu>

Nick and everyone,

The specific failings of the Kalamidas proposal have, in fact, been pointed out in the papers you mentioned and elsewhere. I don't understand why anyone continues to say otherwise. To say that they have not been addressed does not make it so, and comes off merely an act of denial. This has been an interesting episode, but I think it's time to stop beating a dead horse. Chris


On Jun 1, 2013, at 9:13 AM, nick herbert <quanta@cruzio.com> wrote:

Kalamidas fans--

NH: I believe that everyone is in agreement that general considerations prove that the Kalamidas proposal must fail.

JS: Yes

In both Ghirardi's and Gerry's papers, they emphasize these general considerations and decline to engage in the specifics of Kalamidas's calculations. Whether one wishes to engage the specifics or not is a matter of taste. But Kalamidas is asking us to engage in specifics. As he puts it: Since you know that I am wrong, it should be "easy pickins" to
point out exactly where I am mistaken.

Gerry comes closest to meeting Kalamidas's challenge to move out of the safety of generalities and deal with specifics.

In the conclusion of Gerry's paper he states "Clearly, if the exact calculation shows no interference, but the approximate calculation does, there is something wrong with the approximate calculation. Looking at Eq 6, one notes that while some terms to order rA have been kept in going from 6a to 6c, the terms labeled "vanishing" in Eq 6b are also of this order and have been discarded. Thus the approximate calculation in {1} is inconsistent and wrong."

Gerry engages in specifics. He is meeting Kalamidas on his own terms. But he neglects to specify exactly which terms of order rA Kalamidas has mistakenly labeled as "vanishing". When Gerry displays these wrongly-neglected terms (perhaps in an informal note), he would have definitively "slain the beast in his own lair" and we can all get on with the non-Kalamidas aspects of our lives.

JS: Agreed, thanks Nick :-)

Nick

PS: There is still the fascinating Martin Suda Paradox which was discovered in the context of the Kalamidas refutation, but that is a separate issue altogether.

JS: What is that Nick? Please give details.

Begin forwarded message:

From: JACK SARFATTI <adastra1@me.com>
Subject: [ExoticPhysics] Fwd: The Kalamidas affair
Date: June 1, 2013 7:45:42 AM PDT
To: Exotic Physics <exoticphysics@mail.softcafe.net>
Reply-To: Jack Sarfatti's Workshop in Advanced Physics <exoticphysics@mail.softcafe.net>

Sent from my iPad


Subject: Re: The Kalamidas affair

yes I agree with this
any attempt at signaling within axioms of orthodox quantum theory will fail e.g. Adrian Kent's papers
however, antony valentini, myself and others (Stapp, Weinberg, Josephson) have all independently proposed several extensions giving a more general non-orthodox post quantum theory containing orthodox quantum theory as a limiting case. In particular, the non-hermitian boson destruction operator is a macroscopic observable with Glauber coherent eigenstates that are non-orthogonal distinguishable violating orthodox quantum theory. Furthermore, they obey a non-unitary dynamics given by the c-number landau-ginzburg equation for spontaneous broken symmetry ground/vacuum state emergent local order parameters. These order parameters entangle with others and also with orthodox qubits, so we have a new larger theory here analogous to general relativity in relation to special relativity.

Furthermore, there is no violation with the group structure of relativity because  intervals are frame invariant and what matters is the interval between actual irreversible detections. What is violated is the retarded casuality axiom appended to relativity that is adhoc like Euclid's fifth axiom. Again the analogy to non-Euclidean geometry is appropriate.

Sent from my iPad

On Jun 1, 2013, at 6:40 AM, CHRISTOPHER GERRY <CHRISTOPHER.GERRY@lehman.cuny.edu> wrote:

Everyone,

I'm in total agreement with Prof. Ghirardi's assessment. The beam splitter transformations are not the essential point here, as even if the are done correctly, the claimed effect goes away. We addressed the beam splitter issue in our comment to demonstrate that sloppy calculations in general are contained in the Kalamidas paper. We then assumed that the one case of his t and r of parameters that would satisfy the reciprocity relations actually held, thus ensuring that his transformations did not violate unitarity (for that one case!) and from there showed via an exact calculation that the effect disappears. As I said, it will disappear even with totally correct, unitary beam splitter transformations, just as stated by Prof. Ghirardi. Chris



Christopher C. Gerry
Professor of Physics
Lehman College
The City University of New York
718-960-8444
christopher.gerry@lehman.cuny.edu


---- Original message ----
Date: Sat, 1 Jun 2013 14:57:07 +0200
From: Ghirardi Giancarlo <ghirardi@ictp.it>  Subject: The Kalamidas affair  To: CHRISTOPHER GERRY <christopher.gerry@lehman.cuny.edu>, Demetrios Kalamidas <dakalamidas@sci.ccny.cuny.edu>, John Howell <howell@pas.rochester.edu>, nick herbert <quanta@cruzio.com>, Suda Martin <martin.suda.fl@ait.ac.at>, Ruth Kastner <rekastner@hotmail.com>, JACK SARFATTI <adastra1@me.com>, "Romano rromano@iastate.edu [MATH]" <rromano@iastate.edu>

Dear all,
  attached herewith you will find a letter (even though it looks like a paper for technical reasons) that I have decided to forward to you to make clear the conceptual status of the situation. I hope of having been clear and I wait for comments.

With my best regards


GianCarlo


________________
remarks.pdf (83k bytes)
________________


_______________________________________________
ExoticPhysics mailing list
ExoticPhysics@mail.softcafe.net
http://mail.softcafe.net/cgi-bin/mailman/listinfo/exoticphysics

Stephen Hawking's warning on ET Contact
Like · · Share
  • Jack Sarfatti Stephen Hawking has warned us to keep a low profile with ET.

    Stephen Hawking's Tips for Contacting E.T: Everyone Please Just ...

    gizmodo.com/.../stephen-hawkings-tips-for-contacting-et-every...
    by Jack Loftus - in 107 Google+ circles
    Apr 25, 2010 – Stephen Hawking, brilliant scientist, has a simple message for humanity when it comes to contacting E.T.: Shut up. No, really. The pessimistic ...
    Stephen Hawking aliens warning: Contacting ET 'a bad idea' | Metro ...

    metro.co.uk/.../stephen-hawking-aliens-warning-contacting-et-a-...
    by Ted Thornhill - in 22 Google+ circles
    Apr 25, 2010 – Stephen Hawking: Aliens warningThe world-reknowned theoretical physicist rings the alarm bells about reaching out to ET in a new ...
    Stephen Hawking says Earth should not phone ET - New Scientist
    www.newscientist.com/blogs/.../04/stephen-hawking-says-earth-sho.html
    Apr 26, 2010 – Should we try to make contact with ET? Certainly not, says StevenHawking, citing concerns that our Earthly resources would be plundered.
    Stephen Hawking: "We've Been Overlooked by Advanced ET ...
    www.dailygalaxy.com/.../stephen-hawkings-wager-we-have-been-overlo...
    Nov 22, 2011 – In his famous lecture, Life in the Universe, Stephen Hawking asks: "What are the chances that we will encounter some alien form of life, as we ...
    Stay home ET. Stephen Hawking: Aliens may pose risks
    phys.org/news191420676.html
    Apr 25, 2010 – (AP) -- British astrophysicist Stephen Hawking says aliens are out there, but it could be too dangerous for humans to interact with extraterrestrial ...
    Hawking: Aliens may pose risks to Earth - Technology & science ...
    www.nbcnews.com/id/.../ns/.../hawking-aliens-may-pose-risks-earth/
    Apr 25, 2010 – British astrophysicist Stephen Hawking says aliens are out there, but it could be too dangerous for humans to interact with extraterrestrial life.
    If Aliens Exist,They May Come to Get Us, Stephen Hawking Says ...

    www.space.com/8288-aliens-exist-stephen-hawking.html
    by Clara Moskowitz - in 395 Google+ circles
    Apr 26, 2010 – If intelligent alien life forms do exist, they might not be the friendly cosmic neighbors the people of Earth are looking for, famed British scientist ...
  • Jack Sarfatti On May 17, 2013, at 11:04 AM, Adam Crowl wrote:

    "Of course no one ever discusses the theories that Eric's collaborator, Jacques Vallee, has about the nature and purpose of UFOs. Too scary? That ET might have a sinister agenda? Might want to mess with our heads for their own ends?"

    I replied:
    Yes, you are correct. However, if Jacques is correct there is even less reason to support clunky rockets for interstellar travel! Indeed, Dan Throop Smith is running with Vallee's ball in his comical eccentric way of course.

    Of course, any advanced civilization with warp-wormhole WEAPONRY will also most likely have post-quantum signal nonlocality mind-control psychotronics.
    Subquantum Information and Computation
    Antony Valentini
    (Submitted on 11 Mar 2002 (v1), last revised 12 Apr 2002 (this version, v2))
    It is argued that immense physical resources - for nonlocal communication, espionage, and exponentially-fast computation - are hidden from us by quantum noise, and that this noise is not fundamental but merely a property of an equilibrium state in which the universe happens to be at the present time. It is suggested that 'non-quantum' or nonequilibrium matter might exist today in the form of relic particles from the early universe. We describe how such matter could be detected and put to practical use. Nonequilibrium matter could be used to send instantaneous signals, to violate the uncertainty principle, to distinguish non-orthogonal quantum states without disturbing them, to eavesdrop on quantum key distribution, and to outpace quantum computation (solving NP-complete problems in polynomial time).
    Comments: 10 pages, Latex, no figures. To appear in 'Proceedings of the Second Winter Institute on Foundations of Quantum Theory and Quantum Optics: Quantum Information Processing', ed. R. Ghosh (Indian Academy of Science, Bangalore, 2002). Second version: shortened at editor's request; extra material on outpacing quantum computation (solving NP-complete problems in polynomial time)
    Subjects: Quantum Physics (quant-ph)
    Journal reference: Pramana - J. Phys. 59 (2002) 269-277
    DOI: 10.1007/s12043-002-0117-1
    Report number: Imperial/TP/1-02/15
    Cite as: arXiv:quant-ph/0203049
    (or arXiv:quant-ph/0203049v2 for this version)

    They will have solved the mind-matter problem perhaps along the lines I have suggested well described here by Michael Towler in Lecture 8

    http://www.tcm.phy.cam.ac.uk/~mdt26/pilot_waves.html
  1.  
    · Comment
  2. Friends
    See All
    • Jonathan Vos Post
    • Andreas Albrecht
    • Dan Smith
    • Tove K. Danovich
    • Maxine Pearson
    • Bonnie Tarwater
    • Stewart Swerdlow
    • Joseph Nechvatal
    •  
       
      Explaining the Paranormal with Physics - Debate with Garrett Moddel
      stardrive.org
      Stardrive, ISEP, Internet Science Education Project
  3. Photos
    See All
    Most of the FUN people on Christmas Eve!
    Hi Jack and Joe!
    Why Study Entanglement ?  Live version of presentation
http://streamer.perimeterinstitute.ca/Flash/4e43d53a-9611-4e3d-a012-89b94af61d89/viewer.html


(Flash Presentation, MP3, Windows Presentation,Windows Video File,PDF) -
  4. Explaining the Paranormal with Physics
    • Jack Sarfatti Garret Moddel
      Professor, Electrical, Computer & Energy Engineering
      University of Colorado
      Quantum Engineering Lab: http://ecee.colorado.edu/~moddel/QEL/index.html
      PsiPhen Lab: http://psiphen.colorado.edu On Jan 17, 2013, at 6:17 PM, Garret Moddel wrote:

      Thank you for the respect!

      The answer is clearly not (1), but that does not mean it is (2). It could be none of the above.

      Jack: Again I strongly disagree. You are opting for no-explanation or perhaps a non-scientific supernatural explanation. It's obvious to my mind, and I think to many others that quantum entanglement when supplemented with signal nonlocality beyond orthodox quantum theory has all the properties in a natural way that the evidence demands. Now, ultimately to paraphrase Einstein - the correspondence of theory with experiment depends upon the "free invention of the human imagination" into making a coherent narrative. Either you grok it or you don't. Ultimately it comes down to intuitive judgement I suppose. That one can sense events which have not happened before they happen, but which will happen in a Novikov loop in time makes perfect sense in the coherent narrative (paradigm) of entanglement + signal nonlocality. This idea is Popper falsifiable. Without signal nonlocality the kind of evidence you say you believe could not possibly occur.

      The basic no-signal arguments of orthodox quantum theory assert that looking locally at one part B of an entangled system will only show perfectly random noise independent of how one changes the parameter settings (e.g. orientation of a Stern-Gerlach magnet) of a detector of a distantly entangled part A. With signal nonlocality that is no longer the case and a non-random signal can be detected at B's detector depending on the local time sequence of parameter settings for A's detector - without the need for a classical signal key to decrypt the entangled message as in orthodox quantum theory. Moreover, the spatio-temporal separation between the paired detections of A & B do not matter at all. Entanglement is independent of the space-time separation between the irreversible detections of A & B even if A the active sender is in the timelike future of B the passive receiver.

      Bottom line, you are happy not to have any explanation rooted in known physical theory. I am not happy with that, given that there is a natural explanation available that only requires a minimal extension of quantum physics analogous to extending special relativity to general relativity, or extending classical mechanics to orthodox quantum mechanics, or re-interpreting classical thermodynamics in terms of kinetic theory of gases and then beyond to classical statistical mechanics.

      Garrett: If we had been discussing solutions to the ultraviolet catastrophe in the late 19th century and you offered me (1) classical thermodynamics, or (2) natural radical conservative extensions of orthodox Maxwell equations, that would be too limited a choice. None-of-the-above would have included the Planck distribution and quantum mechanics. We may well be in a similar situation here.

      Jack: I think you are making a simple problem more complex. To my mind at least entanglement with signal nonlocality is a perfectly obvious natural explanation and why you cannot see that surprises me.

      Garrett: The only way I know of to distinguish whether natural radical conservative extensions of orthodox quantum theory do resolve the issue would be if they provided testable, and falsifiable, predictions that are then tested.

      Jack: You have put the cart before the horse. The kinds of evidence you say you believe is precisely what to expect from entanglement + signal nonlocality! Indeed, the ABSENCE of the kind of evidence you say you believe would have been the POPPER FALSIFICATION of the entanglement + signal nonlocality explanation!

      Now, in dealing with human subjects of enormous complexity with many variables we cannot control, you can't expect the kind of quantitative comparison of numerical data with equations that we get in Newtonian celestial mechanics or in the radiative corrections to quantum electrodynamics etc. If you are looking for that, you won't get it. However, given the idea that entanglement + signal nonlocality is the mechanism of consciousness itself, one may hope to mimic it in the laboratory with nano-engineering naturally conscious solid-state android brains for example - conscious computers. Such things become thinkable scientifically.
    • Jack Sarfatti BTW in case you are not aware of this:
      Subquantum Information and Computation
      Antony Valentini
      (Submitted on 11 Mar 2002 (v1), last revised 12 Apr 2002 (this version, v2))
      It is argued that immense physical resources - for nonlocal communication, espionage, and exponentially-fast computation - are hidden from us by quantum noise, and that this noise is not fundamental but merely a property of an equilibrium state in which the universe happens to be at the present time. It is suggested that 'non-quantum' or nonequilibrium matter might exist today in the form of relic particles from the early universe. We describe how such matter could be detected and put to practical use. Nonequilibrium matter could be used to send instantaneous signals, to violate the uncertainty principle, to distinguish non-orthogonal quantum states without disturbing them, to eavesdrop on quantum key distribution, and to outpace quantum computation (solving NP-complete problems in polynomial time).
      Comments: 10 pages, Latex, no figures. To appear in 'Proceedings of the Second Winter Institute on Foundations of Quantum Theory and Quantum Optics: Quantum Information Processing', ed. R. Ghosh (Indian Academy of Science, Bangalore, 2002). Second version: shortened at editor's request; extra material on outpacing quantum computation (solving NP-complete problems in polynomial time)
      Subjects: Quantum Physics (quant-ph)
      Journal reference: Pramana - J. Phys. 59 (2002) 269-277
      DOI: 10.1007/s12043-002-0117-1
      Report number: Imperial/TP/1-02/15
      Cite as: arXiv:quant-ph/0203049
      (or arXiv:quant-ph/0203049v2 for this version)
      Submission history
      Excerpts from
    • Jack Sarfatti Theoretical model of a purported empirical violation of the predictions of quantum theory

      Henry P. Stapp

      (Originally published in Physical Review A, Vol.50, No.1, July 1994)

      ABSTRACT: A generalization of Weinberg's nonlinear quantum theory is used to model a reported violation of the predictions of orthodox quantum theory.
      I. INTRODUCTION

      This work concerns the possibility of causal anomalies. By a causal anomaly I mean a theoretical or empirical situation in which the occurrence or nonoccurrence of an observable event at one time must apparently depend upon a subsequently generated (pseudo) random number, or willful human act.

      Considerations of the Einstein-Podolsky-Rosen [1] and Bell's-Theorem [2] type entail [3] -- if many-world's interpretations are excluded -- the occurrence of causal anomalies on the theoretical level, provided certain predictions of quantum theory are at least approximately valid. However, those anomalies cannot manifest on the empirical level if the quantum predictions hold exactly [4]. On the other hand, slight departures from the exact validity of the quantum predictions [5] could lead to small but observable causal anomalies [6].

      Empirical causal anomalies have been reported in the past in experiments that appear, at least superficially, to have been conducted in accordance with scientific procedures [7], and the protocols are becoming ever more stringent [8]. I do not enter into the difficult question of assessing the reliability of these reports. The scientific community generally looks upon them with skepticism. But at least part of this skepticism originates not from specific challenges to the protocols and procedures of the works of, for example, Jahn, Dobyns and Dunne [7], but from the belief that such results are not compatible with well-established principles of physics, and hence to be excluded on theoretical grounds. However, it turns out that small modifications of the standard quantum principles would allow some of the most impossible sounding of the reported phenomena to be accommodated. According to the report in Ref. [8], it would appear that in certain experimental situations willfull human acts, selected by pseudorandom numbers generated at one time, can shift, relative to the randomness predicted by normal quantum theory, the timings of radioactive decays that were detected and recorded months earlier on floppy discs, but that were not observed at that time by any human observer. Such an influence of an observer backward in time on atomic events seems completely at odds with physical theory. However, a slight modification of normal quantum theory can accommodate the reported data. In the scientific study of any reported phenomena it is hard to make progress without a theoretical description that ties them in a coherent way into the rest physics.

      The purpose of the present work is to construct, on the basis of an extension of Weinberg's nonlinear generalization of quantum theory [5], a theoretical model that would accommodate causal anomalies of the kind described above. Specifically, the present work shows that the reported phenomena, although incompatible with the main currents of contemporary scientific thought, can be theoretically modeled in a coherent and relatively simple way by combining certain ideas of von Neumann and Pauli abut the interpretation of quantum theory with Weinberg's nonlinear generalization of the quantum formalism.

      II. THE THEORETICAL MODEL

      To retain the mathematical structure of quantum theory almost intact, I shall exploit the ideas of von Neumann [9] and Pauli [10], according to which the von Neumann process number 1 (reduction of the wave packet) is physically associated with the mental process of the observer. It is interesting that two of our most rigorous-minded mathematical physicists should both be inclined to favor an idea that is so contrary to our normal idea of the nature of the physical world. most physicists have, I think, preferred to accept the common-sense idea that the world of macroscopic material properties is factual: e.g., that the Geiger counter either fires or does not fire, independently of whether any observer has witnessed it; and that the mark on the photographic plate is either there or not there, whether anyone observes it or not. Yet it is difficult to reconcile this common-sense intuition with the mathematical formalism of quantum theory. For there is in that structure no natural breakpoint in the chain of events that leads from an atomic event that initiates the chain to the brain event associated with the resulting observational experience. From the perspective of the mathematical physicist the imposition of a breakpoint at any purely physical level is arbitrary and awkward: it would break the close connection between mathematics and the physical world in a way that is mathematically unnatural, and moreover lacks any empirical or scientific justification. From a purely logical perspective it seems preferable to accept the uniformity of nature's link between the mathematical and physical worlds, rather than to inject, without any logical or empirical reason, our notoriously fallible intuitions about the nature of physical reality.
    • Jack Sarfatti Following, then, the mathematics, instead of intuition, I shall adopt the assumption that the Schrodinger equation holds uniformly in the physical world. That is, I shall adopt the view that the physical universe, represented by the quantum state of the universe, consists merely of a set of tendencies that entail statistical links between mental events.

      In fact, this point of view is not incompatible with the Copenhagen interpretation, which, although epistemological rather than ontological in character [11], rests on the central fact that in science we deal, perforce, with connections between human observations: the rest of science is a theoretical imagery whose connection to reality must remain forever uncertain.

      According to this point of view, expressed however in ontological terms, the various possibilities in regard to the detection of a radioactive decay remain in a state of "possibility" or "potentiality," even after the results are recorded on magnetic tape: no reduction of the wave packet occurs until some pertinent mental event occurs.

      By adopting this non-common-sense point of view, we shift the problem raised by the reported results from that of accounting for an influence of willful thoughts occurring at one time upon radioactive decays occurring months earlier to the simpler problem of accounting for the biasing of the probabilities for the occurrence of the thoughts themselves, i.e., a biasing relative to the probabilities predicted by orthodox quantum theory. This latter problem is manageable: Weinberg [5] has devised a nonlinear quantum mechanics that is very similar to quantum theory, but that can produce probabilities that are biased, relative to the probabilities predicted by linear quantum mechanics. Gisin [6] has already pointed out that Weinberg's theory can lead to causal anomalies.

      According to the interpretation of quantum theory adopted here, the mechanical recording of the detection of the products of a radioactive decay generates a separation of the physical world into a collection of superposed "channels" or "branches": the physical world, as represented by the wave function of the universe, divides into a superposition of channels, one for each of the different possible recorded (but unobserved) results. Contrary to common sense the recorded but unobserved numbers remain in a state of superposed "potentia," to use the word of Heisenberg. Later, when the human observer looks at the device, the state of his brain will separate into a superposition of channels corresponding to the various alternative macroscopic possibilities, in the way described by von Neumann [9]. FInally, when thepsychological event of observation occurs, the state of the universe will be reduced by a projection onto those brain states that are singled out by the conscious experience of the observer [12].

      If the probabilities associated with the various alternative possibilities for the brain state are those given by orthodox quantum theory, then there can be no systematic positive bias of the kind reported: the probabilities associated with the alternative possible brain events will necessarily, according to the orthodox theory, as explained by von Neumann, agree with those that were determined earlier from the probabilities of the alternative possible detections of radioactive decays: there could be no biasing of those probabilities due to a subsequent willful intent of an observer. However, a generalization of Weinberg's nonlinear quantum mechanics allows the probabilities for the possible reductions of the state of the brain of the observer to be biased, relative to those predicted by orthodox quantum theory, by features of the state of the brain of the conscious observer. If such a feature were the activity of the brain that is associated with "intent," then the effect of the anomalous term in the Hamiltonian would be to shift the quantum probabilities corresponding to the various alternative possible conscious events toward the possibilities linked to his positive intent.

      We turn, therefore, to a description of Weinberg's theory, in the context of the problem of the shifting of the probabilities away from those predicted by orthodox quantum theory, and toward those defined by an "intent" represented by particular features of the state of the brain of the observer.

      Weinberg's nonlinear quantum theory is rooted in the fact that the quantum-mchanical equations of motion for a general quantum system are just the classical equations of motion for a very simple kind of classical system, namely a collection of classical simple harmonic oscillators. Thus a natural way to generalize quantum theory is to generalize this simple classical system.
      [ technicalities deleted... ]

      This example shows that the reported phenomena, although contrary to orthodox ideas about causality, can be model within a Weinberg-type of nonlinear quantum theory if the Hamiltonian functionh(psi,psi*) is allowed to be nonreal.

      If there are in nature nonlinear contributions of the kind indicated...then it seems likely that biological systems would develop in such a way as to exploit the biasing action. The biasing states, illustrated in the model by the state |chi>, could become tied, in the course of biological evolution, to biological desiderata, so that the statistical tendencies specified by the basic dynamics would be shifted in a way that would enhance the survival of the organism.

      The Weinberg nonlinearities were intially introduced in the present context because of Gisin's result, which showed that these nonlinearities could lead to causal anomalies of the Einstein-Podolsky-Rosen (EPR) kind. However, the considerations given above indicate that those nonlinearities alone cannot produce anomalies of the kind reported in Ref. [8]: a nonreal h is apparently needed to obtain an effect of that kind.

      Because the nonlinear aspect is not obviously needed, one could try to revert to a linear theory. Yet it is important to recognize that in the modeling of acausal effects one has available the more general nonlinear framework.

      If the purported acausal phenomena is a real physical eitect and is explainable in terms of a nonreal h that arises solely in conjunction with nonlinear terms, as in the model given above, then orthodox quantum theory could become simply the linear approximation to a more adequate nonlinear theory.

      [1] A. Einstein, B. Podoisky, and N. Rosen, Phys. Rev. 47, 777 (1935).
      [2] J.S. Bell, Physics 1, 195 (1964).
      [3] H.P. Stapp, Phys. Rev. A 47, 847 (1993); 46, 6860 (1992); H.P. Stapp and D. Bedford, Synthese (to be published).
      [4] P. Eberhard, Nuovo Ciniento 46B, 392 (1978).
      [5] S. Weinberg, Ann. Phys.(N.Y.)194,336 (1989).
      [6] N. Gisin, Phys. Lett. A 143, 1 (1990).
      [7] R. Jahn, Y. Dobyns, and B. Dunne, J. Sci. Expl. 5, 205 (1991); B.J. Dunne and R.G. Jahn, ibid. 6, 311 (1992).
      [8] H. Schmidt, J. Parapsychol. 57, 351 (1993).
      [9] J. von Neumann, Mathematical Foundations of Quantum Mechanics (Princeton University Press, Princeton, 1955), Chap. VI.
      [10] W. Pauli, quoted in Mind, Matter, and Quantum Mechanics (Springer-Verlag, Berlin, 1993), Chap. 7.
      [11] H.P. Stapp, Am. J. Phys. 40, 1098 (1972).
      [12] H.P. Stapp, Mind, Matter, and Quantum Mechanics (Ref. [10]).

      http://www.fourmilab.ch/rpkp/stapp.html
    • Jack Sarfatti Garrett: I don't know of any such predictions and tests for psi phenomena. We've entered the realm of philosophy and may not be able to resolve this for now.

      Jack: Start here:

      Research papers of interest:
      ...See More
      www.fourmilab.ch
      RPKP wishes to thankHelmut Schmidtfor his continuing advice and encouragement, as well as the loan of anoise-based true random generator. Thanks also toRoger Nelsonat thePrinceton Engineering Anomalies Research lab,Peter Moorein Theology and Religious Studies (UKC), Sir Robert Bunkum for guidance, s...
    • Jack Sarfatti On Jan 17, 2013, at 3:03 PM, Jack Sarfatti <sarfatti@pacbell.net> wrote:

      I respectfully disagree completely with you. A post-quantum theory for this exists. There are several alternative independently derived natural radical conservative extensions of orthodox quantum theory e.g. Stapp, Valentini, Cramer, myself, et-al that have entanglement signaling. There are only two possible interpretations of the evidence
      1) classical electromagnetic OR 2) quantum entanglement supplemented by non-unitary signal nonlocality. If 1) is false, then 2) is true. There is no other alternative if we accept the data as true. If u have a third rational physical alternative, what is it?

      Sent from my iPhone

      On Jan 17, 2013, at 1:25 PM, Garret Moddel wrote:

      Those examples are evidence for psi, which I have no argument with. In a number of studies my lab has also found robust evidence for psi and retrocausal effects.

      However, to conclude that these are due to quantum entanglement is speculative, and so far unsupported by the evidence. Psi shares characteristics with quantum phenomena and psi does influence quantum states (along with any other statistically fluctuating states). But no quantum theory of psi that I am aware of provides accurate predictions. Until there is a falsifiable (in the Popper sense) theory for psi that incorporates quantum entanglement I will remain skeptical of the connection between the two.

      That is the reason that I stated there is a similarity but no direct connection between psi and quantum entanglement.

      -Garret

      On Jan 14, 2013, at 1:27 PM, jack <sarfatti@pacbell.net> wrote:

      Sent from my iPad

      On Jan 14, 2013, at 11:46 AM, Garret Moddel <Moddel@Colorado.EDU> wrote:

      Chris & Jack-

      Garrett: My statement was based on the standard interpretation of quantum entanglement, in which correlation is maintained but there cannot be any information transferred between the distant particles.
      Jack: Right but the evidence clearly shows that no entanglement signal theorem is empirically wrong in my opinion. This is the debate.

      Garrett:I know there are alternative theories, but is there solid evidence of superluminal information transfer in QE? I haven't been following this discussions. It would be great to have evidence that my statement has been shown to be false, because that really would open a lot of doors.

      Jack: Theory along lines of Stapp, Weinberg, Josephson, myself, Cramer, Valentini, i.e. radical conservative extension of orthodox qm to include non-unitary nonlinear effects

      Evidence: presponse Libet, Radin, Bierman, Bem

      Puthoff & Targ SRI

      On Jan 12, 2013, at 7:53 PM, JACK SARFATTI <sarfatti@pacbell.net> wrote:

      Thanks.

      On Jan 12, 2013, at 6:35 PM, hris W wrote:

      Hey Dr. S,

      Here is a link to Garret Moddel's interview (I was incorrect about it being a talk). The transcript of the interview is on this page. If you search for ....

      Garrett: "There’s a similarity, but there’s no direct connection. For example, quantum entanglement is a phenomenon in which two particles at a distance are inter-related. So if you measure one particle, you affect the other particle, instantly, and as far away as you like."

      Jack: I think Moddel is mistaken. It's a direct connection in my opinion provided that electromagnetic communication (both near and far field) can be excluded. Entanglement with Valentini's signal nonlocality is the only remaining explanation assuming good data.

      Chris: You will find the context of the statement also at 4:11 in the mp3 recording. The statement is not directly related to Radin's research but to PSI. I'm assuming (I'm not an expert in these areas) that the underlying phenomenon is related. The following URL contains the podcast interview.

      http://www.skeptiko.com/garret-moddel-brings-psi-to-colorado/

      Additionally, in case you are interested, I have linked the papers that are related to the Grinberg-Zylberbaum experiment.

      Jack: Yes, Fred Alan Wolf & I I knew Jacobo Grinberg in Brazil in 1984. I think he was murdered in Mexico years ago.

      // 2005 Paper TL Richards et al...
      http://www.ncbi.nlm.nih.gov/pubmed/16398586

      // 2004 Paper Standish (TL Richards) et al...
      http://www.ncbi.nlm.nih.gov/pubmed/15165411

      // 2003 Paper by Jiri Wackerman (published in Neuroscience Letters)
      http://www.ncbi.nlm.nih.gov/pubmed/12493602

      Thanks!!!
      chris
      www.skeptiko.com
      Professor at University of Colorado's Department of Electrical and Computer Engineering guides students through experiments demonstr


On Dec 29, 2012, at 2:20 PM, Paul Murad <ufoguypaul@yahoo.com> wrote:

 
This is a very pessimistic perspective.
 
Man by itself is incapable of developing morality and ethics except with God. You mention
death, well if there is a hell, the believe that they exist without god or absence that we can
assume means love for that matter may indeed make hell a very empty disparate place.
 
The crutch that exists may not be fully a religious point but rather a historical view that is part
of mankind's culture. These things happened, are real and they occurred. Regarding your view about
different religious causing problems, I would have to agree but I do not see any contradiction
in believing in God and the possibility of reincarnation...
 
To mention Jung-Pauli is child-play... Scientists are only rarely right and on metaphysical subjects,
we do not have the physical evidence to judge truth or falsehood with a clearly defined scientific
investigation.
 
Paul
Paul M,
1) Rupert Sheldrake's morphogenetic field data is direct evidence for the Jung-Pauli information field.

2) The Central Intelligence Agency Stanford Research Institute Remote Viewing data is evidence for the Jung-Pauli information field.

3) Reincarnation data is evidence for the Jung-Pauli information field.

on all of the above see in particular Russell Targ's several new books as well as Hal Puthoff's on-line report.

4) There is a solid theoretical physics basis for it

a) David Bohm's Implicate Order = world hologram screen software on both our past and future cosmic horizons - the Alpha Point past particle horizon and the Omega Point future event horizon shown in my modification of Tamara Davis's PhD fig 1.1c



For details see http://www.tcm.phy.cam.ac.uk/~mdt26/pilot_waves.html (note also Lecture 8)

The work of MIT physicist Seth Lloyd shows that these two cosmological horizons are computers.

I think they are conscious computers i.e. Hawking's Mind of God - literally

See also the papers of Antony Valentini on signal nonlocality

e.g.
Subquantum Information and Computation
Antony Valentini
(Submitted on 11 Mar 2002 (v1), last revised 12 Apr 2002 (this version, v2))
It is argued that immense physical resources - for nonlocal communication, espionage, and exponentially-fast computation - are hidden from us by quantum noise, and that this noise is not fundamental but merely a property of an equilibrium state in which the universe happens to be at the present time. It is suggested that 'non-quantum' or nonequilibrium matter might exist today in the form of relic particles from the early universe. We describe how such matter could be detected and put to practical use. Nonequilibrium matter could be used to send instantaneous signals, to violate the uncertainty principle, to distinguish non-orthogonal quantum states without disturbing them, to eavesdrop on quantum key distribution, and to outpace quantum computation (solving NP-complete problems in polynomial time).
Comments:    10 pages, Latex, no figures. To appear in 'Proceedings of the Second Winter Institute on Foundations of Quantum Theory and Quantum Optics: Quantum Information Processing', ed. R. Ghosh (Indian Academy of Science, Bangalore, 2002). Second version: shortened at editor's request; extra material on outpacing quantum computation (solving NP-complete problems in polynomial time)
Subjects:    Quantum Physics (quant-ph)
Journal reference:    Pramana - J. Phys. 59 (2002) 269-277
DOI:    10.1007/s12043-002-0117-1
Report number:    Imperial/TP/1-02/15
Cite as:    arXiv:quant-ph/0203049
     (or arXiv:quant-ph/0203049v2 for this version)

Also see the 46 minute raw video of me and Dan Smith discussing this. I look like a frumpy shlepper in it, but the content is good.

www.youtube.com/watch?v=A56hT_51v7I