Text Size

Stardrive

Tag » entanglement signals

 

"Let us illustrate the problem of signalling with the assistance of the ubiquitous experimenters Alice and Bob. We will place Alice and Bob at some distance apart, and between them there will be a source emitting pairs of entangled particles. To avoid relativistic complications we will assume that Alice, Bob, their detectors, and the particle source are all mutually at rest in an inertial frame (the ‘lab’ frame). Pair after pair of particles are emitted by the source and detected by Alice and Bob's apparatuses, who record their results. Alice and Bob are free to alter the angle of their detectors with each run of the apparatus.

 
What each experimenter will record is an apparently random sequence of ups and downs, like the results of an honest coin repeatedly tossed; and yet, when they compare results afterward, they will note that certain correlations, generally sinusoidal in form, stand between their results. For example, if the particles are spin-1/2 fermions, and if Alice and Bob are measuring spin in a particular direction, then the correlation between their results will be -cos@ where @ is the angle between Alice and Bob's detectors. Sinusoidal correlations like these readily violate mathematical inequalities such as those defined by Bell (1964).  Itamar Pitowsky (1994) showed that the Bell Inequalities are examples of “conditions of possible experience” first written down by George Boole; these are consistency conditions between measurement results on the assumption that the results of one measurement and the way it is carried out does not influence the measurement of the other particle at the time of measurement. This means that the particular sequence of results that Alice and Bob get at their respective detectors could not have been encoded in the particles at the source; for some relative angles their results are too well correlated or anti-correlated for them to be due to local causes built into both particles when they were emitted” Kent Peacock "The No-Signalling Theorems: A Nitpicking Distinction” 
 
Here is the setup
 
Bob is closer to the pair source S than Alice.
 
B — S—————A
 
Bob does not change his settings.
 
Alice at the last moment changes her settings in delayed choice fashion AFTER Bob’s particles in the entangled pairs has already been detected.
 
This is done in pulse fashion so that there is a good statistical sample of particles in each pulse.
 
Each setting (ai,b) b-fixed has random outputs 1,0 for each individual detection.
 
Using the statistical rules of orthodox quantum theory Alice and Bob compare their raw data after the experiment is over and from the fraction of coincidences in each pulse, Bob can infer the sequence of settings a1, a2, …. aN for N pulses, which is the encoded message.
 
It is obvious, since Bob did nothing at all,  that Alice’s free will choices of settings a1, a2, …. aN for N pulses  (which is the message) is the active future cause of the back-from-the-future coincidences, unless you want a paranoid conspiracy theory.
 

 

Now of course this is not Valentini’s “signal nonlocality” that is a larger theory violating orthodox quantum theory the way general relativity violates special relativity globally though not locally. With Valentini’s PQM extension of QM Bob can know in advance what Alice will choose even before she chooses it without doing the hindsight correlation analysis. However, any attempt by Bob to cause a paradox will fail either for reasons given by Thorne and Novikov or by David Deutsch.
Some new additions to my Stargate book

Stargate 
Making Star Trek Real
Jack Sarfatti
Internet Science Education Project
Foreword

“The future, and the future alone, is the home of explanation.”
Henry Dwight Sedgwick 

“Sarfatti's Cave is the name I'll give to the Caffe Trieste in San Francisco, where Jack Sarfatti, Ph.D. in physics, writes his poetry, evokes his mystical, miracle-working ancestors, and has conducted a several-decade-long seminar on the nature of reality … to a rapt succession of espresso scholars. ... It's Jack Sarfatti against the world, and he is indomitable. …One of his soaring theories is that things which have not happened yet can cause events in the present.” Gold, Herbert. Bohemia: Where Art, Angst, Love & Strong Coffee Meet. 

There is now a significant body of results on quantum interactions with closed timelike curves (CTCs) in the quantum information literature, … As a consequence, there is a prima facie argument exploiting entanglement that CTC interactions would enable superluminal and, indeed, effectively instantaneous signaling. …. Using the consistency condition, we show that there is a procedure that allows Alice to signal to Bob in the past via relayed superluminal communications between spacelike-separated Alice and Clio, and spacelike-separated Clio and Bob. This opens the door to time travel paradoxes in the classical domain … offering a possible window on what we might expect in a future theory of quantum gravity. Quantum interactions with closed timelike curves and superluminal signaling, Jeffrey Bub and Allen Stairs, PHYSICAL REVIEW A 89, 022311 (2014)

‘In this case, Bob possesses the unknown state even before Alice implements the teleportation. Causality is not violated because Bob cannot foresee Alice’s measurement result, which is completely random. But, if we could pick out only the proper result, the resulting “projective” teleportation would allow us to travel along spacelike intervals, to escape from black holes, or to travel in time.” Seth Lloyd et-al 

This is a series of blog essays about teleological destiny, quick time travel to colonize Earthlike exoplanets through stargates, and the possibility that we are three-dimensional hologram images in a virtual reality programmed by a cosmological conscious super-intelligence that is alive and well on our future two-dimensional dark energy edge of space that we can ever hope to see with light signals. My speculative hypothesis-conjecture of this book is that our idea of time and cause and effect is profoundly wrong. In particular the “unproven theorem paradox” of time travel is not a paradox at all.
“The “unproved theorem” paradox points out that if there are CTCs, then it might be possible to take a published proof of a theorem into the past and present it to someone, who then uses it to produce the very manuscript that leads to the theorem’s publication Bub & Stairs op-cit 

Evidence on “brain presponse” (Libet, Radin, Bierman, Bem) suggests that our consciousness and creativity are such meme self-creating strange loops. The universe does only not emerge out of the past, but is also pulled toward the future for a purpose. This idea is not new in philosophy, but has reappeared in physics starting with the work of John Archibald Wheeler and Richard Feynman in the 1940s. This back-from-the-future effect is needed to understand the nature of both dark matter and dark energy that is most of the stuff in our accelerating universe and most importantly to understand our own consciousness and how to reach the stars and beyond.
 
Jack Sarfatti
6 minutes ago via Twitter
  •  
    http://t.co/1Su7kSJkJk
    Phys. Rev. A 89, 022311 (2014) - Quantum interactions with closed timelike curves and...
    journals.aps.org
    There is now a significant body of results on quantum interactions with closed timelike curves (CTCs) in the quantum information literature, for both the Deutsch model of CTC interactions (D-CTCs) and the projective model (P-CTCs). As a consequence, there is a prima facie argument exploiting entangl…
     
     
     
  • Jack Sarfatti "There is now a significant body of results on quantum interactions with closed timelike curves (CTCs) in the quantum information literature, for both the Deutsch model of CTC interactions (D-CTCs) and the projective model (P-CTCs). As a consequence, there is a prima facie argument exploiting entanglement that CTC interactions would enable superluminal and, indeed, effectively instantaneous signaling. In cases of spacelike separation between the sender of a signal and the receiver, whether a receiver measures the local part of an entangled state or a disentangled state to access the signal can depend on the reference frame. We propose a consistency condition that gives priority to either an entangled perspective or a disentangled perspective in spacelike-separated scenarios. For D-CTC interactions, the consistency condition gives priority to frames of reference in which the state is disentangled, while for P-CTC interactions the condition selects the entangled state. Using the consistency condition, we show that there is a procedure that allows Alice to signal to Bob in the past via relayed superluminal communications between spacelike-separated Alice and Clio, and spacelike-separated Clio and Bob. This opens the door to time travel paradoxes in the classical domain. Ralph [T. C. Ralph, arXiv:1107.4675 [quant-ph].] first pointed this out for P-CTCs, but we show that Ralph's procedure for a “radio to the past” is flawed. Since both D-CTCs and P-CTCs allow classical information to be sent around a spacetime loop, it follows from a result by Aaronson and Watrous [S. Aaronson and J. Watrous, Proc. R. Soc. A 465, 631 (2009)] for CTC-enhanced classical computation that a quantum computer with access to P-CTCs would have the power of PSPACE, equivalent to a D-CTC-enhanced quantum computer."
  • Jack Sarfatti This is high octane fuel for my starship warp engine with Q continuum telepathic psychokinetic mind-control. 


On Sep 8, 2013, at 11:05 AM, JACK SARFATTI <jacksarfatti@icloud.com> wrote:

"Radin draws attention to the similarities between psi phenomena, where events separated in space and time appear to have a connection which can't be explained by known means of communication, and the entanglement of particles resulting in correlations measured at space-like intervals in quantum mechanics, and speculates that there may be a kind of macroscopic form of entanglement in which the mind is able to perceive information in a shared consciousness field (for lack of a better term) as well as through the senses."
I distinguish two levels of entanglement "weak" and "strong.".  The former is consistent with the "no-signal" arguments of mainstream "orthodox" quantum theory. A small minority of "fringe physicists" (including me) think these arguments are circular. With weak entanglement, a third party Eve can in hindsight see patterns of parallel behavior in Alice and Bob although neither Alice nor Bob are directly aware of what the other is thinking etc. With strong entanglement (aka "signal nonlocality" A. Valentini) we have what most people think of as telepathy  and precognition. Alice knows directly and instantly what Bob is thinking. Indeed, Alice may know ahead of time what Bob will think, but hasn't yet.

On Sep 8, 2013, at 10:19 AM, JACK SARFATTI <jacksarfatti@icloud.com> wrote:

http://ricochet.com/member-feed/Saturday-night-science-Entangled-Minds

"Parapsychology is small science.  There are only about 50 people in the entire world doing serious laboratory experiments in the field today, and the entire funding for parapsychology research in its first 130 years is about what present-day cancer research expends in about 43 seconds.  Some may say “What has parapsychology produced in all that time?”, but then one might ask the same of much cancer research.

Of the fifty or so people actively involved in parapsychology research, I have had the privilege to meet at least eight, including the author of the work reviewed infra, and I have found them all to be hard-headed scientists who approach the curious phenomena they study as carefully as physical scientists in any other field.  Their grasp of statistical methods is often much better than their more respectable peers in the mainstream publishing papers in the soft sciences.  Publications in parapsychology routinely use double-blind and randomisation procedures which are the exception in clinical trials of drugs.

The effect sizes in parapsychology experiments are small, but they are larger, and their probability of being due to chance is smaller, than the medical experiments which endorsed prescribing aspirin to prevent heart attacks and banning silicone breast implants.  What is interesting is that the effect size in parapsychology experiments of all kinds appears to converge upon a level which, while small, is so far above chance to indicate “something is going on”.

Before you reject this out of hand, I'd encourage you to read the book or view the videos linked below.  Many people who do this research started out to dismiss such nonsense and were enthralled when they discovered there appeared to be something there."


see also

On Sep 7, 2013, at 8:27 PM, nick herbert <quanta@cruzio.com> wrote:

=======================================

An exploration of mind merge
using physics not chemistry
in less than 1000 words..

http://shorts.quantumlah.org/entry/bobandalice-0

The theory of relativity deals with the geometric
structure of a four-dimensional spacetime. Quantum mechanics
describes properties of matter. Combining these
two theoretical edifices is a difficult proposition. For example,
there is no way of defining a relativistic proper
time for a quantum system which is spread all over
space. A proper time can in principle be defined for a
massive apparatus (‘‘observer’’) whose Compton wavelength
is so small that its center of mass has classical
coordinates and follows a continuous world line. However,
when there is more than one apparatus, there is no
role for the private proper times that might be attached
to the observers’ world lines. Therefore a physical situation
involving several observers in relative motion cannot
be described by a wave function with a relativistic
transformation law (Aharonov and Albert, 1981; Peres,
1995, and references therein). This should not be surprising
because a wave function is not a physical object.
It is only a tool for computing the probabilities of objective
macroscopic events.
 
Einstein’s [special] principle of relativity asserts that there are
no privileged inertial frames. 
 
[Comment #3: Einstein's general principle of relativity is that there are no privileged local accelerating frames (AKA LNIFs). In addition, Einstein's equivalence principle is that one can always find a local inertial frame (LIF) coincident with a LNIF (over a small enough region of 4D space-time) in which to a good approximation, Newton's 1/r^2 force is negligible "Einstein's happiest thought" Therefore, Newton's universal "gravity force" is a purely inertial, fictitious, pseudo-force exactly like Coriolis, centrifugal and Euler forces that are artifacts of the proper acceleration of the detector having no real effect on the test particle being measured by the detector. The latter assumes no rigid constraint between detector and test particle. For example a test particle clamped to the edge r of a uniformly slowly rotating disk will have a real EM force of constraint that is equal to m w x w x r.]
 
This does not imply the
necessity or even the possibility of using manifestly symmetric
four-dimensional notations. This is not a peculiarity
of relativistic quantum mechanics. Likewise, in classical
canonical theories, time has a special role in the
equations of motion.
 
The relativity principle is extraordinarily restrictive.
For example, in ordinary classical mechanics with a finite
number of degrees of freedom, the requirement that
the canonical coordinates have the meaning of positions,
so that particle trajectories q(t) transform like
four-dimensional world lines, implies that these lines
consist of straight segments. Long-range interactions are
forbidden; there can be only contact interactions between
point particles (Currie, Jordan, and Sudarshan,
1963; Leutwyler, 1965). Nontrivial relativistic dynamics
requires an infinite number of degrees of freedom,
which are labeled by the spacetime coordinates (this is
called a field theory).
 
Combining relativity and quantum theory is not only a
difficult technical question on how to formulate dynamical
laws. The ontologies of these theories are radically
different. Classical theory asserts that fields, velocities,
etc., transform in a definite way and that the equations
of motion of particles and fields behave covariantly. …
 
For example, if the expression for the Lorentz force is written
...in one frame, the same expression is valid
in any other frame. These symbols …. have objective
values. They represent entities that really exist, according
to the theory. On the other hand, wave functions
are not defined in spacetime, but in a multidimensional
Hilbert space. They do not transform covariantly when
there are interventions by external agents, as will be
seen in Sec. III. Only the classical parameters attached
to each intervention transform covariantly. Yet, in spite
of the noncovariance of r, the final results of the calculations
(the probabilities of specified sets of events) must
be Lorentz invariant.
 
As a simple example, consider our two observers, conventionally
called Alice and Bob,4 holding a pair of spin-1/2
particles in a singlet state. Alice measures sand finds
+1, say. This tells her what the state of Bob’s particle is,
namely, the probabilities that Bob would obtain + or - 1 if he
measures (or has measured, or will measure) s along
any direction he chooses. This is purely counterfactual
information: nothing changes at Bob’s location until he
performs the experiment himself, or receives a message
from Alice telling him the result that she found. In particular,
no experiment performed by Bob can tell him
whether Alice has measured (or will measure) her half
of the singlet.
 
A seemingly paradoxical way of presenting these results
is to ask the following naive question. Suppose that
Alice finds that sz = 1 while Bob does nothing. When
does the state of Bob’s particle, far away, become the
one for which sz = -1 with certainty? Although this
question is meaningless, it may be given a definite answer:
Bob’s particle state changes instantaneously. In
which Lorentz frame is this instantaneous? In any
frame! Whatever frame is chosen for defining simultaneity,
the experimentally observable result is the same, as
can be shown in a formal way (Peres, 2000b). Einstein
himself was puzzled by what seemed to be the instantaneous
transmission of quantum information. In his autobiography,
he used the words ‘‘telepathically’’ and
‘‘spook’’ (Einstein, 1949). …
 
In the laboratory, any experiment
has to be repeated many times in order to infer a
law; in a theoretical discussion, we may imagine an infinite
number of replicas of our gedanken experiment, so
as to have a genuine statistical ensemble. Yet the validity
of the statistical nature of quantum theory is not restricted
to situations in which there are a large number
of similar systems. Statistical predictions do apply to
single eventsWhen we are told that the probability of
precipitation tomorrow is 35%, there is only one tomorrow.
This tells us that it may be advisable to carry an
umbrella. Probability theory is simply the quantitative
formulation of how to make rational decisions in the
face of uncertainty (Fuchs and Peres, 2000). A lucid
analysis of how probabilistic concepts are incorporated
into physical theories is given by Emch and Liu (2002).
 
[My comment #4: Peres is correct, but there is no conflict with Bohm's ontological
interpretation here. The Born probability rule is not fundamental to quantum reality
in Bohm's view, but is a limiting case when the beables are in thermal equilibrium.]
 
...
 
Some trends in modern quantum information theory
may be traced to security problems in quantum communication.
A very early contribution was Wiesner’s seminal
paper ‘‘Conjugate Coding,’’ which was submitted
circa 1970 to IEEE Transactions on Information Theory
and promptly rejected because it was written in a jargon
incomprehensible to computer scientists (this was actually
a paper about physics, but it had been submitted to
a computer science journal). Wiesner’s article was finally
published (Wiesner, 1983) in the newsletter of ACM
SIGACT (Association for Computing Machinery, Special
Interest Group in Algorithms and Computation
Theory). That article tacitly assumed that exact duplication
of an unknown quantum state was impossible, well
before the no-cloning theorem (Dieks, 1982; Wootters
and Zurek, 1982) became common knowledge. Another
early article, ‘‘Unforgeable Subway Tokens’’ (Bennett
et al., 1983) also tacitly assumed the same.
 
II. THE ACQUISITION OF INFORMATION
A. The ambivalent quantum observer
Quantum mechanics is used by theorists in two different
ways. It is a tool for computing accurate relationships
between physical constants, such as energy levels,
cross sections, transition rates, etc. These calculations
are technically difficult, but they are not controversial.
In addition to this, quantum mechanics also provides
statistical predictions for results of measurements performed
on physical systems that have been prepared in a
specified way. 
 
[My comment #5: No mention of Yakir Aharonov's intermediate present "weak measurements"
with both history past pre-selection and destiny future post-selection constraints. The latter in
Wheeler delayed choice mode would force the inference of real back-from-the-future retrocausality.
This would still be consistent with Abner Shimony's "passion at a distance," i.e. "signal locality"
in that the observer at the present weak measurement would not know what the future constraint 
actually will be. In contrast, with signal non locality (Sarfatti  1976 MIT Tech Review (Martin Gardner) & 
Antony Valentini (2002)) such spooky precognition would be possible as in Russell Targ's reports on 
CIA funded RV experiments at SRI in the mid 70's and 80's. 
This is, on the face of it, a gross violation of orthodox
quantum theory as laid out here in the Peres review paper.]
 
The quantum measuring process is the interface
of classical and quantum phenomena. The preparation
and measurement are performed by macroscopic
devices, and these are described in classical terms. The
necessity of using a classical terminology was emphasized
by Niels Bohr (1927) from the very early days of
quantum mechanics. Bohr’s insistence on a classical description
was very strict. He wrote (1949)
 
‘‘ . . . by the word ‘experiment’ we refer to a situation
where we can tell others what we have done and what
we have learned and that, therefore, the account of the
experimental arrangement and of the results of the observations
must be expressed in unambiguous language,
with suitable application of the terminology of
classical physics.’’
 
Note the words ‘‘we can tell.’’ Bohr was concerned
with information, in the broadest sense of this term. He
never said that there were classical systems or quantum
systems. There were physical systems, for which it was
appropriate to use the classical language or the quantum
language. There is no guarantee that either language
gives a perfect description, but in a well-designed experiment
it should be at least a good approximation.
 
Bohr’s approach divides the physical world into ‘‘endosystems’’
(Finkelstein, 1988), which are described by
quantum dynamics, and ‘‘exosystems’’ (such as measuring
apparatuses), which are not described by the dynamical
formalism of the endosystem under consideration.
A physical system is called ‘‘open’’ when parts of
the universe are excluded from its description. In different
Lorentz frames used by observers in relative motion,
different parts of the universe may be excluded. The
systems considered by these observers are then essentially
different, and no Lorentz transformation exists
that can relate them (Peres and Terno, 2002).
It is noteworthy that Bohr never described the measuring
process as a dynamical interaction between an
exophysical apparatus and the system under observation.
He was, of course, fully aware that measuring apparatuses
are made of the same kind of matter as everything
else, and they obey the same physical laws. It is
therefore tempting to use quantum theory in order to
investigate their behavior during a measurement. However,
if this is done, the quantized apparatus loses its
status as a measuring instrument. It becomes a mere intermediate
system in the measuring process, and there
must still be a final instrument that has a purely classical
description (Bohr, 1939).
 
Measurement was understood by Bohr as a primitive
notion. He could thereby elude questions which caused
considerable controversy among other authors. A
quantum-dynamical description of the measuring process
was first attempted by John von Neumann in his
treatise on the mathematical foundations of quantum
theory (1932). In the last section of that book, as in an
afterthought, von Neumann represented the apparatus
by a single degree of freedom, whose value was correlated
with that of the dynamical variable being measured.
Such an apparatus is not, in general, left in a definite
pure state, and it does not admit a classical
description. Therefore von Neumann introduced a second
apparatus which observes the first one, and possibly
a third apparatus, and so on, until there is a final measurement,
which is not described by quantum dynamics
and has a definite result (for which quantum mechanics
can give only statistical predictions). The essential point
that was suggested, but not proved by von Neumann, is
that the introduction of this sequence of apparatuses is
irrelevant: the final result is the same, irrespective of the
location of the ‘‘cut’’ between classical and quantum
physics.8
 
These different approaches of Bohr and von Neumann
were reconciled by Hay and Peres (1998), who
8At this point, von Neumann also speculated that the final
step involves the consciousness of the observer—a bizarre
statement in a mathematically rigorous monograph (von Neumann,
1955).
 
B. The measuring process
Dirac (1947) wrote that ‘‘a measurement always
causes the system to jump into an eigenstate of the dynamical
variable being measured.’’ Here, we must be
careful: a quantum jump (also called a collapse) is something
that happens in our description of the system, not
to the system itself. Likewise, the time dependence of
the wave function does not represent the evolution of a
physical system. It only gives the evolution of probabilities
for the outcomes of potential experiments on that
system (Fuchs and Peres, 2000).
 
Let us examine more closely the measuring process.
First, we must refine the notion of measurement and
extend it to a more general one: an interventionAn
intervention is described by a set of parameters which
include the location of the intervention in spacetime, referred
to an arbitrary coordinate system. We also have
to specify the speed and orientation of the apparatus in
the coordinate system that we are using as well as various
other input parameters that control the apparatus,
such as the strength of a magnetic field or that of a rf
pulse used in the experiment. The input parameters are
determined by classical information received from past
interventions, or they may be chosen arbitrarily by the
observer who prepares that intervention or by a local
random device acting in lieu of the observer.
 
[My comment #6: Peres, in my opinion, makes another mistake.
Future interventions will affect past weak measurements.
 

Back From the Future

A series of quantum experiments shows that measurements performed in the future can influence the present. Does that mean the universe has a destiny—and the laws of physics pull us inexorably toward our prewritten fate?

By Zeeya Merali|Thursday, August 26, 2010
http://discovermagazine.com/2010/apr/01-back-from-the-future#.UieOnhac5Hw ]
 
An intervention has two consequences. One is the acquisition
of information by means of an apparatus that
produces a record. This is the ‘‘measurement.’’ Its outcome,
which is in general unpredictable, is the output of
the intervention. The other consequence is a change of
the environment in which the quantum system will
evolve after completion of the intervention. For example,
the intervening apparatus may generate a new
Hamiltonian that depends on the recorded result. In particular,
classical signals may be emitted for controlling
the execution of further interventions. These signals are,
of course, limited to the velocity of light.
The experimental protocols that we consider all start
in the same way, with the same initial state ... , and the
first intervention is the same. However, later stages of
the experiment may involve different types of interventions,
possibly with different spacetime locations, depending
on the outcomes of the preceding events. Yet,
assuming that each intervention has only a finite number
of outcomes, there is for the entire experiment only a
finite number of possible records. (Here, the word
record means the complete list of outcomes that occurred
during the experiment. We do not want to use the
word history, which has acquired a different meaning in
the writings of some quantum theorists.)
 
Each one of these records has a definite probability in
the statistical ensemble. In the laboratory, experimenters
can observe its relative frequency among all the records
that were obtained; when the number of records tends
to infinity, this relative frequency is expected to tend to
the true probability. The aim of theory is to predict the
probability of each record, given the inputs of the various
interventions (both the inputs that are actually controlled
by the local experimenter and those determined
by the outputs of earlier interventions). Each record is
objective: everyone agrees on what happened (e.g.,
which detectors clicked). Therefore, everyone agrees on
what the various relative frequencies are, and the theoretical
probabilities are also the same for everyone.
Interventions are localized in spacetime, but quantum
systems are pervasive. In each experiment, irrespective
of its history, there is only one quantum system, which
may consist of several particles or other subsystems, created
or annihilated at the various interventions. Note
that all these properties still hold if the measurement
outcome is the absence of a detector click. It does not
matter whether this is due to an imperfection of the detector
or to a probability less than 1 that a perfect detector
would be excited. The state of the quantum system
does not remain unchanged. It has to change to
respect unitarity. The mere presence of a detector that
could have been excited implies that there has been an
interaction between that detector and the quantum system.
Even if the detector has a finite probability of remaining
in its initial state, the quantum system correlated
to the latter acquires a different state (Dicke,
1981). The absence of a click, when there could have
been one, is also an event.
 
 
The measuring process involves not only the physical
system under study and a measuring apparatus (which
together form the composite system C) but also their
environment, which includes unspecified degrees of freedom
of the apparatus and the rest of the world. These
unknown degrees of freedom interact with the relevant
ones, but they are not under the control of the experimenter
and cannot be explicitly described. Our partial
ignorance is not a sign of weakness. It is fundamental. If
everything were known, acquisition of information
would be a meaningless concept.
 
 
A complete description of involves both macroscopic
and microscopic variables. The difference between
them is that the environment can be considered as
adequately isolated from the microscopic degrees of
freedom for the duration of the experiment and is not
influenced by them, while the environment is not isolated
from the macroscopic degrees of freedomFor example,
if there is a macroscopic pointer, air molecules bounce
from it in a way that depends on the position of that
pointer. Even if we can neglect the Brownian motion of
a massive pointer, its influence on the environment leads
to the phenomenon of decoherence, which is inherent to
the measuring process.
 
An essential property of the composite system C,
which is necessary to produce a meaningful measurement,
is that its states form a finite number of orthogonal
subspaces which are distinguishable by the observer.
 
[My comment #7: This is not the case for Aharonov's weak measurements where
 
<A>weak = <history|A|destiny>/<history|destiny>
 
Nor is it true when Alice's orthogonal micro-states are entangled with Bob's far away distinguishably non-orthogonal macro-quantum Glauber coherent and possibly squeezed states.
 
  1. Coherent states - Wikipedia, the free encyclopedia

    en.wikipedia.org/wiki/Coherent_states
     
    In physics, in quantum mechanics, a coherent state is the specific quantum state of the quantum harmonic oscillator whose dynamics most closely resembles the ...
    You've visited this page many times. Last visit: 8/7/13
  2. Review of Entangled Coherent States

    arxiv.org › quant-ph
    by BC Sanders - ‎2011 - ‎Cited by 6 - ‎Related articles
    Dec 8, 2011 - Abstract: We review entangled coherent state research since its first implicit use in 1967
 
|Alice,Bob> = (1/2)[|Alice +1>|Bob alpha> + |Alice -1>|Bob beta>]
 
<Alice+1|Alice -1> = 0
 
<Bob alpha|Bob beta> =/= 0  
 
 
e.g. Partial trace over Bob's states  |<Alice +1|Alice-Bob>|^2 = (1/2)[1 + |<Bob alpha|Bob beta>|^2] > 1
 
this is formally like a weak measurement where the usual Born probability rule breaks down. 
 
Complete isolation from environmental decoherence is assumed here.
 
It is clear violation of "passion at a distance" no-entanglement signaling arguments based on axioms that are empirically false in my opinion.
 
"The statistics of Bob’s result are not affected at all by what Alice may simultaneously do somewhere else. " (Peres) 
 
is false.
 
While a logically correct formal proof is desirable in physics, Nature has ways of leap frogging over their premises.
 
One can have constrained pre and post-selected conditional probabilities that are greater than 1, negative and even complex numbers. 
 
All of which correspond to observable effects in the laboratory - see Aephraim Steinberg's experimental papers
University of Toronto.]
 
Each macroscopically distinguishable subspace corresponds
to one of the outcomes of the intervention and
defines a POVM element Em , given explicitly by Eq. (8)
below. …
 
C. Decoherence
Up to now, quantum evolution is well defined and it is
in principle reversible. It would remain so if the environment
could be perfectly isolated from the macroscopic
degrees of freedom of the apparatus. This demand is of
course self-contradictory, since we have to read the result
of the measurement if we wish to make any use of it.
 
A detailed analysis of the interaction with the environment,
together with plausible hypotheses (Peres, 2000a),
shows that states of the environment that are correlated
with subspaces of with different labels m can be treated
as if they were orthogonal. This is an excellent approximation
(physics is not an exact science, it is a science of
approximations). The resulting theoretical predictions
will almost always be correct, and if any rare small deviation
from them is ever observed, it will be considered
as a statistical quirk or an experimental error.
 
The density matrix of the quantum system is thus effectively
block diagonal, and all our statistical predictions
are identical to those obtained for an ordinary mixture
of (unnormalized) pure states
.
 
This process is called decoherence. Each subspace
m is stable under decoherence—it is their relative
phase that decoheres. From this moment on, the macroscopic
degrees of freedom of have entered into the
classical domain. We can safely observe them and ‘‘lay
on them our grubby hands’’ (Caves, 1982). In particular,
they can be used to trigger amplification mechanisms
(the so-called detector clicks) for the convenience of the
experimenter.
 
Some authors claim that decoherence may provide a
solution of the ‘‘measurement problem,’’ with the particular
meaning that they attribute to that problem
(Zurek, 1991). Others dispute this point of view in their
comments on the above article (Zurek, 1993). A reassessment
of this issue and many important technical details
were recently published by Zurek (2002, 2003). Yet
decoherence has an essential role, as explained above. It
is essential that we distinguish decoherence, which results
from the disturbance of the environment by the
apparatus (and is a quantum effect), from noise, which
would result from the disturbance of the system or the
apparatus by the environment and would cause errors.
Noise is a mundane classical phenomenon, which we ignore
in this review.
 
E. The no-communication theorem
We now derive a sufficient condition that no instantaneous
information transfer can result from a distant intervention.
We shall show that the condition is
 
[Am,Bnn] = 0
 
where Amand Bnare Kraus matrices for the observation
of outcomes m by Alice and n by Bob.
 
[My comment #8: "The most beautiful theory is murdered by an ugly fact." - Feynman
e.g. Libet-Radin-Bierman presponse in living brain data
SRI CIA vetted reports of remote viewing by living brains.
 
  1. CIA-Initiated Remote Viewing At Stanford Research Institute

    www.biomindsuperpowers.com/Pages/CIA-InitiatedRV.html
     
    As if to add insult to injury, he then went on to "remote view" the interior of the apparatus, .... Figure 6 - Left to right: Christopher Green, Pat Price, and Hal Puthoff.
    You've visited this page many times. Last visit: 5/30/13
  2. Harold E. Puthoff - Wikipedia, the free encyclopedia

    en.wikipedia.org/wiki/Harold_E._Puthoff
     
    PuthoffHal, Success Story, Scientology Advanced Org Los Angeles (AOLA) special... H. E. Puthoff, CIA-Initiated Remote Viewing At Stanford Research Institute, ...
  3. Remote viewing - Wikipedia, the free encyclopedia

    en.wikipedia.org/wiki/Remote_viewing
     
    Among some of the ideas that Puthoff supported regarding remote viewing was the ...by Russell Targ and Hal Puthoff at Stanford Research Institute in the 1970s  ...
    You've visited this page many times. Last visit: 7/5/13
  4. Dr. Harold Puthoff on Remote Viewing - YouTube

    www.youtube.com/watch?v=FOAfH1utUSM
    Apr 28, 2011 - Uploaded by corazondelsur
    Dr. Hal Puthoff is considered the father of the US government'sRemote Viewing program, which reportedly ...
     
  5. Remoteviewed.com - Hal Puthoff

    www.remoteviewed.com/remote_viewing_halputhoff.htm
     
    Dr. Harold E. Puthoff is Director of the Institute for Advanced Studies at Austin. A theoretical and experimental physicist specializing in fundamental ...
 
 
On Sep 4, 2013, at 9:06 AM, JACK SARFATTI <adastra1@icloud.com> wrote:
 
Peres here is only talking about Von Neumann's strong measurements not 
Aharonov's weak measurements.

Standard texbooks on quantum mechanics
tell you that observable quantities are represented by
Hermitian operators, that their possible values are the
eigenvalues of these operators, and that the probability
of detecting eigenvalue a, corresponding to eigenvector
|a>  |<a|psi>|2, where |psi> is the (pure) state of the
quantum system that is observed. With a bit more sophistication
to include mixed states, the probability can
be written in a general way <a|rho|a> …
 
This is nice and neat, but it does not describe what
happens in real lifeQuantum phenomena do not occur
in Hilbert space; they occur in a laboratory. If you visit a
real laboratory, you will never find Hermitian operators
there. All you can see are emitters (lasers, ion guns, synchrotrons,
and the like) and appropriate detectors. In
the latter, the time required for the irreversible act of
amplification (the formation of a microscopic bubble in
a bubble chamber, or the initial stage of an electric discharge)
is extremely brief, typically of the order of an
atomic radius divided by the velocity of light. Once irreversibility
has set in, the rest of the amplification process
is essentially classical. It is noteworthy that the time and
space needed for initiating the irreversible processes are
incomparably smaller than the macroscopic resolution
of the detecting equipment.
 
The experimenter controls the emission process and
observes detection events. The theorist’s problem is to
predict the probability of response of this or that detector,
for a given emission procedure. It often happens
that the preparation is unknown to the experimenter,
and then the theory can be used for discriminating between
different preparation hypotheses, once the detection
outcomes are known.
 
<Screen Shot 2013-09-04 at 8.57.50 AM.png>
 
Many physicists, perhaps a majority, have an intuitive,
realistic worldview and consider a quantum state as a
physical entity. Its value may not be known, but in principle
the quantum state of a physical system would be
well defined. However, there is no experimental evidence
whatsoever to support this naive belief. On the
contrary, if this view is taken seriously, it may lead to
bizarre consequences, called ‘‘quantum paradoxes.’’
These so-called paradoxes originate solely from an incorrect
interpretation of quantum theory, which is thoroughly
pragmatic and, when correctly used, never yields
two contradictory answers to a well-posed question. It is
only the misuse of quantum concepts, guided by a pseudorealistic
philosophy, that leads to paradoxical results.
 
[My comment #2: Here is the basic conflict between epistemological vs ontological views of quantum reality.]
 
In this review we shall adhere to the view that r is
only a mathematical expression which encodes information
about the potential results of our experimental interventions.
The latter are commonly called
‘‘measurements’’—an unfortunate terminology, which
gives the impression that there exists in the real world
some unknown property that we are measuring. Even
the very existence of particles depends on the context of
our experiments. In a classic article, Mott (1929) wrote
‘‘Until the final interpretation is made, no mention
should be made of the a ray being a particle at all.’’
Drell (1978a, 1978b) provocatively asked ‘‘When is a
particle?’’ In particular, observers whose world lines are
accelerated record different numbers of particles, as will
be explained in Sec. V.D (Unruh, 1976; Wald, 1994).
 
 
1The theory of relativity did not cause as much misunderstanding
and controversy as quantum theory, because people
were careful to avoid using the same nomenclature as in nonrelativistic
physics. For example, elementary textbooks on
relativity theory distinguish ‘‘rest mass’’ from ‘‘relativistic
mass’’ (hard-core relativists call them simply ‘‘mass’’ and ‘‘energy’’).
2The ‘‘irreversible act of amplification’’ is part of quantum
folklore, but it is not essential to physics. Amplification is
needed solely to facilitate the work of the experimenter.
3Positive operators are those having the property that
^curuc&>0 for any state c. These operators are always Hermitian.
94 A. Peres and D. R. Terno: Quantum information and relativity theory
Rev. Mod.
 
 
 
On Sep 4, 2013, at 8:48 AM, JACK SARFATTI <adastra1@icloud.com> wrote:



Begin forwarded message:

From: JACK SARFATTI <jacksarfatti@icloud.com>
Subject: Quantum information and relativity theory
Date: September 4, 2013 8:33:48 AM PDT
To: nick herbert <quanta@mail.cruzio.com>
 

The late Asher Peres http://en.wikipedia.org/wiki/Asher_Peres interpretation is the antithesis of the late David Bohm's ontological interpretation http://en.wikipedia.org/wiki/David_Bohm holding to a purely subjective epistemological Bohrian interpretation of the quantum BIT potential Q.
He claims that Antony Valentini's signal non locality beyond orthodox quantum theory would violate the Second Law of Thermodynamics.
 
REVIEWS OF MODERN PHYSICS, VOLUME 76, JANUARY 2004
Quantum information and relativity theory
Asher Peres
Department of Physics, Technion–Israel Institute of Technology, 32000 Haifa, Israel
Daniel R. Terno
Perimeter Institute for Theoretical Physics, Waterloo, Ontario, Canada N2J 2W9
(Published 6 January 2004)
This article discusses the intimate relationship between quantum mechanics, information theory, and
relativity theory. Taken together these are the foundations of present-day theoretical physics, and
their interrelationship is an essential part of the theory. The acquisition of information from a
quantum system by an observer occurs at the interface of classical and quantum physics. The authors
review the essential tools needed to describe this interface, i.e., Kraus matrices and
positive-operator-valued measures. They then discuss how special relativity imposes severe
restrictions on the transfer of information between distant systems and the implications of the fact that
quantum entropy is not a Lorentz-covariant concept. This leads to a discussion of how it comes about
that Lorentz transformations of reduced density matrices for entangled systems may not be
completely positive maps. Quantum field theory is, of course, necessary for a consistent description of
interactions. Its structure implies a fundamental tradeoff between detector reliability and
localizability. Moreover, general relativity produces new and counterintuitive effects, particularly
when black holes (or, more generally, event horizons) are involved. In this more general context the
authors discuss how most of the current concepts in quantum information theory may require a
reassessment.
CONTENTS
I. Three Inseparable Theories 93
A. Relativity and information 93
B. Quantum mechanics and information 94
C. Relativity and quantum theory 95
D. The meaning of probability 95
E. The role of topology 96
F. The essence of quantum information 96
II. The Acquisition of Information 97
A. The ambivalent quantum observer 97
B. The measuring process 98
C. Decoherence 99
D. Kraus matrices and positive-operator-valued
measures (POVM’s) 99
E. The no-communication theorem 100
III. The Relativistic Measuring Process 102
A. General properties 102
B. The role of relativity 103
C. Quantum nonlocality? 104
D. Classical analogies 105
IV. Quantum Entropy and Special Relativity 105
A. Reduced density matrices 105
B. Massive particles 105
C. Photons 107
D. Entanglement 109
E. Communication channels 110
V. The Role of Quantum Field Theory 110
A. General theorems 110
B. Particles and localization 111
C. Entanglement in quantum field theory 112
D. Accelerated detectors 113
VI. Beyond Special Relativity 114
A. Entanglement revisited 115
B. The thermodynamics of black holes 116
C. Open problems 118
Acknowledgments and Apologies 118
Appendix A: Relativistic State Transformations 119
Appendix B: Black-Hole Radiation 119
References 120
I. THREE INSEPARABLE THEORIES
Quantum theory and relativity theory emerged at the
beginning of the twentieth century to give answers to
unexplained issues in physics: the blackbody spectrum,
the structure of atoms and nuclei, the electrodynamics of
moving bodies. Many years later, information theory
was developed by Claude Shannon (1948) for analyzing
the efficiency of communication methods. How do these
seemingly disparate disciplines relate to each other? In
this review, we shall show that they are inseparably
linked.
A. Relativity and information
Common presentations of relativity theory employ
fictitious observers who send and receive signals. These
‘‘observers’’ should not be thought of as human beings,
but rather as ordinary physical emitters and detectors.
Their role is to label and locate events in spacetime. The
speed of transmission of these signals is bounded by
c—the velocity of light—because information needs a
material carrier, and the latter must obey the laws of
physics. Information is physical (Landauer, 1991).
 
[My comment #1: Indeed information is physical. Contrary to Peres, in Bohm's theory Q is also physical but not material (be able), consequently one can have entanglement negentropy transfer without be able material propagation of a classical signal. I think Peres makes a fundamental error here.]
 
However, the mere existence of an upper bound on
the speed of propagation of physical effects does not do
justice to the fundamentally new concepts that were introduced
by Albert Einstein (one could as well imagine
communications limited by the speed of sound, or that
of the postal service). Einstein showed that simultaneity
had no absolute meaning, and that distant events might
have different time orderings when referred to observers
in relative motion. Relativistic kinematics is all about
information transfer between observers in relative motion.
 
Classical information theory involves concepts such as
the rates of emission and detection of signals, and the
noise power spectrum. These variables have well defined
relativistic transformation properties, independent
of the actual physical implementation of the communication
system.


1)   . I intuited the connection between the Einstein-Rosen (ER) wormhole and Einstein-Podolsky-Rosen (EPR) quantum entanglement back in 1973 when I was with Abdus Salam at the International Centre of Theoretical Physics in Trieste, Italy. This idea was published in the wacky book “Space-Time and Beyond” (Dutton, 1975) described by MIT physics historian David Kaiser in his book “How the Hippies Saved Physics.” Lenny Susskind, who I worked with at Cornell 1963-4, rediscovered this ER = EPR connection in the black hole “firewall” paradox. Lenny envisions a multi-mouthed wormhole network connecting the Hawking radiation particles their entangled twins behind the evaporating event horizon. “each escaping particle remains connected to the black hole through a wormhole” Dennis Overbye, Einstein and the Black Hole, New York Times August 13, 2013.  The no-signaling theorem corresponds to the wormhole pinching off before a light speed limited signal can pass through one mouth to the other. Now we know that traversable wormhole stargates are possible using amplified anti-gravity dark energy. This corresponds to signal-nonlocality in post-quantum theory violating orthodox quantum theory. 

1)      Localizing global symmetries requires the addition of compensating gauge connections in a fiber bundle picture of the universe. Indeed, the original global symmetry group is a smaller subgroup of the local symmetry group. The gauge connections define parallel transport of tensor/spinor fields. They correspond to the interactions between the several kinds of charges of the above symmetries. I shall go into more details of this elsewhere. Indeed localizing the above spacetime symmetries corresponds to generalizations of Einstein’s General Relativity as a local gauge theory.[i] For example, localizing the space and time global translational symmetries means that the Lie group transformations at different events (places and times) in the universe are independent of each other. If one believes in the classical special relativity postulate of locality that there are no faster-than-light actions at a distance, then the transformations must certainly be independent of each other between pairs of spacelike separated events that cannot be connected by a light signal. However, the local gauge principle is much stronger, because it applies to pairs of events that can be connected not only by a light signal, but also by slower-than-light timelike signals. This poses a paradox when we add quantum entanglement.  Aspect’s experiment and others since then, show that faster-than-light influences do in fact exist in the conditional probabilities (aka correlations) connecting observed eigenvalues of quantum observable operators independently chosen by Alice and Bob when spacelike separated. I shall return to this in more detail elsewhere. However, the no entanglement-signaling postulate is thought by many mainstream theoretical physicists to define orthodox quantum theory. It’s believed that its violation would also violate the Second Law of Thermodynamics. Note that the entanglement signal need not be faster-than-light over a spacelike separation between sender and receiver. It could be lightlike or timelike separated as well. Indeed it can even be retrocausal with the message sent back-from-the-future. John Archibald Wheeler’s “delayed choice experiment” is actually consistent with orthodox quantum theory’s no-signaling premise. The point is, that one cannot decode the message encoded in the pattern of entanglement until one has a classical signal key that only propagates forward in time. What one sees before the classical key arrives and a correlation analysis is computed is only local random white noise. However, data on precognitive remote viewing as well as brain presponse data suggests that no-entanglement signaling is only true for dead matter. Nobel Prize physicist, Brian Josephson first published on this. I have also suggested it using Bohm’s ontological interpretation (Lecture 8 of Michael Towler’s Cambridge University Lectures on Bohm’s Pilot Wave). Antony Valentini has further developed this idea in several papers. Post-quantum “signal nonlocality” dispenses with the need to wait for the light-speed limited retarded signal key propagating from past to future. Local non-random noise will be seen in violation of the S-Matrix unitarity “conservation of information” postulate of G. ‘t Hooft, L. Susskind et-al.  Indeed the distinguishable non-orthogonality of entangled Glauber macro-quantum coherent states seems to be the way to get signal nonlocality. This gets us to the “Black Hole War” between Susskind and Hawking about information loss down evaporating black holes. It seems that Hawking caved in too fast to Susskind back in Dublin in 2004. I intuited the connection between the Einstein-Rosen (ER) wormhole and Einstein-Podolsky-Rosen (EPR) quantum entanglement back in 1973 when I was with Abdus Salam at the International Centre of Theoretical Physics in Trieste, Italy. This idea was published in the wacky book “Space-Time and Beyond” (Dutton, 1975) described by MIT physics historian David Kaiser in his book “How the Hippies Saved Physics.” Lenny Susskind, who I worked with at Cornell 1963-4, rediscovered this ER = EPR connection in the black hole “firewall” paradox.



[i] Localizing the four space and time translations corresponds to Einstein’s general coordinate transformations that are now gauge transformations defining an equivalence class of physically identical representations of the same curvature tensor field. However, the compensating gauge connection there corresponds to torsion fields not curvature fields. The curvature field corresponds to localizing the three space-space rotations and the three space-time Lorentz boost rotations together. Einstein’s General Relativity in final form (1916) has zero torsion with non-zero curvature. However, T.W.B. Kibble from Imperial College, London in 1961 showed how to get the Einstein-Cartan torsion + curvature extension of Einstein’s 1916 curvature-only model by localizing the full 10-parameter Poincare symmetry Lie group of Einstein’s 1905 Special Relativity. The natural geometric objects to use are the four Cartan tetrads that correspond to Local Inertial Frame (LIF) detector/observers that are not rotating about their Centers of Mass (COM) that are on weightless zero g-force timelike geodesics.  Zero torsion is then imposed as an ad-hoc constraint to regain Einstein’s 1916 model as a limiting case. The ten parameter Poincare Lie group is subgroup of the fifteen parameter conformal group that adds four constant proper acceleration hyperbolic Wolfgang Rindler horizon boosts and one dilation scale transformation that corresponds to Herman Weyl’s original failed attempt to unify gravity with electromagnetism. The spinor Dirac square roots of the conformal group correspond to Roger Penrose’s “twistors.”

 

My review of Jim Woodward's Making Starships book - V1 under construction
  • Jack Sarfatti Sarfatti’s Commentaries on James F. Woodward’s book 
    Making Starships and Star Gates 
    The Science of Interstellar Transport and Absurdly Benign Wormholes

    The book has many good insights except for some ambiguous statements regarding:

    1) The equivalence principle that is the foundation of Einstein’s theory of the gravitational field. This seems to be due to the author’s not clearly distinguishing between local frame invariant proper acceleration and frame dependent coordinate acceleration. Thus, the author says that Newton’s gravity force is eliminated in an “accelerating frame.” In fact, it is eliminated in a Local Inertial Frame (LIF) that has zero proper acceleration, though it has coordinate acceleration relative to the surface of Earth for example. All points of the rigid spherical surface of Earth have non-zero proper accelerations pointing radially outward. This violates common sense and confuses even some physicists as well as engineers not to mention laymen. It is a fact of the Alice in Wonderland topsy-turvy surreal world of the post-modern physics of Einstein’s relativity especially when combined with the faster-than-light and back from the future entanglement of particles and fields in quantum theory and beyond. 
    2) I find the author’s discussion of fictitious inertial pseudo forces puzzling. I include the centripetal force as a fictitious force in the limit of Newton’s particle mechanics sans Einstein’s local inertial frame dragging from rotating sources. That is, every local frame artifact that is inside the Levi-Civita connection is a fictitious inertial pseudo force. This includes, Coriolis, centrifugal, Euler, and most importantly Newton’s gravity force that is not a real force. The terms inside the Levi-Civita connection are not felt by the test particle under observation. Instead, they describe real forces acting on the observer’s local rest frame. A real force acts locally on a test particle’s accelerometer. It causes an accelerometer’s pointer to move showing a g-force. In contrast, Baron Munchausen sitting on a cannonball in free fall is weightless. This was essentially Einstein’s “happiest thought” leading him to the equivalence principle the cornerstone of his 1916 General Relativity of the Gravitational Field. 
    3) A really serious flaw in the book is the author’s dependence on Dennis Sciama’s electromagnetic equations for gravity. In fact, these equations only apply approximately in the weak field limit of Einstein’s field equations in the background-dependent case using the absolute non-dynamical globally-flat Minkowski space-time with gravity as a tiny perturbation. The author uses these equations way out of their limited domain of validity. In particular, the Sciama equations cannot describe the two cosmological horizons past and future of our dark energy accelerating expanding observable universe. What we can see with our telescopes is only a small patch (aka “causal diamond”) of a much larger “inflation bubble” corresponding to Max Tegmark’s “Level 1” in his four level classification of the use of “multiverse” and “parallel universes.” Our two cosmological horizons, past and future, that are thin spherical shells of light with us inside them at their exact centers may in fact be hologram computer screens projecting us as 3D images in a virtual reality quantum computer simulation. This is really a crazy idea emerging from Gerardus ‘t Hooft, Leonard Susskind, Seth Lloyd and others. Is it crazy enough to be true? 
  • Jack Sarfatti 4) John Cramer’s Foreword: I agree with Cramer that it’s too risky in the long run for us to be confined to the Earth and even to this solar system. British Astronomer Royal, Lord Martin Rees in his book “Our Final Hour” gives detailed reasons. Of course if a vacuum strangelet develops like Kurt Vonnegut’s “Ice-9”, then our entire observable universe can be wiped out, our causal diamond and beyond shattered, and there is no hope. That is essentially the apocalyptic worst-case scenario of the Bible’s “Revelations” and we will not dwell on it any further. Let’s hope it’s not a precognitive remote viewing like what the CIA observed in the Stanford Research Institute studies in the 1970’s.  Cramer cites the NASA-DARPA 100 Year Star Ship Project that I was involved with in the first two meetings. Cramer’s text is in quotes and italics. There is “little hope of reaching the nearby stars in a human lifetime using any conventional propulsion techniques … the universe is simply too big, and the stars are too far away. … What is needed is either trans-spatial shortcuts such as wormholes to avoid the need to traverse the enormous distances or a propulsion technique that somehow circumvents Newton’s third law and does not require the storage, transport and expulsion of large volumes of reaction mass.”
    Yes, indeed. I conjecture as a working hypothesis based on the UFO evidence that traversable wormhole stargate time travel machines are the only way to go with warp drive used only as a secondary mechanism at low speeds mainly for silent hovering near the surfaces of planets and for dogfights with conventional aerospace craft. The stargates do not have the blue shift problem that the Alcubierre warp drive has although the Natario warp drive does not have the blue shift problem (high-energy collisions with particles and radiation in the path of the starship). Newton’s third law that every force acting on a material object has an equal and opposite inertial reaction force on the source of that force is a conservation law that follows from symmetry Lie groups of transformations in parameters of the dynamical action of the entire closed system of source and material object. This is a very general organizing principle of theoretical physics known as Noether’s theorem for global symmetries in which the transformations are the same everywhere for all times in the universe. For example:
    Space Translation Symmetry Linear Momentum Conservation
    Time Translation Symmetry Energy Conservation
    Space-Space Rotation Symmetry Angular Momentum Conservation
    Space-Time Rotation Symmetry
    Internal U1 EM Force Symmetry Conserve 1 Electric Charge
    Internal SU2 Weak Force Symmetry Conserve 3 Weak Flavor Charges
    Internal SU3 Strong Force Symmetry Conserve 8 Strong Color Charges
  • Jack Sarfatti In a propellantless propulsion system without the rocket ejection of real particles and/or radiation one must include the gravity curvature field (dynamical space-time itself) as a source and sink of linear momentum. Furthermore, if we include quantum corrections to the classical fields there is the remote possibility of using virtual particle zero point fluctuations inside the vacuum as a source and sink of linear momentum. However, the conventional wisdom is that this kind of controllable small-scale metastable vacuum phase transition is impossible in principle and to do so would violate the Second Law of Thermodynamics (extracting work from an absolute zero temperature heat reservoir). Even if we could do the seemingly impossible, propellantless propulsion while necessary is not sufficient for a true warp drive. A true warp drive must be weightless (zero g-force) timelike geodesic and without time dilation for the crew relative to the external observer outside the warp bubble that they were initially clock synchronized with. Localizing global symmetries requires the addition of compensating gauge connections in a fiber bundle picture of the universe. Indeed, the original global symmetry group is a smaller subgroup of the local symmetry group. The gauge connections define parallel transport of tensor/spinor fields. They correspond to the interactions between the several kinds of charges of the above symmetries. I shall go into more details of this elsewhere. Indeed localizing the above spacetime symmetries corresponds to generalizations of Einstein’s General Relativity as a local gauge theory. For example, localizing the space and time global translational symmetries means that the Lie group transformations at different events (places and times) in the universe are independent of each other. If one believes in the classical special relativity postulate of locality that there are no faster-than-light actions at a distance, then the transformations must certainly be independent of each other between pairs of spacelike events that cannot be connected by a light signal. However, the local gauge principle is much stronger, because it applies to pairs of spacelike events that can be connected not only by a light signal, but also by slower-than-light timelike signals. This poses a paradox when we add quantum entanglement. Aspect’s experiment and others since then, show that faster-than-light influences do in fact exist in the conditional probabilities (aka correlations) connecting observed eigenvalues of quantum observable operators independently chosen by Alice and Bob when spacelike separated. I shall return to this in more detail elsewhere. Finally, we have the P.W. Anderson’s anti-reductionist “More is different” emergence of complex systems of real particles in their quantum ground states with quasi-particles and collective mode excitations in soft condensed matter in which the whole is greater than the sum of its parts. This corresponds to spontaneous symmetry breaking of the quantum vacuum’s virtual particles, in its high energy standard model analog, to the Higgs-Goldstone “God Particle” now found at ~ 125 Gev in CERN’s LHC that gives rest masses to leptons and quarks as well as to the three weak radioactivity force spin 1 gauge W-bosons though not to the single spin 1 photon gauge boson and the eight spin strong force gluon gauge bosons. In this quantum field theory picture, the near field non-radiating interactions among the leptons and quarks are caused by the exchange of virtual spacelike (tachyonic faster-than-light off-mass-shell) gauge bosons continuously randomly emitted and absorbed by the leptons and quarks. To make matters more complicated unlike the single rest massless U1 photon, the three weak rest massive SU2 W bosons and the eight strong rest massless SU3 gluons carry their respective Lie algebra charges, therefore, they self-interact. A single virtual gluon can split into two gluons for example. The SU3 quark-quark-gluon interaction gets stronger at low energy longer separations. This is called quantum chromodynamic confinement and it explains why we do not see free quarks in the present epoch of our causal diamond observable universe patch of the multiverse. Free quarks were there in a different quantum vacuum thermodynamic phase shortly after the Alpha Point chaotic inflation creation of our observable universe that we see with telescopes etc. Indeed, most of the rest mass of protons and neutrons comes from the confined Heisenberg uncertainty principle kinetic energy of the three real confined up and down quarks and their plasma cloud of virtual zero point gluons and virtual quark-antiquark pairs. The Higgs Yukawa interaction rest masses of three bound real quarks is about 1/20 or less than the total hadronic rest masses.

    The author, James F. Woodward (JFW), introduces Mach’s Principle though in an ambiguous way to my mind. He says that the computation of the rest mass from local quantum field theory as has been in fact accomplished for hadrons by MIT Nobel Laureate, Frank Wilczek et-al using supercomputers is not sufficient to explain the inertia of Newton’s Second Law of Particle Mechanics. This does sound like Occult Astrology at first glance, but we do have the 1940 Wheeler-Feynman classical electrodynamics in which radiation reaction is explained as a back-from-the-future retro causal advanced influence from the future absorber on the past emitter in a globally self-consistent loop in time. Indeed, Feynman’s path integral quantum theory grew out of this attempt. Hoyle and Narlikar, and John Cramer have extended the original classical Wheeler-Feynman theory to quantum theory. Indeed, the zero point virtual photons causing spontaneous emission decay of excited atomic electron states can be interpreted as a back from the future effect. The electromagnetic field in the classical Wheeler-Feynman model did not have independent dynamical degrees of freedom, but in the Feynman diagram quantum theory they do. However, the retro causal feature survives. Therefore the only way I can make sense of JFWs fringe physics proposal is to make the following conjecture. Let m0 be the renormalized rest mass of a real particle computed in the standard model of local quantum field theory. Then, the observed rest mass m0’ equals a dimensionless nonlocal coefficient C multiplied by the local m0 renormalized rest mass. Mach’s Principle is then C = 0 in an empty universe of only real test particles without any sources causing spacetime to bend. Furthermore, C splits into past history retarded and future destiny advanced pieces. Now is there any Popper falsifiable test of this excess baggage?
  • Jack Sarfatti 1) Springer-Praxis Books in Space Exploration (2013)
    2) Einstein in Zurich over one hundred years ago read of a house painter falling off his ladder saying he felt weightless.
    3) I have since disassociated myself from that project, as have other hard
    ...See More
  • Jack Sarfatti 4) Roughly speaking, for particle mechanics, the dynamical action is the time integral of the kinetic energy minus the potential energy. The classical physics action principle is that the actual path is an extremum in the sense of the calculus of variations relative to all nearby possible paths with the same initial and final conditions. Richard P. Feynman generalized this classical idea to quantum theory where the actual extremum path corresponds to constructive interference of complex number classical action phases one for each possible path. There are more complications for velocity-dependent non-central forces and there is also the issue of initial and final conditions. The action is generalized to classical fields where one must use local kinetic and potential analog densities and integrate the field Lagrangian density over the 4D spacetime region bounded by initial history and final teleological destiny 3D hypersurfaces boundary constraints. Indeed, Yakir Aharonov has generalized this to quantum theory in which there are back-from-the-future retro causal influences on present weak quantum measurements made between the past initial and future final boundary constraints. Indeed, in our observable expanding accelerating universe causal diamond, these boundary constraints, I conjecture, are our past cosmological particle horizon from the moment of chaotic inflation leading to the hot Big Bang, together with our future dark energy de Sitter event horizon. Both of them are BIT pixelated 2D hologram computer screens with us as IT voxelated “weak measurement” 3D hologram images projected from them. The horizon pixel BIT quanta of area are of magnitude (~10^-33 cm or 10^19 Gev)^2. The interior bulk voxel IT quanta of volume are of magnitude (~10^-13 cm or 1 Gev)^3. This ensures that the number N of BIT horizon pixels equals the number of IT interior voxels in a one-to-one correspondence. The actually measured dark energy density is proportional to the inverse fourth power of the geometric mean of the smallest quantum gravity Planck length with the largest Hubble-sized scale of our future de Sitter causal diamond ~ 10^28 cm. This, when combined with the Unruh effect, corresponds to the Stefan-Boltzmann law of black body radiation that started quantum physics back in 1900. However, this redshifted Hawking horizon blackbody radiation must be coming back from our future de Sitter cosmological horizon not from our past particle horizon.
  • Jack Sarfatti 5) Localizing the four space and time translations corresponds to Einstein’s general coordinate transformations that are now gauge transformations defining an equivalence class of physically identical representations of the same curvature tensor field. However, the compensating gauge connection there corresponds to torsion fields not curvature fields. The curvature field corresponds to localizing the three space-space rotations and the three space-time Lorentz boost rotations together. Einstein’s General Relativity in final form (1916) has zero torsion with non-zero curvature. However, T.W.B. Kibble from Imperial College, London in 1961 showed how to get the Einstein-Cartan torsion + curvature extension of Einstein’s 1916 curvature-only model by localizing the full 10-parameter Poincare symmetry Lie group of Einstein’s 1905 Special Relativity. The natural geometric objects to use are the four Cartan tetrads that correspond to Local Inertial Frame (LIF) detector/observers that are not rotating about their Centers of Mass (COM) that are on weightless zero g-force timelike geodesics. Zero torsion is then imposed as an ad-hoc constraint to regain Einstein’s 1916 model as a limiting case. The ten parameter Poincare Lie group is subgroup of the fifteen parameter conformal group that adds four constant proper acceleration hyperbolic Wolfgang Rindler horizon boosts and one dilation scale transformation that corresponds to Herman Weyl’s original failed attempt to unify gravity with electromagnetism. The spinor Dirac square roots of the conformal group correspond to Roger Penrose’s “twistors.”
  •  
     
  1.  
  2. Phys. Rev. D » Volume 87 » Issue 4
    < Previous Article | Next Article >
    Phys. Rev. D 87, 041301(R) (2013) [6 pages]
    Observing the multiverse with cosmic wakes
    Abstract
    References
    No Citing Articles
    Download: PDF (724 kB) Buy this article Export: BibTeX or EndNote (RIS)
    Matthew Kleban1,*, Thomas S. Levi2,†, and Kris Sigurdson2,‡ 1Department of Physics, CCPP, New York University, New York, New York 10003, USA
    2Department of Physics and Astronomy, University of British Columbia, Vancouver, British Columbia V6T 1Z1, Canada
    Received 28 January 2012; revised 26 May 2012; published 21 February 2013
    Current theories of the origin of the Universe, including string theory, predict the existence of a multiverse with many bubble universes. These bubble universes may collide, and collisions with ours produce cosmic wakes that enter our Hubble volume, appear as unusually symmetric disks in the cosmic microwave background, and disturb large scale structure. There is preliminary evidence consistent with one or more of these disturbances on our sky. However, other sources can produce similar features in the cosmic microwave background, and so additional signals are needed to verify their extra-universal origin. Here we find, for the first time, the detailed three-dimensional shape, temperature, and polarization signals of the cosmic wake of a bubble collision consistent with current observations. The polarization pattern has distinct features that when correlated with the corresponding temperature pattern are a unique and striking signal of a bubble collision. These features represent a verifiable prediction of the multiverse paradigm and might be detected by current or future experiments. A detection of a bubble collision would confirm the existence of the multiverse, provide compelling evidence for the string theory landscape, and sharpen our picture of the Universe and its origins.
    Like · · Share
    • Ram Ayana and Miriam Strauss like this.
    • Jack Sarfatti Kuch, you are not communicating intelligibly in many of your sentences.
    • William Kuch My apologies for that it's a habit Ive been trying to break.
    • Theodore Silva I like the Multiverse idea, it leaves open the concept of a kind of "natural selection" for evolving Universes -- even a kind of sexual selection, like the exchange of genes between bacteria. Universes exchanging Constants?
    • Paul Zielinski "No Z you are confused. Tegmark's Levels 1 and 2 are a simple consequence of Einstein's GR + INFLATION." No Jack I am not confused. The mainstream view is that as things stand the existence of a Tegmark Level II multiverse is a *hypothesis*, and I agree with that view.

      The anthropic conundrum is solved in the Tegmark Level II multiverse model by random generation of new universes, in a kind of cosmic Darwinian lottery -- as discussed for example by Penrose. I see nothing in contemporary physics that *requires* the existence of such a multiverse, and the observational support at this point is rather weak. All kinds of things can be derived in theory that may or may not be realized in nature.

      Of course a Tegmark Level III multiverse (a la Everett) is another issue, and is even more conjectural than Level II, since it is based on an alternate interpretation of QM, and is thus not subject to direct empirical confirmation. So I agree with you on that.
    • William Kuch The term "Multiverse" is an oxymoron, resolvable IFF all of these alternate universes are trivial. BAM.
    • Jack Sarfatti Kuch U r babbling like a loon and do not at all understand this subject. You are way out of your depth and do not know that you do not know.
    • Jack Sarfatti Z yes multiverse Level II is a hypothesis that is a "theorem" if you accept the mainstream theory of "chaotic inflation" for which actual evidence is accumulating and more decisive tests are coming. Level 1 is much more certain as it only requires Einstein's GR - this is explained in Tamara Davis's PhD. There are many "causal diamonds" we are inside one of them and they are observer-dependent.
    • William Kuch Indeed I am, with one caveat. I do not babble like a loon. I babble as one.
    • Jack Sarfatti A moment of lucid self-awareness - good for you.
    • Jack Sarfatti OK Z I think we agree Level I very probable - effectively a fact given Tamara Davis's PhD Level II less certain e.g. Penrose's qualms about chaotic inflation, Level III even less certain, I actually reject it. Level IV seems to be of no scientific value. BTW string theory is getting more testable it seems from Lenny Susskind's Stanford online videos.
    • Paul Zielinski OK Jack let's agree that GR + cosmic inflation strongly suggests the possibility of a Level II multiverse being realized in nature. But let's also acknowledge that the inflation model is still itself hypothetical in character. So yes if you are committed to the inflation model then it is reasonable to take the existence of a Level II multiverse seriously.
  3. Like · · Share
    • Jack Sarfatti On Jun 24, 2013, at 5:27 PM, JACK SARFATTI <adastra1@me.com> wrote:

      problem is that it does no work so we cannot apply it to fly an airplane or a space ship there always seems to be a Catch 22 preventing a useful application :

      "perpetual motion"? fir
      st thought "crackpot"

      second thought: "Wilczek's time crystal"

      Rotating Casimir systems: magnetic field-enhanced perpetual motion, possible realization in doped nanotubes, and laws of thermodynamics
      M. N. Chernodub
      CNRS, Laboratoire de Mathematiques et Physique Theorique, Universite Francois-Rabelais Tours,
      Federation Denis Poisson, Parc de Grandmont, 37200 Tours, France and
      Department of Physics and Astronomy, University of Gent, Krijgslaan 281, S9, B-9000 Gent, Belgium
      (Dated: August 24, 2012)

      Recently, we have demonstrated that for a certain class of Casimir-type systems ("devices") the energy of zero-point vacuum fluctuations reaches its global minimum when the device rotates about a certain axis rather than remains static. This rotational vacuum effect may lead to the emergence of permanently rotating objects provided the negative rotational energy of zero-point fluctuations cancels the positive rotational energy of the device itself. In this paper, we show that for massless electrically charged particles the rotational vacuum effect should be drastically (astronomically) enhanced in the presence of a magnetic field. As an illustration, we show that in a background of experimentally available magnetic fields the zero-point energy of massless excitations in rotating torus-shaped doped carbon nanotubes may indeed overwhelm the classical energy of rotation for certain angular frequencies so that the permanently rotating state is energetically favored. The suggested "zero-point driven" devices, which have no internally moving parts, correspond to a perpetuum mobile of a new, fourth kind: They do not produce any work despite the fact that their equilibrium (ground) state corresponds to a permanent rotation even in the presence of an external environment. We show that our proposal is consistent with the laws of thermodynamics.
      PACS numbers: 03.70.+k, 42.50.Lc, 42.50.Pq

      Sent from my iPhone

      On Jun 24, 2013, at 2:05 PM, art wagner wrote:

      http://xxx.lanl.gov/pdf/1207.3052.pdf
    • Dean Radin rebuts the failure to replicate Bem's "Feeling the Future" done on line without proper controls Radin says - bogus rebuttal
    • Jack Sarfatti From: Dean Radin
      Subject: Re: Possible nuclear detonation detected by anomalous mental phenomena
      Date: June 24, 2013 5:02:48 PM PDT
      To: JACK SARFATTI
      ...See More
    • Jack Sarfatti From: JACK SARFATTI <adastra1@me.com>
      Subject: Re: [ExoticPhysics] Reality of Possibility
      Date: June 25, 2013 11:08:05 AM PDT
      To: Exotic Physics <exoticphysics@mail.softcafe.net>
      Reply-To: Jack Sarfatti's Workshop in Advanced Physics <exoticphysics@mai
      ...See More
      www.tcm.phy.cam.ac.uk
      This paper is dedicated to three great thinkers who have insisted that the world is not quite the straightforward affair that our successes in describing it mathematically may have seemed to suggest: Niels Bohr, whose analyses of the problem of explaining life play a central role in the following di...
    • Jack Sarfatti On Jun 25, 2013, at 1:27 PM, JACK SARFATTI <adastra1@me.com> wrote:

      On Jun 24, 2013, at 7:49 PM, Ruth Kastner <rekastner@hotmail.com> wrote:

      See Chapter 7 of my book. One can see the usual subject/object dichotomy as the absorption/emission dichotomy in TI, and can think of 'qualia' as the subjective aspects of any absorption event.

      This is strange. You seem to say that in the simplest Feynman diagram ---< --- = photon < = scattered electron

      there is a conscious experience?

      I think you go too far. First of all quantum electrodynamics is built upon linear unitary Born probability rule orthodox quantum theory with signal locality "passion at a distance" (A. Shimony), no perfect cloning of an unknown quantum state etc. built in. David Deutsch has correctly argued that consciousness is not possible in orthodox quantum theory.

      Basically your distinction is equivalent to Bohm's simply a change of nouns in my opinion.

      Your "possibility" = Bohm's "quantum potential" Q = Wheeler's BIT = Stapp's "thought like" field = David Chalmers "intrinsic mental field"

      Your "actuality" = Bohm's not so "hidden variables" i.e. material particles/classical EM-gravity field configurations that are piloted by Q i.e. "beables."

      Valentini's recent claim that Q is unstable leading to deviations from Born probability rule where it shouldn't of course needs to be addressed. Basil Hiley did so.

      As you will see in Lecture 8 of Michael Towler's http://www.tcm.phy.cam.ac.uk/~mdt26/pilot_waves.html

      The no-signal theorems of Adrian Kent et-al only apply in the approximate limit where the generalized action-reaction principle of Einstein's relativity is violated.

      In other words, no stand-alone entanglement signaling (without a classical signal key to decrypt the coded message) depends upon lack of a direct back-reaction of Q on the beables it pilots. This is equivalent to Antony Valentini's "sub-quantal thermal equilibrium" of the beables.
      Indeed, orthodox quantum theory is not background independent to make an analogy of Q with space-time geometry. Q is not itself a dynamical field (in configuration space) it has no sources! This violates Einstein's relativity principle in a very deep sense of no absolute fields in physics. Any field that acts on another field must have back-reaction. Now of course we have test particles in the gravity & EM fields that are not sources. But we all understand that is an approximation. Orthodox quantum theory depends upon beables being test particles, i.e. not sources of the Q BIT field in configuration space. Therefore, orthodox quantum theory is an approximation of a more general theory, e.g. something like Valentini's, and is not complete. The most obvious breakdown of orthodox quantum theory is living matter.

      Orthodox Quantum Theory is simply John Archibald Wheeler's

      IT FROM BIT

      It is incomplete because it does NOT have direct back-reaction

      BIT FROM IT.
    • Jack Sarfatti Consciousness is, in my view, an emergent property of very complex highly entangled many-particle pumped open-systems which are Prigogine's "dissipative structures" corresponding to Tony Valentini's "sub-quantal non-equilibrium". The big defect in Valentini's theory is that he does not properly address pumping of the system. He only really includes closed systems relaxing to thermal equilibrium.

      Consciousness is imprinting of information directly from the classical IT material degrees of freedom, e.g. CLASSICAL Fuv = Au,v - Av,u on their (super) pilot field Q, which is intrinsically mental.

      <ureye.gif>

      CONSCIOUS QUALIA = IT FROM BIT + BIT FROM IT

      in a creative self-organizing loop of a nonlinear non-unitary post-quantum theory.

      We need the "More is different" (P.W. Anderson) Higgs-Goldstone spontaneous breakdown of ground state symmetry to get the Glauber coherent states that obey a nonlinear nonunitary Landau-Ginzburg equation in ordinary space - not configuration space - that replaces the linear unitary Schrodinger-Dirac equations. This is why 't Hooft's S-Matrix for black hole horizons may fail. This is why Tegmark's Level 3 may fail as well.

      <multiverse.jpg>

      In particular, as I note in the book, the 'Now' (with its attendant qualia) is a primal, irreduceably local phenomenon, defined relative to an absorption resulting in an actualized transaction. Biological organisms are very sophisticated absorption systems. Note that my model does not presume that the physical entities are mind-free Cartesian matter, so allows for a subjective component within the interacting systems, although the model is not observer-dependent.

      RK

      From: adastra1@me.com
      Subject: Re: Reality of Possibility
      Date: Mon, 24 Jun 2013 19:26:50 -0700
      To:

      It's much more than that. I have a clear picture of qualia. What's yours?

      Sent from my iPhone

      On Jun 24, 2013, at 7:18 PM, Ruth Kastner <rekastner@hotmail.com> wrote:

      You're depending on the Bohmian model here. I'm working with a different model, so these arguments don't apply.

      RK
      Subject: Re: Reality of Possibility
      Date: Mon, 24 Jun 2013 18:34:05 -0700
      To: rekastner@hotmail.com

      I don't think u can have consciousness qualia without signal nonlocality violating quantum theory.

      Sure free will is simply the piloting of matter by Bohm's Q. However, you cannot have qualia imprinted on Q from the matter Q pilots. Quantum theory violates the generalized action-reaction principle.

      Sent from my iPhone

      On Jun 24, 2013, at 6:24 PM, Ruth Kastner wrote:

      Jack,

      Thanks for the feedback.
      My interpretation of the quantum realm as physical possibility certainly leaves room for the theory to apply to consciousness and biological systems. For example, I don't go into this in detail in my book, but 'offer waves' (i.e. the entities described by quantum states) are excitations of the relevant fields. The creation of these entities (involving 'creation operators' in QFT) is inherently unpredictable. This leaves room for things like volition and creativity within the standard theory.
      So I disagree that one needs a Valentini-type model i.e., going beyond standard QM, for these things.

      I welcome thoughts on my guest post on George Musser's Sci Am blog (http://blogs.scientificamerican.com/critical-opalescence/2013/06/21/can-we-resolve-quantum-paradoxes-by-stepping-out-of-space-and-time-guest-post/)

      Ruth

      From: adastra1@me.com
      Date: Mon, 24 Jun 2013 18:07:52 -0700
      Subject: Reality of Possibility

      To: rek

      Ruth, I disagree with your basic thesis that orthodox quantum theory is complete.
      This would deny Antony Valentini's sub-quantal non-equilibrium with signal nonlocality for example.
      My basic thesis is that orthodox quantum theory is incomplete. That it cannot explain biology and consciousness.
      Both the latter depend upon signal nonlocality in strong violation of orthodox quantum theory.

      1) linear Hermitian operators for all observables

      2) orthogonal eigenfunctions for all observables

      3) unitary time evolution

      4) linear superposition of quantum states

      5) Born probability interpretation

      6) consciousness

      are incompatible

      I also accept retro-causation in mind/brain data as a working hypothesis, i.e. Libet, Radin, Bierman, Bem.
      blogs.scientificamerican.com
      Next month will be the 100th anniversary of Bohr's model of the atom, one of the foundations of the theory of quantum mechanics. And look where ...
  • Jack Sarfatti "We present an exactly-solvable model for the suppression of quantum noise at large scales on expanding space. The suppression arises naturally in the de Broglie-Bohm pilot-wave formulation of quantum theory, according to which the Born probability rul...See More
  • Jack Sarfatti here is another one: Their last sentence

    "If all this is a dead end, there remains analog systems, like the ones studied in quantum information and condensed matter."

    is the most important - Frank Wilczek's anyons in 2D with fractional quantum statistics like quantum Hall effect, topological computers with sub-quantum non-equilibrium signal nonlocality?

    On Jun 12, 2013, at 8:06 AM, art wagner <wagnerart@hotmail.com> wrote:

    http://xxx.lanl.gov/pdf/1306.0967.pdf
Kalamides entanglement signal design refuted decisively
  • Steve Schultz Well, that's no fun. Guess that means I won't be need to register the radio station letters KFTL...
  • Jack Sarfatti On Jun 12, 2013, at 8:50 AM, Suda Martin wrote:

    Dear all,

    Yes, if one calculates precisely the Kalamidas - expression given in the attachment of the email of CG one obtains exactly
    ...See More
  • Jack Sarfatti Von: CHRISTOPHER GERRY
    Gesendet: Mittwoch, 12. Juni 2013 16:18
    An: nick herbert; Demetrios Kalamidas
    Cc: John Howell; Suda Martin; ghirardi Giancarlo; Ruth Elinor Kastner; JACK SARFATTI
    Betreff: Re: More on the |0>|0> term


    I probably shouldn't jump in on this again, but...

    I can assure you that there's no thorn in the side of the quantum optics community concerning the scheme of Kalamidas. There are only people doing bad calculations. Despite claims to the contrary, our paper, as with Ghirardi's, does specifically deal with the Kalamidas proposal. It is quite clearly the case that EXACT calculations in the Kalamidas proposal shows that the claimed effect disappears. To suggest that it's there in the approximate result obtained by series expansion, and therefore must be a real effect, is simply preposterous. All it means is that the approximation is wrong; in this case being due to the dropping important terms.

    The whole business about the |00> and whatever (the beam splitter transformations and all that) is not the issue. I'm astonished at how the debate on this continues. The real problem, and I cannot emphasize it enough, is this: Kalamidas cannot do quantum optical calculations, even simple ones and therefore nothing he does should be taken seriously. As I've said before, his calculation of our Eq. (9), which I have attached here, is embarrassingly wrong. It's obvious from the expression of the expectation value in the upper left that there has to be two terms in the result both containing the product of r and t. But Kalamidas throws away one of the terms which is of the same order of magnitude as the one he retains. Or maybe he thinks that term is zero via the quantum mechanical calculation of its expectation value, which it most certainly is not. His limits have been taken inconsistently. So, he not only does not know how to do the quantum mechanical calculations, he doesn't even know how or when the limits should be taken. There's absolutely no point in debating the meaning of the results incorrect calculations. Of course, by incorrectly doing these things he gets the result he wants, and then thinks it's the duty of those of us who can do these calculations to spend time showing him why his calculations are wrong, which he then dismisses anyway. My point in again bringing this specific calculation of his is not to say anything about his proposal per se, but to demonstrate the abject incompetence of Kalamidas in trying to do even the most elementary calculations. And if anyone still wonders why I'm angry about the whole affair, well, what should I feel if some guy unable to do simple calculations tries to tell established quantum optics researchers, like me and Mark Hillery, that our paper showing where he's wrong dismisses ours as being "irrelevant?" He doesn't even seem to know that what he said was an insult.

    And finally, the continued claim that the specific proposal of Kalamidas has not been addressed must simply stop. It has been repeatedly. I suspect this claim is being made because people don't like the results of the correct calculations. That's not the problem of those of us can carry through quantum optical calculations.

    CG
  • Keith Kenemer disappointing, but not unexpected...
  • Jack Sarfatti Yes, but here is latest from Nick Herbert - Custer's Last Stand
    On Jun 12, 2013, at 12:28 PM, nick herbert <quanta@cruzio.com> wrote:

    All--

    Excuse me for being confused.
    Gerry refutes Kalamidas by showing that an omitted term is large.
    Suda refutes Kalamidas by showing that the same term is identically zero.
    What am I missing here?

    I wish to say that I accept the general proofs. Kalamidas's scheme will not work as claimed.
    That is the bottom line. So if the general proofs say FTL will fail for full calculation, then it will certainly fail for approximations.

    The "weak coherent state" is a common approximation made in quantum optics. And dozens of experiments have been correctly described using this approximation. So it should be a simple matter to show if one uses
    Kalamidas's approximation, that FTL terms vanish to the appropriate level of approximation. If this did not happen we would not be able to trust the results of approximation schemes not involving FTL claims.

    Gerry's criticism is that Kalamidas's scheme is simply WRONG--that he has thrown away terms DK regards as small. But in fact they are large. Therefore the scheme is flawed from the outset.

    If Gerry is correct, then it seems appropriate to ask: Is there a CORRECT WAY of formulating the Kalamidas scheme using the "weak coherent state" approximation, where it can be explicitly shown that this correct scheme utterly fails?

    It seems to me that there are still some loose ends in this Kalamidas affair, if not a thorn in the side, at least an unscratched itch.

    It seems to me that closure might be obtained. And the Kalamidas affair properly put to rest if everyone can agree that
    1. DK has improperly treated his approximations; 2. Using the CORRECT APPROXIMATION SCHEME, the scheme abjectly fails just as the exact calculation says it must.

    Why should it be so difficult to construct a correct description of the Kalamidas proposal, with CORRECT APPROXIMATIONS, and show that it fails to work as claimed?

    AS seen from the Ghirardi review, there are really not that many serious FTL proposals in existence. And each one teaches us something-- mostly about some simple mistakes one should not make when thinking about quantum systems. Since these proposals are so few, it is really not a waste of time to consider them in great detail, so we can learn to avoid the mistakes that sloppy thinking about QM brings about.

    When Ghirardi considers the Kalamidas scheme in his review, I would consider it less than adequate if he did not include the following information:

    1. Kalamidas's scheme is WRONG because he treats approximations incorrectly.
    2. When we treat the approximations correctly, the scheme fails, just as the general proofs say it must.

    Gerry has provided the first part of this information. What is seriously lacking here is some smart person providing the second part.

    Nick Herbert
  • Jack Sarfatti On Jun 12, 2013, at 2:07 PM, JACK SARFATTI <adastra1@me.com> wrote:

    Lest anyone be confused. I am not defending Kalamidas's gedankenexperiment. Neither is Nick Herbert.
    I agree, that in contrast to Antony Valentini's strategy, any proposal for stand-alone entanglement signaling that does not violate an axiom of orthodox quantum theory will fail. Furthermore, one must show why such a violation is found in Nature. It's not clear whether John Cramer's experiment is supposed to violate quantum theory or not?
    Going for a blast into the real past - seattlepi.com

    www.seattlepi.com/.../Going-for-a-blast-into-the-real-past-1219...
    by Tom Paulson - in 171 Google+ circles
    Nov 14, 2006 – Going for a blast into the real past ... The reflection of UW physicist John Cramer can be seen as he prepares an experiment with lasers. Cramer ...
    Going for a blast into the real past - Worldnews.com
    article.wn.com/view/2013/05/20/Going_for_a_blast_into_the_real_past/
    May 20, 2013 – ... splitting photons actually works, says University of Washington physicist John Cramer, the next step will ... >Going for a blast into the real past ...
    Going for a blast into the real past (quantum retrocausality ...
    www.democraticunderground.com › Discuss
    Nov 15, 2006 - 11 posts - 10 authors
    Going for a blast into the real past. If his experiment with splitting photons actually works, says University of Washington physicist John Cramer, ...
    An Experimental Test of Signaling using Quantum Nonlocality
    faculty.washington.edu/jcramer/NLS/NL_signal.htm
    John G. Cramer. Reports: UW CENPA ... "Going for a blast into the real past", Tom Paulson, Seattle Post-Intelligencer, November 15, 2006 · "Science hopes to ...
    John Cramer's Retrocausality Experiment
    sci.physics.narkive.com › sci physics
    Nov 17, 2006 – "Going for a blast into the real past. If the experiment works, ...University of Washington physicist John Cramer, the next step will be to test for ...
    Retrocausality - Wikipedia, the free encyclopedia
    en.wikipedia.org/wiki/Retrocausality
    Furthermore, the ability to affect the past suggests that causes could be negated by their own ... The Wheeler–Feynman absorber theory, proposed by John Archibald Wheeler and .... "Going for a blast in the real past". ... "Five Decades of Physics" http://www.physics.ohio-state.edu/~lisa/CramerSympo
    Begin forwarded message:

    From: ghirardi
    Date: June 12, 2013 1:33:38 PM PDT
    To: CHRISTOPHER GERRY

    To reinforce the appropriate remarks by Christopher, I want to stress that suggesting that my, as well as Gerry's contributions do not deal with Kalamidas' proposal is an unacceptable position to take. Both of us have PROVED that precisely Kalamidas' proposal does not work and is affected by basic errors that either derive from a mistaken use of general quantum rules or from resorting to unjustified and wrong approximations. That's the story.

    GianCarlo Ghirardi

    P.S. I believe that the debate which is going on, if it becomes known to a larger community of physicists, is seriously damaging the investigations on foundational issues since it puts into clear evidence that part of the people involved is not even capable of using correctly the basic principles of quantum mechanics.

    GianCarlo Ghirardi
    Emeritus
    University of Trieste
    Italy
  • Jack Sarfatti For the record I agree with Chris Gerry below: "On Jun 12, 2013, at 2:03 PM, CHRISTOPHER GERRY <christopher.gerry@lehman.cuny.edu> wrote:

    We are both right: the two terms cancel each other out! That the whole expectation value is zero is actually exactly what's in our paper's Eq. 9. This happens because the reciprocity relations must hold. That Kalamidas thought (or maybe even still thinks) his calculation is correct, is at the heart of the matter, that is, that he is either unable to do the calculations or that he can do them but chooses not too because they don't get him where he wants to go.

    The Kalamidas scheme will not work not work on the basis of general principles as we showed in the first part of our paper (see also Ghirardi's paper).

    And again, the notion that an alleged approximate calculation (I say "alleged" because as with everything else there are correct and incorrect approximate calculations) based on a weak signal coherent state somehow trumps an exact computation valid for any value of the coherent state parameter, is, well, just insane. If you want to see where things go wrong just take more terms in the series expansions. Add up enough terms and, viola, no effect! One can't get much more specific than that.

    Christopher C. Gerry
    Professor of Physics
    Lehman College
    The City University of New York
    718-960-8444
    christopher.gerry@lehman.cuny.edu"
  1.  
  2. NICK'S REVIEW OF THE KALAMIDAS AFFAIR (JUNE 5, 2013)
    "Recently CCNY physics graduate Demetrios Kalamidas proposed a clever
    faster-than-light signaling scheme [DK1] which survived peer review and
    was recently published in Journal of the Optical Society of America. Kalamidas's FTL scheme has generated much discussion and controversy which I will attempt to summarize in this brief review."
    5Like · · Share
    • Jack Sarfatti Nick Herbert continues: "I wish to emphasize that I am not a member of the quantum-optics community nor am I proficient in boson algebra. I am however familiar with devising and refuting FTL communication schemes [1]. I would appreciate comments, corrections and additions to this review.
      Kalamidas's scheme is based on a path-uncertain pair of photons shared by
      Alice and Bob. Whenever Bob's photon path is certain, then so is Alice's, and
      no path interference can occur at Alice's detectors. But if Bob erases which-path information at his detectors, so the argument goes, Alice's which-path information is also (instantly!) erased and interference ensues at Alice's detectors.
      By turning his quantum eraser on and off, Bob can send an FTL signal
      to Alice in the form of patterns of interference or no-interference.
      The beauty of Kalamidas's scheme resides in his original method of which-path
      erasure. When Bob's path info is certain, one path contain a single photon
      and the other path is empty, symbolized by |10> or |01>."
    • Jack Sarfatti "Kalamidas proposes to erase which-path info by mixing into each path a kind of light whose photon number is uncertain. The source of this number-uncertain light is a coherent state |A> which is mixed with Bob's photons via a weakly reflecting beam splitter ( r --> 0) where A is adjusted so that a "weak
      coherent state" |rA> = |0> + rA |1> blends with whatever is in Bob's path. [2] This scheme leads to 5 possible outputs |01>, |01>, |11>, |02> and |20>. For four of these outputs, the path Bob's photon took is not erased, but whenever Bob's counters read |11>, which path the photon took is uncertain and erasure ensues. Using this scheme, Kalamidas can demonstrate apparent FTL signaling from Bob to Alice."
    • Jack Sarfatti "Once I heard of this scheme, I publicized it on my blog [NH1] and hastened to refute it. I was able to invent a simpler path-erasure scheme using "Gray light" |U> instead of a coherent state (where |U> = x|0> + y|1>) which was easy to refute[NH2]. But I could not refute Kalamidas's original scheme.
      Instead of refuting DK's scheme, I actually enhanced it by showing that if he
      strengthened his "weak coherent state" by expanding it to higher powers of
      (rA), the intensity of his FTL signal would actually increase [NH3]. At about
      this same time I wrote the theme song for an opera celebrating DK's quixotic
      quest [NH4] and issued a second blog post [NH5] publicly challenging the
      physics community to refute DK's audacious scheme. The first physicist to take up the challenge was John Howell at the University of Rochester who produced a general refutation of FTL schemes using photon- mixing of the Kalamidas type [JH1]. John's proof used Displaced Fock States (DFS) as Bob's counter outputs and suggested moreover that Kalamidas had erred by using Photon-added Coherent States (PACS) instead of DFS.
      "Everyone knows" that DFS are the correct output states for this kind of experiment, Howell insisted. This has been shown both theoretically and by experiment, for instance here [L&B] and here [W/MS/al]. Kalamidas could not see where his derivation was flawed, but it was clear that his states were of the PACS type. So if DFS was correct, he was prepared to reluctantly admit defeat. However Martin Suda from Austrian Institute of Technology came to the rescue with a simple proof, that at this particular stage of the beam-splitter algebra, both PACS and DFS were correct states [MS1], an astonishing result I call "the Martin Suda Paradox".
    • Jack Sarfatti Nick continued: "Coincidently, GianCarlo Ghirardi had just published a review of past FTL signaling schemes [GCG1] and was drawn into the debate. Together with Raffaele Romano, Ghirardi produced a general refutation [G&R] based on "unitary operations." If the operations that Kalamidas performed on his photons were all unitary, then G & R showed that no FTL signaling would ensue.
      Then one of Kalamidas's former teachers and author of several lucid texts on
      quantum-optics, Christopher Gerry, composed a general refutation [CG/etal]
      based on PACS, the same states Kalamidas had used in his scheme. John
      Howell, at about the same time, published a slightly different refutation [JH2]
      also based on PACS.
    • Jack Sarfatti "One might imagine that, confronted with so many general refutations from all sides, that Kalamidas would cave in and admit defeat. But a funny thing happened on the way to the refutation.
      Despite all the general proofs that his scheme was impossible, no one had
      been able to find a mistake in Kalamidas's math nor his physics. It was true
      that his scheme involved an APPROXIMATION but approximations are used
      all the time in physics. DK's "weak coherent state", for instance, is a veritable
      workhorse of quantum optics, is quite well-understood and appears in numerous experiments where it causes no paradoxical behavior. Kalamidas could cite considerable precedent for using this approximation. One of the reviewers quite rightly pointed out that if the general proofs (which contain no approximations) said that DK's FTL scheme could not work, then that certainly spelled doom for all approximate schemes such as the one DK was proposing. To which DK boldly replied: since you are so certain--because of your general proofs--that I am wrong, then it should be "easy pickins" for you to discover my mistake. But no one has yet met this Kalamidas challenge."
    • Jack Sarfatti "There are two issues here 1. the PACS vs DFS issue and 2. the EXACT vs
      APPROXIMATION issue.
      General refutations using both the PACS and DFS formulations have been
      derived but the PACS APPROXIMATION scheme has not been refuted. It
      remains a mystery why this refutation has not occurred.
      To top things off, Martin Suda formulated a Kalamidas-like scheme using
      DFS APPROXIMATION instead of PACS [MS2]. Suda's new scheme, even
      though approximate, was easily refuted--all the FTL signaling terms obligingly
      summed to zero. However, Martin's nice refutation was spoiled by the
      presence of an ugly non-physical |00> term which no one could justify or
      explain.
      What is the meaning of this impasse? Why can't Kalamidas's simple approximation be refuted when the unapproximated schemes are easily destroyed.
      Martin faintly suspects it has to do with the way the vacuum states |0> are
      treated in approximation schemes. I've always been confused whenever vacuum
      states appear in calculations mixed with "real states". Maybe Kalamidas's
      stubbornly unrefuted FTL scheme (which is certainly wrong, make no
      mistake) has something new and subtle to teach us about boson algebra."
      Nick Herbert (quanta@cruzio.com) June 5, 2013
    • Jack Sarfatti REFERENCES
      [1] Nick Herbert "Faster Than Light: Superluminal Loopholes in
      Physics" NAL (1989)<http://www.amazon.com/gp/product/
      0452263174?ie=UTF8&tag=nikkherbert-20>
      - 4 -
      [2] A coherent state is conventionally written |alpha>, where "alpha"
      is a complex number. For typographical convenience, I write a
      coherent state as |A> where A is understood to be the upper-case
      Greek "alpha".
      [DK1] Demetrios Kalamidas "A Proposal for a Feasible Quantum-
      Optical Experiment to Test the Validity of the No-signaling
      Theorem" <http://lanl.arxiv.org/abs/1110.4629>--Kalamidas's
      original proposal in the physics arXiv.
      [NH1] Nick Herbert "The Kalamida Experiment (blog)" <http:/
      /quantumtantra.blogspot.com/2013/02/the-kalamidasexperiment.
      html>--Publicizing (#1) DK's FTL communication
      scheme; Confirmation of APPROX DK FTL Scheme
      [NH2] Nick Herbert "The Kalamidas Experiment (pdf)" <http:/
      /quantumtantra.com/KalamidasFINAL.pdf>--Refutation of FULL
      Gray-light version of DK FTL Scheme. (In these references "FULL"
      means NO APPROXIMATIONS)
      [NH3] Nick Herbert "Maximizing the Kalamidas Effect (pdf)" <http:/
      /quantumtantra.com/Kalamidas1.pdf>--Expanding & Confirming
      DK APPROX FTL Scheme to higher powers of rA.
      [NH4] Nick Herbert "Demetrios! The Opera (blog)" <http:/
      /quantumtantra.blogspot.com/2013/02/demetrios-opera.html>--
      Demetrios! The Opera.
      [NH5] Nick Herbert (blog) "FTL Signaling Made Easy" <http:/
      /quantumtantra.blogspot.com/2013/05/ftl-signaling-madeeasy.
      html>--Publicizing (#2) APPROX DK FTL Signaling Scheme.
      [JH1] John Howell "Refutation of the Kalamidas's Signaling" (private
      communication) //Refutation of FULL DFS version of DK FTL Scheme
      - 5 -
      [W/MS/al] A. Windhager, Martin Suda et al "Quantum Interference
      between a Single-photon Fock State and a Coherent State" <http:/
      /arxiv.org/pdf/1009.1844.pdf> -- derivation of DFS output of a
      beamsplitter with input |A, 1>
      [L&B] AI Lvovski & SA Babichev "Synthesis and Tomographic
      Characterization of the Displaced Fock State" <http://lanl.arxiv.org/
      abs/quant-ph/0202163>--production and measurement of DFS at
      beam splitter output.
      [GCG1] GianCarlo Ghirardi "Entanglement, Non-locality,
      Superluminal Signaling and Cloning" <http://lanl.arxiv.org/pdf/
      1305.2305v1.pdf>--Refutation of several historical FTL signaling
      schemes
      [G&R] GianCarlo Ghirardi & Raffaelle Romano "On a quite recent
      proposal of faster than light communication" (private
      communication)--General Refutation of all Full Unitary Systems.
      [CGetal] Christopher Gerry, VV, Ugur Güney & Mark Hillery
      "Comment on a superluminal signaling scheme" (private
      communication)--Refutation of FULL PACS version of DK FTL
      Scheme
      [MS1] "MARTIN SUDA PARADOX" (private communication)--"Martin
      Suda Paradox": Symmetry of PACS and DFS at BS output.
      [MS2] Martin Suda "Interferometry at the 50/50 BS" (private
      communication)--refutation of APPROX DFS version of DK FTL
      Scheme
      [JH2} John Howell "Full Calculation No Approximation" (private
      communication)//refutation of FULL PACS version of DK FTL
      Scheme.
  1.