Begin forwarded message:
From: JACK SARFATTI <Sarfatti@PacBell.net>
Subject: [Starfleet Command] Re: Causal Discovery Algorithms - where to draw the line in the sand on the domain of validity of orthodox quantum no entanglement signaling
Date: March 9, 2013 12:47:28 PM PST
To: Exotic Physics <exoticphysics@mail.softcafe.net>
Reply-To: SarfattiScienceSeminars@yahoogroups.com
Right on the money
where to draw the line in the sand on the domain of validity of orthodox quantum no entanglement signaling postulate
"The deBroglie-Bohm interpretation is a prominent example
of a model that seeks to provide a causal explanation
of Bell correlations using superluminal causal influences.
Consider the deBroglie-Bohm interpretation of a
relativistic theory such as the model of QED provided by
Struyve and Westman [18], or else of a nonrelativistic theory
wherein the interaction Hamiltonians are such that
there is a maximum speed at which signals can propagate.
In both cases, it is presumed that there is a preferred rest
frame that is hidden at the operational level. In a Bell
experiment, if the measurement on the left wing occurs
prior to the measurement on the right wing relative to the
preferred rest frame, then there is a superluminal causal
influence from the setting on the left wing to the outcome
on the right wing, mediated by the quantum state,
which is considered to be a part of the ontology of the
theory [19]. (Note that no causal influence from the outcome
of the first experiment to the outcome of the second
is required because the outcomes are deterministic functions
of the Bohmian conguration and the wavefunction.)
It follows from our analysis that the parameters in
the causal model posited by the deBroglie-Bohm inter
pretation must be ne-tuned in order to explain the lack
of superluminal signalling.
Valentini's version of the deBroglie-Bohm interpretation
makes this fact particularly clear. In Refs. [20, 21]
he has noted that the wavefunction plays a dual role in
the deBroglie-Bohm interpretation. On the one hand,
it is part of the ontology, a pilot wave that dictates the
dynamics of the system's conguration (the positions of
the particles in the nonrelativistic theory). On the other
hand, the wavefunction has a statistical character, specifying
the distribution over the system's congurations.
In order to eliminate this dual role, Valentini suggests
that the wavefunction is only a pilot wave and that any
distribution over the configurations should be allowed as
the initial condition. It is argued that one can still recover
the standard distribution of congurations on a coarsegrained
scale as a result of dynamical evolution [22].
Within this approach, the no-signalling constraint is a
feature of a special equilibrium distribution. The tension
between Bell inequality violations and no-signalling
is resolved by abandoning the latter as a fundamental
feature of the world and asserting that it only holds as
a contingent feature. The fine-tuning is explained as the
consequence of equilibration. (It has also been noted in
the causal model literature that equilibration phenomena
might account for fine-tuning of causal parameters [23].)
Conversely, the version of the deBroglie-Bohm interpretation
espoused by Durr, Goldstein and Zhangi [24] {
which takes no-signalling to be a non-contingent feature
of the theory { does not seek to provide a dynamical explanation
of the fine-tuning. Consequently, it seems fair
to say that the fine-tuning required by the deBroglie-
Bohm interpretation is less objectionable in Valentini's
version of the theory."
On Mar 8, 2013, at 11:53 AM, JACK SARFATTI <jacksarfatti@gmail.com> wrote:
On Mar 8, 2013, at 11:19 AM, Ruth Elinor Kastner <rkastner@umd.edu> wrote:
Jack, interpretations are generally not Popper falsifiable since they are empirically equivalent with the theory they're interpreting.
In the case of quantum theory, the main different interpretations
1) Copenhagen - epistemic
Asher Peres's as a sub-category?
2) Bohm ontologic
3) Aharonov history-destiny
4) Cramer transactions
5) Hartle consistent histories
6) variations on many-worlds (Tegmark's Level 3)
are degenerate as you say.
However, Antony Valentini has shown how Bohm's theory in particular breaks the above impasse since it gives entanglement signal nonlocality violating no-cloning & no-signaling constraints for sub-quantum non-equlibrium violation of the Born probability rule. This is not even thinkable in some of the above interpretations.
Bohm's theory is a different theory from standard QM to the extent that it has possible empirical non-equivalence (for particle distributions deviating from Psi^2).
right
However there is a possible empirical prediction at the relativistic level for PTI in which there could be deviations from standard QED (which possibly have already been observed). I'm working on that now.
good
RK
________________________________________
From: JACK SARFATTI [sarfatti@pacbell.net]
Sent: Friday, March 08, 2013 2:06 PM
To: Ruth Elinor Kastner
Subject: Re: Causal Discovery Algorithms - Stapp, Kastner, Cramer, Aharonov
The issue is what is the precise operational meaning of your particular distinction between "possibilities" and "actualized transactions"? How can we Popper falsify such a verbal distinction in the "informal language" (Bohm). In contrast, in Bohm's interpretation there is a clear distinction in the formalism between the "thoughtlike" (Stapp) quantum BIT potential Q and the "rocklike" (Stapp) hidden variable classical lepton-quark et-al world lines and electromagnetic-weak-strong classical field configurations.
On Mar 8, 2013, at 10:20 AM, Ruth Elinor Kastner <rkastner@umd.edu> wrote:
Thanks Jack,
My ontology takes spacetime relations as supervenient on causal relations., where the latter are relations among possibilities, and those are time-symmetrically related. The spacetime relations (i.e sets of events resulting from actualized transactions) are only indeterministically related to the time-symmetric causal relations characterizing the underlying possibilities. So I don't see anything here that refutes anything I'm doing. Of course I welcome anyone's pointing out what I may be overlooking.
Best
Ruth
Now Available: The Transactional Interpretation of Quantum Mechanics, Ruth E. Kastner
http://www.cambridge.org/us/knowledge/discountpromotion/?site_locale=en_US&code=L2TIQM
________________________________________
From: JACK SARFATTI [sarfatti@pacbell.net]
Sent: Thursday, March 07, 2013 7:28 PM
To: art wagner
Subject: Causal Discovery Algorithms - Stapp, Kastner, Cramer, Aharonov
This very important paper will have profound impact on Henry Stapp's and Ruth E. Kastner's models - also Cramer's & Aharonov's. I am curious about their future responses to it.
On Mar 7, 2013, at 11:41 AM, art wagner <wagnerart@hotmail.com<mailto:wagnerart@hotmail.com>> wrote:
Causal Discovery Algorithms - http://xxx.lanl.gov/pdf/1208.4119.pdfbasi
________________________________
Subject: Re: Chinese Physicists Measure Speed of "Spooky Action At a Distance" | MIT Technology Review
From: sarfatti@pacbell.net<mailto:sarfatti@pacbell.net>
Date: Thu, 7 Mar 2013 11:36:59 -0800
To: PhysicsFellows@mail.softcafe.net<mailto:PhysicsFellows@mail.softcafe.net>
" because the “spooky action” cannot be used to send information faster than the speed of light."
Don't be so sure. The Fat Lady has not yet sung on that one. ;-)
The question is whether orthodox quantum theory is complete, or is it a limiting case of a more general theory with pre-sponse entanglement signal nonlocality for living matter?
__._,_.___
Reply via web post Reply to sender Reply to group Start a New Topic Messages in this topic (1)
RECENT ACTIVITY:
Visit Your Group
These are the logs of the starship NCC-1701-280Z. Its five-year mission to seek out new minds, new quantum realms. To boldly explore physics where no physicist has gone before (in physical, virtual, or quantum worlds)!
Starmind(tm) -- Your daily journal to the industry's brightest stars. You get infinite knowledge only with Starmind:
All hits. All Physics. All the time. And now in parallel and diverging universes. (Thus proving they don't exist as separate entities --But have we gotten to them yet or not?)
** Patronize any Yahoo! Group Sponsor at your own risk.
- - - - - - Message From Starfleet - - - (Read below) - - - - - - - - - - -
To change any characteristic of your online membership access, visit via web:
http://groups.yahoo.com/subscribe/SarfattiScienceSeminars
Join in our ongoing discussions and theoretical science writings:
http://groups.yahoo.com/messages/SarfattiScienceSeminars
Dr. Sarfatti may be reached at his e-mail or using Internet site:
http://stardrive.org
http://www.1st-books.com
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
To respond or comment directly to the group's archive, reply via e-mail:
SarfattiScienceSeminars@YahooGroups.com
Switch to: Text-Only, Daily Digest • Unsubscribe • Terms of Use • Send us Feedback
.
__,_._,___
Advanced Intelligence Agency
(a private sector contractor)
Memorandum for the Record
Subject: Technological Surprise Quantum Cryptography et-al - Threat Assessment Analysis
To: The Usual Suspects ;-)
From: A "Hippy" who Saved Physics ;-)
Cut to the chase, bottom line for intelligence analysts - excerpts
In regards to the computational power of computers with access to time machines, it
is straightforward to see that any computation in which an efficient method for checking
the solution exists, a source of random bits, sent through a checking algorithm which then
acts as a c-not on a time machine qubit, can arrive at the correct solution immediately
if the informational content of the solution is sufficiently less than the work function of
the time machine. Since time machine bits may also act as perfectly random sources,
the information may seem to be created from nothing, but one may also think of such
`calculations' as becoming an extremely lucky guesser, due to the re-weighting of histories
by the time machine. ... Conventional crytpography would pose little obstacle to such a
computer. ... public key certification by computer would be almost useless.
Hawking famously has cited the lack of future tourists as good evidence against time machines.
Disputed by UFO investigators where Hawking's statement is considered as part of the disinformation coverup by the UFO investigators and the UFO investigators are debunked as kooks, cranks, crackpots, paranoid conspiracy theorists by the opposition sometimes called "MAJIC" whose front org is the Committee to Investigate Claims of the Paranormal. However, rising above this din of factional wars of rivals inside the intelligence agencies of the major powers and their agents of influence, cut-outs, useful and useless idiots ;-), It is prudent to assume that whoever is really behind "UFOs" has such literally advanced super-technology at their disposal.
Thermodynamics of Time Machines
Michael Devin
Abstract
In this note, a brief review of the consistent state approach to systems containing closed timelike
curves[CTCs] or similar devices is given, and applied to the well known thermodynamic problem of
Maxwell's demon. The 'third party paradox' for acausal systems is defined and applied to closed
timelike curve censorship and black hole evaporation. Some traditional arguments for chronology
protection are re-examined ...
Since the original version of this paper in 2001[1], there has been a renewed interest in
time machine calculations, springing from a duality between acausal systems constrained
by a quantum analog of the Novikov[2] consistency principle, and the formalism of post-
selected ensembles developed by Aharonov and many others[3{5]. Interest has also grown in
the applications of such systems to computation theory, following the footsteps of Deutsch[6],
who employed a dierent superselection criteria leading to different physics. ...
In the past ten years the body of work of post-selected ensembles has grown to become the
standard concerning time machines[3],.... Some material has been added to reflect the more recent developments in black hole physics and post-selected systems.
Suppose we take the time looping bit and entangle it with the position of a particle in
a box. The box is divided by a partition into two unequal sections. In the case of a classic
Szilard engine we measure which side of the box the particle in on and then adiabatically
expand the section with the particle to the size of the original box, performing work. by
Landauer's principle we generate one bit of entropy reseting the measurement apparatus,
which is exactly equal to the maximum work we can extract from the engine by placing the
partition in the center. When the particle is entangled with the time machine qubit, the
probability distribution is no longer uniform and net work can be extracted. ...
As the noise drops to zero for a time machine, the work extractable per bit diverges. A perfectly reliable time machine can therefore violate the second law of thermodynamics by an arbitrarily large amount, but a noisy one has an effective limit. ...
In some of the cases considered in general relativity, with back reactions ignored, we find that CTCs and other time machines act like systems that do not need to borrow from anywhere to have energy. The number of accessible states grows exponentially with energy, and with all microstates equally probable, we naturally arrive at a negative temperature. A similar argument may be used for particle number to give negative chemical potential to the occupation numbers of each field mode. A similar argument may be used for particle number to give negative chemical potential to the occupation numbers of each field mode. If the number of particles
or energy is not somehow bounded then a divergence can result. This is especially the case when we have the internal entropy naturally maximized by eliminating the interaction of the time machine with it's environment due to ignoring back reaction.
The appearance of these divergences is often cited as support for Hawking's chronology protection conjecture[7, 8]. It is assumed that the fluctuations must destroy the time machine before anything improper can occur. However, if this is the case, then it provides the very mechanism for making time machines well behaved entities with positive temperature. The higher the energy or occupation number of a particular field mode in a time machine, the more it is suppressed by the re-weighting of histories by the amplitude for such a high energy state to scatter onto the same state. In post-selected language the sample of high energy states acceptable to post selection is small because high energy modes tend to decay, and high particle number states tend to dissipate, with exponential probability. ...
The system is capable of extracting arbitrarily large amounts of work from an entangled system. In general we can imagine that systems with very large values of time machine negentropy will behave quite strangely, as the probability of exotic events could be exponentially amplified. ...
In regards to the computational power of computers with access to time machines, it
is straightforward to see that any computation in which an efficient method for checking
the solution exists, a source of random bits, sent through a checking algorithm which then
acts as a c-not on a time machine qubit, can arrive at the correct solution immediately
if the informational content of the solution is suffciently less than the work function of
the time machine. Since time machine bits may also act as perfectly random sources,
the information may seem to be created from nothing, but one may also think of such
`calculations' as becoming an extremely lucky guesser, due to the re-weighting of histories
by the time machine.
Essentially time machines are entropy pumps, similar to a classical heat engine. Instead of
transporting heat, they transport entropy, pushing a system of particles or a random message
in message space into a lower entropy state, but increasing the entropy of the environment
in some way not yet understood. The computations, like those of a true quantum computer,
are essentially analog computations. In this case eectively physically reversing classical
hard problems in time. Conventional crytpography would pose little obstacle to such a
computer. Instead one would have to devise ambiguous codes, which could be decoded into
a number of intelligible but incorrect messages, leaving the computer with a problem of
interpreting which was significant, a task made hard for entirely different reasons. A `brute
force' entropy attack assisted by a time machine would then more likely generate one of
the red-herring messages. Other unusual protocols might be used to increase security, but
public key certication by computer would be almost useless. ...
Hawking famously has cited the lack of future tourists as good evidence against time
machines. Although no one disputes this, it is an interesting case to consider for the would be
time theorist. One possible explanation for the lack of such `tourist' fields on the microscopic
scale could be something like the quantum zeno effect. The atom is locked in its state and
the cat never dies because we generally have good records of whether or not time travelers
have appeared. For such a traveler, our present would be his past, and such records in that
future of a lack of visitors from the future may act as a similar lock on using tunneling or
entanglement type phenomena as time machines for that purpose. Different possible tourists
may destructively interfere with each other, just as highly non-classical paths for macroscopic
systems do in path integral theory. Consider that the weight of a particular time tourist
scenario is determined by the amplitude for the tourist's state to scatter onto itself at his
later departure. For any large number of bits of the tourist, as those bits decohere with the
environment that weight should decrease exponentially.
A physical example of how one might look for such `tourists' could be realized by exploring
the third party paradox where the receiving channel is measured well before the time machine
exists. The spin measurements of that channel should be random, but if tourism is allowed,
then they may contain a message. If we consider ensembles that may or may not contain a
time machine, it is helpful to note that the weight factor for a particular history is an inner
product of two unit vectors, as well as a noise coecient. Both of these factors are less than
one, and a sampling from ensembles where the existence of a later time machine depends
on the reception of a message that enables it's construction will actually be suppressed
relative to other random possible messages. A statistical `weak censorship' counteracts the
spontaneous emergence of time machines, without absolutely forbidding them. It might
make for an interesting experiment to construct a post-selection equivalent of the tourist
problem, in which selection criteria followed more complex protocols.
In order for tourists to be suciently rare, the chronology protection mechanism need
not be absolute. Instead it need only be exponentially difficult for tourists to visit some
location in order for the expectation value of tourists to be nite, and thus hopefully small.
VI. CONCLUSION
In conclusion, time machines, if they exist at all, must possess fundamental limits on their
error rate and waste heat, irrespective of the exact method of construction. These limits can
be thought of as analogous to the old Carnot effciency of classical heat engines independent
of the specific construction of the engine. Most of the standard paradoxes associated with
time travel are mitigated by considering systems operating within these limits. The study
of acausal models still has much room for development. In the case of renormalization,
badly behaved bare models may form condensates, shifting the vacuum and creating a more
well behaved dressed model. Similarly, acausal bare models may lead to better behaved
approximately causal models when various corrections are accounted for. In cosmology and
landscape theory, some physicists have sought a model for the emergence of the Lorentzian
signature of the metric, a spontaneous symmetry breaking that creates time itself. If such
ambitions are ever to succeed they surely have to entertain causality as potentially only
approximate in the resulting cosmos.
Technical Appendix for Physicists may be skipped by non-Physicists
To new students of quantum mechanics, the Bell inequalities, delayed choice, and quantum eraser experiments have seemed to almost violate causality. The fact that they cannot
is a crucial consequence of the unitary nature of quantum mechanics. One of the most
troubling aspects of the information loss paradox is the apparent loss of unitarity. Not all
non-unitary maps are created equal, and trace over models of lossy processes do generally
preserve causality. Such models seemed adequate until Hawking radiation came along. The
eventual disintegration of the hole broke the analogy of environmental decoherence open-
ing up the possibility of `bad' nonunitary processes in some imagined acausal lossy theory
of quantum gravity. The aim of the remaining sections is to explore implications of this
possibility.
A quantum eraser is a system that exhibits extreme nature of the delayed choice exper-
iment by measuring and then coherently erasing information about two different possible
paths for a system. By the no copy theorem a qubit that is created by measuring another
qubit can only be coherently erased by combining it with the original again. Coherent era-
sure makes the erased bit `unrecoverable in principle' and thus restores interference effects
relating to any superposition of the original bit before the creation of the measurement bit.
Two concerns in the information paradox were first, that an evaporated black hole might constitute an `in principle unrecoverable' process, and second that proposed complementarity scenarios would violate the no copy theorem, providing another way to erase measurements.
Both cases lead to breakdown of unitarity and subsequently causality. Complementarity has
to ensure the second scenario of a bit meeting its extra twin can not occur. This appears to
be the primary motivation for the recent 'rewall' models of black hole evaporation.
The inherent non-unitarity of time machines can easily be seen by observing the effect
that this probability skewing has on entangled particle pairs. Consider instead of a particle
in a box, the classic spin entangled pairs of particles. If we should choose one of the entangled particles to be sent an arbitrary distance away, then use the other as a control bit in our
time machine circuit, then the state of the pair becomes in general a mixed state. If we
designate a second communication channel to act as the control of another c-not gate on
the time machine bit, then we may measure a real correlation between that channel and the
spin measurements of the distant spin partner. A single time machine as a third party in
the mutual future of two observers can apparently effect nonlocal communication between
them. Thus the non-unitary effects of a time machine may be felt arbitrarily far away, even
in the past light cone of the arrival of |out >.
Consider the equivalent case for a post-selected system where a single bit is selected
to be in a state |0> at two different times. In between these two it is acted on by by two
controlled not operations, one of an EPR pair, and a second being the desired input bit. The
post-selected statistics of the distant EPR partner will now reflect those of the chosen input
bit. Any time a superselection operator acts on an entangled system of particles to enforce
a particular final state on part of the system, the potential for acausal communications
between two third parties also appears. This `third party paradox', is an important element
in understanding the interaction between time machines and nonunitary dynamics.
So far it seems that time machines skew the statistics of ensembles to create effective
nonlinear dynamics. In turn most nonlinear quantum mechanics appears to be exploitable
to create time machines. Explicitly, one time machine can be used to create another, or any
number of others, through the third party paradox. A useful exercise here is to consider
the work done by these `child' machines and how it compares to the work extractable by
the parent alone. Each child `inherits' the noise of its parent, and shares to some degree
the back reaction of its siblings. If the spawning process introduces no additional noise,
then we can shift the arrival time of |out> to an earlier time and find an equivalent system
containing only the parent time loop. This is possible since the duration of the loop is not a
factor in the work function. The maximum work performed by the entire ensemble, minus
any entropy cost for erasing extra measurement bits, should still be less than or equal to
the original work function. ...
Early in the `black hole wars' Hawking tentatively proposed a theory of density matrices might be considered as a generalization of quantum mechanics capable of handling the apparent lack of unitary evolution in gravitational collapse[9]. This approach was heavily criticized for possible violations of locality or energy conservation[10]. Energy conservation can be maintained, but the trade-off between causality and non-unitarity remains. Any system that can act on a qubit to map orthogonal to non-orthogonal states, can be added
to a quantum eraser double interferometer to break the net balance between opposing interference patterns that locally cancel for distant entangled states. It would seem though that if such transitions were possible, then vacuum fluctuations would cause them to occur to any given bit eventually, and thus nonlocal interactions would be everywhere. ...
Hawking and others have contended that all systems containing time machines should
possess entropy in accord with the number of internal states `sent' to the past[11] ...
This scenario is trivially modeled in a post-selection experiment as simply
three measurements of a random bit, in which the first and last measurements are the same
result. ...
the importance of the relative phase information of out states that is crucial to preventing entangled particle pairs from allowing non-local communication. The classic double interferometer fails to detect any local interference effects when observing only one of the photons. The other photon may be in either of two states, and that bit contains either the path information of its cousin, eliminating the interference, or the two outcomes
contribute to two separate interference patterns exactly out of phase, such that the trace over those gives no local interference.
(not necessarily using Glauber coherent non-orthogonal entangled states - JS)
Some black holes are thought to contain ready-made time machines in the form of closed
timelike curves. The troubling behavior of the Kerr metric near the singularity was assumed
to be made safe by being behind the horizon, an early and important result supporting
the cosmic censorship hypothesis. However due to the third party effect, it would appear
that not only does the horizon fail to prevent information from leaving the CTC region,
it leads non-local communication between points far from the hole. These secondary time
machines can then effectively 'mine' the black hole for negentropy. Some fraction of the
entropy associated with the irreducible mass of the hole should then provide a bound on
this entropy, and therefore some constraints on k, for the CTC region. For the purposes of
chronology protection, horizons alone are ineffective 'transparent censors'. ...
One proposal in resolution to the black hole information paradox is to add a boundary
condition to the singularity[12]. Some critics argue this violates causality[13]. The argu-
ment against it can be illustrated with the following paradox. Under normal circumstances,
information, such as a volume of Shakespeare, falls into a black hole, which then evaporates
via Hawking radiation. If a boundary condition at the singularity is prescribed, then these
fields must be canceled by other contributions as they approach the singularity. These other
contributions are the in-falling components of the pairs of particles appearing from the vac-
uum, the outgoing of which constitute the Hawking radiation. Since each pair is strongly
entangled, and the in-falling radiation is forced to match up with the in-falling Shakespeare
volume via the superselection of global field congurations to fit the boundary condition,
then the outgoing radiation must be simply the scrambled volume of Shakespeare. Another
way of considering it is to imagine the field modes reflect off of the singularity, becoming
negative energy time reversed modes. They then travel out of the hole and reflect off the
potential outside the main black hole, becoming again positive energy, forward modes.
The boundary condition acts as a selector of global field congurations, much like the
post-selection operator used to model acausal ensembles. The proposed mechanism `similar
to state teleportation' is in fact the third party paradox communication channel arising
in both time machine and post selected systems. We may employ the same methods of
superselection to generate a time machine via the third party problem. The picture is
complicated slightly though by the presence of the incoming part of the Hawking pairs.
This incoming part may serve as the required noise that bounds the total work extractable
by all third party time machines. If no time machines are spawned this way, the work is
expended adjusting the outgoing radiation into the form of Shakespeare. One flaw in this method of teleportation is also that there is nothing to require that the teleported states leave the black hole before the original states enter it. ...