Advanced Intelligence Agency

(a private sector contractor)

Memorandum for the Record

Subject: Technological Surprise Quantum Cryptography et-al - Threat Assessment Analysis

To: The Usual Suspects ;-)

From: A "Hippy" who Saved Physics ;-)

Cut to the chase, bottom line for intelligence analysts - excerpts

In regards to the computational power of computers with access to time machines, it

is straightforward to see that any computation in which an efficient method for checking

the solution exists, a source of random bits, sent through a checking algorithm which then

acts as a c-not on a time machine qubit, can arrive at the correct solution immediately

if the informational content of the solution is sufficiently less than the work function of

the time machine. Since time machine bits may also act as perfectly random sources,

the information may seem to be created from nothing, but one may also think of such

`calculations' as becoming an extremely lucky guesser, due to the re-weighting of histories

by the time machine. ... Conventional crytpography would pose little obstacle to such a

computer. ... public key certification by computer would be almost useless.

Hawking famously has cited the lack of future tourists as good evidence against time machines.

Disputed by UFO investigators where Hawking's statement is considered as part of the disinformation coverup by the UFO investigators and the UFO investigators are debunked as kooks, cranks, crackpots, paranoid conspiracy theorists by the opposition sometimes called "MAJIC" whose front org is the Committee to Investigate Claims of the Paranormal. However, rising above this din of factional wars of rivals inside the intelligence agencies of the major powers and their agents of influence, cut-outs, useful and useless idiots ;-), It is prudent to assume that whoever is really behind "UFOs" has such literally advanced super-technology at their disposal.

Thermodynamics of Time Machines

Michael Devin

Abstract

In this note, a brief review of the consistent state approach to systems containing closed timelike

curves[CTCs] or similar devices is given, and applied to the well known thermodynamic problem of

Maxwell's demon. The 'third party paradox' for acausal systems is defined and applied to closed

timelike curve censorship and black hole evaporation. Some traditional arguments for chronology

protection are re-examined ...

Since the original version of this paper in 2001[1], there has been a renewed interest in

time machine calculations, springing from a duality between acausal systems constrained

by a quantum analog of the Novikov[2] consistency principle, and the formalism of post-

selected ensembles developed by Aharonov and many others[3{5]. Interest has also grown in

the applications of such systems to computation theory, following the footsteps of Deutsch[6],

who employed a dierent superselection criteria leading to different physics. ...

In the past ten years the body of work of post-selected ensembles has grown to become the

standard concerning time machines[3],.... Some material has been added to reflect the more recent developments in black hole physics and post-selected systems.

Suppose we take the time looping bit and entangle it with the position of a particle in

a box. The box is divided by a partition into two unequal sections. In the case of a classic

Szilard engine we measure which side of the box the particle in on and then adiabatically

expand the section with the particle to the size of the original box, performing work. by

Landauer's principle we generate one bit of entropy reseting the measurement apparatus,

which is exactly equal to the maximum work we can extract from the engine by placing the

partition in the center. When the particle is entangled with the time machine qubit, the

probability distribution is no longer uniform and net work can be extracted. ...

As the noise drops to zero for a time machine, the work extractable per bit diverges. A perfectly reliable time machine can therefore violate the second law of thermodynamics by an arbitrarily large amount, but a noisy one has an effective limit. ...

In some of the cases considered in general relativity, with back reactions ignored, we find that CTCs and other time machines act like systems that do not need to borrow from anywhere to have energy. The number of accessible states grows exponentially with energy, and with all microstates equally probable, we naturally arrive at a negative temperature. A similar argument may be used for particle number to give negative chemical potential to the occupation numbers of each field mode. A similar argument may be used for particle number to give negative chemical potential to the occupation numbers of each field mode. If the number of particles

or energy is not somehow bounded then a divergence can result. This is especially the case when we have the internal entropy naturally maximized by eliminating the interaction of the time machine with it's environment due to ignoring back reaction.

The appearance of these divergences is often cited as support for Hawking's chronology protection conjecture[7, 8]. It is assumed that the fluctuations must destroy the time machine before anything improper can occur. However, if this is the case, then it provides the very mechanism for making time machines well behaved entities with positive temperature. The higher the energy or occupation number of a particular field mode in a time machine, the more it is suppressed by the re-weighting of histories by the amplitude for such a high energy state to scatter onto the same state. In post-selected language the sample of high energy states acceptable to post selection is small because high energy modes tend to decay, and high particle number states tend to dissipate, with exponential probability. ...

The system is capable of extracting arbitrarily large amounts of work from an entangled system. In general we can imagine that systems with very large values of time machine negentropy will behave quite strangely, as the probability of exotic events could be exponentially amplified. ...

In regards to the computational power of computers with access to time machines, it

is straightforward to see that any computation in which an efficient method for checking

the solution exists, a source of random bits, sent through a checking algorithm which then

acts as a c-not on a time machine qubit, can arrive at the correct solution immediately

if the informational content of the solution is suffciently less than the work function of

the time machine. Since time machine bits may also act as perfectly random sources,

the information may seem to be created from nothing, but one may also think of such

`calculations' as becoming an extremely lucky guesser, due to the re-weighting of histories

by the time machine.

Essentially time machines are entropy pumps, similar to a classical heat engine. Instead of

transporting heat, they transport entropy, pushing a system of particles or a random message

in message space into a lower entropy state, but increasing the entropy of the environment

in some way not yet understood. The computations, like those of a true quantum computer,

are essentially analog computations. In this case eectively physically reversing classical

hard problems in time. Conventional crytpography would pose little obstacle to such a

computer. Instead one would have to devise ambiguous codes, which could be decoded into

a number of intelligible but incorrect messages, leaving the computer with a problem of

interpreting which was significant, a task made hard for entirely different reasons. A `brute

force' entropy attack assisted by a time machine would then more likely generate one of

the red-herring messages. Other unusual protocols might be used to increase security, but

public key certication by computer would be almost useless. ...

Hawking famously has cited the lack of future tourists as good evidence against time

machines. Although no one disputes this, it is an interesting case to consider for the would be

time theorist. One possible explanation for the lack of such `tourist' fields on the microscopic

scale could be something like the quantum zeno effect. The atom is locked in its state and

the cat never dies because we generally have good records of whether or not time travelers

have appeared. For such a traveler, our present would be his past, and such records in that

future of a lack of visitors from the future may act as a similar lock on using tunneling or

entanglement type phenomena as time machines for that purpose. Different possible tourists

may destructively interfere with each other, just as highly non-classical paths for macroscopic

systems do in path integral theory. Consider that the weight of a particular time tourist

scenario is determined by the amplitude for the tourist's state to scatter onto itself at his

later departure. For any large number of bits of the tourist, as those bits decohere with the

environment that weight should decrease exponentially.

A physical example of how one might look for such `tourists' could be realized by exploring

the third party paradox where the receiving channel is measured well before the time machine

exists. The spin measurements of that channel should be random, but if tourism is allowed,

then they may contain a message. If we consider ensembles that may or may not contain a

time machine, it is helpful to note that the weight factor for a particular history is an inner

product of two unit vectors, as well as a noise coecient. Both of these factors are less than

one, and a sampling from ensembles where the existence of a later time machine depends

on the reception of a message that enables it's construction will actually be suppressed

relative to other random possible messages. A statistical `weak censorship' counteracts the

spontaneous emergence of time machines, without absolutely forbidding them. It might

make for an interesting experiment to construct a post-selection equivalent of the tourist

problem, in which selection criteria followed more complex protocols.

In order for tourists to be suciently rare, the chronology protection mechanism need

not be absolute. Instead it need only be exponentially difficult for tourists to visit some

location in order for the expectation value of tourists to be nite, and thus hopefully small.

VI. CONCLUSION

In conclusion, time machines, if they exist at all, must possess fundamental limits on their

error rate and waste heat, irrespective of the exact method of construction. These limits can

be thought of as analogous to the old Carnot effciency of classical heat engines independent

of the specific construction of the engine. Most of the standard paradoxes associated with

time travel are mitigated by considering systems operating within these limits. The study

of acausal models still has much room for development. In the case of renormalization,

badly behaved bare models may form condensates, shifting the vacuum and creating a more

well behaved dressed model. Similarly, acausal bare models may lead to better behaved

approximately causal models when various corrections are accounted for. In cosmology and

landscape theory, some physicists have sought a model for the emergence of the Lorentzian

signature of the metric, a spontaneous symmetry breaking that creates time itself. If such

ambitions are ever to succeed they surely have to entertain causality as potentially only

approximate in the resulting cosmos.

Technical Appendix for Physicists may be skipped by non-Physicists

To new students of quantum mechanics, the Bell inequalities, delayed choice, and quantum eraser experiments have seemed to almost violate causality. The fact that they cannot

is a crucial consequence of the unitary nature of quantum mechanics. One of the most

troubling aspects of the information loss paradox is the apparent loss of unitarity. Not all

non-unitary maps are created equal, and trace over models of lossy processes do generally

preserve causality. Such models seemed adequate until Hawking radiation came along. The

eventual disintegration of the hole broke the analogy of environmental decoherence open-

ing up the possibility of `bad' nonunitary processes in some imagined acausal lossy theory

of quantum gravity. The aim of the remaining sections is to explore implications of this

possibility.
A quantum eraser is a system that exhibits extreme nature of the delayed choice exper-

iment by measuring and then coherently erasing information about two different possible

paths for a system. By the no copy theorem a qubit that is created by measuring another

qubit can only be coherently erased by combining it with the original again. Coherent era-

sure makes the erased bit `unrecoverable in principle' and thus restores interference effects

relating to any superposition of the original bit before the creation of the measurement bit.

Two concerns in the information paradox were first, that an evaporated black hole might constitute an `in principle unrecoverable' process, and second that proposed complementarity scenarios would violate the no copy theorem, providing another way to erase measurements.

Both cases lead to breakdown of unitarity and subsequently causality. Complementarity has

to ensure the second scenario of a bit meeting its extra twin can not occur. This appears to

be the primary motivation for the recent 'rewall' models of black hole evaporation.

The inherent non-unitarity of time machines can easily be seen by observing the effect

that this probability skewing has on entangled particle pairs. Consider instead of a particle

in a box, the classic spin entangled pairs of particles. If we should choose one of the entangled particles to be sent an arbitrary distance away, then use the other as a control bit in our

time machine circuit, then the state of the pair becomes in general a mixed state. If we

designate a second communication channel to act as the control of another c-not gate on

the time machine bit, then we may measure a real correlation between that channel and the

spin measurements of the distant spin partner. A single time machine as a third party in

the mutual future of two observers can apparently effect nonlocal communication between

them. Thus the non-unitary effects of a time machine may be felt arbitrarily far away, even

in the past light cone of the arrival of |out >.

Consider the equivalent case for a post-selected system where a single bit is selected

to be in a state |0> at two different times. In between these two it is acted on by by two

controlled not operations, one of an EPR pair, and a second being the desired input bit. The

post-selected statistics of the distant EPR partner will now reflect those of the chosen input

bit. Any time a superselection operator acts on an entangled system of particles to enforce

a particular final state on part of the system, the potential for acausal communications

between two third parties also appears. This `third party paradox', is an important element

in understanding the interaction between time machines and nonunitary dynamics.

So far it seems that time machines skew the statistics of ensembles to create effective

nonlinear dynamics. In turn most nonlinear quantum mechanics appears to be exploitable

to create time machines. Explicitly, one time machine can be used to create another, or any

number of others, through the third party paradox. A useful exercise here is to consider

the work done by these `child' machines and how it compares to the work extractable by

the parent alone. Each child `inherits' the noise of its parent, and shares to some degree

the back reaction of its siblings. If the spawning process introduces no additional noise,

then we can shift the arrival time of |out> to an earlier time and find an equivalent system

containing only the parent time loop. This is possible since the duration of the loop is not a

factor in the work function. The maximum work performed by the entire ensemble, minus

any entropy cost for erasing extra measurement bits, should still be less than or equal to

the original work function. ...

Early in the `black hole wars' Hawking tentatively proposed a theory of density matrices might be considered as a generalization of quantum mechanics capable of handling the apparent lack of unitary evolution in gravitational collapse[9]. This approach was heavily criticized for possible violations of locality or energy conservation[10]. Energy conservation can be maintained, but the trade-off between causality and non-unitarity remains. Any system that can act on a qubit to map orthogonal to non-orthogonal states, can be added

to a quantum eraser double interferometer to break the net balance between opposing interference patterns that locally cancel for distant entangled states. It would seem though that if such transitions were possible, then vacuum fluctuations would cause them to occur to any given bit eventually, and thus nonlocal interactions would be everywhere. ...

Hawking and others have contended that all systems containing time machines should

possess entropy in accord with the number of internal states `sent' to the past[11] ...

This scenario is trivially modeled in a post-selection experiment as simply

three measurements of a random bit, in which the first and last measurements are the same

result. ...

the importance of the relative phase information of out states that is crucial to preventing entangled particle pairs from allowing non-local communication. The classic double interferometer fails to detect any local interference effects when observing only one of the photons. The other photon may be in either of two states, and that bit contains either the path information of its cousin, eliminating the interference, or the two outcomes

contribute to two separate interference patterns exactly out of phase, such that the trace over those gives no local interference.

(not necessarily using Glauber coherent non-orthogonal entangled states - JS)

Some black holes are thought to contain ready-made time machines in the form of closed

timelike curves. The troubling behavior of the Kerr metric near the singularity was assumed

to be made safe by being behind the horizon, an early and important result supporting

the cosmic censorship hypothesis. However due to the third party effect, it would appear

that not only does the horizon fail to prevent information from leaving the CTC region,

it leads non-local communication between points far from the hole. These secondary time

machines can then effectively 'mine' the black hole for negentropy. Some fraction of the

entropy associated with the irreducible mass of the hole should then provide a bound on

this entropy, and therefore some constraints on k, for the CTC region. For the purposes of

chronology protection, horizons alone are ineffective 'transparent censors'. ...

One proposal in resolution to the black hole information paradox is to add a boundary

condition to the singularity[12]. Some critics argue this violates causality[13]. The argu-

ment against it can be illustrated with the following paradox. Under normal circumstances,

information, such as a volume of Shakespeare, falls into a black hole, which then evaporates

via Hawking radiation. If a boundary condition at the singularity is prescribed, then these

fields must be canceled by other contributions as they approach the singularity. These other

contributions are the in-falling components of the pairs of particles appearing from the vac-

uum, the outgoing of which constitute the Hawking radiation. Since each pair is strongly

entangled, and the in-falling radiation is forced to match up with the in-falling Shakespeare

volume via the superselection of global field congurations to fit the boundary condition,

then the outgoing radiation must be simply the scrambled volume of Shakespeare. Another

way of considering it is to imagine the field modes reflect off of the singularity, becoming

negative energy time reversed modes. They then travel out of the hole and reflect off the

potential outside the main black hole, becoming again positive energy, forward modes.

The boundary condition acts as a selector of global field congurations, much like the

post-selection operator used to model acausal ensembles. The proposed mechanism `similar

to state teleportation' is in fact the third party paradox communication channel arising

in both time machine and post selected systems. We may employ the same methods of

superselection to generate a time machine via the third party problem. The picture is

complicated slightly though by the presence of the incoming part of the Hawking pairs.

This incoming part may serve as the required noise that bounds the total work extractable

by all third party time machines. If no time machines are spawned this way, the work is

expended adjusting the outgoing radiation into the form of Shakespeare. One flaw in this method of teleportation is also that there is nothing to require that the teleported states leave the black hole before the original states enter it. ...