Text Size

Stardrive

Jul 30

INSTITUTE OF PHYSICS PUBLISHING REPORTS ON PROGRESS IN PHYSICS
Rep. Prog. Phys. 68 (2005) 897–964 doi:10.1088/0034-4885/68/4/R04
The structure of the world from pure numbers  ref 18
F J Tipler

"I shall show that observing the CMBR through a filter of 290 Å of graphite would yield a 39% greater flux if the CMBR were a SU(2)L gauge field than if the CMBR is an electromagnetic field."

"One might object that there is no consistent quantum gravity theory. On the contrary,
there is a qualitatively unique quantum gravity theory based on the continuum, on the metric of general relativity. In fact, this theory has been in effect independently discovered by Feynman, DeWitt and Weinberg among others, but because this theory has a ‘philosophical problem’, a problem that arises from taking the integers as fundamental rather than the continuum, these great physicists did not realize that they had solved the problem of quantizing gravity. They also did not realize that the correct quantum gravity theory is consistent only if a certain set of boundary conditions are imposed, which I shall describe. Quantum gravity stabilizes the SM, but this stabilization forces the constants of the SM to depend on cosmic time. DeWitt (1964), Salam and Strathdee (1978) and Isham et al (1971) long ago suggested that gravity might eliminate the infinities of quantum field theory. I shall argue that they were correct.


Starting from the indicated boundary conditions, I shall calculate what the initial state of
the universe must be. It is, as Kelvin and Maxwell conjectured at the end of the nineteenth
century, a state of zero entropy. This unique quantum state is consistent with the SM only
if the only field present is the SU(2)L field of the SM. I shall compute the solution to the
Yang–Mills–Einstein equations for this unique state, and show that it naturally yields, via
electroweak tunnelling, more matter than antimatter, and also the correct baryon to photon
ratio η. The baryons thus generated are the source of the perturbations from which all the structure of the universe is generated, and I shall show that observed scale-free Harrison–
Zel’dovich spectrum arises naturally from the generated baryons. The flatness, horizon
and isotropy problems are automatically resolved given the required unique initial state. In
particular, the observed flatness of the universe is a result of the familiar quantum mechanical wave packet spreading.


There remain the dark matter and the dark energy problems. I point out that these problems
have a solution if the initial SU(2)L gauge field managed to avoid thermalization in the early
universe. If it did, then necessarily this field is the cosmic microwave background radiation
(CMBR), and the dark matter would be a manifestation of an interchange of energy between
the SM Higgs field and the CMBR. The dark energy would then be the manifestation of the
residual positive cosmological constant that must exist if the SM is to be consistent with general relativity."

Jul 29
"The Question is what is The Question?" John Archibald Wheeler

Roger Penrose "started by stressing the importance of the second law of thermodynamics that states that disorder or entropy must always increase. For example, at the Big Bang the Universe began in a low entropy state as illustrated by studies of background radiation left over from shortly after that time, the cosmic microwave background. Since then entropy has always increased.

...

According to the new theory as the Universe expands all the particles of matter are collected into black holes. These destroy the information content of matter and hence reduce entropy. The black holes radiate Hawkins radiation and after a very very very long time disappear – for instance for a three million solar mass black hole like the one at the centre of the Milky Way the time is 10 to the power of 84 years. When all matter has disappeared and entropy has been reduced a new Big Bang can take place. Sir Roger calls this sequence of one universe being formed after the death of the previous one conformal cyclic cosmology." Click here for the entire article.

On the other hand in my version of the world hologram theory using Tamara Davis's computer simulation





The area of our future de Sitter dark energy event horizon saturates in the future and is "zero" at the moment of inflation.
In the retro-causal hologram theory we are advanced 3D images from the future 2D horizon that according to Seth Lloyd is a computer. Obviously the early universe has small entropy in this picture. The problem as to why we age as the universe accelerates to its causal constant asymptote is trivial in this model.

Event horizons are Nature's static LNIF detectors. Hawking's radiation is easily understood intuitively as energy from the gravity field at the null geodesic horizon of a black hole lifting virtual electron-positron pairs et-al from off-mass-shell to on-mass-shell. We are outside the observer-independent black hole horizon. The Unruh temperature of the real plasma at the event horizon is
T ~ (c^2/rs)(1 - rs/(rs +Lp))^-1/2 ~ (c^2/rs)(rs/Lp)^1/2   for static LIFs outside the horizon
However as r >>> rs
T ---> (c^2/rs)
The Wheeler-Feynman total absorber is dual to the above! We are inside our observer-dependent de Sitter dark energy future event horizon - we are at r = 0 in the static LNIF representation of Einstein's metric tensor field.
Now here is what I suspect I am the only one so far to realize - at least James Woodward does not know it, and until recently, I would have agreed with him. Also I was completely dissatisfied with Sean Carroll's "From Eternity to Here" because its great title, like Paul Davies "God and The New Physics" was false advertising - did not deliver on the implicate promise. God was nowhere to be found between the covers of Davies book, and why we age as the universe accelerates was not explained in Carroll's book. This is a vaporware trend in physics today started by Brian Greene and Michio Kaku on TV. Lest we forget, last but not least, the application of string theory to financial derivatives on Wall Street - Sting Theory - pardon me for rubbing salt on wounds. In another relevant case, talking about signal nonlocality in extended quantum theory without talking about consciousness and the paranormal is like a loveless marriage. ;-)
The Hare and The Tortoise
The pre-dark energy FRW cosmology had / = 0 without any finite future event horizon and without any natural explanation for irreversibility's arrow of time, why the entropy of the early universe was so small - why we age and die as space expands. Indeed from Wheeler and Feynman to Hoyle and Narlikar we know that retarded causality from past to future does not work when / = 0, but it does work for any / > 0. The reason is simple, eventually all real on-mass-shell quanta peter out like the Hare. Virtual quanta are like the Tortoise or super Energizer Bunny - they always win out in the end. Therefore, the initial cosmological redshift from the stretching of the de Broglie wavelengths will eventually give way to the blueshifting anti-gravity repulsive field of the virtual off-mass-shell bosons on the large scale doing work on them resulting in the Planck-scale limited temperature at the future dS horizon that they are inside of as r ---> /^-1/2. The hot plasma at the future horizon is completely opaque to any real quanta getting there. This plasma is self-generated by the real quanta themselves! If no real quanta attempt to cross, that region of space will be quiet.

Intuitively, there is a competition between the / = 0 FRW metric cosmological redshift effect and the / > 0 de Sitter cosmological blue shift effect. But de Sitter wins the race in the end. FRW is the Hare, de Sitter is the Tortoise. Remember the energy densities of real quanta keep going down as space expands - not so the virtual zero point quanta!

                                                Rocky Kolb SLAC Summer School
Remember there is  NO finite future event horizon when / = 0 in the old FRW standard big bang cosmology prior to the 1999 discovery of dark energy in the anomalous redshifted spectra of Type 1a supernovae.
area of future event horizon ~ /^-1 ~ 1/(dark energy density in past light cone)
Also remember that in Wheeler-Feynman/Hoyle-Narlikar a pure FRW cosmology gives the wrong answer for the Arrow of Time - there is no net retarded causality for / = 0, only for dS / > 0.
We need the dark energy density in order for the 2nd Law of Thermodynamics to work! Sean Carroll does not seem to know this in his book From Eternity to Here.
On Jul 26, 2010, at 1:39 PM, james f woodward wrote:
"Jack, I repeat, a photon emitted, necessarily into the future, gets
cosmologically redshifted as it travels along.  It does not get
blueshifted."
That's what I used to think also. But it's wrong. Sure the universe is expanding so the wavelengths stretch that's the standard argument and it works for / = 0, but it does not work as our universe approaches the de Sitter (dS) metric
gtt ~ 1 - / ^2  asymptotically in the future
in static LNIF representation where we are located at r = 0
There is, of course, a competition between the cosmic redshift stretching of the wavelengths for the / = 0 contribution, but eventually the / > 0 wins.
Remember in Newtonian terms the repulsive gravity force (per unit test mass)
is g = +2c^2/    (universal anti-gravity repulsion accelerating expansion rate of 3D space)
and this does work on the photon increasing its energy relative to the static LNIF detectors (actually does work on any real particle).
This is in contrast to a photon leaving a black hole where
g = - c^2rs/2r^2
here r = rs is the event horizon and we are at r ---> infinity
therefore a photon must do work to get out of the gravity well in Newtonian terms.
also a photon falling into the gravity well has work done on it by the gravity field and blue shifts
Note the duality in the effective Newtonian potentials
VdS = -c^2/ ^2  repulsive field toward inside future cosmological dS event horizon that we are inside of!
we at r = 0
therefore work done on retarded test particle falling toward the dark energy dS horizon from the inside.
Vbh = - c^2rs/2r   attractive field toward black hole event horizon - we are outside black hole horizon
we at r ---> infinity
therefore work done on retarded test particle falling toward the black hole event horizon from the outside.
the inside/outside topology difference and the r^2 vs 1/r difference, and the inversion r = 0 <--> r = infinity are a kind of duality here.
"So, as it approaches the horizon of the source, it cannot
excite e-p pairs out of the vacuum to act as an absorber.  If the
expansion is accelerating, I suppose you might try to argue that the
acceleration at the horizon produces a Unruh-Davies thermal bath of
photons of sufficient energy to excite the e-p pairs out of the vacuum
that would then act as an absorber.  But that's a completely different
scenario than the one you are pushing here.  And to be convincing, it
would have to be fleshed out in much greater detail.  Besides, if there
is an accelerating expansion, you automatically get a cosmic horizon
without invoking the vacuum at all."

On the contrary, to my intuition it's obvious that the future de Sitter horizon is a static LNIF detector i.e. Kip Thorne's electrical membrane, where any real on shell charged particle or photon that couples to charged particles has so much energy relative to it, that it will excite real pairs out of the virtual vacuum sea and thus in effect be the desired total absorber of last resort restoring the Arrow of Time (2nd Law of Thermodynamics) and the net past to future retarded causality - that is broken only when we have post-quantum signal nonlocality as in living matter - as Fred Hoyle clearly saw back in the 1980s.

Jul 26

My Greatest Blunder? -- corrected v2

Posted by: JackSarfatti |
Tagged in: Untagged 



Both the past and future particle horizons are infinite redshift surfaces. What does that mean in terms of retarded photons?
We never see a retarded photon emitted from the past particle horizon or behind it because it is infinitely redshifted to zero frequency when it reaches the origin of the past light cone of our detector. For the moment forget Hawking blackbody radiation. The past particle horizon is the future light cone of the Alpha Point of inflation.
Next emit a retarded photon into the sky. That photon will infinitely blue shift along the future light cone of the emitter at its intersection with the future event horizon that is the past light cone of our conformal end time Omega Point (infinite metric clock proper time).
Therefore, if, in past, I argued that our future horizon was a Wheeler-Feynman total absorber because zero frequency at it for a retarded signal - that is wrong - my greatest blunder. Not sure if I did, but I might have. In fact, our future horizon is the Wheeler-Feynman total absorber because any retarded photon hitting it will form an opaque electron-positron plasma that will absorb it with probability 1 - is the idea here. This is essentially Lenny Susskind's black hole complementarity argument that static LNIFs at horizons need infinite covariant accelerations and consequently infinite Unruh temperature prior to imposing Planck quantum gravity cutoff.
On Jul 25, 2010, at 3:01 PM, JACK SARFATTI wrote:
On Jul 25, 2010, at 11:53 AM, james f woodward wrote:
"Yes, Nick Herbert.  Recall that he pointed out that a photon approaching the event horizon will propagate through the horizon (as its local speed is still c and assuming that in the vicinity of the horizon spacetime is transparent)." 
I think Nick is mistaken. Kip Thorne has shown that the event horizon is an electrical membrane. We shoot a retarded photon into the sky. When it reaches our future event horizon it is infinitely blue-shifted exciting real electron-positron pairs out of vacuum of the horizon -getting totally absorbed. The horizon itself acts as a static LNIF detector with acceleration calculated in the following way.

The de Sitter "Newtonian" potential is  -c^2/ ^2

V/c^2 = - (area of concentric sphere inside our future horizon)/(area of our future horizon)

Newton's force per unit mass is
a = -dV/dr = +2c^2/

i.e. in Newtonian terms the repulsive anti-gravity field does work on the photon relative to static LNIF detectors.

We are at r = 0 in this observer-dependent dS metric representation

In Einsteinian terms

gtt = (1 + 2V/c^2) = 1 - 2 (area of concentric sphere inside our future horizon)/(area of our future horizon) ---> 0 at our future horizon

therefore the static LNIF representation metric is

Tunruh ~ a = 2c^2/ gtt^-1/2 = 2c^2/ (1 - 2 (area of concentric sphere inside our future horizon)/(area of our future horizon))^-1/2

I think this is the physical basis of Kip Thorne's membrane idea. It's also part of Lenny Susskind's black hole complementarity.
"And if it interacts with something beyond the horizon (as it must in the TI view as emission only occurs when the future absorption event is fixed), the advanced wave will propagate back through the horizon to the source -- unless the horizon is accelerating.  If the multiverse really is infinite so photons can, in principle, propagate infinitely far, then accelerating cosmic expansion is the only way to cut off interactions that produce advanced waves that return to the source from beyond the horizon. If total absorption within the event horizon is the case, then this cutoff issue is not a matter of concern.   But if total absorption is NOT the case, then it is."
On Jul 25, 2010, at 10:56 AM, james f woodward wrote:
"The paper you seek is (with loosely translated title): "A response to the argument directed by EPR against Bohr's interpretation of quantum phenomena," Comptes Rendus, vol. 236, pp. 1632 - 1634 (1953). As for predicting dark energy in the '50s (or, for that matter, the '60s, '70s, or '80s), the historical situation in observational and theoretical cosmology made that impossible for even very smart people like  Sciama to carry that off.  Had Hoyle not had such a commitment to steady state cosmology, he might have done it I guess.  By the mid '90s, any cosmologist with a real understanding of action at a distance field theory, and the cutoff issue idenified by NH a while back, should have been able to make the prediction."
On Jul 24, 2010, at 10:42 PM, james f woodward wrote:
"Olivier Costa de Beauregard's first paper on the application of W-F absorber theory to QM was published in 1953.  So was Sciama's first paper on Mach's principle -- which leads to an "action at a distance" view."
Seems like Sciama could have predicted dark energy back then. The de Sitter future event horizon as a hologram total absorber is a kind of ultra-strong Mach's principle it seems to me.

Hi Creon

I will include this in Destiny Matrix 2010.

Why do you include Susskind? He has the hologram idea with 't Hooft, but I think they mean the past-particle horizon not the dark energy de Sitter asymptotic future event horizon if they have even bothered to think of when the hologram is? Perhaps I am mistaken. Reference?  See below on my new gedankexperiment conceived a few moments ago increasing brain blood flow on machines at the Club.

Note - the key issue is signal nonlocality - do all the interpretations permit extensions of themselves to include it in more general theories when some of the constraining axioms of orthodox quantum theory are removed?

Gedankenexperiment

Consider both Aspect experiment across spacelike intervals outside the relevant local light cones as well as the timelike version of Aharonov and Vaidman et-al inside the relevant sets of light cones with pre and post-selection (here we simplify and ignore the intermediate measurement - not doing weak measurements - a composite of two timelike EPR correlations.

The quantum correlation of photon polarizations in all of these cases will be essentially sin^2(Theta). But what is Theta?

Theta = theta(Alice) - theta(Bob)

at the events A & B of actual irreversible single photon (ideal case) detections of an individual pair. Practically we will use short entangled laser pulses - some changes in the details).

OK in Aspect experiment A & B are spacelike separated.

In Aharonov type experiment A & B are timelike separated.

Ignore gravity curvature.

Let Alice be an active sender of a real message like the price of Apple stock at a certain moment. Bob is a passive receiver. Since we are only using passion at a distance with signal locality (sub-quantal thermal equilibrium of the nonlocal hidden variables) Alice and Bob locally see random white noise as pairs of photons are emitted back to back to them.

Now let Alice encode the Apple stock price according to a standard protocol in the time series theta(Alice(t)) where t is the time of irreversible detection of a photon by Alice. Similarly, theta(Bob(t')) where t' is the time of irreversible detection of the precise twin of Alice's photon. In general t =/= t' also the flight paths from the source of pairs to the detectors are different and can be adjusted with delay lines to either spacelike or timelike at will.

This is a variation on Wheeler's delayed choice experiment.

Suppose t (Alice) > t'(Bob) in the invariant timelike sense.

Also suppose t(Alice) - t'(Bob) = 1 week!  The source emits a pair at t" < t'(Bob).

So on Sept 14 Alice sends a sequence of theta(Alice) that encodes the Apple stock price on Sept 14 - where Bob's twin photons are detected on Sept 7!

True, Alice and Bob locally see random white noise on Sept 14 and Sept 7 respectively if they look at their local outputs on their laptops.

But now on Sept 15 a computer does the correlation analysis and out comes the Sept 14 Apple stock price. The only rational conclusion is that information was transmitted backward in time from Sept 14 to Sept 7 - but that information could not be decoded until after Alice made her free will choice to encode the Apple stock price showing on her I Phone APP.

Weird for sure - no alternative rational explanation consistent with free will.

But that's without signal nonlocality.

Russell Targ's report of the CIA SRI precognitive remote viewing of the Chinese nuclear bomb test, for example, is signal nonlocality violating orthodox QM. That would correspond to Bob seeing the Apple stock price of Sept 14 on Sept 7 without needing to do a hindsight correlation analysis!

On Jul 24, 2010, at 4:34 PM, Creon Levit wrote:

"The long Sciama quote at the end of Saul-Paul's post suggests extensions to what may be the only profound and decent idea I've ever had in fundamental physics: The equivalence, (or complimentarily) of quantum effects in different interpretations of QM.

In the DeBroglie-Bohm interpretation, it is the quantum potential that's responsible for all departures from classical mechanics. In the many-universes interpretation, it is the effects of other universes upon ours which accounts for all nonclassical effects. In the Feynman path integral approach, it is alternative paths. In the Sciama/Good/Susskind/Sarfatti scheme, it is the future boundary conditions. In the Wheeler-Feynman-Cramer picture it is also the future (absorber) boundary condition. In the Bohr-Heisenberg picture, quantum departures from classical causality are "inherent" - i.e. there is no "picture".

So the point is: The future, the quantum potential, the effect of other worlds, and "inherent quantum behavior" are all equivalent. They make equivalent predictions. They produce the equivalent quantum effects. They are complimentary tools in the quantum mechanic's tool-crib. While each might prefer one tool or the other under different circumstances anyone can, in principle, build all of quantum physics with any one of the tools."

 

From end notes of my book Destiny Matrix 2010 2nd Edition - under construction
Astrophysics and Relativity

On Jul 23, 2010, at 11:03 AM, Nick Herbert wrote:

“The notion that knowledge of a future boundary condition could eliminate quantum uncertainty was first put forth by I.J. Good (1916-2009) in "The Scientist Speculates-- an Anthology of Partially-Baked Ideas" published in 1964. As far as I am aware Good has priority in this speculation. …”

 

For the record, I was not consciously influenced by I.J. Good's book about that idea.

I think Saul-Paul showed me that book when I got to San Francisco? Saul-Paul had already cited me in his Berkeley Opera on that idea when I was with Abdus Salam in Trieste before I met Saul-Paul. Actually Feynman has the germ of the idea in his Lagrangian QM fairly explicit in his NR QM paper. I do remember being influenced by Fred Hoyle who is clear about it in his book Intelligent Universe. Also remember Greg Benford's sci-fi novel Timescape - I was collaborating with Greg informally on some of this back at UCSD - part of the gang with Herbie Bernstein, Harry Yesian he fictionalizes in the novel. Of course Hoyle came to La Jolla frequently back then to see the Burbidges and the beaches. Fred Wolf wrote Star Wave in early 80's I think influenced by Feynman's papers. I don't recall Fred even knowing about I.J. Good's book, but maybe I am mistaken?

 

It's not obvious reading Aharonov's and Vaidman's papers that they literally mean a real physical retro-causality on the intermediate measurement by post-selection, which operationally seems to be merely a way of doing the after-the-fact statistical analysis (throwing away - tracing over) final states one is not interested in - similar to how the Bell's theorem-violating correlations are extracted from the raw data. Basically post-selection is temporal EPR correlation rather than spatial.

 

My early ideas on future to past are already recorded in the 1973 SRI tape of my first meeting with Hal Puthoff and Russell Targ arranged by Brendan O Regan during their CIA remote viewing experiments with Uri Geller, Pat Price, Ingo Swann et-al - my idea was implanted in my 1953 "contact" whatever that really was. I never read the Sciama article cited below until Saul-Paul sent it on July 24, 2010. Of course, if we believe in signal nonlocality then my getting the ideas years ago could be the past effect of the future cause of me reading Saul-Paul’s message.

 

On Jul 24, 2010, at 9:47 AM, Saul-Paul Sirag wrote:

Nick & Jack,

 

            Nick is referring to I.J. Good's short article "Two-way Determinism" on pp. 314-315 of "The Scientist Speculates" (Basic Books, 1962). Here it is:

------------

            'Backward time isn't such a new thing, backward time will start long ago.' --Doog (after a popular song).

           

            G.N. Lewis* outlined a theory of light in which the present is determined as much by the future as the past.  Popper,** contradicting a familiar interpretation of Heisenberg's uncertainty principle, claimed that the position and momentum of a particle can both be determined with arbitrarily high accuracy at a single moment of time, provided one has accurately observed both its earlier position and its later momentum at two specified moments. It is natural then to raise the following question.

           

            Given a connected bounded piece of space-time, are all the elementary subatomic events within it that are classically describable (i.e. without explicit reference to quantum mechanics) fully determined by all the classically describable events outside of it?  Or, if not, is there any neat way of describing how much indeterminacy is left?  Can these questions be answered in terms of existing quantum mechanics, and do they raise interesting new mathematical problems?

 

            If the answer to the first question is yes, then we could say that we have two-way determinism, since the present would be mathematically determined jointly by the past and future, however remote.  Note however that two-way determinism is a special case of what is usually called 'indeterminism', since the past alone would not determine the present.  This merely shows that language does not alway behave very well.

 

            If two-way determinism is true it raises another, more philosophical, question, namely whether we should then say that future events are contributory causes of present ones.***

 

            * Lewis, G.N.: Nature, volume 117, pages 236-8, 1926.

            ** Popper, K.R.: The Logic of Scientific Discovery: page 231, 1959. See also Sir Arthur Eddington, The Nature of the Physical World, London, 1928, chapter 14.

            *** Compare pbis Nos. 104 (Computers, Causality, and the Direction of Time), 45 (Speculations Concerning Precognition), and 59 (Precognition and Reversed Causality). 

                        -------------------------------------[end of quote from I.J. Good]

 

BTW: I gave a copy of this book to Andrija Puharich in the spring of 1973, when I was working on a story about Uri Geller for Esquire magazine (which was never published).  In that story I also mentioned the idea that we are being influenced by us in the future. (Also both Uri and Andrija had discussed this idea).

 

            As I have mentioned before Dennis Sciama discussed "two-way determinism" in the book "Determinism and Freedom" (edited by Sidney Hook, New York University, 1958; Collier Books edition, 1961).  This is also short so I (again) will type it.

            --------------------

            Determinsm and the Cosmos

            Dennis W.  Sciama, Trinity College, Cambridge

 

“As a physicist I have found the following working hypothesis very useful: violent controversy about a scientific problem is a sign that some simple essential consideration is missing. The polemic, as it were, tries to substitute for the missing point, but of course it never can.  I think for instance that this has been so in discussions of Mach's principle of the origin of inertia, and also of the problem of deducing irreversible macroscopic behavior from reversible microscopic laws.

 

            Bridgman has reminded us that the physicists are conducting violent controversy about the meaning of quantum mechanics.  This situation is in striking contrast to that prevailing in classical mechanics; for although classical mechanics is known to be false, there is no dispute as to its meaning. It is only in quantum mechanics (which is known to be true!) that there is such a dispute….

 

            The basic way in which quantum mechanics differs from classical mechanics is the following: our inferences about the future must be expressed in terms of probabilities.  This introduction of probability would enable us to make the calculation.

 

            With this state of affairs in mind, let us make a new assumption.  Let us suppose that in nature systems are deterministic in the sense that we can calculate the state of a system at time t if we know enough boundary conditions referring to times other than t; but let us differ from classical mechanics by supposing that nature is so constructed that roughly speaking, half the boundary conditions must refer to the past and half to the future of the moment t.  In other words, we assume that nature is such that "mixed" boundary conditions are always needed.

 

            Presumably a system with such properties would be called deterministic.  This is a matter of definition, of course; what is really important is that the behavior of the system is as well defined and intelligible as that of a system obeying classical mechanics.  But now we must ask: How would a "mixed" system appear to an observer who himself is part of the system?

 

            Now, such and observer, for reasons that cannot be elaborated here but that have to do with the second law of thermodynamics, is acquainted only with the past.  Hence if he attempts to calculate the state of a system at a time t in his future, he will find that he cannot do so, for he does not know all the boundary conditions.  His knowledge of the past boundary will delimit the possibilities considerably, but it is clear that to the observer the system will appear to contain indeterminate elements.

 

            What sort of a theory will such an observer devise?  In effect he will be forced to average over all those future boundary conditions that are compatible with his present knowledge.  (Of course, at first he will not realize that this is what he is doing.)  That is to say, he will be forced to introduce a probability calculus to account for his observations.  The suggestion is that this probability calculus is just quantum mechanics.

 

            In this way the correctness of quantum mechanics can be reconciled with a deterministic universe.  In the language of von Newmann, there are hidden variables;  they escape his ban because they refer to the future.”

 

           

 

---------------------------[end of quote from Dennis Sciama]

 

All for now;-)

 

Saul-Paul

-------------------------

Thanks Waldyr, I will cite you on this in my book Destiny Matrix 2010 - we did not know this. Yes, I think I already have Helmut Schmidt & Henry Stapp cited in old version Destiny Matrix 2002 - thanks for reminding me of Raju who I did not know about in 2002.

 

On Jul 24, 2010, at 12:09 PM, Professor Waldyr A. Rodrigues Jr. UNICAMP, Brazil wrote:

 

“Dears  Jack  and Fred,

1)      The famous Italian mathematician Professor Luigi Fantappié  wrote a series of articles in the 50’s where he proposed explicitly influence from the future.

One of his students, Professor Giuseppe Arcidiacono ( now deceased, and which I knew personally when I have been  a visiting professor at Perugia university) wrote a very interesting book called  ‘’Fantappié e gli Universi’’ (Il Fuoco edditora, Roma ,1986) where Fantappié’s ideas are described in detail and where references to Fantappié’s papers can be found. Fantappié’s ideas are very similar to your ideas.

 

Maybe it is a good idea for Jack to order and read the book ( you will recall the Italian of your grandfathers ). If you did not succeed in buying the book, I will ask one of my students to scan it for you.”

 

It's a good idea. Perhaps you can write something we can include or do it in a review of the 2nd edition when it comes out - you can put it on Amazon. I am under time pressure to get this 2nd revised edition into the market. I think I cite Costa de Beauregard already along with Stalin's alleged spymaster Yakov Tereletskii whose tachyon book we had at SDSU. 

 

“2)      Also, I mention here  Professor Olivier Costa de Beauregard ( a famous French physicist, now deceased). He has been my guest at UNICAMP in 1974 (!). We discuss a lot about retro causality at that time. In particular, I recall that he wrote a paper called:  “Einstein-Podolski-Rosen-Paradox non-separability and Feynman non-locality, Phys. Letters A  60(2), 93-96, 1986” where he claimed that retro causality coming from QM  gives to paranormal phenomena  a real existence right.”

 

For that we need signal nonlocality violating quantum theory as Brian Josephson and I independently suggested and as Antony Valentini formalized in the Bohm interpretation. Henry Stapp already formalized it in the Copenhagen collapse interpretation for which he was figuratively burned at the stake by the same people who have attacked Brian Josephson now for decades like in Les Miserables 

 

“At that time, when I got interested in the paranormal, I put  Beauregard in contact with several extraordinary paranormal people in Brazil and he got very much impressed.

 

3)      Of course,  you cannot forget also Helmut Schmidt ( who you meet at São Paulo) and that is talking about retro causal influences since 1969. ( see: http://www.parapsych.org/members/h_schmidt.html)

 

4)      Finally, last month I have meet in Pecs and Budapest the mathematician Professor C. K. Raju ( see his web site at:  http://ckraju.net/).

He wrote some books and articles arguing for retro causation. His ideas are very similar to Fantappie´s and yours. I told him about this fact, he said that never read Fantappié or your papers. So, eventually one can conclude that those are ideas coming from a very far from future and arriving free for anyone who pay attention on it…”

 

Yes, Fred Hoyle suggests that explicitly.

 

Best regards,

Waldyr

Quotes below are excerpted from:

 

Astrophysics and Relativity

Preprint Series No. 70

THE UNIVERSE: PAST AND PRESENT REFLECTIONS

FRED HOYLE

May 1981

Department of

Applied Mathematics and Astronomy

 

"Many will smile if I say that such an incident was triggered by the deciphering of a cosmic signal. It will be agreed that a sudden reordering of substantial blocks of information in the brain must have been involved, but it will be said that the initiating signal happened by chance, from a random firing of neurons. ...

 

The alternate view is that the deaf Beethoven, decisively cut-off from the distractions of the world of men, equipped as a terminal with unusual backing storage, was able to receive a particular component of the cosmic signals, and with sharply increasing clarity as the years passed by. This view would be my choice, but each of us must listen and decide. Perhaps the decision turns on whether we ourselves hear the thunder of Zeus on Mt. Olympus."

 

Hoyle died in 2001 before the meaning of dark energy was understood.

 

dark energy density ~ (area/entropy of our future horizon hologram)^-1

 

"The time was the late 1960?s, when Narlikar and I were struggling with the problem of the quantum mechanical signal from the future."

 

Fred Alan Wolf and I were doing the same thing at San Diego State at same time. 

 

"Let me begin with the 1964-70 period. It was then that I became a dyed-in-the-wool believer in the time symmetry of basic physics. Of course there are aspects of our experience that are not time symmetric - thermodynamics, the past-to-future propagation of radiation fields, and certain features of particle physics. In my view such asymmetries are cosmological manifestations, however, not basic physics. Here I have space only to discuss the past-to-future propagation of the electromagnetic field. In a famous demonstration, Wheeler and Feynman showed more than thirty years ago that one could have a time-symmetric electrodynamics augmented by a cosmological response from the future that reproduced exactly the same results as the classical Maxwell-Lorentz theory. For some years it was thought that a similar demonstration could not be given in quantum physics, but in the late 1960?s Jayant Narlikar and I showed that, just as in the classical case of Wheeler and Feynman, it was possible to have a time-symmetric local quantum theory augmented by a cosmological response from the future that reproduced exactly all the practical results of normal quantum electrodynamics. Although there was no difference at all in its statistical predictions, the time-symmetric theory was interestingly different in its details. Because the cosmological response involved both the wavefunction and its conjugate complex, unlike normal quantum mechanics no pure-amplitude theory could be formulated. This I saw as an advantage. The pure-amplitude aspect of normal quantum mechanics involves a redundancy, because information is discarded in passing to practical results. The time-symmetric theory yields the practical results without redundancy. 

 

... What I saw in 1970 or thereabouts was that von Neumann had been concerned with a finite local system. If cosmology were involved, with a response from the future, the dynamical variables in the system could be infinite, and the situation could then be different. This was the chink of light. 

 

Even so, the problem remained acutely puzzling. The future imposes a condition on a local system because a signal goes out from the local system to other material systems in the future, which respond with a return signal on account of the time symmetry. One would like a situation in which the return signal imposed a deterministic reality on the local system, forcing an explicit decision to be made in all situations of an A or not-A kind, as in the example discussed above (mushroom cloud or no-mushroom cloud). The trouble is that so long as one calculates the return signal from within quantum mechanics this does not happen, just as von Neumann claimed it could not happen. One is faced by a chicken-and-egg situation. The initial local system does not have deterministic reality because the systems in its future with which it interacts do not have deterministic reality, and this is because the systems in the further future with which the second systems interact do not have deterministic reality, and so on along an infinite chain of interactions. Yet somewhere the Gordian knot has to be cut - it must be, since our everyday experience tells us that it is! The mathematical loophole lies at the limit-of the infinite chain of interactions. True, we cannot establish deterministic reality by starting within the chain and by attempting to argue in a past-to-future direction towards the limit. But if we were to start with deterministic reality at the limit, arguing backwards from future to past, there would be deterministic reality at every link of the chain. In other words, the trouble may well come from arguing the problem back-to-front instead of front-to-back."

 

Obviously the buck stops at our future dark energy dS event horizon that cuts the Gordian knot. Note Hoyle came to UCSD when I was there in the late 60's before he really had this idea matured - did I.J. Good get it from Hoyle or the other way round. Obviously, they knew each other.

 

"This was the stage of my thinking following the work of the 1964-70 period, before it became apparent, from the arguments given earlier, than an enormous intelligence must be abroad in the Universe. As the Americans say, this instantly creates a new ball game. ...the persistent religious conviction that the pattern of our lives is stored in the future looks as if it could quite well be correct."

 

That's the future hologram right there!

 

"At the mathematical limit discussed above. At the last trumpet! What an extraordinary way to describe the outcome of a sequence of arguments involving the condensation of the wavefunction, the need to avoid von Neumann?s mathematical result for finite systems, and time-symmetric electrodynamics. Of course one can argue that the correspondences are fortuitous. Notice, however, that in timesymmetric theory influences are indeed felt “in (less than) a moment, in (even less than) the twinkling of an eye”, and that all finite events are brought together at the mathematical limit in the future. Fortuitous or not, it is curious that so many people without scientific knowledge have believed in the idea, as if they had caught a glimpse of a difficult message which they could only express in terms of an everyday analogy.  Religion is an interesting but not really convincing example of the computer terminal idea.”

 

[1] Tamara Davis’s PhD dissertation may be downloaded here        http://www.physics.uq.edu.au/download/tamarad/

[1] http://web.mac.com/nquebedeau/Norman_Quebedeau/Animation_files/spectra.swf

 

[1] Jack Sarfatti wrote: Indeed, Nobel Prize physicist Brian Josephson was disinvited from a Bohm physics conference in Tuscany, Italy in August of 2010 by Antony Valentini because of Brian’s research into the same kinds of paranormal experiences recorded in this memoir. Not even a Nobel Prize, a professorship at Cambridge University’s Trinity College, a membership in Newton’s Royal Society, not even invitations by The Queen to Buckingham Palace could prevent the Orwellian New Inquisitors from this blatant shocking assault on freedom. Another case of the resurgence of totalitarian repression of politically incorrect ideas reminiscent of the Gulag in the Soviet Union is the ban from Britain of American talk radio host Michael Savage by both the Labor and the new Conservative governments treading on the Magna Carta. Some old-fashioned liberal once wrote “I may not agree with your ideas, but I will defend your right to speak them.” I am writing this in the midst of climate changes, man-made or not does not matter, the BP oil spill in the Gulf of Mexico, in the wake of the Iceland volcano eruption that stopped air travel to Britain and Northern Europe and in the aftermath of the world financial collapse of September 2008 allegedly brought on by string theorist physicists inventing “derivatives” on Wall Street prostituting themselves to greedy gangsters in pinstripe suits. It sure looks like what Sir Martin Rees, British Astronomer Royal, Master of Josephson’s Trinity College and Newton’s Royal Society has called “Our Final Hour.”

to be continued

 

Quotes by Fred Hoyle

"Many will smile if I say that such an incident was triggered by the deciphering of a cosmic signal. It will be agreed that a sudden reordering of substantial blocks of information in the brain must have been involved, but it will be said that the initiating signal happened by chance, from a random firing of neurons. ...
The alternate view is that the deaf Beethoven, decisively cut-off from the distractions of the world of men, equipped as a terminal with unusual backing storage, was able to receive a particular component of the cosmic signals, and with sharply increasing clarity as the years passed by. This view would be my choice, but each of us must listen and decide. Perhaps the decision turns on whether we ourselves hear the thunder of Zeus on Mt. Olympus."


Hoyle died in 2001 before the meaning of dark energy was understood.

dark energy density ~ (area/entropy of our future horizon hologram)^-1

"The time was the late 1960?s, when Narlikar and I were struggling with the problem of the quantum mechanical signal from the future."

Fred Alan Wolf and I were doing the same thing at San Diego State at same time.

"Let me begin with the 1964-70 period. It was then that I became a dyed-in-the-wool believer in the time symmetry of basic physics. Of course there are aspects of our experience that are not time symmetric - thermodynamics, the past-to-future propagation of radiation fields, and certain features of particle physics. In my view such asymmetries are cosmological manifestations, however, not basic physics. Here I have space only to discuss the past-to-future propagation of the electromagnetic field. In a famous demonstration, Wheeler and Feynman showed more than thirty years ago that one could have a time-symmetric electrodynamics augmented by a cosmological response from the future that reproduced exactly the same results as the classical Maxwell-Lorentz theory. For some years it was thought that a similar demonstration could not be given in quantum physics, but in the late 1960?s Jayant Narlikar and I showed that, just as in the classical case of Wheeler and Feynman, it was possible to have a time-symmetric local quantum theory augmented by a cosmological response from the future that reproduced exactly all the practical results of normal quantum electrodynamics. Although there was no difference at all in its statistical predictions, the time-symmetric theory was interestingly different in its details. Because the cosmological response involved both the wavefunction and its conjugate complex, unlike normal quantum mechanics no pure-amplitude theory could be formulated. This I saw as an advantage. The pure-amplitude aspect of normal quantum mechanics involves a redundancy, because information is discarded in passing to practical results. The time-symmetric theory yields the practical results without redundancy.
... What I saw in 1970 or thereabouts was that von Neumann had been concerned with a finite local system. If cosmology were involved, with a response from the future, the dynamical variables in the system could be infinite, and the situation could then be different. This was the chink of light.

Even so, the problem remained acutely puzzling. The future imposes a condition on a local system because a signal goes out from the local system to other material systems in the future, which respond with a return signal on account of the time symmetry. One would like a situation in which the return signal imposed a deterministic reality on the local system, forcing an explicit decision to be made in all situations of an A or not-A kind, as in the example discussed above (mushroom cloud or no-mushroom cloud). The trouble is that so long as one calculates the return signal from within quantum mechanics this does not happen, just as von Neumann claimed it could not happen. One is faced by a chicken-and-egg situation. The initial local system does not have deterministic reality because the systems in its future with which it interacts do not have deterministic reality, and this is because the systems in the further future with which the second systems interact do not have deterministic reality, and so on along an infinite chain of interactions. Yet somewhere the Gordian knot has to be cut - it must be, since our everyday experience tells us that it is! The mathematical loophole lies at the limit-of the infinite chain of interactions. True, we cannot establish deterministic reality by starting within the chain and by attempting to argue in a past-to-future direction towards the limit. But if we were to start with deterministic reality at the limit, arguing backwards from future to past, there would be deterministic reality at every link of the chain. In other words, the trouble may well come from arguing the problem back-to-front instead of front-to-back."




Obviously the buck stops at our future dark energy de Sitter event horizon that cuts the Gordian knot. Note Hoyle came to UCSD when I was there in the late 60's before he really had this idea matured - did I.J. Good get it from Hoyle or the other way round. Obviously, they knew each other.

"This was the stage of my thinking following the work of the 1964-70 period, before it became apparent, from the arguments given earlier, than an enormous intelligence must be abroad in the Universe. As the Americans say, this instantly creates a new ball game. ...the persistent religious conviction that the pattern of our lives is stored in the future looks as if it could quite well be correct."

That's the future hologram right there!

"At the mathematical limit discussed above. At the last trumpet! What an extraordinary way to describe the outcome of a sequence of arguments involving the condensation of the wavefunction, the need to avoid von Neumann?s mathematical result for finite systems, and time-symmetric electrodynamics. Of course one can argue that the correspondences are fortuitous. Notice, however, that in timesymmetric theory influences are indeed felt “in (less than) a moment, in (even less than) the twinkling of an eye”, and that all finite events are brought together at the mathematical limit in the future. Fortuitous or not, it is curious that so many people without scientific knowledge have believed in the idea, as if they had caught a glimpse of a difficult message which they could only express in terms of an everyday analogy.  Religion is an interesting but not really convincing example of the computer terminal idea."


Jul 23

Preface to my book Destiny Matrix 2010

Posted by: JackSarfatti |
Tagged in: Untagged 

 

“The future, and the future alone, is the home of explanation.”

      Henry Dwight Sedgwick, “An Apology for Old Maids” (1908)

 

You are now about to enter into a real Twilight Zone X-Files in which we are 3D hologram images projected from the future Mind of God located on our 2D future event horizon associated with the dark energy accelerating the expansion of our observable universe. The UFO data suggest that we can make Star Trek real. Our Mission Impossible Quixotic objective is to create low power dark energy warp drive and stargate time travel to the past and to parallel universes next door on neighboring branes. The tales that follow are the facts as truthfully as I, and the others, can recall. Believe it or not depending on your comfort zone dealing with the uncanny.

 

The basic idea of the Destiny Matrix, that our future influences our present that I have been professing since the 1970’s is now becoming mainstream. For example, the FQ Foundation website[1] published:

 

The Destiny of the Universe

“A radical reformulation of quantum mechanics suggests that the universe has a set destiny and its pre-existing fate reaches back in time to influence the past. It could explain the origin of life, dark energy and solve other cosmic conundrums. …

The universe has a destiny—and this set fate could be reaching backwards in time and combining with influences from the past to shape the present. It’s a mind-bending claim, but some cosmologists now believe that a radical reformulation of quantum mechanics in which the future can affect the past could solve some of the universe’s biggest mysteries, including how life arose. What’s more, the researchers claim that recent lab experiments are dramatically confirming the concepts underpinning this reformulation. …

 

‘It’s a very, very profound idea,’ says (Paul )Davies. (Yakir) Aharonov’s take on quantum mechanics can explain all the usual results that the conventional interpretations can, but with the added bonus that it also explains away nature’s apparent indeterminism. What’s more, a theory in which the future can influence the past may have huge—and much needed—repercussions for our understanding of the universe, says Davies.
Cosmologists studying the conditions of the early universe have been puzzling about why the cosmos seems so ideally suited for life. There are other mysteries too: Why is the expansion of the universe speeding up? What is the origin of the magnetic fields seen in galaxies? And why do some cosmic rays appear to have impossibly high energies? These questions cannot be answered just by looking at the past conditions of the universe. But perhaps, Davies ponders, if the cosmos has set final conditions in place—a destiny—then this, combined with the influence of the initial conditions set out at the beginning of the universe, might together perfectly explain these cosmic conundrums.”

 

by Julie Rehmeyer on July 2, 2010

 

Herbert Gold wrote in “Bohemia” (Simon & Schuster, 1993)

 

“The Bohemian physicist…contributes a balanced scientific non establishment for this expanding society. I don’t mean to disparage the work; either…among all the blatherers there sometimes appears a breakthrough thinker. Originality has always required a fertile expanse of fumble and mistake.  That’s the beauty of the option.  Your wastrel life might turn out to be just what’s required to save the planet. …

 

Sarfatti’s Cave is the name I’ll give to the Caffe Trieste in San Francisco, where Jack Sarfatti, Ph.D. in physics, writes his poetry, evokes his mystical, miracle-working ancestors, and has conducted a several-decade-long seminar on the nature of reality …

 

One of his soaring theories is that things, which have not happened, yet can cause events in the present…Obviously this has consequences for prediction, the nature of causality, our conceptions of logic … He has published papers in respectable physics journals. His poetry is widely photocopied. His correspondence with the great in several fields is voluminous, recorded on computer disks. Cornell University BA, University of California Ph.D., his credentials are impeccable.   

 

Following is a quotation from a lecture given to a San Francisco State University physics seminar on 30 April 1991:”

 


Causality-Violating Quantum Action-at-a-Distance?

 

The universe is created by intelligent design but the Designer lives in our far future[2] and has evolved from us [3]…Perhaps all of the works of cultural genius, from the music of Mozart to the physics of Einstein, have their real origin in the future. The genius may be a real psychic channeler whose mind is open to telepathic messages from the future.[4]  The genius must be well trained in his or her craft and intellectually disciplined with the integrity of the warrior in order to properly decode the quantum signals from the future. The purpose of our existence would then be to ensure, not only the creation of life on earth, but also the creation of the big bang itself!  We obviously cannot fail since the universe cannot have come into existence without us in this extreme example of Borgesian quantum solipsism.  Existentialism is wrong because it is an incorrect extrapolation of the old physics. Breton’s surrealism, with its Jungian idea of meaningful coincidence, is closer to the truth.  This would then be “The Final Secret of the Illuminati[5] - that charismatic chain of adepts[6] in quixotic quest of their “Impossible Dream” of the Grail. Enough of my subjective vision, now on to the objective physics.  pp. 14-16

 

“So now I am in the first hour of one of my deaths. The thought made me dizzy. I was reminded of Jack Sarfatti, Ph.D. physicist and reincarnation of the eleventh-century mystic Rabbi Sarfatti…with rapt descriptions of how events from the future cause events in the past.” p. 115



[2] Princeton’s Richard Gott has a new book “Time Travel” (2001) with essentially this idea years after I suggested it starting around 1973 based on my contact in 1953.

[3] The influence of Harvard‘s Henry Dwight Sedgwick on my thought here is obvious.

[4] This precognitive remote viewing funded by the CIA and the DIA, as told in James Schnabel’s “Remote Viewers: The Secret History of America’s Psychic Spies”, is a violation of quantum physics but not post-quantum physics. The mathematics of this is in papers by Antony Valentini.

[5] Book by Robert Anton Wilson

[6]  Heinz Pagels in “The Cosmic Code” also talks about this as well as his own dream of his death that came true.  Usama bin Laden talking of his 911 Attack on America, mentions such precognitive dreams in the horridly evil videotape released by the Pentagon.

Seth Lloyd1, Lorenzo Maccone1, Raul Garcia-Patron1, Vittorio Giovannetti2, Yutaka Shikano1,3 wrote
"Einstein’s theory of general relativity allows the existence
of closed timelike curves, paths through spacetime
that, if followed, allow a time traveler – whether human
being or elementary particle – to interact with her former
self. ...
This paper explores a particular
version of closed timelike curves based on combining
quantum teleportation with post-selection. The resulting
post-selected closed timelike curves (P-CTCs) provide
a self-consistent picture of the quantum mechanics
of time-travel. ...
As in
all versions of time travel, closed timelike curves embody
apparent paradoxes, such as the grandfather paradox, in
which the time traveller inadvertently or on purpose performs
an action that causes her future self not to exist.
Einstein (a good friend of G¨odel) was himself seriously
disturbed by the discovery of CTCs [11]. Because the
theory of P-CTCs rely on post-selection, they provide
self-consistent resolutions to such paradoxes: anything
that happens in a P-CTC can also happen in conventional
quantum mechanics with some probability. Similarly, the
post-selected nature of P-CTCs allows the predictions
and retrodictions of the theory to be tested experimentally,
even in the absence of an actual general-relativistic
closed timelike curve. ...
The G¨odel universe consists of a cloud of
swirling dust, of sufficient gravitational power to support
closed timelike curves. Later, it was realized that closed
timelike curves are a generic feature of highly curved, rotating
spacetimes: the Kerr solution for a rotating black
hole contains closed timelike curves within the black hole
horizon; and massive rapidly rotating cylinders typically
are associated with closed timelike curves [2, 8, 12]. The
topic of closed timelike curves in general relativity continues
to inspire debate: Hawking’s chronology protection
postulate, for example, suggests that the conditions
needed to create closed timelike curves cannot arise in
any physically realizable spacetime ...
Hartle and Politzer pointed out that in the presence
of closed timelike curves, the ordinary correspondence
between the path-integral formulation of quantum
mechanics and the formulation in terms of unitary evolution
of states in Hilbert space breaks down [5, 7]. Morris
et al. explored the quantum prescriptions needed to construct
closed timelike curves in the presence of wormholes,
bits of spacetime geometry that, like the handle
of a coffee cup, ‘break off’ from the main body of
the universe and rejoin it in the the past [4]. Meanwhile,
Deutsch formulated a theory of closed timelike
curves in the context of Hilbert space, by postulating
self-consistency conditions for the states that enter and
exit the closed timelike curve ...
Quantum mechanics supports a variety
of counter-intuitive phenomena which might allow
time travel even in the absence of a closed timelike curve
in the geometry of spacetime. ...
We start from the prescription that time travel
effectively represents a communication channel from the
future to the past. ...
A well-known quantum communication channel is
given by quantum teleportation, in which shared entanglement
combined with quantum measurement and classical
communication allows quantum states to be transported
between sender and receiver. We show that if
quantum teleportation is combined with post-selection,
then the result is a quantum channel to the past. The entanglement
occurs between the forward- and backward going
parts of the curve, and post-selection replaces the
quantum measurement and obviates the need for classical
communication, allowing time travel to take place. ...
entanglement and projection can give rise to closed timelike
curves ...
Deutsch’s theory has recently
been critiqued by several authors as exhibiting
self-contradictory features [33–36]. By contrast, although
any quantum theory of time travel quantum mechanics
is likely to yield strange and counter-intuitive results,
P-CTCs appear to be less pathological [17]. They are
based on a different self-consistent condition that states
that self-contradictory events do not happen (Novikov
principle [29]). ...
Pegg points out that this can arise because
of destructive interference of self-contradictory histories
[22]. ...
in addition to
general-relativistic CTCs, our proposed theory can also
be seen as a theoretical elaboration of Wheeler’s assertion
to Feynman that ‘an electron is a positron moving
backward in time’ [16]. In particular, any quantum
theory which allows the nonlinear process of postselection
supports time travel even in the absence of general relativistic
closed timelike curves.
The mechanism of P-CTCs [17] can be summarized by
saying that they behave exactly as if the initial state of
the system in the P-CTC were in a maximal entangled
state (entangled with an external purification space) and
the final state were post-selected to be in the same entangled
state. When the probability amplitude for the transition
between these two states is null, we postulate that
the related event does not happen (so that the Novikov
principle [29] is enforced). ...
Note that Deutsch’s
formulation assumes that the state exiting the CTC in
the past is completely uncorrelated with the chronologypreserving
variables at that time: the time-traveler’s
‘memories’ of events in the future are no longer valid.
The primary conceptual difference between Deutsch’s
CTCs and P-CTCs lies in the self-consistency condition
imposed. ...
It seems that, based on what is currently known on
these two approaches, we cannot conclusively choose PCTCs
over Deutsch’s, or vice versa. Both arise from
reasonable physical assumptions and both are consistent
with different approaches to reconciling quantum mechanics
with closed timelike curves in general relativity.
A final decision on which of the two is “actually the case”
may have to be postponed to when a full quantum theory
of gravity is derived (which would allow to calculate
from first principles what happens in a CTC) or when
a CTC is discovered that can be tested experimentally. ...
[Aharonov's theory]
Here we briefly comment on the two-state vector formalism
of quantum mechanics [48, 51]. It is based on
post-selection of the final state and on renormalizing the
resulting transition amplitudes: it is a time-symmetrical
formulation of quantum mechanics in which not only the
initial state, but also the final state is specified. As such,
it shares many properties with our post-selection based
treatment of CTCs. In particular, in both theories it
is impossible to assign a definite quantum state at each
time: in the two-state formalism the unitary evolution
forward in time from the initial state might give a different
mid-time state with respect to the unitary evolution
backward in time from the final state. Analogously
in a P-CTC, it is impossible to assign a definite state
to the CTC system at any time, given the cyclicity of
time there ...
Another aspect that the two-state
formalism and P-CTCs share is the nonlinear renormalization
of the states and probabilities. In both cases this
arises because of the post-selection. In addition to the
two-state formalism, our approach can also be related to
weak values [48, 52], since we might be performing measurements
between when the system emerges from the
CTC and when it re-enters it. Considerations analogous
to the ones presented above apply. It would be a mistake,
however, to think that the theory of post-selected closed
timelike curves in some sense requires or even singles out
the weak value theory. Although the two are compatible
with each other, the theory of P-CTCs is essentially
a ‘free-standing’ theory that does not give preference to
one interpretation of quantum mechanics over another. ...
the non-unitarity comes from the fact that, after the CTC is
closed, for the chronology-respecting system it will be forever
inaccessible. The nonlinearity of (9) is more difficult
to interpret, but is connected with the periodic boundary
conditions in the CTC. ...
when quantum fields inside a CTC interact with
external fields, linearity and unitarity is lost. ...
Hartle notes
that CTCs might necessitate abandoning not only unitarity
and linearity, but even the familiar Hilbert space
formulation of quantum mechanics [7]. Indeed, the fact
that the state of a system at a given time can be written
as the tensor product states of subsystems relies crucially
on the fact that operators corresponding to spacelike
separated regions of spacetime commute with each
other. When CTCs are introduced, the notion of ‘spacelike’
separation becomes muddied. The formulation of
closed timelike curves in terms of P-CTCs shows, however,
that the Hilbert space structure of quantum mechanics
can be retained. ...
any quantum theory that allows the nonlinear
process of projection onto some particular state, such
as the entangled states of P-CTCs, allows time travel
even when no spacetime closed timelike curve exists. ...
projection is a non-linear process that cannot be implemented deterministically
in ordinary quantum mechanics, it can easily be implemented in a probabilistic fashion.
Consequently,  the effect of P-CTCs can be tested simply by
performing quantum teleportation experiments, and by
post-selecting only the results that correspond to the desired
entangled-state output. ...
it might be possible
to implement time travel even in the absence of a
general-relativistic closed timelike curve. The formalism
of P-CTCs shows that such quantum time travel can be
thought of as a kind of quantum tunneling backwards
in time, which can take place even in the absence of a
classical path from future to past. ...
It has been long known that nonlinear quantum mechanics
potentially allows the rapid solution of hard problems
such as NP-complete problems [56]. The nonlinearities
in the quantum mechanics of closed timelike
curves is no exception ...

Bennett et al. argue, the programmer who is using a
Deutschian closed timelike
curve as part of her quantum computer typically finds
the output of the curve is completely decorrelated from
the problem she would like to solve: the curve emits random
states.
In contrast, because P-CTCs are formulated explicitly
to retain correlations with chronology preserving curves,
quantum computation using P-CTCs do not suffer from
state-preparation ambiguity. That is not so say that PCTCs
are computationally innocuous: their nonlinear
nature typically renormalizes the probability of states in
an input superposition, yielding to strange and counterintuitive
effects. For example, any CTC can be used
to compress any computation to depth one, as shown
in Fig. 2. Indeed, it is exactly the ability of nonlinear
quantum mechanics to renormalize probabilities from
their conventional values that gives rise to the amplification
of small components of quantum superpositions
that allows the solution of hard problems. Not least
of the counter-intuitive effects of P-CTCs is that they
could still solve hard computational problems with ease!
The ‘excessive’ computational power of P-CTCs is effectively
an argument for why the types of nonlinearities
that give rise to P-CTCs, if they exist, should only
be found under highly exceptional circumstances such as
general-relativistic closed timelike curves or black-hole
singularities. ...
We have extensively argued that P-CTCs are physically
inequivalent to Deutsch’s CTCs. In Sec. II we
showed that P-CTCs are compatible with the pathintegral
formulation of quantum mechanics. This formulation
is at the basis of most of the previous analysis
of quantum descriptions of closed time-like curves, since
it is particularly suited to calculations of quantum mechanics
in curved space time. P-CTCs are reminiscent of,
and consistent with, the two-state-vector and weak-value
formulation of quantum mechanics. It is important to
note, however, that P-CTCs do not in any sense require
such a formulation. ...
we have argued that, as Wheeler’s picture
of positrons as electrons moving backwards in time suggests,
P-CTCs might also allow time travel in spacetimes
without general-relativistic closed timelike curves. If nature
somehow provides the nonlinear dynamics afforded
by final-state projection, then it is possible for particles
(and, in principle, people) to tunnel from the future to
the past.
Finally, in Sec. V we have seen that P-CTCs are computationally
very powerful, though less powerful than the
Aaronson-Watrous theory of Deutsch’s CTCs.
Our hope in elaborating the theory of P-CTCs is that
this theory may prove useful in formulating a quantum
theory of gravity, by providing new insight on one of the
most perplexing consequences of general relativity, i.e.,
the possibility of time-travel."