Text Size

Stardrive

 
  1. · Share
    • Jack Sarfatti Addressing some of Jim Woodward's key objections. (some key equation jpgs from original missing here - too lazy now to put them in)

      Let's work some elementary toy models.

      Start with the static LNIF class of detectors


      the proper acceleration is

      g ~ gtt^-1/2dgtt/dr

      1) gtt = 1 - rs/r

      rs/r < 1

      Let the source be at r ---> infinity, therefore gtt(source) ~ 1

      1 + z = (1 - rs/r)^1/2 < 1 BLUE SHIFT

      Both retarded and advanced radiation will seem to work in exactly the same way because the static metric is time symmetric.

      Homework problem 1
      Reverse roles of source and detector to get a red shift.

      2) simple de Sitter space. Note our future universe approaches this metric, our past universe is not at all de Sitter. You cannot model our past particle horizon with a de Sitter metric in our early universe.

      gtt = 1 - r^2/A

      this is observer-dependent.

      The detector INSIDE the horizon is at r = 0 where gtt = 1

      Let, the emitter be near the horizon a distance Lp from it as in Lenny Susskind's stretched membrane model

      First of all now we see we have a red shift because for all r

      1 + z = (1 - r^2/A)^-1/2 > 1

      In particular, for the stretched membrane

      r ~ A^1/2 - Lp

      1 + z = (1 - (A - 2A^1/2Lp + Lp^2)/A)^-1/2

      where Lp^2/A << 1

      1 + z ~ + (Lp/A^1/2)^-1/2 = (A^1/2/Lp)^1/2 = femit/fobsv >> 1

      Suppose further that

      femit = c/Lp

      Therefore,

      fobsv = femit(Lp/A^1/2)^1/2 = (c/Lp)(Lp/A^1/2)^1/2 = c/(LpA^1/2)^1/2

      i.e. c/(Geometric mean of shortest and longest length scales)

      This red shift is for retarded radiation from a past de Sitter horizon and/or

      advanced radiation from a future de Sitter horizon.

      However, we do not have a past de Sitter horizon.

      The Unruh temperature for c/(LpA^1/2)^1/2 via Stefan-Boltzmann law gives precisely the observed dark energy density hc/Lp^2A.

      However, to get w = -1 ZPF at r = 0 and to fit the facts, this must be advanced red shifted Wheeler-Feynman Hawking-Unruh radiation of energy density hc/Lp^4 on our future horizon.

      Jim Woodward's blue shift is a different concurrent effect from

      This will be a relatively small co-moving cosmological blue shift subtraction from the dominant acceleration = gravity (EEP) red shift.

      Note that as is intuitively obvious from Tamara Davis's horizon diagram below

      (A^1/2/Lp)^1/2 ~ (10^29/10^-33)^1/2 ~ 10^31 >> anow/athen

      That is, there is no way a cosmological blue shift of the advanced radiation can over power this huge gravity red shift on the stretched horizon.

      There are several causes of frequency shift, cosmological, peculiar velocity, gravity-acceleration.

      In the case of retarded radiation from us in the accelerating actual universe, the cosmological redshift would be super-imposed on the acceleration blue shift for the static LNIF. The latter will dominate because of gtt^-1/2 --> infinity classically at our future horizon's intersection with the emitter's future light cone that happens at a finite-comoving distance.

      Also if you look at Hawking's paper and compare it with Tamara Davis's diagram, it's obvious that no retarded radiation can ever reach us from our future dark energy horizon. Yet, Hawking says we can see horizon radiation. Therefore, it would follow that the horizon radiation we see is net advanced Wheeler-Feynman radiation.
  2. Friends
    See All
    • Burton Lee
    • Richard Lubbock
    • Tenzin Gyeltsen
    • Philip Noicon Move
    • Joseph Nechvatal
    • Stewart Swerdlow
    • Jonathan Vos Post
    • Usman Zaheer

We recede from our past particle horizon, but we approach our future event horizon. Therefore, retarded radiation in our past light cone is cosmologically red shifted.

Advanced radiation in our past light cone is cosmologically blue shifted.

However, because the effective available space between us and our future horizon is contracting - even though space as a whole is speeding up in its expansion from the anti-gravity dark energy (virtual bosons w = -1), it follows that

Retarded radiation on our future light cone is blue shifted at our future horizon.

Advanced radiation from our future horizon to us at r = 0 is redshifted because it sees available space expanding backwards in time.

Also when you use the static LNIF representation

g00 = 1 - r^2/A

the gravity red shift of hc/Lp^4 Hawking radiation at r = A^1/2 - Lp is the observed dark energy density of hc/Lp^2A at the r = 0 detector.

This is the best of all possible explanations of the dark energy rooted in the Wheeler-Feynman idea.

OK, Jim here is the answer. We are getting closer to our future horizon in co-moving distance as cosmic time from the inflation -> hot big bang goes on.

Looking at Tamara Davis's causal diamond picture. Uncompensated advanced Wheeler-Feynman black body radiation back from our future horizon to us starts at a(then) ~ 8 billion light years

It reaches us at about

a(now) ~ 14 billion light years



~ 14/8 ~ 7/4 ~ 1.75  > 1

But this co-moving geodesic LIF cosmological redshift is still small compared to the much larger off-geodesic static LNIF gravity redshift of order

1 + z' ~ 10^123 >> 1

http://en.wikipedia.org/wiki/Redshift



So in that sense, the co-moving distance to the future horizon is contracting for retarded radiation from us to our future horizon - hence blue shift for it. The opposite for back from the future Hawking radiation from our future horizon to us - hence red shift for it.


Begin forwarded message:

From: JACK SARFATTI <sarfatti@pacbell.net>
Subject: Re: the redshift or blueshift depends on the total experimental arrang ement.
Date: March 29, 2013 12:28:50 AM PDT
To: "jfwoodward@juno.com" <jfwoodward@juno.com>


On Mar 29, 2013, at 12:11 AM, "jfwoodward@juno.com" <jfwoodward@juno.com> wrote:

If that is so Jack, what you have is a compelling argument against Hawking radiation, the advanced part anyway, having anything to do with our present.  In the static LNIF rep

gtt = 1 - r^2/A

is TIME SYMMETRIC - WORKS SAME WAY FOR RETARDED & ADVANCED.

and with that metric, which only is in our future not in our past we get

hc/Lp^4 on the future horizon r = A^1/2 - Lp redshifts down to the observed dark energy density hc/Lp^2A

that is simple mathematics from




For the blue shifting of advanced radiation is a consequence ONLY of the fact that the radiation passes from expanded space to more compact space in transit, causing the wavelength of the radiation to decrease.  It has nothing to do with the circumstances of emission.

In fact its just the opposite inside our causal diamond observable patch of the multiverse.

The electron-positron pairs stuck on our relative horizon have enormous proper accelerations c^2/Lp, the horizon is not expanding at all away from us, it's at fixed r = A^1/2 - Lp from us. It has nothing directly to do with expanding space in this conformal diagram. In fact, we are getting closer in co-moving distance to our future horizon whilst receding away from our past horizon.




The way you can salvage your argument is to claim that Hawking radiation (retarded) from our past cosmic horizon is redshifted and so on.

No, that does not work at all. Our past metric is nothing like de Sitter and hc/Lp^2A is way too big in the past because A is smaller!

A approaches a fixed asymptote (middle solid curve below)




On Mar 28, 2013, at 5:32 PM, JACK SARFATTI <adastra1@me.com> wrote:


Begin forwarded message:

From: JACK SARFATTI <sarfatti@pacbell.net>
Subject: the redshift or blueshift depends on the total experimental arrangement.
Date: March 28, 2013 5:19:43 PM PDT
To: "PhysicsFellows-request@mail.softcafe.net" <PhysicsFellows-request@mail.softcafe.net>
Bcc: james Woodward <jfwoodward@juno.com>

Jim

Bottom line, is that it looks like there are two competing effects for the advanced waves.

I. Your dynamic co-moving LIF back-from-the-future blue shift

II. My static LNIF advanced red shift.

with II >> I

For the co-moving metric detectors

1 + z = femit/fobs   definition.

1 + z = anow/athen  derivation from the co-moving metric for null geodesics



k = 0

1) retarded spherical waves of positive frequency in an expanding universe

Therefore, then = emit be in our past.

now = obsv

1 +  zret = anow/athen

1 + zret = > 1  retarded co-moving LIF red shift

2) advanced spherical waves of positive frequency in an expanding universe coming back from the future to now from a co-moving emitter to a co-moving receiver

1 + zadv = femit/fobs = anow/athen < 1   advanced co-moving LIF blue shift

Which was what you said.

The situation is different for static LNIF detectors in which the far future metric in de Sitter space for our accelerating dark energy universe is

ds^2 ~ -c^2(1 - r^2/A)dt^2 + (1 - r^2/A)^-1dr^2 + ...

we are at r = 0 and the proper acceleration of the detector at fixed r is

g(r) ~ g00^-1/2dg00/dr

g00 ~ 1 - r^2/A g(future horizon) -> infinity classically in fact it's large and finite c^2/Lp ~ 10^54 cm/sec^2 from the Planck cut off  Now in fact the virtual electron positron pairs are stuck on this horizon relative to us at r = 0. They have plenty of energy from their local thermal bath of Unruh photons to become real pairs relative to us.

They will Hawking radiate advanced waves to us from r = A^-1/2 to us at r = 0 at their local temperature of

T = hg/ckB = hc/LpkB


Now use the time symmetric static LNIF redshift formula starting from r = A^1/2 - Lp emission to r = 0 US reception.

<e674bae4544742b5f8d788e8dd76bfc1.png>

The redshifted result is

T' = hc/(LpA^1/2)^1/2

Using the Stefan Boltzmann law this is an energy density ~ T'^4, i.e. hc/Lp^2A exactly as observed for the dark energy density.

Since we at r = 0 have zero proper acceleration, we see this energy as w = -1 virtual photons of mean frequency c/(LpA^1/2)^1/2 rather than the w = + 1/3 real photons.

So we have TWO effects simultaneously.

Yes, there will I think be a small LIF blue shift correction to the much larger static LNIF advanced redshift.

1 + zadv = femit/fobs = anow/athen < 1   advanced co-moving LIF blue shift

However,  anow/athen is of order unity, i.e. 46/55. You can see we are at about 46 billion light years from Alpha creation in Penrose conformal time. Our future light cone intersects our future event horizon at roughly 55 billion light years. We have to look at the de Sitter metric in conformal time and then do a calculation of the usual anow/athen. I need to check this more carefully of course. Right now I assumed that a(t) is linear in Penrose conformal time, but this may be mistaken.

Jack Sarfatti
Red Shift? Blue Shift? Both?
Jack Sarfatti Not sure of this yet

Begin forwarded message:

From: JACK SARFATTI <sarfatti@pacbell.net>

Subject: the redshift or blueshift depends on the total experimental arrangement.
Date: March 28, 2013 5:19:43 PM PDT
To: "PhysicsFellows-request@mail.softcafe.net" <PhysicsFellows-request@mail.softcafe.net>
Bcc: james Woodward <jfwoodward@juno.com>

Jim

Bottom line, is that it looks like there are two competing effects for the advanced waves.

I. Your dynamic co-moving LIF back-from-the-future blue shift

II. My static LNIF advanced red shift.

with II >> I

For the co-moving metric detectors
http://upload.wikimedia.org/math/e/9/0/e90d3b510ad6906ca4a3d96297b4a52e.png
1 + z = femit/fobs definition.

1 + z = anow/athen derivation from the co-moving metric for null geodesics

k = 0

1) retarded spherical waves of positive frequency in an expanding universe

Therefore, then = emit be in our past.

now = obsv

1 + zret = anow/athen

1 + zret = > 1 retarded co-moving LIF red shift

2) advanced spherical waves of positive frequency in an expanding universe coming back from the future to now from a co-moving emitter to a co-moving receiver

1 + zadv = femit/fobs = anow/athen < 1 advanced co-moving LIF blue shift

Which was what you said.

The situation is different for static LNIF detectors in which the far future metric in de Sitter space for our accelerating dark energy universe is

ds^2 ~ -c^2(1 - r^2/A)dt^2 + (1 - r^2/A)^-1dr^2 + ...

we are at r = 0 and the proper acceleration of the detector at fixed r is

g(r) ~ g00^-1/2dg00/dr

g00 ~ 1 - r^2/A

g(future horizon) -> infinity classically

in fact it's large and finite c^2/Lp ~ 10^54 cm/sec^2 from the Planck cut off

Now in fact the virtual electron positron pairs are stuck on this horizon relative to us at r = 0. They have plenty of energy from their local thermal bath of Unruh photons to become real pairs relative to us.

They will Hawking radiate advanced waves to us from r = A^-1/2 to us at r = 0 at their local temperature of

T = hg/ckB = hc/LpkB

Now use the time symmetric static LNIF redshift formula starting from r = A^1/2 - Lp emission to r = 0 US reception.

The redshifted result is

T' = hc/(LpA^1/2)^1/2

Using the Stefan Boltzmann law this is an energy density ~ T'^4, i.e. hc/Lp^2A exactly as observed for the dark energy density.

Since we at r = 0 have zero proper acceleration, we see this energy as w = -1 virtual photons of mean frequency c/(LpA^1/2)^1/2 rather than the w = + 1/3 real photons.

So we have TWO effects simultaneously.

Yes, there will I think be a small LIF blue shift correction to the much larger static LNIF advanced redshift.

1 + zadv = femit/fobsv = anow/athen < 1 advanced co-moving LIF blue shift
http://www.google.com/url?sa=i&rct=j&q=&esrc=s&source=images&cd=&cad=rja&docid=hT2Wzkc2PhBr1M&tbnid=35ICO5P_Nhx5SM:&ved=&url=http://stardrive.org/stardrive/index.php/blog/back-from-the-future-cosmological-event-horizon-retrocausal-emergent-gravity-.html&ei=deJUUf-aB4fWiALyk4FY&bvm=bv.44442042,d.cGE&psig=AFQjCNHO9D0_cEisU48JMBoBAY_8NHElkQ&ust=1364603880981307
However, anow/athen is of order unity, i.e. 46/55. You can see we are at about 46 billion light years from Alpha creation in Penrose conformal time. Our future light cone intersects our future event horizon at roughly 55 billion light years. We have to look at the de Sitter metric in conformal time and then do a calculation of the usual anow/athen. I need to check this more carefully of course. Right now I assumed that a(t) is linear in Penrose conformal time, but this may be mistaken.
    •  
    •  
       
      Fifth FQXi Essay Contest: It From Bit, or Bit From It?
      lnkd.in
      The Fifth essay contest from the Foundational Questions Institute is now underway. The topic is about whether information is more fundamental than material objects. The subject is similar to the co...
    • Jack Sarfatti IT FROM BIT + BIT FROM IT = Conscious Universe as a John Archibald Wheeler Self-Excited Self-Organizing Circuit.
    • Jack Sarfatti http://en.wikipedia.org/wiki/John_Archibald_Wheeler
      en.wikipedia.org
      John Archibald Wheeler (July 9, 1911 – April 13, 2008) was an Americantheoretica...See More
    • Jack Sarfatti Michael Towler wrote about my theory: "Living matter and back-action
      In certain dark corners of the internet, can find speculation of the following nature:
      • Propose the wave function/pilot wave is intrinsically ‘mental’ and capable of qualia.
      • Equate
      the pilot wave with the mental aspect of the universe, generally: the
      particles are ‘matter’, and ‘mind’ the pilot wave.
      OK, who cares, except..
      • ‘Mental’ aspect of universe upgradeable to life/consciousness by self-organization.
      Happens when a physical system uses its own nonlocality in its organization.
      • In this case a feedback loop is created, as follows: system configures itself so as to
      set up its own pilot wave, which in turn directly affects its physical configuration,
      which then affects its non-local pilot wave, which affects the configuration etc..
      • Normally in QM this ‘back-action’ is not taken into account. The wave guides
      the particles but back-action of particle onto wave not systematically calculated.
      Of course, the back-action is physically real since particle movement determines
      initial conditions for next round of calculation. But there is no systematic way to
      characterize such feedback. One reason this works in practice is that for systems
      that are not self-organizing the back-action may not exert any systematic effect.
      Well, it’s not obviously wrong..!
      [see p.346, Bohm and Hiley’s Undivided Universe).]
    • Jack Sarfatti Towler continued: "Two-way traffic
      Important to note that pilot-wave theory does not take into account any effect of
      individual particle on its own quantum field (though Bohm and Hiley briefly sketch
      some ideas about how this might happen, see e.g. Und
      ivided Universe pp. 345-346).
      • Idea that particles collectively affect quantum field of a single particle is contained in the standard
      notion that shape of quantum field of a particle is determined by shape of environment (which
      consists of many particles, and is part of the boundary conditions put into the Schr¨odinger equation
      before solving it, even in conventional QM).
      • Celebrity nutjob Jack Sarfatti (see e.g., er.. www.stardrive.org) in particular has emphasized
      the need for an explanation of how the individual particle influences its own field and has proposed
      mechanisms for such ‘back-action’, also emphasizing its importance in understanding the mindmatter
      relationship and how consciousness arises (see earlier slide).
      • Assuming that notion of such an influence of the particle on its field can be coherently developed,
      we can then have two-way traffic between the mental and the physical levels without reducing one
      to the other. Role of Bohm’s model of the quantum system then would be that it provides a kind of
      prototype that defines a more general class of systems in which a field of information is connected
      with a material body by a two-way relationship.
      • Quantum theory is currently our most fundamental theory of matter and Bohm suggests that, when
      ontologically interpreted, it reveals a proto-mental aspect of matter. This is the quantum field,
      described mathematically by the wave function, which is governed by the Schr¨odinger equation.
      Bohm’s suggestion is known as panprotopsychism.. so at least you learned a new word today..!"
      stardrive.org
      Stardrive, ISEP, Internet Science Education Project
       
      Jack Sarfatti You are 100% correct on this Chris.
      However, I think the FX version will allow comments on their website. If that is really so, then you and others should post your comments on the submissions as well submit an essay. I will try to work on one myself -
      though I will be in London, Paris, South of France etc. during April & May.

      On Mar 27, 2013, at 12:07 PM, Chris Langan <cml325@gmail.com> wrote:

      Of course, everyone is aware that SciAm and Templeton are markedly slanted in their approaches.
      Speaking just for myself, past experience suggests that if one deviates in any way from their preferred viewpoints - respectively, atheistic physicalism and "humility theology", which essentially holds that theological truth is inaccessible and should be abandoned in favor of religious syncretism and mere "reconciliation" between science and religion - then one has approximately a snowball's chance in hell of winning the competition. (If your name has ever been mentioned by anyone at all in the same breath as, say, Intelligent Design, then your chances are somewhat worse.)
      On the other hand, if one's ideas already fall within those guidelines, then one may do just fine.

      On Wed, Mar 27, 2013 at 1:37 PM, Jack Sarfatti <sarfatti@pacbell.net> wrote:
      I think its same one as fx?

      Sent from my iPhone

      On Mar 27, 2013, at 10:04 AM, David Mathes <davidmathes8@yahoo.com> wrote:

      Jack

      John Templeton Foundation sponsors an interesting essay contest that just opened up...closes in

      http://fqxi.org/community/essay

      Topical: The theme for this Essay Contest is: "It from Bit or Bit from It?"
      The past century in fundamental physics has shown a steady progression away from thinking about physics, at its deepest level, as a description of material objects and their interactions, and towards physics as a description of the evolution of information about and in the physical world. Moreover, recent years have shown an explosion of interest at the nexus of physics and information, driven by the "information age" in which we live, and more importantly by developments in quantum information theory and computer science.

      We must ask the question, though, is information truly fundamental or not?
      Yes.
      Can we realize John Wheeler’s dream,

      Yes.
      or is it unattainable?

      No.
      We ask: ”It From Bit or Bit From It?”

      False dichotomy. It's both forming a creative self-organizing "self-excited circuit" of conscious intent.

      Michael Towler brilliantly describes my proposal on this in his Lecture 8 http://www.tcm.phy.cam.ac.uk/~mdt26/pilot_waves.html

      Possible topics or sub-questions include, but are not limited to:
      What IS information?
      That's an easy one: the Bohm quantum potential Q in particle mechanics and its generalization to field theory.

      John Leslie reviews 'The Undivided Universe' by David Bohm ...
      www.lrb.co.uk/v16/n09/john-leslie/the-absolute-now
      The Absolute Now. John Leslie. The Undivided Universe: An Ontological Interpretation of Quantum Theory by David Bohm, translated by Basil Hiley Routledge ...
      The Undivided Universe: An Ontological Interpretation ... - Amazon.ca
      www.amazon.ca › ... › New & Used Textbooks › Humanities › Philosophy
      In the The Undivided Universe, David Bohn and Basil Hiley present a ... Review. ' This is a brilliant book, of great depth and originality. Every physicist and ...
      One must also look at the pixelated cosmological horizons both past and future in which their area-entropies A may be the projective hologram screens where

      N = A/Lp^2 = A^3/2/L^3 ~ 10^123 asymptotically into the far future

      L = 3D voxel scale (quantum of volume of the hologram image)

      We are inside these past and future cosmological 2D anyonic topological computing horizons at the exact center always at each point along our world line.

      Tamara Davis, Ph.D. Fig 1.1c http://dark.nbi.ku.dk/people/tamara/
      What is its relation to “Reality”?

      Depends what you mean by the word. If one means the totality of possible measurement patterns, then if one believes that the world is a quantum bit hologram image simulation, then matter is the hologram image projected both ways in time from our observer-dependent past particle and future de Sitter dark energy cosmological horizons inside the light speed limited "causal diamond" of our subjective observable universe.

      The hardware hologram screens are the horizons where g00 = 0 in the static LNIF representation of the cosmological metric.

      For example, for static LNIF observers with proper accelerations

      g(r) ~ c^2g00^-1/2dg00/dr

      g00 ~ 1 - r^2/A

      where WE are always at r = 0

      How does nature (the universe and the things therein) “store” and “process” information?
      How does understanding information help us understand physics, and vice-versa?
      (Note: While this topic is broad, successful essays will not use this breadth as an excuse to shoehorn in the author's pet topic, but will rather keep as their central focus the theme of whether information or “material” objects are more fundamental.)

      Additionally, to be consonant with FQXi's scope and goals, essays should be primarily concerned with physics (mainly quantum physics, high energy 'fundamental' physics, and gravity), cosmology (mainly of the early universe), or closely related fields (such as astrophysics, astrobiology, biophysics, mathematics, complexity and emergence, and philosophy of physics), insofar as they bear directly on questions in physics or cosmology.
      Foundational: This Contest is limited to works addressing, in one of its many facets, our understanding of the deep or "ultimate" nature of reality.

      Submission: Essays and accompanying material must be submitted online using the webform between the dates of March 25, 2013, and June 28, 2013 (until 11:59PM Eastern Time). Applicants must provide accurate contact information, an abstract of their essay, a brief biographical statement, and their essay.

      D
  • Which of the Basic Assumptions of Modern Physics are Wrong? Announcing the 4th Foundational Question...
    lnkd.in
    There's something unnerving about unifying physics. The two theories that need to be unified, quantum field theory and Einstein's general theory of relativity, are both highly successful. ...
  • Jack Sarfatti One assumption that is wrong is the no-signaling arguments in quantum theory. They are of course correct for orthodox quantum theory of dead simple matter like we see in scattering experiments. This follows from the linearity of the Hermitian operators and the unitarity of the time evolution of the wave function. However, these assumptions are violated in complex open systems far from thermodynamic equilibrium with spontaneous broken symmetries in the ground state that has an emergent order parameter. This order parameter is a giant quantum wave in ordinary space with an enormous number of integer spin "bosons" in the same single-particle micro-quantum state. This giant quantum wave is also a Glauber coherent state that corresponds to a non-Hermitian boson destruction operator. It's time evolution is not unitary and the dynamics is highly nonlinear. These macro-quantum coherent states can be entangled with each other and signal nonlocality without the need of a classical decryption key seems possible. Living matter is such a system. Experiments by Ben Libet, Dean Radin, Dick Bierman and Daryl Bem show a back-from-the-future presponse that can be explained as future to past entanglement signal nonlocality of distinguishable non-orthogonal Glauber coherent states. Memory can also be explained this way as past to future signal nonlocality.
Mar 09

Causal Discovery Algorithms

Posted by: JackSarfatti |
Tagged in: Untagged 



Begin forwarded message:

From: JACK SARFATTI <Sarfatti@PacBell.net>
Subject: [Starfleet Command] Re: Causal Discovery Algorithms - where to draw the line in the sand on the domain of validity of orthodox quantum no entanglement signaling
Date: March 9, 2013 12:47:28 PM PST
To: Exotic Physics <exoticphysics@mail.softcafe.net>
Reply-To: SarfattiScienceSeminars@yahoogroups.com


Right on the money
where to draw the line in the sand on the domain of validity of orthodox quantum no entanglement signaling postulate

"The deBroglie-Bohm interpretation is a prominent example
of a model that seeks to provide a causal explanation
of Bell correlations using superluminal causal influences.

Consider the deBroglie-Bohm interpretation of a
relativistic theory such as the model of QED provided by
Struyve and Westman [18], or else of a nonrelativistic theory
wherein the interaction Hamiltonians are such that
there is a maximum speed at which signals can propagate.
In both cases, it is presumed that there is a preferred rest
frame that is hidden at the operational level. In a Bell
experiment, if the measurement on the left wing occurs
prior to the measurement on the right wing relative to the
preferred rest frame, then there is a superluminal causal
influence from the setting on the left wing to the outcome
on the right wing, mediated by the quantum state,
which is considered to be a part of the ontology of the
theory [19]. (Note that no causal influence from the outcome
of the first experiment to the outcome of the second
is required because the outcomes are deterministic functions
of the Bohmian conguration and the wavefunction.)
It follows from our analysis that the parameters in
the causal model posited by the deBroglie-Bohm inter
pretation must be ne-tuned in order to explain the lack
of superluminal signalling.

Valentini's version of the deBroglie-Bohm interpretation
makes this fact particularly clear. In Refs. [20, 21]
he has noted that the wavefunction plays a dual role in
the deBroglie-Bohm interpretation. On the one hand,
it is part of the ontology, a pilot wave that dictates the
dynamics of the system's conguration (the positions of
the particles in the nonrelativistic theory). On the other
hand, the wavefunction has a statistical character, specifying
the distribution over the system's congurations.
In order to eliminate this dual role, Valentini suggests
that the wavefunction is only a pilot wave and that any
distribution over the configurations should be allowed as
the initial condition. It is argued that one can still recover
the standard distribution of congurations on a coarsegrained
scale as a result of dynamical evolution [22].

Within this approach, the no-signalling constraint is a
feature of a special equilibrium distribution. The tension
between Bell inequality violations and no-signalling
is resolved by abandoning the latter as a fundamental
feature of the world and asserting that it only holds as
a contingent feature. The fine-tuning is explained as the
consequence of equilibration. (It has also been noted in
the causal model literature that equilibration phenomena
might account for fine-tuning of causal parameters [23].)
Conversely, the version of the deBroglie-Bohm interpretation
espoused by Durr, Goldstein and Zhangi [24] {
which takes no-signalling to be a non-contingent feature
of the theory { does not seek to provide a dynamical explanation
of the fine-tuning. Consequently, it seems fair
to say that the fine-tuning required by the deBroglie-
Bohm interpretation is less objectionable in Valentini's
version of the theory."

On Mar 8, 2013, at 11:53 AM, JACK SARFATTI <jacksarfatti@gmail.com> wrote:


On Mar 8, 2013, at 11:19 AM, Ruth Elinor Kastner <rkastner@umd.edu> wrote:

Jack, interpretations are generally not Popper falsifiable since they are empirically equivalent with the theory they're interpreting.

In the case of quantum theory, the main different interpretations

1) Copenhagen - epistemic

Asher Peres's as a sub-category?

2) Bohm ontologic

3) Aharonov history-destiny

4) Cramer transactions

5) Hartle consistent histories

6) variations on many-worlds (Tegmark's Level 3)

are degenerate as you say.

However, Antony Valentini has shown how Bohm's theory in particular breaks the above impasse since it gives entanglement signal nonlocality violating no-cloning & no-signaling constraints for sub-quantum non-equlibrium violation of the Born probability rule. This is not even thinkable in some of the above interpretations.

Bohm's theory is a different theory from standard QM to the extent that it has possible empirical non-equivalence (for particle distributions deviating from Psi^2).

right

However there is a possible empirical prediction at the relativistic level for PTI in which there could be deviations from standard QED (which possibly have already been observed). I'm working on that now.
good

RK
________________________________________
From: JACK SARFATTI [sarfatti@pacbell.net]
Sent: Friday, March 08, 2013 2:06 PM
To: Ruth Elinor Kastner

Subject: Re: Causal Discovery Algorithms -  Stapp, Kastner, Cramer, Aharonov

The issue is what is the precise operational meaning of your particular distinction between "possibilities" and "actualized transactions"? How can we Popper falsify such a verbal distinction in the "informal language" (Bohm). In contrast, in Bohm's interpretation there is a clear distinction in the formalism between the "thoughtlike" (Stapp) quantum BIT potential Q and the "rocklike" (Stapp) hidden variable classical lepton-quark et-al world lines and electromagnetic-weak-strong classical field configurations.

On Mar 8, 2013, at 10:20 AM, Ruth Elinor Kastner <rkastner@umd.edu> wrote:


Thanks Jack,

My ontology takes spacetime relations as supervenient on causal relations., where the latter are relations among possibilities, and those are time-symmetrically related. The spacetime relations (i.e sets of events resulting from actualized transactions) are only indeterministically related to the time-symmetric causal relations characterizing the underlying possibilities. So I don't see anything here that refutes anything I'm doing. Of course I welcome anyone's pointing out what I  may be overlooking.

Best
Ruth

Now Available: The Transactional Interpretation of Quantum Mechanics, Ruth E. Kastner
http://www.cambridge.org/us/knowledge/discountpromotion/?site_locale=en_US&code=L2TIQM
________________________________________
From: JACK SARFATTI [sarfatti@pacbell.net]
Sent: Thursday, March 07, 2013 7:28 PM
To: art wagner

Subject: Causal Discovery Algorithms -  Stapp, Kastner, Cramer, Aharonov

This very important paper will have profound impact on Henry Stapp's and Ruth E. Kastner's models - also Cramer's & Aharonov's. I am curious about their future responses to it.

On Mar 7, 2013, at 11:41 AM, art wagner <wagnerart@hotmail.com<mailto:wagnerart@hotmail.com>> wrote:

Causal Discovery Algorithms -  http://xxx.lanl.gov/pdf/1208.4119.pdfbasi

________________________________
Subject: Re: Chinese Physicists Measure Speed of "Spooky Action At a Distance" | MIT Technology Review
From: sarfatti@pacbell.net<mailto:sarfatti@pacbell.net>
Date: Thu, 7 Mar 2013 11:36:59 -0800
To: PhysicsFellows@mail.softcafe.net<mailto:PhysicsFellows@mail.softcafe.net>

" because the “spooky action” cannot be used to send information faster than the speed of light."

Don't be so sure. The Fat Lady has not yet sung on that one. ;-)

The question is whether orthodox quantum theory is complete, or is it a limiting case of a more general theory with pre-sponse entanglement signal nonlocality for living matter?




__._,_.___
Reply via web post                            Reply to sender                             Reply to group                            Start a New Topic               Messages in this topic (1)                       RECENT ACTIVITY:
Visit Your Group
These are the logs of the starship NCC-1701-280Z.  Its five-year mission to seek out new minds, new quantum realms.  To boldly explore physics where no physicist  has gone before (in physical, virtual, or quantum worlds)!

Starmind(tm) -- Your daily journal to the industry's brightest stars.  You get infinite knowledge only with Starmind:

All hits.  All Physics. All the time.  And now in parallel and diverging universes.  (Thus proving they don't exist as separate entities --But have we gotten to them yet or not?)

** Patronize any Yahoo! Group Sponsor at your own risk.

- - - - - - Message From Starfleet  - - - (Read below) - - - - - - - - - - -
To change any characteristic of your online membership access, visit via web:
http://groups.yahoo.com/subscribe/SarfattiScienceSeminars

Join in our ongoing discussions and theoretical science writings:
http://groups.yahoo.com/messages/SarfattiScienceSeminars

Dr. Sarfatti may be reached at his e-mail or using Internet site:
http://stardrive.org
http://www.1st-books.com

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
To respond or comment directly to the group's archive, reply via e-mail:

SarfattiScienceSeminars@YahooGroups.com
Switch to: Text-Only, Daily Digest • Unsubscribe • Terms of Use • Send us Feedback .
 
__,_._,___

Notes on my Star Gate book
Like · · Share
  • Jack Sarfatti The equation in question is (in index-free short hand)

    DU/ds = dU/ds + {LC}UU

    This is a tensor equation- a geometric object, and the choices of local coordinate patches of differential geometry are irrelevant.

    We are interested in this book in the heuristic physical meaning of the equations not in the excess baggage of formalism that only obscures the essential physics leading many physicists astray into purely mathematical dead ends perhaps important to pure mathematics but not to physics. The Cornell philosophy of Hans Bethe, Ed Salpeter, Phil Morrison, Tommy Gold and Richard Feynman himself was to get the most physics with the least possible mathematics. This is in accord with Einstein's remark that any intelligent fool can make the subject more complicated than it need be. Indeed, this is the trend we see in modern theoretical physics today.

    DU/ds is what accelerometers measure locally on the test particle being observed.

    {LC}UU is what accelerometers measure locally on the detector observer of the test particle.

    dU/ds is the apparent or kinematical 4-acceleration of the test particle relative to the detector.

    The test particle and the detector are nearly coincident, i.e. their actual space-time separation must be small compared to the local radii of curvature of 4D spacetime for the equation not to break down. For example on Earth surface that curvature radius is about 10^13 cm, so this is not a problem for local experiments.

    The Levi Civita connection {LC} in Einstein's GR physically describes the fictitious inertial pseudo forces that appear to act on test particles. These inertial forces are caused by real forces on the local noninertial frames lnifs measuring the motion of the test particle.

    Newton's 2nd Law is for rest mass m constant

    F = mDV/ds = real 4-force

    V = 4-velocity of test particle

    ds = proper time differential

    DV/ds = dV/ds - {LC}VV = proper 4-acceleration. it is a gct group tensor

    In a local inertial frame lif {LC} = 0

    This is Einstein's equivalence principle

    In a local non inertial frame lnif

    {LC} =/= 0

    - m{LC}VV = all the inertial fictitious pseudo forces that seem to act on the test particle from the POV of the properly accelerating lnif detector observer, but don't.

    Note the g00 in the denominator that is zero at horizons.

    Note also that the quantum Unruh effect in which vacuum zero point virtual photons are transformed into real black body radiation photons is proportional to the local tensor proper acceleration DU/ds of the detector accelerometer.


Advanced Intelligence Agency
(a private sector contractor)

Memorandum for the Record

Subject: Technological Surprise Quantum Cryptography et-al - Threat Assessment Analysis

To: The Usual Suspects ;-)

From: A "Hippy" who Saved Physics ;-)

Cut to the chase, bottom line for intelligence analysts - excerpts

In regards to the computational power of computers with access to time machines, it
is straightforward to see that any computation in which an efficient method for checking
the solution exists, a source of random bits, sent through a checking algorithm which then
acts as a c-not on a time machine qubit, can arrive at the correct solution immediately
if the informational content of the solution is sufficiently less than the work function of
the time machine. Since time machine bits may also act as perfectly random sources,
the information may seem to be created from nothing, but one may also think of such
`calculations' as becoming an extremely lucky guesser, due to the re-weighting of histories
by the time machine. ... Conventional crytpography would pose little obstacle to such a
computer. ... public key certification by computer would be almost useless.

Hawking famously has cited the lack of future tourists as good evidence against time machines.

Disputed by UFO investigators where Hawking's statement is considered as part of the disinformation coverup by the UFO investigators and the UFO investigators are debunked as kooks, cranks, crackpots, paranoid conspiracy theorists by the opposition sometimes called "MAJIC" whose front org is the Committee to Investigate Claims of the Paranormal. However, rising above this din of factional wars of rivals inside the intelligence agencies of the major powers and their agents of influence, cut-outs, useful and useless idiots ;-), It is prudent to assume that whoever is really behind "UFOs" has such literally advanced super-technology at their disposal.

Thermodynamics of Time Machines
Michael Devin
Abstract
In this note, a brief review of the consistent state approach to systems containing closed timelike
curves[CTCs] or similar devices is given, and applied to the well known thermodynamic problem of
Maxwell's demon. The 'third party paradox' for acausal systems is defined and applied to closed
timelike curve censorship and black hole evaporation. Some traditional arguments for chronology
protection are re-examined ...

Since the original version of this paper in 2001[1], there has been a renewed interest in
time machine calculations, springing from a duality between acausal systems constrained
by a quantum analog of the Novikov[2] consistency principle, and the formalism of post-
selected ensembles developed by Aharonov and many others[3{5]. Interest has also grown in
the applications of such systems to computation theory, following the footsteps of Deutsch[6],
who employed a dierent superselection criteria leading to different physics. ...

In the past ten years the body of work of post-selected ensembles has grown to become the
standard concerning time machines[3],....  Some material has been added to reflect the more recent developments in black hole physics and post-selected systems.



Suppose we take the time looping bit and entangle it with the position of a particle in
a box. The box is divided by a partition into two unequal sections. In the case of a classic
Szilard engine we measure which side of the box the particle in on and then adiabatically
expand the section with the particle to the size of the original box, performing work. by
Landauer's principle we generate one bit of entropy reseting the measurement apparatus,
which is exactly equal to the maximum work we can extract from the engine by placing the
partition in the center. When the particle is entangled with the time machine qubit, the
probability distribution is no longer uniform and net work can be extracted. ...




As the noise drops to zero for a time machine, the work extractable per bit diverges. A perfectly reliable time machine can therefore violate the second law of thermodynamics by an arbitrarily large amount, but a noisy one has an effective limit. ...

In some of the cases considered in general relativity, with back reactions ignored, we find that CTCs and other time machines act like systems that do not need to borrow from anywhere to have energy. The number of accessible states grows exponentially with energy, and with all microstates equally probable, we naturally arrive at a negative temperature. A similar argument may be used for particle number to give negative chemical potential to the occupation numbers of each field mode. A similar argument may be used for particle number to give negative chemical potential to the occupation numbers of each field mode. If the number of particles
or energy is not somehow bounded then a divergence can result. This is especially the case when we have the internal entropy naturally maximized by eliminating the interaction of the time machine with it's environment due to ignoring back reaction.

The appearance of these divergences is often cited as support for Hawking's chronology protection conjecture[7, 8]. It is assumed that the  fluctuations must destroy the time machine before anything improper can occur. However, if this is the case, then it provides the very mechanism for making time machines well behaved entities with positive temperature. The higher the energy or occupation number of a particular field mode in a time machine, the more it is suppressed by the re-weighting of histories by the amplitude for such a high energy state to scatter onto the same state. In post-selected language the sample of high energy states acceptable to post selection is small because high energy modes tend to decay, and high particle number states tend to dissipate, with exponential probability. ...

The system is capable of extracting arbitrarily large amounts of work from an entangled system. In general we can imagine that systems with very large values of time machine negentropy will behave quite strangely, as the probability of exotic events could be exponentially amplified. ...

In regards to the computational power of computers with access to time machines, it
is straightforward to see that any computation in which an efficient method for checking
the solution exists, a source of random bits, sent through a checking algorithm which then
acts as a c-not on a time machine qubit, can arrive at the correct solution immediately
if the informational content of the solution is suffciently less than the work function of
the time machine. Since time machine bits may also act as perfectly random sources,
the information may seem to be created from nothing, but one may also think of such
`calculations' as becoming an extremely lucky guesser, due to the re-weighting of histories
by the time machine.

Essentially time machines are entropy pumps, similar to a classical heat engine. Instead of
transporting heat, they transport entropy, pushing a system of particles or a random message
in message space into a lower entropy state, but increasing the entropy of the environment
in some way not yet understood. The computations, like those of a true quantum computer,
are essentially analog computations. In this case eectively physically reversing classical
hard problems in time. Conventional crytpography would pose little obstacle to such a
computer. Instead one would have to devise ambiguous codes, which could be decoded into
a number of intelligible but incorrect messages, leaving the computer with a problem of
interpreting which was significant, a task made hard for entirely different reasons. A `brute
force' entropy attack assisted by a time machine would then more likely generate one of
the red-herring messages. Other unusual protocols might be used to increase security, but
public key certication by computer would be almost useless. ...

Hawking famously has cited the lack of future tourists as good evidence against time
machines. Although no one disputes this, it is an interesting case to consider for the would be
time theorist. One possible explanation for the lack of such `tourist' fields on the microscopic
scale could be something like the quantum zeno effect. The atom is locked in its state and
the cat never dies because we generally have good records of whether or not time travelers
have appeared. For such a traveler, our present would be his past, and such records in that
future of a lack of visitors from the future may act as a similar lock on using tunneling or
entanglement type phenomena as time machines for that purpose. Different possible tourists
may destructively interfere with each other, just as highly non-classical paths for macroscopic
systems do in path integral theory. Consider that the weight of a particular time tourist
scenario is determined by the amplitude for the tourist's state to scatter onto itself at his
later departure. For any large number of bits of the tourist, as those bits decohere with the
environment that weight should decrease exponentially.

A physical example of how one might look for such `tourists' could be realized by exploring
the third party paradox where the receiving channel is measured well before the time machine
exists. The spin measurements of that channel should be random, but if tourism is allowed,
then they may contain a message. If we consider ensembles that may or may not contain a
time machine, it is helpful to note that the weight factor for a particular history is an inner
product of two unit vectors, as well as a noise coecient. Both of these factors are less than
one, and a sampling from ensembles where the existence of a later time machine depends
on the reception of a message that enables it's construction will actually be suppressed
relative to other random possible messages. A statistical `weak censorship' counteracts the
spontaneous emergence of time machines, without absolutely forbidding them. It might
make for an interesting experiment to construct a post-selection equivalent of the tourist
problem, in which selection criteria followed more complex protocols.

In order for tourists to be suciently rare, the chronology protection mechanism need
not be absolute. Instead it need only be exponentially difficult for tourists to visit some
location in order for the expectation value of tourists to be nite, and thus hopefully small.

VI. CONCLUSION
In conclusion, time machines, if they exist at all, must possess fundamental limits on their
error rate and waste heat, irrespective of the exact method of construction. These limits can
be thought of as analogous to the old Carnot effciency of classical heat engines independent
of the specific construction of the engine. Most of the standard paradoxes associated with
time travel are mitigated by considering systems operating within these limits. The study
of acausal models still has much room for development. In the case of renormalization,
badly behaved bare models may form condensates, shifting the vacuum and creating a more
well behaved dressed model. Similarly, acausal bare models may lead to better behaved
approximately causal models when various corrections are accounted for. In cosmology and
landscape theory, some physicists have sought a model for the emergence of the Lorentzian
signature of the metric, a spontaneous symmetry breaking that creates time itself. If such
ambitions are ever to succeed they surely have to entertain causality as potentially only
approximate in the resulting cosmos.

Technical Appendix for Physicists may be skipped by non-Physicists

To new students of quantum mechanics, the Bell inequalities, delayed choice, and quantum eraser experiments have seemed to almost violate causality. The fact that they cannot
is a crucial consequence of the unitary nature of quantum mechanics. One of the most
troubling aspects of the information loss paradox is the apparent loss of unitarity. Not all
non-unitary maps are created equal, and trace over models of lossy processes do generally
preserve causality. Such models seemed adequate until Hawking radiation came along. The
eventual disintegration of the hole broke the analogy of environmental decoherence open-
ing up the possibility of `bad' nonunitary processes in some imagined acausal lossy theory
of quantum gravity. The aim of the remaining sections is to explore implications of this
possibility. A quantum eraser is a system that exhibits extreme nature of the delayed choice exper-
iment by measuring and then coherently erasing information about two different possible
paths for a system. By the no copy theorem a qubit that is created by measuring another
qubit can only be coherently erased by combining it with the original again. Coherent era-
sure makes the erased bit `unrecoverable in principle' and thus restores interference effects
relating to any superposition of the original bit before the creation of the measurement bit.
Two concerns in the information paradox were first, that an evaporated black hole might constitute an `in principle unrecoverable' process, and second that proposed complementarity scenarios would violate the no copy theorem, providing another way to erase measurements.



Both cases lead to breakdown of unitarity and subsequently causality. Complementarity has
to ensure the second scenario of a bit meeting its extra twin can not occur. This appears to
be the primary motivation for the recent 'rewall' models of black hole evaporation.
The inherent non-unitarity of time machines can easily be seen by observing the effect
that this probability skewing has on entangled particle pairs. Consider instead of a particle
in a box, the classic spin entangled pairs of particles. If we should choose one of the entangled particles to be sent an arbitrary distance away, then use the other as a control bit in our
time machine circuit, then the state of the pair becomes in general a mixed state. If we
designate a second communication channel to act as the control of another c-not gate on
the time machine bit, then we may measure a real correlation between that channel and the
spin measurements of the distant spin partner. A single time machine as a third party in
the mutual future of two observers can apparently effect nonlocal communication between
them. Thus the non-unitary effects of a time machine may be felt arbitrarily far away, even
in the past light cone of the arrival of  |out >.

Consider the equivalent case for a post-selected system where a single bit is selected
to be in a state |0> at two different times. In between these two it is acted on by by two
controlled not operations, one of an EPR pair, and a second being the desired input bit. The
post-selected statistics of the distant EPR partner will now reflect those of the chosen input
bit. Any time a superselection operator acts on an entangled system of particles to enforce
a particular final state on part of the system, the potential for acausal communications
between two third parties also appears. This `third party paradox', is an important element
in understanding the interaction between time machines and nonunitary dynamics.
So far it seems that time machines skew the statistics of ensembles to create effective
nonlinear dynamics. In turn most nonlinear quantum mechanics appears to be exploitable
to create time machines. Explicitly, one time machine can be used to create another, or any
number of others, through the third party paradox. A useful exercise here is to consider
the work done by these `child' machines and how it compares to the work extractable by
the parent alone. Each child `inherits' the noise of its parent, and shares to some degree
the back reaction of its siblings. If the spawning process introduces no additional noise,
then we can shift the arrival time of |out> to an earlier time and find an equivalent system
containing only the parent time loop. This is possible since the duration of the loop is not a
factor in the work function. The maximum work performed by the entire ensemble, minus
any entropy cost for erasing extra measurement bits, should still be less than or equal to
the original work function. ...

Early in the `black hole wars' Hawking tentatively proposed a theory of density matrices might be considered as a generalization of quantum mechanics capable of handling the apparent lack of unitary evolution in gravitational collapse[9]. This approach was heavily criticized for possible violations of locality or energy conservation[10]. Energy conservation can be maintained, but the trade-off between causality and non-unitarity remains. Any system that can act on a qubit to map orthogonal to non-orthogonal states, can be added
to a quantum eraser double interferometer to break the net balance between opposing interference patterns that locally cancel for distant entangled states. It would seem though that if such transitions were possible, then vacuum fluctuations would cause them to occur to any given bit eventually, and thus nonlocal interactions would be everywhere. ...


Hawking and others have contended that all systems containing time machines should
possess entropy in accord with the number of internal states `sent' to the past[11] ...
This scenario is trivially modeled in a post-selection experiment as simply
three measurements of a random bit, in which the first and last measurements are the same
result. ...

the importance of the relative phase information of out states that is crucial to preventing entangled particle pairs from allowing non-local communication. The classic double interferometer fails to detect any local interference effects when observing only one of the photons. The other photon may be in either of two states, and that bit contains either the path information of its cousin, eliminating the interference, or the two outcomes
contribute to two separate interference patterns exactly out of phase, such that the trace over those gives no local interference.

(not necessarily using Glauber coherent non-orthogonal entangled states - JS)

Some black holes are thought to contain ready-made time machines in the form of closed
timelike curves. The troubling behavior of the Kerr metric near the singularity was assumed
to be made safe by being behind the horizon, an early and important result supporting
the cosmic censorship hypothesis. However due to the third party effect, it would appear
that not only does the horizon fail to prevent information from leaving the CTC region,
it leads non-local communication between points far from the hole. These secondary time
machines can then effectively 'mine' the black hole for negentropy. Some fraction of the
entropy associated with the irreducible mass of the hole should then provide a bound on
this entropy, and therefore some constraints on k, for the CTC region. For the purposes of
chronology protection, horizons alone are ineffective 'transparent censors'. ...

One proposal in resolution to the black hole information paradox is to add a boundary
condition to the singularity[12]. Some critics argue this violates causality[13]. The argu-
ment against it can be illustrated with the following paradox. Under normal circumstances,
information, such as a volume of Shakespeare, falls into a black hole, which then evaporates
via Hawking radiation. If a boundary condition at the singularity is prescribed, then these
fields must be canceled by other contributions as they approach the singularity. These other
contributions are the in-falling components of the pairs of particles appearing from the vac-
uum, the outgoing of which constitute the Hawking radiation. Since each pair is strongly
entangled, and the in-falling radiation is forced to match up with the in-falling Shakespeare
volume via the superselection of global field congurations to fit the boundary condition,
then the outgoing radiation must be simply the scrambled volume of Shakespeare. Another
way of considering it is to imagine the field modes reflect off of the singularity, becoming
negative energy time reversed modes. They then travel out of the hole and reflect off the
potential outside the main black hole, becoming again positive energy, forward modes.
The boundary condition acts as a selector of global field congurations, much like the
post-selection operator used to model acausal ensembles. The proposed mechanism `similar
to state teleportation' is in fact the third party paradox communication channel arising
in both time machine and post selected systems. We may employ the same methods of
superselection to generate a time machine via the third party problem. The picture is
complicated slightly though by the presence of the incoming part of the Hawking pairs.
This incoming part may serve as the required noise that bounds the total work extractable
by all third party time machines. If no time machines are spawned this way, the work is
expended adjusting the outgoing radiation into the form of Shakespeare. One flaw in this method of teleportation is also that there is nothing to require that the teleported states leave the black hole before the original states enter it. ...