Text Size

Stardrive

Tag » firewall paradox
  1. Not all three statements are consistent it is claimed.
  2. 1) Hawking radiation is in a pure state.
  3. 2) The information carried by the radiation is emitted from the region near the horizon, with low energy effective field theory valid beyond some microscopic distance from the horizon.
  4. 3) The infalling observer encounters nothing unusual at the horizon.
 
 
Well 1) cannot be true since if Hawking radiation is black body it is not in a pure state it is a mixed state with a reduced density matrix that is not an idempotent projection operator.
 
 
So what is all the fuss about? ;-)
 
Throw away 1) and keep 2) and 3)?
 
Furthermore, there is no reason to go hog wild that the universe obeys unitarity at all levels of organization. Why should probability be conserved in the first place? Life does not seem to conserve probabilities. When Feynman gave an early lecture on his Lagrangian formulation of quantum theory Dirac was there with Einstein and Dirac asked Feynman if his theory was “unitary.” Feynman said he had no idea of what Dirac even meant at that time. Valentini has an extended quantum theory that is definitely not unitary for example. Feynman also asked why observables have to be Hermitian operators. Hermitian operators generate unitary transformations.
 
 
Unitarity is in Hilbert qubit pilot wave space what orthogonality is in the spacetime continuum. There is nothing sacred and absolute in either. There is no compelling reason to say that inner products of quantum states are invariant under time evolution. It works in a limited range of experiments - scattering experiments - very primitive smashing of things together - brute force not very subtle.
 
 
The S-MATRIX is a crude tool that has been elevated into The Golden Calf by the Priests of Orthodox Physics.
 
 
<imgres.jpg>
 

 

Goodbye to linear unitary S-Matrix conservation of information in the black hole firewall debate.

 
 
On Jul 7, 2014, at 2:24 PM, art wagner <wagnerart@hotmail.com> wrote:

Now, combine that with this ....


The ability of the Bohmian formalism to analyze this last type of problems

for (open) quantum systems remains mainly unexplored by the scientific community. The authors of this

review are convinced that the nal status of the Bohmian theory among the scientific community will

be greatly infuenced by its potential success in these type of problems that present non-unitary and/or

nonlinear quantum evolutions.

http://arxiv.org/pdf/1406.3151.pdf

 

Abstract. Bohmian mechanics provides an explanation of quantum phenomena in terms of point particles
guided by wave functions. This review focuses on the formalism of non-relativistic Bohmian mechanics,
rather than its interpretation. Although the Bohmian and standard quantum theories have different
formalisms, both give exactly the same predictions for all phenomena. Fifteen years ago, the quantum
chemistry community began to study the practical usefulness of Bohmian mechanics. Since then, the scientific 
community has mainly applied it to study the (unitary) evolution of single-particle wave functions,
either by developing efficient quantum trajectory algorithms or by providing a trajectory-based explanation
of complicated quantum phenomena. Here we present a large list of examples showing how the Bohmian
formalism provides a useful solution in different forefront research elds for this kind of problems (where
the Bohmian and the quantum hydrodynamic formalisms coincide). In addition, this work also emphasizes
that the Bohmian formalism can be a useful tool in other types of (non-unitary and nonlinear) quantum
problems where the influence of the environment or the global wave function are unknown. This review
contains also examples on the use of the Bohmian formalism for the many-body problem, decoherence
and measurement processes. The ability of the Bohmian formalism to analyze this last type of problems
for (open) quantum systems remains mainly unexplored by the scientific community. The authors of this
review are convinced that the final status of the Bohmian theory among the scientic community will
be greatly influenced by its potential success in these type of problems that present non-unitary and/or
nonlinear quantum evolutions. A brief introduction of the Bohmian formalism and some of its extensions
are presented in the last part of this review.
 
 
PS
 

Are entangled particles connected by wormholes? Support for the ER=EPR conjecture from entropy inequalities

If spacetime is built out of quantum bits, does the shape of space depend on how the bits are entangled? The ER=EPR conjecture relates the entanglement entropy of a collection of black holes to the cross sectional area of Einstein-Rosen (ER) bridges (or wormholes) connecting them. We show that the geometrical entropy of classical ER bridges satisfies the subadditivity, triangle, strong subadditivity, and CLW inequalities. These are nontrivial properties of entanglement entropy, so this is evidence for ER=EPR. We further show that the entanglement entropy associated to classical ER bridges has nonpositive interaction information. This is not a property of entanglement entropy, in general. For example, the entangled four qubit pure state |GHZ_4>=(|0000>+|1111>)/sqrt{2} has positive interaction information, so this state cannot be described by a classical ER bridge. Large black holes with massive amounts of entanglement between them can fail to have a classical ER bridge if they are built out of |GHZ_4> states. States with nonpositive interaction information are called monogamous. We conclude that classical ER bridges require monogamous EPR correlations.
PPS 
On Jul 7, 2014, at 2:11 PM, JACK SARFATTI <adastra1@me.com> wrote:

Yes, this is consistent with what I told Addinal today about Tegmark multiverse Levels 1, 2 & 3
 
3 is pilot “mind" field for “material" 1 & 2 that must be linked with wormholes as in ER = EPR, but they must not pinch off for consistent
CTC  time travel to past histories.
Those wormholes that do pinch off are perhaps the “discontinuities” that Fred Wolf mentioned. 

 
"It was shown recently that replacing classical geodesics with quantal (Bohmian) trajectories gives
rise to a quantum corrected Raychaudhuri equation (QRE). Here we derive the second order Friedmann
equations from the QRE, and show that this also contains a couple of quantum correction
terms, the first of which can be interpreted as cosmological constant (and gives a correct estimate
of its observed value), or as dark matter, while the second as a radiation term in the early universe,
which gets rid of the big-bang singularity and predicts an infinite age of our universe."

On Jul 7, 2014, at 2:01 PM, art wagner <wagnerart@hotmail.com> wrote:

 
PPPS
 
On Jul 6, 2014, at 11:20 PM, fred alan wolf <fawolf@ix.netcom.com> wrote:

          Actually there may be inconsistencies in the Deutsch model.  See attached paper.  I visited with these physicists who discovered them (Tom Imbo is the head physicist for the group) at UIC when I was in Chicago last May.  The bottom line is that there are CTCs that are discontinuous.  I quote their paper’s conclusion (in black with a different font):
 
Discussion:
 
We have considered Deutsch's model of a non-time traveling system interacting with a time traveler confined to a bounded region, and have demonstrated that the state of the non-time traveler in the asymptotic future can be a discontinuous function of the state in the asymptotic past. Furthermore, we have demonstrated that these discontinuities occur independent of the method of choosing a unique consistent time traveling state, as well as independent of whether Deutsch's assumption regarding the initial composite state or Politzer's generalization is used. Given the phenomenon of discontinuous evolutions within the Deutsch model, we note several possible reactions.
 
(1) Question the assumptions upon which Deutsch's model is based. However, relaxing the two most obvious
of these, as stated in the previous paragraph, does not provide any respite. Thus, the only remaining natural
assumptions to be questioned are that (a) the spatial degrees of freedom can be treated classically, (b) the effect of the systems on the surrounding spacetime can be neglected, and finally that (c) a quantum mechanical (as opposed to field-theoretic) model captures the relevant dynamics. Although over-idealizations can indeed lead to apparent discontinuities, none of (a)-(c) above seems obviously responsible for the discontinuous behavior in Deutsch's model. In particular, it is difficult to believe that there is no imaginable configuration utilizing a discontinuous gate for which these approximations are sufficiently justified.
 
(2) Accept the assumptions of the Deutsch model, but further assume that nature either does not utilize those
gates which are physically discontinuous, or does not allow initial states of the non-time traveler which are near a discontinuity. (Analogous tactics have been considered in the classical case as a way of avoiding the grandfather paradox .) However, this solution is somewhat ad hoc and inelegant. In addition, placing such restrictions on initial states and/or gates sacrifices one of the great strengths of Deutsch's approach which purports to provide a viable model for any set of initial conditions and any dynamics.
 
(3) Accept that the Deutsch model is correct as writ, but interpret the existence of discontinuous evolutions as
evidence that CTC's are unphysical.
 
(4) Acknowledge that quantum mechanics in the presence of CTC's is sufficiently strange that the existence of these discontinuities is a fitting physical consequence. Further study will be required not only to adequately
address these reactions, but also to answer other interesting questions raised by our results, such as: What are the exact properties of the gates which give rise to such peculiar evolutions? For any such gate, how are the points at which the evolution is discontinuous distributed in the space of initial states? Do these discontinuities occur in other approaches to quantum systems in the presence of CTC's? Regardless, it is clear that discontinuous evolutions are an unavoidable feature of the Deutsch model, and are yet another strange and fascinating consequence of the attempt to bring together quantum mechanics and gravity.
 
            Generally speaking discontinuities indicate that we may be lacking a required extension of the model.  My PhD thesis was about discontinuities appearing in large amplitude (nonlinear) plasma waves wherein such discontinuities vanished when appropriate care was taken to add charge separation between ions and electrons in the plasma (which had previously been neglected in such studies, hence my thesis) when such waves pass through the plasma. 
            Perhaps we are missing some physics here in the nonlinear Deutsch CTCs as was the case in the nonlinear plasma wave model I looked at a long time ago.
            Such discontinuous behavior is absent from the Lloyd CTC model and I would be surprised if any cropped up, since it is based on linear quantum physics.  Remember the Deutsch model is inherently nonlinear and a new idea (nonlinearity) being added to quantum physics, so we might expect discontinuous behavior to crop up, whereas the Lloyd model is not—it is based on linear quantum physics.
 
 
Best Wishes,
 
Fred Alan Wolf Ph.D.  aka Dr. Quantum ®
Have Brains / Will Travel
San Francisco
mailto:fred@fredalanwolf.com 
web page: 
http://www.fredalanwolf.com 
Blog page: 
http://fredalanwolf.blogspot.com/


 
From: JACK SARFATTI [mailto:jacksarfatti@gmail.com] 
Sent: Sunday, July 06, 2014 6:34 PM
To: Robert Addinall

 
Yeah
 
Fred Wolf has been working on that difference
 
On Jul 6, 2014, at 5:45 PM, Robert Addinall <beowulfr@interlog.com> wrote:


Seth Lloyd’s talk at the Perimeter Institute was good, and it appears that the Deutsch and P-CTCs models are incompatible.
 
However, I wonder if it is possible that both could happen – there are consistent CTCs (maybe a lot of them, as in your theory of consciousness), but there are also some inconsistent ones that can bump a time traveler across to a parallel universe.  Do you have any thoughts on that?
 
there are never any inconsistent CTCs not even in Deutsch’s theory - jumping to the universe next door is a consistent narrative.
 
John Gribbin has the best pop explanation of Deutsch’s theory in his new quantum computer book.

 
 
From: JACK SARFATTI [mailto:jacksarfatti@gmail.com] 
Sent: July-05-14 2:09 PM
To: JACK SARFATTI
Subject: Re: H. Wiseman-- "Bell's Theorem Still reverberates" & Hauke Traulsen's entanglement swapping "FLASH"
 
Thanks Nick
 
Of course your proof is expected because Traulsen only uses linear unitary orthodox quantum theory. Antony Valentini’s post-quantum theory, in contrast, is a totally new physics of a nonlinear non-unitary post-quantum theory.
 
However, Nick it is not yet clear to me that your density matrix computation applies to Traulsen’s use of the HOM Effect below?
 

Subquantum Information and Computation

(Submitted on 11 Mar 2002 (v1), last revised 12 Apr 2002 (this version, v2))
It is argued that immense physical resources - for nonlocal communication, espionage, and exponentially-fast computation - are hidden from us by quantum noise, and that this noise is not fundamental but merely a property of an equilibrium state in which the universe happens to be at the present time. It is suggested that 'non-quantum' or nonequilibrium matter might exist today in the form of relic particles from the early universe. We describe how such matter could be detected and put to practical use. Nonequilibrium matter could be used to send instantaneous signals, to violate the uncertainty principle, to distinguish non-orthogonal quantum states without disturbing them, to eavesdrop on quantum key distribution, and to outpace quantum computation (solving NP-complete problems in polynomial time).
Also Seth Lloyd’s schemes on post-selection simulations of QM computing on CTCs, which should not be same as linear unitary orthodox QM if it is really a faithful simulation.
 

Search Results

1.   Time travel theory avoids grandfather paradox - Phys.org

o     
o     
o     
 Rating: 4.2 - ‎58 votes
Jul 21, 2010 - The model of time travel proposed by Seth Lloyd, et al., in a recent paper at arXiv. org arises from their investigation of the quantum mechanics ...

2.   The quantum mechanics of time travel through post-selected ...

o     
o     
o     
arXiv
by S Lloyd - ‎2010 - ‎Cited by 32 - ‎Related articles
Jul 15, 2010 - We analyze a specific proposal for such quantum time travel, the quantum ... Seth Lloyd, Lorenzo Maccone, Raul Garcia-Patron, Vittorio ...

3.   Quantum Time Machine Solves Grandfather Paradox | MIT ...

o     
o     
o     
MIT Technology Review
Jul 19, 2010 - A new kind of time travel based on quantum teleportation gets around ... forward by Seth Lloyd at the Massachusetts Institute of Technology and ...

4.   Should Time Travel Be A Moral Imperative? - Forbes

www.forbes.com/sites/.../should-time-travel-be-a-moral-imperativ...
o     
o     
o     
Forbes
by Bruce Dorminey - Aug 28, 2013 - That's the question I posed to MIT quantum mechanic Seth Lloyd, who ... If time travel is possible, should society be ethically obligated to try and ...

5.   The Quantum Mechanics of Time Travel - YouTube

o     
o     
Dec 16, 2010 - Uploaded by QuantumIQC
Dr. Seth Lloyd, an MIT professor and self-described "quantum mechanic," describes the quantum ...

6.   Quantum mechanics of time travel - Wikipedia, the free ...

o     
o     
o     
Wikipedia
Jump to Lloyd's prescription - [edit]. An alternative proposal was later presented by Seth Lloyd based upon post -selection and path integrals.
You've visited this page 2 times. Last visit: 1/19/14

7.   Grandfather paradox - Wikipedia, the free encyclopedia

o     
o     
o     
Wikipedia
The grandfather paradox is a proposed paradox of time travel first described by the ...Seth Lloyd and other researchers at MIT have proposed an expanded ...

8.   NOVA | A Quantum Leap in Computing - PBS

o     
o     
o     
Public Broadcasting Service
Jul 21, 2011 - MIT's Seth Lloyd, a pioneer of quantum computing, explains its ... you can think of time travel, the process of going from the future into the past, ...

9.   Quantum time machine 'allows paradox-free time travel ...

o     
o     
The Daily Telegraph
by Tom Chivers - Jul 22, 2010 - Scientists have for some years been able to 'teleport' quantum states from one place to another. Now Seth Lloyd and his MIT team say that, ...
 
 
On Jul 5, 2014, at 9:26 AM, nick herbert <quanta@cruzio.com> wrote:



Saul-Paul --
 
"Bell's Theorem still reverberates." What a great word "reverberate"!
Thanks for the update on BT.
 
Did a bit more work on Traulsen's PSBA proposal by calculating the Density Matrix BS for a random mix of the 4 Bell States, compared to the Density Matrix PS for a random mix of the 4 uncorrelated polarization states |H>|H>, |H>|V>, |V>|H> and |V>|V>.
 
The result (not surprising) is that PS = BS. The density matrix for both of these situations is the same.
 
Thus quantum mechanics predicts that all statistical averages for these two situations will be the same. Hence no FTL signal (on the average).
 
Hence the only way that Traulsen's PSBA scheme could work is that if individual measurement events give distinguishable results for the BS and PS cases, while the average of these results remains identical.Traulsen has proposed a couple of measurement schemes (Fig 4 on his paper). It would be interesting to calculate the expected outputs of these two measurement schemes to the eight possible inputs (4 BS states and 4 PS states) to see if one can observe patterns in the individual detector responses that might be able to reveal whether a Bell State (BS) was the input, or whether the input was an uncorrelated polarization state (PS).
 
Nick Herbert
 
 
On Jul 4, 2014, at 7:14 PM, Saul-Paul and Mary-Minn Sirag wrote:



NIck,
         
The June 26 issue of Nature has a very interesting article by Howard Wiseman: "Bell's theorem still reverberates".  Wiseman emphasizes the fact that Bell's 1976 paper went somewhat beyond the 1964 paper. In this 1976 paper "local causes" are ruled out.
 
          Wiseman also has a 35 page paper on the Arxiv on this same topic.
 
                  
 
<HowardWiseman-TwoBell'sTheorems-19Jun14.pdf>
 
He has posted many other Arxiv papers.
 
<HowardWiseman-ArXivPapers-4Jul14.pdf>
 
All for now;-)
Saul-Paul
----------------------
         
 
On Jul 4, 2014, at 4:52 PM, nick herbert wrote:



Dear Hauke Traulsen --
 
One of my hobbies is attempting FTL communication using quantum entanglement. I have found, like you, that almost nobody is interested in these schemes because "everyone knows this is impossible" so one's time is better spent on the "physics of the possible". However, tho I agree in principle with this consensus it seems to me that there are at least three reasons to devise refutations for FTL schemes:



1. to show off what a good physicist you are -- if a scheme is impossible, it should be easy to refute;



2. Constructing a refutation might deepen your own knowledge of, say, the physics of entanglement, and
 



3. might lead, if not to FTL signaling, possibly to some new result that no one had ever seen before.



I might also add a fourth reason:



4. sheer intellectual curiosity -- a quality in rare supply in these days of "hurry-up" physics.



My FTL efforts started with Alice deciding whether to collapse her A photons in the circularly-polarized basis (R and L photons) or in the plane-polarized basis (H and V photons). Because of entanglement, Bob's photons (previously polarization-undecided) collapse into the same basis. Alice sends a coded  message by switching between CUP (circularly unpolarized light) and PUP (plane-unpolarized light). If Bob can distinguish between a stream of random  R and L photons and a stream of random H and V photons, then he can decode Alice's (FTL) message. Since both CUP light and PUP light (altho seemingly  physically distinct beams of photons) possess exactly the same density matrix, quantum theory predicts that no experiment exists that can distinguish these two (ostensibly different) kinds of light.
 
[Lots of room for philosophy here: H, V, R and L photons are physically different. Yet a random beam of H and V photons cannot be distinguished from a random beam of R and L photons. No one can really say why.]
 
I have spent a lot of time devising clever schemes to make the CUP/PUP distinction and have learned a lot of physics. One of my schemes (called FLASH) led directly to the famous quantum no-cloning rule.
 
But that is all in the past.
 
The quest for FTL signaling via quantum entanglement has moved beyond these early failures into fresh new ground -- your clever scheme being the newest.
 
I would characterize second-generation FTL schemes as an attempt to expand the dimensions of Bob's measurement space. In Demetrios Kalamidas's recent FTL device, Bob's photons were coherently mixed with a truncated coherent state which increased the number of Bob's output possibilities. Kalamidas's scheme was recently refuted by a small team of experts.
 
In my ETCALLHOME proposal I used a scheme similar to yours to permit Bob to look at TWO CONSECUTIVE PHOTONS -- hence to expand Bob's Hilbert space from 2 complex dimensions to 4.
 
By "similar to yours" I mean a thought experiment in which, in addition to the common SOURCE O producing a pair of entangled photons (sent to A and B) The SOURCE O also sends a TIMING SIGNAL to A and B so that ALICE and BOB can, if they wish, coherently mix two consecutive photons  by knowledgably adjusting optical delay lines. (This is cheap and easy to do with a thought experiment) This timing information (in your case, you envision "storing the photons" -- also easy to do in the mental lab) allows for a more complex measurement on Bob's part which might allow him to perform a more subtle kind of measurement than is encoded into the density matrix.
 
In my ETCALLHOME experiment, Alice makes no use of her timing signal, she just sends 2-ples of photons, either both CUP or both PUP. Bob using his timing signal combines his two photons coherently in a 50/50 beamsplitter and hopes to see if he can get a different output from combining two CUP photons from combining two PUP photons. A simple calculation (as explained to me by Lev Vaidman) shows that both pairs of photons yield the same result. No FTL signaling is possible using the ETCALLHOME scheme.
 
=============
 
With this preface behind me, I would like to consider your PSBA scheme.
 
Like ETCALL HOME, PSBA uses timing information (or photon storage) to combine two consecutive photons. Only in your case both Alice and Bob use the timing info, potentially leading to greater possibility of consummating a robust FTL connection.
 
============= 
 
I am only beginning to understand your PSBA scheme. So please correct me if I am wrong.
 
In your scheme Alice switches from two kinds of measurement on her photon 2-ples. Either she chooses to make a Bell-state measurement (BSM), which ENTANGLES distant Bob's 2-ple. Or she merely detects the two photons (SSM = separable-state measurement) which leaves Bob's 2-ple unentangled.
 
Using your scheme Alice sends PAIRS OF PHOTONS (2-ples) to Bob that are either mutually ENTANGLED (logical ONE) or mutually UNCORRELATED (logical ZERO). If Bob can discern this difference, either in a single measurement or in a series of measurements on identically prepared pairs, then Bob can decode Alice's FTL signals with the usual apocalyptic consequences -- time machines, causality-violation, grandfather paradoxes, stock-market windfalls and much much more.
 
Have I got your scheme right? Bob's task is to look at a sequence of PHOTON PAIRS and decide whether they are POLARIZATION-ENTANGLED or POLARIZATION UNCORRELATED.
 
==============
 
If I have you scheme right?
 
(Already you can begin to see the outline of a refutation: there are 4 ways that a pair of photons can be entangled (the four Bell states). A random sequence of these Four Degrees of Entanglement might well be experimentally indistinguishable from a random sequence of uncorrelated pairs.)
 
==============
 
Enough for now.
 
Thanks for your imaginative FTL scheme.
 
Nick Herbert
 
 
On Jul 3, 2014, at 2:47 AM, Hauke Traulsen wrote:




Hi Nick,

thanks for your reply. Actually there has been no response regarding my publication so far. ;). 

Regarding your second email:  If you want to test correlation for every single photon pair you need classical communication, that's obviously right.
My hope is that analyzing a sequence of entangled photons pairs (without additional classical information) one could determine a statical distribution of detected Bell-States,  which is different from 25%-25%-25%-25% for polarization-entangled photons. 

For example: What about the HOM effect at a symmetrical 50:50 beam splitter?
 

Hong–Ou–Mandel effect

From Wikipedia, the free encyclopedia
The Hong–Ou–Mandel effect is a two-photon interference effect in quantum optics which was demonstrated by three physicists, C. K. Hong, Z. Y. Ou and Leonard Mandel in 1987 from the University of Rochester.[1] The effect occurs when two identical single-photon waves enter a 50:50 beam splitter, one in each input port. When both photons are identical they will extinguish each other. If there are changes in phase, the probability of detection will increase. In this way the interferometer can measure accurately bandwidth, path lengths and timing. http://en.wikipedia.org/wiki/Hong–Ou–Mandel_effect
  1. I expected that for a sequence of entangled photon pairs only pairs in the HV-VH state (antisymmetrical) have the strength to leave the BS though different exits (even under perfect HOM-dip-conditions), which lets one observe and distinguish this state in 25% of all pairs at the BS (by detecting parallel two photons on different outputs of the 50:50 BS).



  1. For a sequence of unentangled photon pairs i expect that the photons of each pair leave the 50:50 BS through the same exit (under perfect HOM-Dip-Conditions 100%) with 50%-50% propability for exit #1 and exit #2.


And for distinguishing 1. and 2. no futher classical communication would be necessary. 

What do you think? Is my assumption regarding the HOM Dip and the HV-VH generally wrong, and if so, why?

Thanks for any hint, best regards,
Hauke


Am 01.07.2014 21:57, schrieb nick herbert:
Hauke Traulsen--
 
i am reading with much interest your new FTL signaling scheme, I have somewhat of a history with such schemes
even publishing a book on the subject "Faster Than Light -- Superluminal Loopholes in Physics" New American Library 1988. 
Recently I collaborated in refuting a clever FTL scheme devised by Demetrios Kalamidas. For more information on 
"The Kalamidas Effect" see these two entries on my Quantum Tantra blog.
 
 
 
I am sure that by now you have received many attempts to refute your scheme. But have you yet received one that really does the job? 
I would appreciate hearing from you if your scheme has already been refuted.
 
in the meantime i am subjecting your paper to careful scrutiny (and am admiring your cleverness) and have forwarded this paper to all members 
of the team that helped to refute the Kalamidas Scheme.
 
warm regards
Nick Herbert
 



-- 
 
----------------------------------------------------------------------------------------------------------------------------------
Unsere aktuellen L.I.N.K.-News finden Sie unter: www.iis.fraunhofer.de/link-newsletter
----------------------------------------------------------------------------------------------------------------------------------
Hauke Traulsen
 
Gruppe Technologien
Zentrum für Intelligente Objekte ZIO
 
Fraunhofer-Arbeitsgruppe für Supply Chain Services SCS
Nordostpark 93  |  90411 Nürnberg, Germany
 
Telefon +49 911 58061-9548  |  Fax +49 911 58061 9598
hauke.traulsen@iis.fraunhofer.de
www.scs.fraunhofer.de  |  www.zio.fraunhofer.de
 
 
 
 
 
<0908.2655v4discontinous Deutsch CTCs.pdf>
 

On Dec 5, 2013, at 8:00 PM, JACK SARFATTI <jacksarfatti@icloud.com> wrote:


 
From the beginning:
 
First Hawking
 
L = Schwarzschild radial coordinate distance to horizon classical 2D surface g00 = 0.
 
Newton's surface gravity ~ A^-1/2
 
A = area-entropy of g00 = 0
 
What they do in Wikipedia above comes down to this
 
Redshifted Unruh temperature a long distant from the black hole is
 
THawking ~ A^-1/2
 
Stefan-Boltzmann law
 
energy density ~ THawking^4 ~ A^-2
 
Total redshifted power
 
P ~ A (energy density) ~ A^-1
 
A ~ M^2
 
P ~ dM/dt
 
tlifetime ~ M^3
 
OK now my new prediction following the same argument as above
 
The redshifted thickness gravity Unruh temperature is
 
T' ~ (LA^1/2)^-1/2
 
If we take
 
Lp ~ mp = Planck mass
 
T' ~ (mpM)^-1/2
 
P' ~ AT'^4 ~ A/L^2A ~ L^-2 ~ mp^-2
 
dM'/dt ~ mp^-2
 
t' ~ mp^2M << t ~ M^3
 

1)   . I intuited the connection between the Einstein-Rosen (ER) wormhole and Einstein-Podolsky-Rosen (EPR) quantum entanglement back in 1973 when I was with Abdus Salam at the International Centre of Theoretical Physics in Trieste, Italy. This idea was published in the wacky book “Space-Time and Beyond” (Dutton, 1975) described by MIT physics historian David Kaiser in his book “How the Hippies Saved Physics.” Lenny Susskind, who I worked with at Cornell 1963-4, rediscovered this ER = EPR connection in the black hole “firewall” paradox. Lenny envisions a multi-mouthed wormhole network connecting the Hawking radiation particles their entangled twins behind the evaporating event horizon. “each escaping particle remains connected to the black hole through a wormhole” Dennis Overbye, Einstein and the Black Hole, New York Times August 13, 2013.  The no-signaling theorem corresponds to the wormhole pinching off before a light speed limited signal can pass through one mouth to the other. Now we know that traversable wormhole stargates are possible using amplified anti-gravity dark energy. This corresponds to signal-nonlocality in post-quantum theory violating orthodox quantum theory. 

1)      Localizing global symmetries requires the addition of compensating gauge connections in a fiber bundle picture of the universe. Indeed, the original global symmetry group is a smaller subgroup of the local symmetry group. The gauge connections define parallel transport of tensor/spinor fields. They correspond to the interactions between the several kinds of charges of the above symmetries. I shall go into more details of this elsewhere. Indeed localizing the above spacetime symmetries corresponds to generalizations of Einstein’s General Relativity as a local gauge theory.[i] For example, localizing the space and time global translational symmetries means that the Lie group transformations at different events (places and times) in the universe are independent of each other. If one believes in the classical special relativity postulate of locality that there are no faster-than-light actions at a distance, then the transformations must certainly be independent of each other between pairs of spacelike separated events that cannot be connected by a light signal. However, the local gauge principle is much stronger, because it applies to pairs of events that can be connected not only by a light signal, but also by slower-than-light timelike signals. This poses a paradox when we add quantum entanglement.  Aspect’s experiment and others since then, show that faster-than-light influences do in fact exist in the conditional probabilities (aka correlations) connecting observed eigenvalues of quantum observable operators independently chosen by Alice and Bob when spacelike separated. I shall return to this in more detail elsewhere. However, the no entanglement-signaling postulate is thought by many mainstream theoretical physicists to define orthodox quantum theory. It’s believed that its violation would also violate the Second Law of Thermodynamics. Note that the entanglement signal need not be faster-than-light over a spacelike separation between sender and receiver. It could be lightlike or timelike separated as well. Indeed it can even be retrocausal with the message sent back-from-the-future. John Archibald Wheeler’s “delayed choice experiment” is actually consistent with orthodox quantum theory’s no-signaling premise. The point is, that one cannot decode the message encoded in the pattern of entanglement until one has a classical signal key that only propagates forward in time. What one sees before the classical key arrives and a correlation analysis is computed is only local random white noise. However, data on precognitive remote viewing as well as brain presponse data suggests that no-entanglement signaling is only true for dead matter. Nobel Prize physicist, Brian Josephson first published on this. I have also suggested it using Bohm’s ontological interpretation (Lecture 8 of Michael Towler’s Cambridge University Lectures on Bohm’s Pilot Wave). Antony Valentini has further developed this idea in several papers. Post-quantum “signal nonlocality” dispenses with the need to wait for the light-speed limited retarded signal key propagating from past to future. Local non-random noise will be seen in violation of the S-Matrix unitarity “conservation of information” postulate of G. ‘t Hooft, L. Susskind et-al.  Indeed the distinguishable non-orthogonality of entangled Glauber macro-quantum coherent states seems to be the way to get signal nonlocality. This gets us to the “Black Hole War” between Susskind and Hawking about information loss down evaporating black holes. It seems that Hawking caved in too fast to Susskind back in Dublin in 2004. I intuited the connection between the Einstein-Rosen (ER) wormhole and Einstein-Podolsky-Rosen (EPR) quantum entanglement back in 1973 when I was with Abdus Salam at the International Centre of Theoretical Physics in Trieste, Italy. This idea was published in the wacky book “Space-Time and Beyond” (Dutton, 1975) described by MIT physics historian David Kaiser in his book “How the Hippies Saved Physics.” Lenny Susskind, who I worked with at Cornell 1963-4, rediscovered this ER = EPR connection in the black hole “firewall” paradox.



[i] Localizing the four space and time translations corresponds to Einstein’s general coordinate transformations that are now gauge transformations defining an equivalence class of physically identical representations of the same curvature tensor field. However, the compensating gauge connection there corresponds to torsion fields not curvature fields. The curvature field corresponds to localizing the three space-space rotations and the three space-time Lorentz boost rotations together. Einstein’s General Relativity in final form (1916) has zero torsion with non-zero curvature. However, T.W.B. Kibble from Imperial College, London in 1961 showed how to get the Einstein-Cartan torsion + curvature extension of Einstein’s 1916 curvature-only model by localizing the full 10-parameter Poincare symmetry Lie group of Einstein’s 1905 Special Relativity. The natural geometric objects to use are the four Cartan tetrads that correspond to Local Inertial Frame (LIF) detector/observers that are not rotating about their Centers of Mass (COM) that are on weightless zero g-force timelike geodesics.  Zero torsion is then imposed as an ad-hoc constraint to regain Einstein’s 1916 model as a limiting case. The ten parameter Poincare Lie group is subgroup of the fifteen parameter conformal group that adds four constant proper acceleration hyperbolic Wolfgang Rindler horizon boosts and one dilation scale transformation that corresponds to Herman Weyl’s original failed attempt to unify gravity with electromagnetism. The spinor Dirac square roots of the conformal group correspond to Roger Penrose’s “twistors.”

 

My review of Jim Woodward's Making Starships book - V1 under construction
  • Jack Sarfatti Sarfatti’s Commentaries on James F. Woodward’s book 
    Making Starships and Star Gates 
    The Science of Interstellar Transport and Absurdly Benign Wormholes

    The book has many good insights except for some ambiguous statements regarding:

    1) The equivalence principle that is the foundation of Einstein’s theory of the gravitational field. This seems to be due to the author’s not clearly distinguishing between local frame invariant proper acceleration and frame dependent coordinate acceleration. Thus, the author says that Newton’s gravity force is eliminated in an “accelerating frame.” In fact, it is eliminated in a Local Inertial Frame (LIF) that has zero proper acceleration, though it has coordinate acceleration relative to the surface of Earth for example. All points of the rigid spherical surface of Earth have non-zero proper accelerations pointing radially outward. This violates common sense and confuses even some physicists as well as engineers not to mention laymen. It is a fact of the Alice in Wonderland topsy-turvy surreal world of the post-modern physics of Einstein’s relativity especially when combined with the faster-than-light and back from the future entanglement of particles and fields in quantum theory and beyond. 
    2) I find the author’s discussion of fictitious inertial pseudo forces puzzling. I include the centripetal force as a fictitious force in the limit of Newton’s particle mechanics sans Einstein’s local inertial frame dragging from rotating sources. That is, every local frame artifact that is inside the Levi-Civita connection is a fictitious inertial pseudo force. This includes, Coriolis, centrifugal, Euler, and most importantly Newton’s gravity force that is not a real force. The terms inside the Levi-Civita connection are not felt by the test particle under observation. Instead, they describe real forces acting on the observer’s local rest frame. A real force acts locally on a test particle’s accelerometer. It causes an accelerometer’s pointer to move showing a g-force. In contrast, Baron Munchausen sitting on a cannonball in free fall is weightless. This was essentially Einstein’s “happiest thought” leading him to the equivalence principle the cornerstone of his 1916 General Relativity of the Gravitational Field. 
    3) A really serious flaw in the book is the author’s dependence on Dennis Sciama’s electromagnetic equations for gravity. In fact, these equations only apply approximately in the weak field limit of Einstein’s field equations in the background-dependent case using the absolute non-dynamical globally-flat Minkowski space-time with gravity as a tiny perturbation. The author uses these equations way out of their limited domain of validity. In particular, the Sciama equations cannot describe the two cosmological horizons past and future of our dark energy accelerating expanding observable universe. What we can see with our telescopes is only a small patch (aka “causal diamond”) of a much larger “inflation bubble” corresponding to Max Tegmark’s “Level 1” in his four level classification of the use of “multiverse” and “parallel universes.” Our two cosmological horizons, past and future, that are thin spherical shells of light with us inside them at their exact centers may in fact be hologram computer screens projecting us as 3D images in a virtual reality quantum computer simulation. This is really a crazy idea emerging from Gerardus ‘t Hooft, Leonard Susskind, Seth Lloyd and others. Is it crazy enough to be true? 
  • Jack Sarfatti 4) John Cramer’s Foreword: I agree with Cramer that it’s too risky in the long run for us to be confined to the Earth and even to this solar system. British Astronomer Royal, Lord Martin Rees in his book “Our Final Hour” gives detailed reasons. Of course if a vacuum strangelet develops like Kurt Vonnegut’s “Ice-9”, then our entire observable universe can be wiped out, our causal diamond and beyond shattered, and there is no hope. That is essentially the apocalyptic worst-case scenario of the Bible’s “Revelations” and we will not dwell on it any further. Let’s hope it’s not a precognitive remote viewing like what the CIA observed in the Stanford Research Institute studies in the 1970’s.  Cramer cites the NASA-DARPA 100 Year Star Ship Project that I was involved with in the first two meetings. Cramer’s text is in quotes and italics. There is “little hope of reaching the nearby stars in a human lifetime using any conventional propulsion techniques … the universe is simply too big, and the stars are too far away. … What is needed is either trans-spatial shortcuts such as wormholes to avoid the need to traverse the enormous distances or a propulsion technique that somehow circumvents Newton’s third law and does not require the storage, transport and expulsion of large volumes of reaction mass.”
    Yes, indeed. I conjecture as a working hypothesis based on the UFO evidence that traversable wormhole stargate time travel machines are the only way to go with warp drive used only as a secondary mechanism at low speeds mainly for silent hovering near the surfaces of planets and for dogfights with conventional aerospace craft. The stargates do not have the blue shift problem that the Alcubierre warp drive has although the Natario warp drive does not have the blue shift problem (high-energy collisions with particles and radiation in the path of the starship). Newton’s third law that every force acting on a material object has an equal and opposite inertial reaction force on the source of that force is a conservation law that follows from symmetry Lie groups of transformations in parameters of the dynamical action of the entire closed system of source and material object. This is a very general organizing principle of theoretical physics known as Noether’s theorem for global symmetries in which the transformations are the same everywhere for all times in the universe. For example:
    Space Translation Symmetry Linear Momentum Conservation
    Time Translation Symmetry Energy Conservation
    Space-Space Rotation Symmetry Angular Momentum Conservation
    Space-Time Rotation Symmetry
    Internal U1 EM Force Symmetry Conserve 1 Electric Charge
    Internal SU2 Weak Force Symmetry Conserve 3 Weak Flavor Charges
    Internal SU3 Strong Force Symmetry Conserve 8 Strong Color Charges
  • Jack Sarfatti In a propellantless propulsion system without the rocket ejection of real particles and/or radiation one must include the gravity curvature field (dynamical space-time itself) as a source and sink of linear momentum. Furthermore, if we include quantum corrections to the classical fields there is the remote possibility of using virtual particle zero point fluctuations inside the vacuum as a source and sink of linear momentum. However, the conventional wisdom is that this kind of controllable small-scale metastable vacuum phase transition is impossible in principle and to do so would violate the Second Law of Thermodynamics (extracting work from an absolute zero temperature heat reservoir). Even if we could do the seemingly impossible, propellantless propulsion while necessary is not sufficient for a true warp drive. A true warp drive must be weightless (zero g-force) timelike geodesic and without time dilation for the crew relative to the external observer outside the warp bubble that they were initially clock synchronized with. Localizing global symmetries requires the addition of compensating gauge connections in a fiber bundle picture of the universe. Indeed, the original global symmetry group is a smaller subgroup of the local symmetry group. The gauge connections define parallel transport of tensor/spinor fields. They correspond to the interactions between the several kinds of charges of the above symmetries. I shall go into more details of this elsewhere. Indeed localizing the above spacetime symmetries corresponds to generalizations of Einstein’s General Relativity as a local gauge theory. For example, localizing the space and time global translational symmetries means that the Lie group transformations at different events (places and times) in the universe are independent of each other. If one believes in the classical special relativity postulate of locality that there are no faster-than-light actions at a distance, then the transformations must certainly be independent of each other between pairs of spacelike events that cannot be connected by a light signal. However, the local gauge principle is much stronger, because it applies to pairs of spacelike events that can be connected not only by a light signal, but also by slower-than-light timelike signals. This poses a paradox when we add quantum entanglement. Aspect’s experiment and others since then, show that faster-than-light influences do in fact exist in the conditional probabilities (aka correlations) connecting observed eigenvalues of quantum observable operators independently chosen by Alice and Bob when spacelike separated. I shall return to this in more detail elsewhere. Finally, we have the P.W. Anderson’s anti-reductionist “More is different” emergence of complex systems of real particles in their quantum ground states with quasi-particles and collective mode excitations in soft condensed matter in which the whole is greater than the sum of its parts. This corresponds to spontaneous symmetry breaking of the quantum vacuum’s virtual particles, in its high energy standard model analog, to the Higgs-Goldstone “God Particle” now found at ~ 125 Gev in CERN’s LHC that gives rest masses to leptons and quarks as well as to the three weak radioactivity force spin 1 gauge W-bosons though not to the single spin 1 photon gauge boson and the eight spin strong force gluon gauge bosons. In this quantum field theory picture, the near field non-radiating interactions among the leptons and quarks are caused by the exchange of virtual spacelike (tachyonic faster-than-light off-mass-shell) gauge bosons continuously randomly emitted and absorbed by the leptons and quarks. To make matters more complicated unlike the single rest massless U1 photon, the three weak rest massive SU2 W bosons and the eight strong rest massless SU3 gluons carry their respective Lie algebra charges, therefore, they self-interact. A single virtual gluon can split into two gluons for example. The SU3 quark-quark-gluon interaction gets stronger at low energy longer separations. This is called quantum chromodynamic confinement and it explains why we do not see free quarks in the present epoch of our causal diamond observable universe patch of the multiverse. Free quarks were there in a different quantum vacuum thermodynamic phase shortly after the Alpha Point chaotic inflation creation of our observable universe that we see with telescopes etc. Indeed, most of the rest mass of protons and neutrons comes from the confined Heisenberg uncertainty principle kinetic energy of the three real confined up and down quarks and their plasma cloud of virtual zero point gluons and virtual quark-antiquark pairs. The Higgs Yukawa interaction rest masses of three bound real quarks is about 1/20 or less than the total hadronic rest masses.

    The author, James F. Woodward (JFW), introduces Mach’s Principle though in an ambiguous way to my mind. He says that the computation of the rest mass from local quantum field theory as has been in fact accomplished for hadrons by MIT Nobel Laureate, Frank Wilczek et-al using supercomputers is not sufficient to explain the inertia of Newton’s Second Law of Particle Mechanics. This does sound like Occult Astrology at first glance, but we do have the 1940 Wheeler-Feynman classical electrodynamics in which radiation reaction is explained as a back-from-the-future retro causal advanced influence from the future absorber on the past emitter in a globally self-consistent loop in time. Indeed, Feynman’s path integral quantum theory grew out of this attempt. Hoyle and Narlikar, and John Cramer have extended the original classical Wheeler-Feynman theory to quantum theory. Indeed, the zero point virtual photons causing spontaneous emission decay of excited atomic electron states can be interpreted as a back from the future effect. The electromagnetic field in the classical Wheeler-Feynman model did not have independent dynamical degrees of freedom, but in the Feynman diagram quantum theory they do. However, the retro causal feature survives. Therefore the only way I can make sense of JFWs fringe physics proposal is to make the following conjecture. Let m0 be the renormalized rest mass of a real particle computed in the standard model of local quantum field theory. Then, the observed rest mass m0’ equals a dimensionless nonlocal coefficient C multiplied by the local m0 renormalized rest mass. Mach’s Principle is then C = 0 in an empty universe of only real test particles without any sources causing spacetime to bend. Furthermore, C splits into past history retarded and future destiny advanced pieces. Now is there any Popper falsifiable test of this excess baggage?
  • Jack Sarfatti 1) Springer-Praxis Books in Space Exploration (2013)
    2) Einstein in Zurich over one hundred years ago read of a house painter falling off his ladder saying he felt weightless.
    3) I have since disassociated myself from that project, as have other hard
    ...See More
  • Jack Sarfatti 4) Roughly speaking, for particle mechanics, the dynamical action is the time integral of the kinetic energy minus the potential energy. The classical physics action principle is that the actual path is an extremum in the sense of the calculus of variations relative to all nearby possible paths with the same initial and final conditions. Richard P. Feynman generalized this classical idea to quantum theory where the actual extremum path corresponds to constructive interference of complex number classical action phases one for each possible path. There are more complications for velocity-dependent non-central forces and there is also the issue of initial and final conditions. The action is generalized to classical fields where one must use local kinetic and potential analog densities and integrate the field Lagrangian density over the 4D spacetime region bounded by initial history and final teleological destiny 3D hypersurfaces boundary constraints. Indeed, Yakir Aharonov has generalized this to quantum theory in which there are back-from-the-future retro causal influences on present weak quantum measurements made between the past initial and future final boundary constraints. Indeed, in our observable expanding accelerating universe causal diamond, these boundary constraints, I conjecture, are our past cosmological particle horizon from the moment of chaotic inflation leading to the hot Big Bang, together with our future dark energy de Sitter event horizon. Both of them are BIT pixelated 2D hologram computer screens with us as IT voxelated “weak measurement” 3D hologram images projected from them. The horizon pixel BIT quanta of area are of magnitude (~10^-33 cm or 10^19 Gev)^2. The interior bulk voxel IT quanta of volume are of magnitude (~10^-13 cm or 1 Gev)^3. This ensures that the number N of BIT horizon pixels equals the number of IT interior voxels in a one-to-one correspondence. The actually measured dark energy density is proportional to the inverse fourth power of the geometric mean of the smallest quantum gravity Planck length with the largest Hubble-sized scale of our future de Sitter causal diamond ~ 10^28 cm. This, when combined with the Unruh effect, corresponds to the Stefan-Boltzmann law of black body radiation that started quantum physics back in 1900. However, this redshifted Hawking horizon blackbody radiation must be coming back from our future de Sitter cosmological horizon not from our past particle horizon.
  • Jack Sarfatti 5) Localizing the four space and time translations corresponds to Einstein’s general coordinate transformations that are now gauge transformations defining an equivalence class of physically identical representations of the same curvature tensor field. However, the compensating gauge connection there corresponds to torsion fields not curvature fields. The curvature field corresponds to localizing the three space-space rotations and the three space-time Lorentz boost rotations together. Einstein’s General Relativity in final form (1916) has zero torsion with non-zero curvature. However, T.W.B. Kibble from Imperial College, London in 1961 showed how to get the Einstein-Cartan torsion + curvature extension of Einstein’s 1916 curvature-only model by localizing the full 10-parameter Poincare symmetry Lie group of Einstein’s 1905 Special Relativity. The natural geometric objects to use are the four Cartan tetrads that correspond to Local Inertial Frame (LIF) detector/observers that are not rotating about their Centers of Mass (COM) that are on weightless zero g-force timelike geodesics. Zero torsion is then imposed as an ad-hoc constraint to regain Einstein’s 1916 model as a limiting case. The ten parameter Poincare Lie group is subgroup of the fifteen parameter conformal group that adds four constant proper acceleration hyperbolic Wolfgang Rindler horizon boosts and one dilation scale transformation that corresponds to Herman Weyl’s original failed attempt to unify gravity with electromagnetism. The spinor Dirac square roots of the conformal group correspond to Roger Penrose’s “twistors.”
  •  
     
  • JackSarfatti's comment on A Black Hole Mystery Wrapped in a Firewall Paradox via @nytimes http://t.co/I671P9aoP3 1 of 2
    A Black Hole Mystery Wrapped in a Firewall Paradox
    nyti.ms
    A paradox around matter leaking from black holes puts into question various scientific axioms: Either information can be lost; Einstein’s principle of equivalence is wrong; or quantum field theory needs fixing.
  • Jack SarfattiActually I was the first to propose a connection between gravity wormholes and quantum entanglement back in the early 1970's when I was at Abdus Salam's Institute in Trieste, Italy. The idea is explicit in the zany book Space-Time and Beyond (Dutton, 1975) that I co-authored with New Age artist Bob Toben and physicist Fred Alan Wolf. MIT Professor David Kaiser describes this history in his award winning book "How the Hippies Saved Physics." Curious that this article thinks that traversable wormholes are impossible. Many physicists think otherwise. Traversable wormholes held open by gravitationally repulsive exotic dark energy without horizons would of course permit faster-than-light communication via entanglement in violation of orthodox quantum theory. Indeed, Antony Valentini has written papers on such signal nonlocality using David Bohm's interpretation of quantum theory when the "beables" are not in thermal equilibirium.
  • Jack SarfattiThe black hole horizon is already quite physical without violating Einstein's equivalence principle for the hovering observer who must have an increasingly large proper acceleration the closer he gets to the horizon. By the Unruh effect, the hovering observer Bob sees a large temperature of real thermal photons. If Bob burns up from hovering too close to the horizon, and if Alice free falls into the black hole in close contact with Bob, then it seems plausible that Alice will feel Bob's heat and indeed catch fire herself. Furthermore, the virtual electron-positron pairs stuck to the horizon in Hawking's radiation mechanism will also feel the photon heat high temperature and will get enough energy from the gravity field to excite into a real electron positron plasma within a Compton wavelength thickness, so that free-falling Alice should see that as a quantum firewall. The equivalence principle is a classical physics idea prior to quantum corrections. Hence, I see no real paradox if looked at from this perspective.