Text Size


Tag » Hal Puthoff


Jack Sarfatti

16 mins · San Francisco, CA · Edited · 

On Jan 3, 2015, at 11:31 PM, Jacques Vallee wrote:

Beowulf is right on. About 1970 Paul Baran (inventor of packet switching at Rand and arguably the true grandfather of the Internet) tested the first radio prototype of

the Arpanet by spread spectrum on the range of frequencies of the SFO control tower. He could do that without interference with air operations because the spread spectrum signal was undetectable -- low in the noise....

See More



Like ·  · Share


Jack Sarfatti http://stardrive.org



On Jan 3, 2015, at 6:34 PM, Jack Sarfatti <jacksarfatti@icloud.com> wrote:


Ignoring the UFO data in front of our noses is a fatal mistake. Meantime let's see if the fly by anomaly is caused by a small wormhole. There are credible reports by Eric Davis of a small wormhole at the Bigelow ranch in Utah. http://en.wikipedia.org/wiki/Skinwalker_Ranch


Sent from my iPad


On Jan 3, 2015, at 4:07 PM, Robert Addinall wrote:


I actually tend to think that few civilizations will end up building Dyson spheres. Again, my suspicion is that it's possible (though not easy in the initial stages) to develop techniques for generating and containing negative energy/mass, and then you have warp drive/wormholes. At that point you can colonize (or terraform and then colonize) new planets. Most likely you don't want more than two or three billion inhabitants per planet (Earth is probably currently overpopulated). You'll primarily use FTL (some sort of wormholes or else readable quantum entanglement) for communication and not put out significant radio signals. So, I would really expect to *only* see regular planetary systems. We can't really say anything for certain until we get enough telescope resolution to see Earth-like planets and whether (1) they show evidence of biological processes like photosynthesis and oxygen-rich atmospheres, and (2) lights illuminating metropolitan areas.


Even such observations would not rule out intelligent life of very different forms than those found on Earth.


From: creon levit NASA - National Aeronautics and Space Administration

Sent: Saturday, January 3, 2015 5:02 PM



Subject: "zeroth order null result" from WISE for free energy and for UFOs.


More evidence of no high level ET civilizations in our galaxy: http://arxiv.org/pdf/1408.1134.pdf


The Gaia mission, currently in orbit, will provide a much tighter (probabilistic) bound. It is surveying a billion local stars. If any of them have a something like a Dyson sphere, we will know.


The Kepler mission found that most stars have planets, and that a significant fraction have habitable planets. So for those like me who do not at present find UFO evidence convincing, these missions, and the negative results from all SETI searches to date, reinforce the Fermi paradox. It leads one either towards “we are alone” or to the great filter.


For an amusing but serious summary of these issues see Bostrum’s essay "why I hope the search for extraterrestrial life finds nothing"



Stardrive, ISEP, Internet Science Education Project


10 mins · Like · Remove Preview


Jack Sarfatti On Dec 29, 2014, at 3:19 PM, Jack Sarfatti <jacksarfatti@icloud.com> wrote:


From: Hal Puthoff 

Date: December 29, 2014 at 2:01:38 PM PST

To: lensman137@sbcglobal.net, 


Subject: Re: The RAND Corporation on UFOs !


Though overlooked by many, the recently declassified UK MOD report (so-called Condign Report, interestingly enough!), assembled in 2000 by the Defense Intelligence staff, though written to 'get out of the pubic UFO business,' has within its > 100 pages a number of gems of technical details, including an assessment EM frequencies hypothesized to possibly be involved in the Rendlesham Forest event. Available on the Internet from the UK National Archives - see below.




<< Unidentified Aerial Phenomena (UAP) in the UK Air Defence Region


The Ministry of Defence has released this report in response to a Freedom of Information request and we are pleased to now make it available to a wider audience via the MOD Freedom of Information Publication Scheme. Where indicated information is withheld in accordance with Section 26 (Defence), Section 27 (International Relations) and Section 40 (Personal Information) of the Freedom of Information Act 2000.

UAP in the UK Air Defence Region: Executive Summary

UAP in the UK Air Defence Region: Volume 1

UAP in the UK Air Defence Region: Volume 2

UAP in the UK Air Defence Region: Volume 3 >>


-----Original Message-----

From: Kim Burrafato <lensman137@sbcglobal.net>

To: Creon Levit 


Sent: Mon, Dec 29, 2014 3:19 pm

Subject: Re: The RAND Corporation on UFOs !


What about the testimony of Base Commander, Colonel Charles Halt, and all of the other airmen who were up close and personal witnesses to the highly strange events at Rendlesham forest? All the people involved in Rendlesham were reliable, extensively vetted RAF Bentwaters USAF security personnel — after all, this was a NATO nuclear weapons storage facility. They would never have attained those security positions if they weren’t exemplary soldiers. Unlike Roswell, where key witnesses weren’t interviewed until many years after the alleged incident, the majority of witnesses in the Rendlesham forest incident are alive and well. Halt maintains to this day that the object he and others observed at Rendlesham was extraterrestrial technology. Despite the apparent lack of physical and photographic evidence to that effect, we cannot discount all that important detailed and reliable eyewitness testimony. And it’s a safe bet that if any physical or photographic evidence was gathered, it has been sequestered deep within the black catacombs of the national security establishment. 


On Dec 29, 2014, at 9:43 AM, creon levit wrote:


Ok I'll read John's book too !-)


On Dec 28, 2014, at 11:25 PM, Colonel John Alexander wrote:


The evidence in favor of UFOs is simply overwhelming and I agree with Hal's comment on Bentwaters. In my book, UFOs: Myths, Conspiracies, and Realities that is one of my top cases as it had physical evidence as well as veridical eyewitnesses. In addition, it was not a singular event. Like the Phoenix Lights and Gulf Breeze it recurred over long periods of time. That said, the ETH is only one hypothesis and may not be the best fit when all the evidence is considered. As I end my book, whatever it is (they are) the UFO phenomena are more complex than we ever imagined.



4 mins · Like


Jack Sarfatti On Dec 28, 2014, at 1:13 PM, Robert Addinall wrote:


My current guess is the same as that of the 50s AF generals; probably a small percentage of reports are caused by interstellar vessels. The rest would have mundane explanations and I'm also willing to entertain other explanations; perhaps a handful are some sort of "interdimensional" clouds of energy/organisms that occasionally show up. Some reports seem to indicate rather odd, amorphous shapes and lights, but others, such as those cited in the RAND report, do seem to clearly indicate mechanical craft.


I would consider "killer" proof to be recovery and verification of a physical artifact in the public domain:


1. A spacecraft or substantial component of a spacecraft (ie. large piece of wreckage with enough intact components and structure to indicate that it could not have come from any other type of aircraft).


2. An EBE (extraterrestrial biological entity). At least a more or less complete body that could not be mistaken for anything else. Preferably a living being who can talk to reporters, academics, government officials etc on camera.


3. Keep in mind the possibility that a mechanical artifact might also be a self aware AI that could talk to us. So, #3 is a combination of #1 and #2.


Now if we prove that we can generate and contain negative mass or negative energy density and go ahead to build a working warp drive or wormhole generator, such a human made artifact would be highly suggestive - you would probably be justified in making the leap of saying that UFOs are mechanical craft driven by this type of technology and so the AFC explanation is correct. However, in the absence of a physical artifact or being, either mechanical or biological, I feel that we must simply treat the AFC explanation for the small percentage of reports unexplainable by mundane reasons as a good one, but we can't be certain.


Given that FTL travel also necessarily implies possible time travel, some of the craft may be ours from our future light cone, or from civilizations that have become connected with us in some way in our future light cone. I treat this as a subset of the AFC hypothesis. Aliens need not be totally alien. How such back from the future interactions might play out we do not yet know - whether there is some chronological protection mechanism law of physics that makes consistent closed timelike curves (CTCs) or whether they are actually changing their past/our present.

2 mins · Edited · Like


Jack Sarfatti On Dec 27, 2014, at 11:33 AM, Robert Addinall wrote:


They did quite a good job IMO.


1. Cocteau's estimate of how many highly advanced civilizations may exist in the galaxy was very good and almost exactly how I've tried to articulate the problem at times. I'll probably now use this as a reference. I was surprised at the estimate of 100 million advanced civilizations/average spacing of 10 light years between advanced civilizations. My estimates tended to be an order or two of magnitude lower, but his methodology seems solid even ~45 years later. Of course we now know for certain that most, if not almost all, stars do develop planetary systems, but observing earth sized planets is difficult, so we're still not sure how abundant they are. We do know that a fair number of stars appear to have planets too close or too far to be in a habitable zone, but even that is already taken into account by Cocteau; he estimates 1000 million sun-like stars out of 100 billion stars and drops the number with planets in acceptable orbits to somewhere around 600 million.


Interestingly recent observations and computer models seem to suggest that binary and trinary star systems can have planets in stable orbits around each star, so long as the stars orbit a common barycenter at a sufficient distance; indeed some studies claim to have detected planets circling the two main Alpha Centauri stars (the third smaller star would circle the whole system outside of the two local systems). So perhaps Cocteau's estimate is even conservative.


To get ~10 LY average spacing we should expect civilizations in at least two of the following three systems with reasonably sun-like stars: Epsilon Eridani (though it's probably too young), Tau Ceti and Alpha Centauri. To maintain the spacing places like Gleise 86 would probably have to be inhabited too. So, either there should be loads of activity out there, or else: (a) correctly sized planets in habitable zones are very infrequent for some reason we don't yet understand; (b) for some reason we don't yet understand life fails to get started or to evolve beyond relatively small, simple forms; (c) civilizations tend to destroy themselves.


I keep an open mind but in the absence of data all I can say is that my instincts suggest that (a), (b), and (c) are wrong, which should mean that Cocteau's methodology holds and that there is a lot going on around the galaxy.


2. Another point where we now have a bit more to go on - the old light speed limit discussion further down in the paper. We now have the Thorne wormhole and Alcubierre warp metrics and the associated requirement for negative energy or mass, and we also have the accelerating expansion of the universe, which suggests that negative energy does exist in the universe. This is much more than having no clue as to how interstellar travel might work. Possibly we've actually already figured out generally how it works, but not the details yet. Obviously we can't build anything like this until we know how to generate and control negative energy.


Things like Jack's idea about changing the flexibility of spacetime by changing the speed of light might be techniques that further augment FTL travel or reduce the negative energy requirement.

1 min · Like




On Jan 4, 2015, at 6:28 AM, Paul Murad <ufoguypaul@yahoo.com> wrote:




There are two problems here. They are that either the document is fake or the document has any disinformation.




If disinformation is to be successful, it has to have some level of truth or honest information. 




This is a necessity for establishing credibility for the entire document. Whenever we got any Soviet disinformation, the problem was to find those pieces that had legitimate information for credibility.  The same is to be true if this is real in this document. Regardless, some real information needs to be established.


Personally I feel the document is a fake because there is no clear identification of a government organization. Government types like to make sure people know about where things are going or from where. That involves highly classified documents so you can refer back any questions or what points may cause problems.  I might have missed this but with a quick scan, I did not see such information...


I read the Einstein/Oppenheim meeting and are you implying it was a fake?  If you look at it, this looks like a precursor for what this country has performed with respect to treating UFOs and so on.  Einstein carries significant weight. If it is a fake, then it was well done! This information is similar to what Col. Corso said regarding early UFO activities so there is some correlations. The question is to find out the truth from the lies.


It would be interesting if there was any mention of UFOs in the Einstein papers. I doubt it. Same for John Archibald Wheeler’s papers because he switched to gravity research around 1952 at the peak of the flying saucer craze and he had top security clearance.


Regarding typographic comments, this does not fly for several reasons. Secretaries may have had errors in documents or if critical, the author could have made mistakes.  Remember we did not have WORD or files of documents to the point that people published whatever they could get out of the typewriter.


Finally, the issue about alien communications.  Appearently SETI does not work because aliens may not use electromagnetic communications moving at the speed of light.  If they go supposedly faster than the speed of light, the messages would be months or years after the events. The only possibilities is a torsion field as predicted by the Russians or gravity waves considering recent findings from Podkletnov... Oh, I forgot, it was all disinformation.  So is the Kosyrev star experiment or that jets leave a black hole where this is not particles from an accretion disk but from the black hole itself because of evaporation…


Gravity waves also move at c.


Now to go back to other more meaningful activities...


Paul Murad

Morningstar Applied Physics, LLC






From: Ryan Wood <rwood@majesticdocuments.com>

To: 'Robert Addinall' <beowulfr@interlog.com>; 'JACK SARFATTI' <jacksarfatti@comcast.net>; 'IFPA GROUP-EUROPE' <ifpagroup@gmail.com> 


Sent: Saturday, January 3, 2015 9:45 PM

Subject: RE: Majestic-12 Group Special Operations Manual -- REAL


If you think it might be disinformation of some sort, see my 10 page paper on this entire issue on majesticdocuments.com http://www.majesticdocuments.com/pdf/psywar.pdf


Or excerpted under authentication at www.specialoperationsmanual.com


WHY disinformation? Who are we trying to deceive? For what purpose? Scare the Russians in ’54?   If it were disinformation it is so good that the KGB would have decided to assign more assets to penetrate, Wright Patt, Area-51, Kirtland AFB, those people etc.  


After all the KGB ripped off the bomb secrets with ease.  Any logical military / political decision team would AVOID attracting attention to this matter.  So the notion of disinformation utterly FAILS.


We know SOM1-01 was printed with a hot lead printing press of the era according to author of the 1958 US Government Printing Office Style Manual. My father (Dr. Bob Wood) and I interviewed him in his home in Virginia, more than a decade ago. His read was that SOM1-01 is authentic because of the raised Z in the typography. The use of “screw driver” as two words and the capitalization of “First Aid” which is now first aid. Even the arrogance of the phrase “Central Intelligence” rather than Central Intelligence Agency suggested to him that the CIA involved. 


I can go on, but that’s not the point. Disinformation is not at all probable. Far more likely that it’s all real.  


Cheers Ryan


Ryan S. Wood


Majic Eyes Only – Earth’s Encounters With Extraterrestrial Technology


14004 Quail Ridge Drive

Broomfield, CO 80020

720-887-8171 (ph)






From: Robert Addinall [mailto:beowulfr@interlog.com] 

Sent: Friday, January 02, 2015 8:33 PM



Subject: Re: Majestic-12 Group Special Operations Manual - Website - BOGUS


I gave a couple of specific reactions, but as I said none of them positively confirm or deny on their own. Sometimes a manual will use superlatives repeatedly or spend a lot of time on vague generalities about the purpose of an organization which the people reading it should already know (for example, "very highest security" will not often appear since "highest security" already imparts the gravity of the situation in the context). Overall, Col. Alexander and others of us have a fair bit of experience with NATO nation military documents, so you get a sense of whether something smells off or not. Any determined disinformation attempt would do a decent job of forging a control page and initials/signatures of people who should have been there at the time, so again it's very difficult ‎to confirm or reject based on that.


I actually just finished writing another message about why I suspect that a lot of disinformation is out there about UFOs and will send it momentarily.


Incidentally I don't have any problem with you selling reproductions of MJ 12 documents; there is a market for it, and it's also valuable to see what disinformation is out there and to see if there are common threads or bits of good info that can be teased out.




From: Ryan Wood

Sent: Friday, January 2, 2015 9:47 PM



Subject: RE: Majestic-12 Group Special Operations Manual - Website - BOGUS?


If you think it’s a “fabrication and doesn’t ring true” then those comments are useless, it’s just speculation on your part.  


This is 1954 Top secret stuff…Why do even think you have a perspective on what would be true or not.


So now, I’ll give you some investigated facts. 


So the change control page has initials of JRT and EWL in it where those document control / MJ-12 control officers changed pages from ‘54 to ‘57. 


We know the manual came from Kirtland AFB UNIT KB-88, so I checked the phone book exhaustively for the JRT’s and EWL’s in 1955 and sure enough lt. JR Totten (JRT) and Col Edward Levine (EWL) both lived on base on Perimeter road.  Furthermore, our private detectives interviewed EWL’s family and they confirmed his “special” military service.


I could go on, but I think it’s just a waste of time.  Please give me specific reasons why you think it’s a fake.


Cheers Ryan


Ryan S. Wood


Majic Eyes Only – Earth’s Encounters With Extraterrestrial Technology


14004 Quail Ridge Drive

Broomfield, CO 80020

720-887-8171 (ph)



From: JACK SARFATTI [mailto:jacksarfatti@comcast.net] 

Sent: Friday, January 02, 2015 3:41 PM

To: IFPA GROUP-EUROPE; rswood@majesticdocuments.com


Subject: Re: Majestic-12 Group Special Operations Manual - Website - BOGUS?


Right, but why is Ryan pushing this? Who really wrote it?



On Jan 2, 2015, at 1:30 PM, Robert Addinall wrote:


Yes it just doesn't ring true.


Of course it depends on the writers and editors, but military manuals from NATO countries usually avoid use of superlatives like "very." The writing doesn't ring true.


Also, a lot of the content is actually somewhat vague, dressed up a bit to appear specific. Again, this can be a problem with real manuals, but it's a warning sign.


We also know, generally, that the MJ-12 conspiracy stuff is smack in the middle ‎of all the disinformation that floats around on this topic.


Taking all the clues together it just smells like a fabrication.


Certain accurate details may have been inserted in it, which is common with disinformation, but overall it's still misdirection.


On Jan 2, 2015, at 1:12 PM, IFPA GROUP-EUROPE <ifpagroup@gmail.com> wrote:


Yes Jack, I concur ......

Pure disinformation... 

This is BS for mass UFO distraction from the real things.




On Fri, Jan 2, 2015 at 9:44 PM, JACK SARFATTI <jacksarfatti@comcast.net> wrote:

Colonel John Alexander thinks the manual is bogus.


On Jan 2, 2015, at 10:08 AM, IFPA GROUP-EUROPE <ifpagroup@gmail.com> wrote:


Jack et al ..

Here are links to PDF of the "Manual"








On Fri, Jan 2, 2015 at 6:59 PM, JACK SARFATTI <jacksarfatti@comcast.net> wrote:

comments on this?






    • Jack Sarfatti shared a link.
      3 minutes ago · Edited
      Begin forwarded message:

      From: "Academia.edu" <notifications@academia.edu>
      Subject: You just got 35 views on "ER=EPR discovered by Jack Sarfatti in 1974"
      Date: November 20, 2013 at 4:26:15 PM PST
      To: jacksarfatti
      Reply-To: "Academia.edu Support" <support@academia.edu>

      Hi Jack,

      Congratulations! You uploaded your paper 2 days ago and it is already gaining traction.

      Total views since upload:


      You got 35 views from Argentina, the United Kingdom, the United States, Australia, the Islamic Republic of Iran, Israel, Canada, Brazil, Italy, and Spain on "ER=EPR discovered by Jack Sarfatti in 1974".

      Upload Another Paper

      The Academia.edu Team

      You can disable these alerts in your Notification Settings

      Academia.edu, 251 Kearny St., Suite 520, San Francisco, CA, 94108


      From my Starship book under construction
      Only recently, Lenny Susskind and his students working on hologram universe ideas rediscovered this “ER = EPR”[i] connection in a more mathematically rigorous manner than my precognitive remote viewing intuitions over forty years ago. Back then no one else was linking EPR with ER to my knowledge. I conjecture, semiseriously given the claims of Puthoff and Targ at SRI[ii], that since Lenny and I worked together at Cornell in 1963-4 that I was glimpsing his work of 2012 back then in 1974.

      1973: H. G. Ellis’s “drainhole,” the first plausible stargate candidate where the gravity wormhole is coupled to a massless negative energy spin zero field. That year is also a year of high strangeness, but that story is not for this book.

      1974: Hawking shows that all black holes radiate black body radiation[i] whose peak wavelength lmax is roughly the square root of the area-entropy of the black hole’s horizon, i.e., lmax ~ A1/2 where the entropy S ~ kBA/4.

      During this time I conjectured in the pop physics book “Space-Time and Beyond” that Einstein-Rosen bridges and Einstein-Rosen-Podolsky[ii] quantum entanglement[iii] were two sides of the same coin in some yet not well understood sense. This was a precognitive intuition on my part.

      Remember I wrote the quote below in 1974 almost 40 years ago. See David Kaiser's "How the Hippies Saved Physics" about me and my associates back then. We were way ahead of the pack.

      From the 1975 book Space-Time and Beyond E.P. Dutton co-authored with Fred Alan Wolf and artist Bob Toben - First edition. p. 134 "Each part of space is connected to every other part through basic units of interconnection, called wormholes. Signals move through the constantly appearing and disappearing (virtual) wormhole connections, providing instant communication between all parts of space. These signals can be likened to pulses of nerve cells of a great cosmic brain that permeates all parts of space. This is a point of view motivated by Einstein's general theory of relativity in the form of geometrodynamics. A parallel point of view is given in the quantum theory as interpreted by Bohm. In my opinion this is no accident because I suspect that general relativity and quantum theory are simply two complementary aspects of a deeper theory that will involve a kind of cosmic consciousness as the key concept. Bohm writes of “quantum interconnectedness": 

      However there has been too little emphasis on what is, in our view, the most fundamentally different new feature of all, i.e., the intimate interconnection of different systems that are not in spatial contact ... the well known experiment of Einstein, Podolsky and Rosen ... Recently interest in this question has been stimulated by the work of Bell..." D. Bohm & B. Hiley...

      End of excerpt from 1975 Space-Time and Beyond.

      The Wheeler-Fuller pinch-off would then correspond to signal locality (later called “passion at a distance”) corresponding to unitary linear orthodox quantum theory. Stargate traversable wormholes would correspond to what Antony Valentini would years later call “signal nonlocality” in a more general post-quantum theory that was both non-unitary and nonlinear in the sense later clarified independently by Steven Weinberg[iv] and Henry Stapp. [v]

      [i] http://en.wikipedia.org/wiki/Black-body_radiation

      [ii] http://en.wikipedia.org/wiki/EPR_paradox

      [iii] http://en.wikipedia.org/wiki/Quantum_entanglement

      [iv] http://www.npl.washington.edu/AV/altvw48.html

      Steven Weinberg, Physical Review Letters 62, 485 (1989);

      Joseph Polchinski, Physical Review Letters 66, 397 (1991).

      [v] http://www.fourmilab.ch/rpkp/stapp.html

      Henry Stapp Physical Review A, Vol.50, No.1, July 1994

      [i] http://arxiv.org/pdf/1308.0289v1.pdf

      http://motls.blogspot.com/2013/07/papers-on-er-epr-correspondence.html Lubos Motl 


      [ii] http://www.biomindsuperpowers.com/Pages/CIA-InitiatedRV.html
      Black-body radiation - Wikipedia, the free encyclopedia
      Black-body radiation is the type of electromagnetic radiation within or surrounding a body in thermodynamic equilibrium with its environment, or emitted by a black body (an opaque and non-reflective body) held at constant, uniform temperature. The radiation has a specific spectrum and intensity that...
      Jack Sarfatti
      35 minutes ago via Twitter
        http://t.co/BsDySKcu8y Dick Bierman
        Quantum Consciousness
        Studies by Professor Benjamin Libet at University of California San Francisco in the late 1970's on awake neurosurgery patients suggested that the brain refers information "backwards in time". Simple activities like the sensation of walking (seeing and feeling your feet hit the pavement) may also in…
      Jack Sarfatti
      42 minutes ago via Twitter
        Feeling The Future: Is Precognition Possible? - Wired Sciencehttp://t.co/Bp4Tcm3AKc
        Feeling The Future: Is Precognition Possible? - Wired Science
        Most science papers don’t begin with a description of psi, those “anomalous processes of information or energy transfer” that have no material explanation. (Popular examples of psi include telepathy, clairvoyance and psychokinesis.) It’s even less common for a serious science …
      Jack Sarfatti
      43 minutes ago via Twitter
        Can we feel the future through psi? Don't rule it out http://t.co/iMOsHHL8cY
        Can we feel the future through psi? Don't rule it out | Ed Halliwell
        Ed Halliwell: A study suggesting the existence of precognition should be carefully scrutinised – not dismissed out of hand
      Jack Sarfatti
      53 minutes ago via The BBC website
        BBC Two - The Secret Life of Uri Geller
        Documentary exploring Uri Geller's covert life as a 'psychic spy'.
The theory of relativity deals with the geometric
structure of a four-dimensional spacetime. Quantum mechanics
describes properties of matter. Combining these
two theoretical edifices is a difficult proposition. For example,
there is no way of defining a relativistic proper
time for a quantum system which is spread all over
space. A proper time can in principle be defined for a
massive apparatus (‘‘observer’’) whose Compton wavelength
is so small that its center of mass has classical
coordinates and follows a continuous world line. However,
when there is more than one apparatus, there is no
role for the private proper times that might be attached
to the observers’ world lines. Therefore a physical situation
involving several observers in relative motion cannot
be described by a wave function with a relativistic
transformation law (Aharonov and Albert, 1981; Peres,
1995, and references therein). This should not be surprising
because a wave function is not a physical object.
It is only a tool for computing the probabilities of objective
macroscopic events.
Einstein’s [special] principle of relativity asserts that there are
no privileged inertial frames. 
[Comment #3: Einstein's general principle of relativity is that there are no privileged local accelerating frames (AKA LNIFs). In addition, Einstein's equivalence principle is that one can always find a local inertial frame (LIF) coincident with a LNIF (over a small enough region of 4D space-time) in which to a good approximation, Newton's 1/r^2 force is negligible "Einstein's happiest thought" Therefore, Newton's universal "gravity force" is a purely inertial, fictitious, pseudo-force exactly like Coriolis, centrifugal and Euler forces that are artifacts of the proper acceleration of the detector having no real effect on the test particle being measured by the detector. The latter assumes no rigid constraint between detector and test particle. For example a test particle clamped to the edge r of a uniformly slowly rotating disk will have a real EM force of constraint that is equal to m w x w x r.]
This does not imply the
necessity or even the possibility of using manifestly symmetric
four-dimensional notations. This is not a peculiarity
of relativistic quantum mechanics. Likewise, in classical
canonical theories, time has a special role in the
equations of motion.
The relativity principle is extraordinarily restrictive.
For example, in ordinary classical mechanics with a finite
number of degrees of freedom, the requirement that
the canonical coordinates have the meaning of positions,
so that particle trajectories q(t) transform like
four-dimensional world lines, implies that these lines
consist of straight segments. Long-range interactions are
forbidden; there can be only contact interactions between
point particles (Currie, Jordan, and Sudarshan,
1963; Leutwyler, 1965). Nontrivial relativistic dynamics
requires an infinite number of degrees of freedom,
which are labeled by the spacetime coordinates (this is
called a field theory).
Combining relativity and quantum theory is not only a
difficult technical question on how to formulate dynamical
laws. The ontologies of these theories are radically
different. Classical theory asserts that fields, velocities,
etc., transform in a definite way and that the equations
of motion of particles and fields behave covariantly. …
For example, if the expression for the Lorentz force is written
...in one frame, the same expression is valid
in any other frame. These symbols …. have objective
values. They represent entities that really exist, according
to the theory. On the other hand, wave functions
are not defined in spacetime, but in a multidimensional
Hilbert space. They do not transform covariantly when
there are interventions by external agents, as will be
seen in Sec. III. Only the classical parameters attached
to each intervention transform covariantly. Yet, in spite
of the noncovariance of r, the final results of the calculations
(the probabilities of specified sets of events) must
be Lorentz invariant.
As a simple example, consider our two observers, conventionally
called Alice and Bob,4 holding a pair of spin-1/2
particles in a singlet state. Alice measures sand finds
+1, say. This tells her what the state of Bob’s particle is,
namely, the probabilities that Bob would obtain + or - 1 if he
measures (or has measured, or will measure) s along
any direction he chooses. This is purely counterfactual
information: nothing changes at Bob’s location until he
performs the experiment himself, or receives a message
from Alice telling him the result that she found. In particular,
no experiment performed by Bob can tell him
whether Alice has measured (or will measure) her half
of the singlet.
A seemingly paradoxical way of presenting these results
is to ask the following naive question. Suppose that
Alice finds that sz = 1 while Bob does nothing. When
does the state of Bob’s particle, far away, become the
one for which sz = -1 with certainty? Although this
question is meaningless, it may be given a definite answer:
Bob’s particle state changes instantaneously. In
which Lorentz frame is this instantaneous? In any
frame! Whatever frame is chosen for defining simultaneity,
the experimentally observable result is the same, as
can be shown in a formal way (Peres, 2000b). Einstein
himself was puzzled by what seemed to be the instantaneous
transmission of quantum information. In his autobiography,
he used the words ‘‘telepathically’’ and
‘‘spook’’ (Einstein, 1949). …
In the laboratory, any experiment
has to be repeated many times in order to infer a
law; in a theoretical discussion, we may imagine an infinite
number of replicas of our gedanken experiment, so
as to have a genuine statistical ensemble. Yet the validity
of the statistical nature of quantum theory is not restricted
to situations in which there are a large number
of similar systems. Statistical predictions do apply to
single eventsWhen we are told that the probability of
precipitation tomorrow is 35%, there is only one tomorrow.
This tells us that it may be advisable to carry an
umbrella. Probability theory is simply the quantitative
formulation of how to make rational decisions in the
face of uncertainty (Fuchs and Peres, 2000). A lucid
analysis of how probabilistic concepts are incorporated
into physical theories is given by Emch and Liu (2002).
[My comment #4: Peres is correct, but there is no conflict with Bohm's ontological
interpretation here. The Born probability rule is not fundamental to quantum reality
in Bohm's view, but is a limiting case when the beables are in thermal equilibrium.]
Some trends in modern quantum information theory
may be traced to security problems in quantum communication.
A very early contribution was Wiesner’s seminal
paper ‘‘Conjugate Coding,’’ which was submitted
circa 1970 to IEEE Transactions on Information Theory
and promptly rejected because it was written in a jargon
incomprehensible to computer scientists (this was actually
a paper about physics, but it had been submitted to
a computer science journal). Wiesner’s article was finally
published (Wiesner, 1983) in the newsletter of ACM
SIGACT (Association for Computing Machinery, Special
Interest Group in Algorithms and Computation
Theory). That article tacitly assumed that exact duplication
of an unknown quantum state was impossible, well
before the no-cloning theorem (Dieks, 1982; Wootters
and Zurek, 1982) became common knowledge. Another
early article, ‘‘Unforgeable Subway Tokens’’ (Bennett
et al., 1983) also tacitly assumed the same.
A. The ambivalent quantum observer
Quantum mechanics is used by theorists in two different
ways. It is a tool for computing accurate relationships
between physical constants, such as energy levels,
cross sections, transition rates, etc. These calculations
are technically difficult, but they are not controversial.
In addition to this, quantum mechanics also provides
statistical predictions for results of measurements performed
on physical systems that have been prepared in a
specified way. 
[My comment #5: No mention of Yakir Aharonov's intermediate present "weak measurements"
with both history past pre-selection and destiny future post-selection constraints. The latter in
Wheeler delayed choice mode would force the inference of real back-from-the-future retrocausality.
This would still be consistent with Abner Shimony's "passion at a distance," i.e. "signal locality"
in that the observer at the present weak measurement would not know what the future constraint 
actually will be. In contrast, with signal non locality (Sarfatti  1976 MIT Tech Review (Martin Gardner) & 
Antony Valentini (2002)) such spooky precognition would be possible as in Russell Targ's reports on 
CIA funded RV experiments at SRI in the mid 70's and 80's. 
This is, on the face of it, a gross violation of orthodox
quantum theory as laid out here in the Peres review paper.]
The quantum measuring process is the interface
of classical and quantum phenomena. The preparation
and measurement are performed by macroscopic
devices, and these are described in classical terms. The
necessity of using a classical terminology was emphasized
by Niels Bohr (1927) from the very early days of
quantum mechanics. Bohr’s insistence on a classical description
was very strict. He wrote (1949)
‘‘ . . . by the word ‘experiment’ we refer to a situation
where we can tell others what we have done and what
we have learned and that, therefore, the account of the
experimental arrangement and of the results of the observations
must be expressed in unambiguous language,
with suitable application of the terminology of
classical physics.’’
Note the words ‘‘we can tell.’’ Bohr was concerned
with information, in the broadest sense of this term. He
never said that there were classical systems or quantum
systems. There were physical systems, for which it was
appropriate to use the classical language or the quantum
language. There is no guarantee that either language
gives a perfect description, but in a well-designed experiment
it should be at least a good approximation.
Bohr’s approach divides the physical world into ‘‘endosystems’’
(Finkelstein, 1988), which are described by
quantum dynamics, and ‘‘exosystems’’ (such as measuring
apparatuses), which are not described by the dynamical
formalism of the endosystem under consideration.
A physical system is called ‘‘open’’ when parts of
the universe are excluded from its description. In different
Lorentz frames used by observers in relative motion,
different parts of the universe may be excluded. The
systems considered by these observers are then essentially
different, and no Lorentz transformation exists
that can relate them (Peres and Terno, 2002).
It is noteworthy that Bohr never described the measuring
process as a dynamical interaction between an
exophysical apparatus and the system under observation.
He was, of course, fully aware that measuring apparatuses
are made of the same kind of matter as everything
else, and they obey the same physical laws. It is
therefore tempting to use quantum theory in order to
investigate their behavior during a measurement. However,
if this is done, the quantized apparatus loses its
status as a measuring instrument. It becomes a mere intermediate
system in the measuring process, and there
must still be a final instrument that has a purely classical
description (Bohr, 1939).
Measurement was understood by Bohr as a primitive
notion. He could thereby elude questions which caused
considerable controversy among other authors. A
quantum-dynamical description of the measuring process
was first attempted by John von Neumann in his
treatise on the mathematical foundations of quantum
theory (1932). In the last section of that book, as in an
afterthought, von Neumann represented the apparatus
by a single degree of freedom, whose value was correlated
with that of the dynamical variable being measured.
Such an apparatus is not, in general, left in a definite
pure state, and it does not admit a classical
description. Therefore von Neumann introduced a second
apparatus which observes the first one, and possibly
a third apparatus, and so on, until there is a final measurement,
which is not described by quantum dynamics
and has a definite result (for which quantum mechanics
can give only statistical predictions). The essential point
that was suggested, but not proved by von Neumann, is
that the introduction of this sequence of apparatuses is
irrelevant: the final result is the same, irrespective of the
location of the ‘‘cut’’ between classical and quantum
These different approaches of Bohr and von Neumann
were reconciled by Hay and Peres (1998), who
8At this point, von Neumann also speculated that the final
step involves the consciousness of the observer—a bizarre
statement in a mathematically rigorous monograph (von Neumann,
B. The measuring process
Dirac (1947) wrote that ‘‘a measurement always
causes the system to jump into an eigenstate of the dynamical
variable being measured.’’ Here, we must be
careful: a quantum jump (also called a collapse) is something
that happens in our description of the system, not
to the system itself. Likewise, the time dependence of
the wave function does not represent the evolution of a
physical system. It only gives the evolution of probabilities
for the outcomes of potential experiments on that
system (Fuchs and Peres, 2000).
Let us examine more closely the measuring process.
First, we must refine the notion of measurement and
extend it to a more general one: an interventionAn
intervention is described by a set of parameters which
include the location of the intervention in spacetime, referred
to an arbitrary coordinate system. We also have
to specify the speed and orientation of the apparatus in
the coordinate system that we are using as well as various
other input parameters that control the apparatus,
such as the strength of a magnetic field or that of a rf
pulse used in the experiment. The input parameters are
determined by classical information received from past
interventions, or they may be chosen arbitrarily by the
observer who prepares that intervention or by a local
random device acting in lieu of the observer.
[My comment #6: Peres, in my opinion, makes another mistake.
Future interventions will affect past weak measurements.

Back From the Future

A series of quantum experiments shows that measurements performed in the future can influence the present. Does that mean the universe has a destiny—and the laws of physics pull us inexorably toward our prewritten fate?

By Zeeya Merali|Thursday, August 26, 2010
http://discovermagazine.com/2010/apr/01-back-from-the-future#.UieOnhac5Hw ]
An intervention has two consequences. One is the acquisition
of information by means of an apparatus that
produces a record. This is the ‘‘measurement.’’ Its outcome,
which is in general unpredictable, is the output of
the intervention. The other consequence is a change of
the environment in which the quantum system will
evolve after completion of the intervention. For example,
the intervening apparatus may generate a new
Hamiltonian that depends on the recorded result. In particular,
classical signals may be emitted for controlling
the execution of further interventions. These signals are,
of course, limited to the velocity of light.
The experimental protocols that we consider all start
in the same way, with the same initial state ... , and the
first intervention is the same. However, later stages of
the experiment may involve different types of interventions,
possibly with different spacetime locations, depending
on the outcomes of the preceding events. Yet,
assuming that each intervention has only a finite number
of outcomes, there is for the entire experiment only a
finite number of possible records. (Here, the word
record means the complete list of outcomes that occurred
during the experiment. We do not want to use the
word history, which has acquired a different meaning in
the writings of some quantum theorists.)
Each one of these records has a definite probability in
the statistical ensemble. In the laboratory, experimenters
can observe its relative frequency among all the records
that were obtained; when the number of records tends
to infinity, this relative frequency is expected to tend to
the true probability. The aim of theory is to predict the
probability of each record, given the inputs of the various
interventions (both the inputs that are actually controlled
by the local experimenter and those determined
by the outputs of earlier interventions). Each record is
objective: everyone agrees on what happened (e.g.,
which detectors clicked). Therefore, everyone agrees on
what the various relative frequencies are, and the theoretical
probabilities are also the same for everyone.
Interventions are localized in spacetime, but quantum
systems are pervasive. In each experiment, irrespective
of its history, there is only one quantum system, which
may consist of several particles or other subsystems, created
or annihilated at the various interventions. Note
that all these properties still hold if the measurement
outcome is the absence of a detector click. It does not
matter whether this is due to an imperfection of the detector
or to a probability less than 1 that a perfect detector
would be excited. The state of the quantum system
does not remain unchanged. It has to change to
respect unitarity. The mere presence of a detector that
could have been excited implies that there has been an
interaction between that detector and the quantum system.
Even if the detector has a finite probability of remaining
in its initial state, the quantum system correlated
to the latter acquires a different state (Dicke,
1981). The absence of a click, when there could have
been one, is also an event.
The measuring process involves not only the physical
system under study and a measuring apparatus (which
together form the composite system C) but also their
environment, which includes unspecified degrees of freedom
of the apparatus and the rest of the world. These
unknown degrees of freedom interact with the relevant
ones, but they are not under the control of the experimenter
and cannot be explicitly described. Our partial
ignorance is not a sign of weakness. It is fundamental. If
everything were known, acquisition of information
would be a meaningless concept.
A complete description of involves both macroscopic
and microscopic variables. The difference between
them is that the environment can be considered as
adequately isolated from the microscopic degrees of
freedom for the duration of the experiment and is not
influenced by them, while the environment is not isolated
from the macroscopic degrees of freedomFor example,
if there is a macroscopic pointer, air molecules bounce
from it in a way that depends on the position of that
pointer. Even if we can neglect the Brownian motion of
a massive pointer, its influence on the environment leads
to the phenomenon of decoherence, which is inherent to
the measuring process.
An essential property of the composite system C,
which is necessary to produce a meaningful measurement,
is that its states form a finite number of orthogonal
subspaces which are distinguishable by the observer.
[My comment #7: This is not the case for Aharonov's weak measurements where
<A>weak = <history|A|destiny>/<history|destiny>
Nor is it true when Alice's orthogonal micro-states are entangled with Bob's far away distinguishably non-orthogonal macro-quantum Glauber coherent and possibly squeezed states.
  1. Coherent states - Wikipedia, the free encyclopedia

    In physics, in quantum mechanics, a coherent state is the specific quantum state of the quantum harmonic oscillator whose dynamics most closely resembles the ...
    You've visited this page many times. Last visit: 8/7/13
  2. Review of Entangled Coherent States

    arxiv.org › quant-ph
    by BC Sanders - ‎2011 - ‎Cited by 6 - ‎Related articles
    Dec 8, 2011 - Abstract: We review entangled coherent state research since its first implicit use in 1967
|Alice,Bob> = (1/2)[|Alice +1>|Bob alpha> + |Alice -1>|Bob beta>]
<Alice+1|Alice -1> = 0
<Bob alpha|Bob beta> =/= 0  
e.g. Partial trace over Bob's states  |<Alice +1|Alice-Bob>|^2 = (1/2)[1 + |<Bob alpha|Bob beta>|^2] > 1
this is formally like a weak measurement where the usual Born probability rule breaks down. 
Complete isolation from environmental decoherence is assumed here.
It is clear violation of "passion at a distance" no-entanglement signaling arguments based on axioms that are empirically false in my opinion.
"The statistics of Bob’s result are not affected at all by what Alice may simultaneously do somewhere else. " (Peres) 
is false.
While a logically correct formal proof is desirable in physics, Nature has ways of leap frogging over their premises.
One can have constrained pre and post-selected conditional probabilities that are greater than 1, negative and even complex numbers. 
All of which correspond to observable effects in the laboratory - see Aephraim Steinberg's experimental papers
University of Toronto.]
Each macroscopically distinguishable subspace corresponds
to one of the outcomes of the intervention and
defines a POVM element Em , given explicitly by Eq. (8)
below. …
C. Decoherence
Up to now, quantum evolution is well defined and it is
in principle reversible. It would remain so if the environment
could be perfectly isolated from the macroscopic
degrees of freedom of the apparatus. This demand is of
course self-contradictory, since we have to read the result
of the measurement if we wish to make any use of it.
A detailed analysis of the interaction with the environment,
together with plausible hypotheses (Peres, 2000a),
shows that states of the environment that are correlated
with subspaces of with different labels m can be treated
as if they were orthogonal. This is an excellent approximation
(physics is not an exact science, it is a science of
approximations). The resulting theoretical predictions
will almost always be correct, and if any rare small deviation
from them is ever observed, it will be considered
as a statistical quirk or an experimental error.
The density matrix of the quantum system is thus effectively
block diagonal, and all our statistical predictions
are identical to those obtained for an ordinary mixture
of (unnormalized) pure states
This process is called decoherence. Each subspace
m is stable under decoherence—it is their relative
phase that decoheres. From this moment on, the macroscopic
degrees of freedom of have entered into the
classical domain. We can safely observe them and ‘‘lay
on them our grubby hands’’ (Caves, 1982). In particular,
they can be used to trigger amplification mechanisms
(the so-called detector clicks) for the convenience of the
Some authors claim that decoherence may provide a
solution of the ‘‘measurement problem,’’ with the particular
meaning that they attribute to that problem
(Zurek, 1991). Others dispute this point of view in their
comments on the above article (Zurek, 1993). A reassessment
of this issue and many important technical details
were recently published by Zurek (2002, 2003). Yet
decoherence has an essential role, as explained above. It
is essential that we distinguish decoherence, which results
from the disturbance of the environment by the
apparatus (and is a quantum effect), from noise, which
would result from the disturbance of the system or the
apparatus by the environment and would cause errors.
Noise is a mundane classical phenomenon, which we ignore
in this review.
E. The no-communication theorem
We now derive a sufficient condition that no instantaneous
information transfer can result from a distant intervention.
We shall show that the condition is
[Am,Bnn] = 0
where Amand Bnare Kraus matrices for the observation
of outcomes m by Alice and n by Bob.
[My comment #8: "The most beautiful theory is murdered by an ugly fact." - Feynman
e.g. Libet-Radin-Bierman presponse in living brain data
SRI CIA vetted reports of remote viewing by living brains.
  1. CIA-Initiated Remote Viewing At Stanford Research Institute

    As if to add insult to injury, he then went on to "remote view" the interior of the apparatus, .... Figure 6 - Left to right: Christopher Green, Pat Price, and Hal Puthoff.
    You've visited this page many times. Last visit: 5/30/13
  2. Harold E. Puthoff - Wikipedia, the free encyclopedia

    PuthoffHal, Success Story, Scientology Advanced Org Los Angeles (AOLA) special... H. E. Puthoff, CIA-Initiated Remote Viewing At Stanford Research Institute, ...
  3. Remote viewing - Wikipedia, the free encyclopedia

    Among some of the ideas that Puthoff supported regarding remote viewing was the ...by Russell Targ and Hal Puthoff at Stanford Research Institute in the 1970s  ...
    You've visited this page many times. Last visit: 7/5/13
  4. Dr. Harold Puthoff on Remote Viewing - YouTube

    Apr 28, 2011 - Uploaded by corazondelsur
    Dr. Hal Puthoff is considered the father of the US government'sRemote Viewing program, which reportedly ...
  5. Remoteviewed.com - Hal Puthoff

    Dr. Harold E. Puthoff is Director of the Institute for Advanced Studies at Austin. A theoretical and experimental physicist specializing in fundamental ...
On Sep 4, 2013, at 9:06 AM, JACK SARFATTI <adastra1@icloud.com> wrote:
Peres here is only talking about Von Neumann's strong measurements not 
Aharonov's weak measurements.

Standard texbooks on quantum mechanics
tell you that observable quantities are represented by
Hermitian operators, that their possible values are the
eigenvalues of these operators, and that the probability
of detecting eigenvalue a, corresponding to eigenvector
|a>  |<a|psi>|2, where |psi> is the (pure) state of the
quantum system that is observed. With a bit more sophistication
to include mixed states, the probability can
be written in a general way <a|rho|a> …
This is nice and neat, but it does not describe what
happens in real lifeQuantum phenomena do not occur
in Hilbert space; they occur in a laboratory. If you visit a
real laboratory, you will never find Hermitian operators
there. All you can see are emitters (lasers, ion guns, synchrotrons,
and the like) and appropriate detectors. In
the latter, the time required for the irreversible act of
amplification (the formation of a microscopic bubble in
a bubble chamber, or the initial stage of an electric discharge)
is extremely brief, typically of the order of an
atomic radius divided by the velocity of light. Once irreversibility
has set in, the rest of the amplification process
is essentially classical. It is noteworthy that the time and
space needed for initiating the irreversible processes are
incomparably smaller than the macroscopic resolution
of the detecting equipment.
The experimenter controls the emission process and
observes detection events. The theorist’s problem is to
predict the probability of response of this or that detector,
for a given emission procedure. It often happens
that the preparation is unknown to the experimenter,
and then the theory can be used for discriminating between
different preparation hypotheses, once the detection
outcomes are known.
<Screen Shot 2013-09-04 at 8.57.50 AM.png>
Many physicists, perhaps a majority, have an intuitive,
realistic worldview and consider a quantum state as a
physical entity. Its value may not be known, but in principle
the quantum state of a physical system would be
well defined. However, there is no experimental evidence
whatsoever to support this naive belief. On the
contrary, if this view is taken seriously, it may lead to
bizarre consequences, called ‘‘quantum paradoxes.’’
These so-called paradoxes originate solely from an incorrect
interpretation of quantum theory, which is thoroughly
pragmatic and, when correctly used, never yields
two contradictory answers to a well-posed question. It is
only the misuse of quantum concepts, guided by a pseudorealistic
philosophy, that leads to paradoxical results.
[My comment #2: Here is the basic conflict between epistemological vs ontological views of quantum reality.]
In this review we shall adhere to the view that r is
only a mathematical expression which encodes information
about the potential results of our experimental interventions.
The latter are commonly called
‘‘measurements’’—an unfortunate terminology, which
gives the impression that there exists in the real world
some unknown property that we are measuring. Even
the very existence of particles depends on the context of
our experiments. In a classic article, Mott (1929) wrote
‘‘Until the final interpretation is made, no mention
should be made of the a ray being a particle at all.’’
Drell (1978a, 1978b) provocatively asked ‘‘When is a
particle?’’ In particular, observers whose world lines are
accelerated record different numbers of particles, as will
be explained in Sec. V.D (Unruh, 1976; Wald, 1994).
1The theory of relativity did not cause as much misunderstanding
and controversy as quantum theory, because people
were careful to avoid using the same nomenclature as in nonrelativistic
physics. For example, elementary textbooks on
relativity theory distinguish ‘‘rest mass’’ from ‘‘relativistic
mass’’ (hard-core relativists call them simply ‘‘mass’’ and ‘‘energy’’).
2The ‘‘irreversible act of amplification’’ is part of quantum
folklore, but it is not essential to physics. Amplification is
needed solely to facilitate the work of the experimenter.
3Positive operators are those having the property that
^curuc&>0 for any state c. These operators are always Hermitian.
94 A. Peres and D. R. Terno: Quantum information and relativity theory
Rev. Mod.
On Sep 4, 2013, at 8:48 AM, JACK SARFATTI <adastra1@icloud.com> wrote:

Begin forwarded message:

From: JACK SARFATTI <jacksarfatti@icloud.com>
Subject: Quantum information and relativity theory
Date: September 4, 2013 8:33:48 AM PDT
To: nick herbert <quanta@mail.cruzio.com>

The late Asher Peres http://en.wikipedia.org/wiki/Asher_Peres interpretation is the antithesis of the late David Bohm's ontological interpretation http://en.wikipedia.org/wiki/David_Bohm holding to a purely subjective epistemological Bohrian interpretation of the quantum BIT potential Q.
He claims that Antony Valentini's signal non locality beyond orthodox quantum theory would violate the Second Law of Thermodynamics.
Quantum information and relativity theory
Asher Peres
Department of Physics, Technion–Israel Institute of Technology, 32000 Haifa, Israel
Daniel R. Terno
Perimeter Institute for Theoretical Physics, Waterloo, Ontario, Canada N2J 2W9
(Published 6 January 2004)
This article discusses the intimate relationship between quantum mechanics, information theory, and
relativity theory. Taken together these are the foundations of present-day theoretical physics, and
their interrelationship is an essential part of the theory. The acquisition of information from a
quantum system by an observer occurs at the interface of classical and quantum physics. The authors
review the essential tools needed to describe this interface, i.e., Kraus matrices and
positive-operator-valued measures. They then discuss how special relativity imposes severe
restrictions on the transfer of information between distant systems and the implications of the fact that
quantum entropy is not a Lorentz-covariant concept. This leads to a discussion of how it comes about
that Lorentz transformations of reduced density matrices for entangled systems may not be
completely positive maps. Quantum field theory is, of course, necessary for a consistent description of
interactions. Its structure implies a fundamental tradeoff between detector reliability and
localizability. Moreover, general relativity produces new and counterintuitive effects, particularly
when black holes (or, more generally, event horizons) are involved. In this more general context the
authors discuss how most of the current concepts in quantum information theory may require a
I. Three Inseparable Theories 93
A. Relativity and information 93
B. Quantum mechanics and information 94
C. Relativity and quantum theory 95
D. The meaning of probability 95
E. The role of topology 96
F. The essence of quantum information 96
II. The Acquisition of Information 97
A. The ambivalent quantum observer 97
B. The measuring process 98
C. Decoherence 99
D. Kraus matrices and positive-operator-valued
measures (POVM’s) 99
E. The no-communication theorem 100
III. The Relativistic Measuring Process 102
A. General properties 102
B. The role of relativity 103
C. Quantum nonlocality? 104
D. Classical analogies 105
IV. Quantum Entropy and Special Relativity 105
A. Reduced density matrices 105
B. Massive particles 105
C. Photons 107
D. Entanglement 109
E. Communication channels 110
V. The Role of Quantum Field Theory 110
A. General theorems 110
B. Particles and localization 111
C. Entanglement in quantum field theory 112
D. Accelerated detectors 113
VI. Beyond Special Relativity 114
A. Entanglement revisited 115
B. The thermodynamics of black holes 116
C. Open problems 118
Acknowledgments and Apologies 118
Appendix A: Relativistic State Transformations 119
Appendix B: Black-Hole Radiation 119
References 120
Quantum theory and relativity theory emerged at the
beginning of the twentieth century to give answers to
unexplained issues in physics: the blackbody spectrum,
the structure of atoms and nuclei, the electrodynamics of
moving bodies. Many years later, information theory
was developed by Claude Shannon (1948) for analyzing
the efficiency of communication methods. How do these
seemingly disparate disciplines relate to each other? In
this review, we shall show that they are inseparably
A. Relativity and information
Common presentations of relativity theory employ
fictitious observers who send and receive signals. These
‘‘observers’’ should not be thought of as human beings,
but rather as ordinary physical emitters and detectors.
Their role is to label and locate events in spacetime. The
speed of transmission of these signals is bounded by
c—the velocity of light—because information needs a
material carrier, and the latter must obey the laws of
physics. Information is physical (Landauer, 1991).
[My comment #1: Indeed information is physical. Contrary to Peres, in Bohm's theory Q is also physical but not material (be able), consequently one can have entanglement negentropy transfer without be able material propagation of a classical signal. I think Peres makes a fundamental error here.]
However, the mere existence of an upper bound on
the speed of propagation of physical effects does not do
justice to the fundamentally new concepts that were introduced
by Albert Einstein (one could as well imagine
communications limited by the speed of sound, or that
of the postal service). Einstein showed that simultaneity
had no absolute meaning, and that distant events might
have different time orderings when referred to observers
in relative motion. Relativistic kinematics is all about
information transfer between observers in relative motion.
Classical information theory involves concepts such as
the rates of emission and detection of signals, and the
noise power spectrum. These variables have well defined
relativistic transformation properties, independent
of the actual physical implementation of the communication

The key to this is Valentini’s “signal nonlocality” (see below) which I captured here in a particular instance

On Jul 24, 2012, at 5:39 PM, art wagner wrote:

"The aim of this paper is to define in theoretical terms and summarise the available experimental evidence that physical and mental "objects", if considered "information units", may present similar classical and quantum models of communication beyond their specific characteristics. Starting with the Remote State Preparation protocol, a variant of the teleportation protocol, for which formal models and experimental evidence are already available in quantum mechanics, we outline a formal model applied to mental information we defined Remote State Preparation of Mental Information (RSPMI), and we summarise the experimental evidence supporting the feasibility of a RSPMI protocol. The available experimental evidence offers strong support to the possibility of real communication at distance of mental information promoting the integration between disciplines that have as their object of knowledge different aspects of reality, both physical and the mental, leading to a significant paradigm shift in cognitive and information science." http://xxx.lanl.gov/abs/1201.6624

All papers by Khrennikov:  http://arxiv.org/find/quant-ph/1/au:+Khrennikov_A/0/1/0/all/0/1

Subquantum Information and Computation
Antony Valentini
(Submitted on 11 Mar 2002 (v1), last revised 12 Apr 2002 (this version, v2))
It is argued that immense physical resources - for nonlocal communication, espionage, and exponentially-fast computation - are hidden from us by quantum noise, and that this noise is not fundamental but merely a property of an equilibrium state in which the universe happens to be at the present time. It is suggested that 'non-quantum' or nonequilibrium matter might exist today in the form of relic particles from the early universe. We describe how such matter could be detected and put to practical use. Nonequilibrium matter could be used to send instantaneous signals, to violate the uncertainty principle, to distinguish non-orthogonal quantum states without disturbing them, to eavesdrop on quantum key distribution, and to outpace quantum computation (solving NP-complete problems in polynomial time).
Comments:    10 pages, Latex, no figures. To appear in 'Proceedings of the Second Winter Institute on Foundations of Quantum Theory and Quantum Optics: Quantum Information Processing', ed. R. Ghosh (Indian Academy of Science, Bangalore, 2002). Second version: shortened at editor's request; extra material on outpacing quantum computation (solving NP-complete problems in polynomial time)
Subjects:    Quantum Physics (quant-ph)
Journal reference:    Pramana - J. Phys. 59 (2002) 269-277
DOI:    10.1007/s12043-002-0117-1
Report number:    Imperial/TP/1-02/15
Cite as:    arXiv:quant-ph/0203049v2

► 3:15► 3:15
US Scientists Discuss Secret CIA Testing of Uri ...
Aug 13, 2008 - 3 min - Uploaded by mathienco
Dr. Hal Puthoff and Laser Physicist Russel Targ talking about Uri Geller's secret CIA tests at SRI. ... Watch ...

► 5:58► 5:58
Hal Puthoff pt.5 of 5- Remote Viewing and ...
Nov 20, 2007 - 6 min - Uploaded by newrealities
Hal Puthoff pt.5 of 5- Remote Viewing and Consciousness ... of the online series Puthoff discusses the ...

► 32:09► 32:09
Secret CIA Psychic Lab Experiments - Watch Free ...
Jun 3, 2012 - 32 min
For over 20 years, scientists at Stanford Research Institute (SRI)... (RV) Remote Viewing, and the research ...

CIA-Initiated RV Program at SRI
CIA-Initiated Remote Viewing At Stanford Research Institute. by H. E. Puthoff, Ph. D. Institute for Advanced Studies at Austin 4030 Braker Lane W., #300. Austin ...
You visited this page on 7/23/12.
Remoteviewed.com - Hal Puthoff
Remote Viewing - About Hal Puthoff and SRI. ... The CIA Star Gate Files ... Dr. Harold E. Puthoff is Director of the Institute for Advanced Studies at Austin.
Remote viewing - Wikipedia, the free encyclopedia
In the early 1970s, Harold E. Puthoff and Russell Targ joined the Electronics and ... In 1972, Puthoff tested remote viewer Ingo Swann at SRI, and the experiment led to a visit ... The initial CIA-funded project was later renewed and expanded.
You've visited this page 2 times. Last visit: 7/16/12
Dr. Harold E. Puthoff (Remote Viewing) From the Secret Life of ...
Dr. Harold E. Puthoff at The Arlington Institute Speaks of the beginnings of the real ...CIA-Initiated Remote Viewing Program at Stanford Research Institute ...