You are here:
Home Jack Sarfatti's Blog Blog (Full Text Display) Henry Stapp on Daryl Bem's evidence for signal nonlocality

Jun
22

Tagged in:

On Jun 21, 2011, at 2:42 PM, Ruth Elinor Kastner wrote:

Just a note to say that I agree with much of what Prof. Stapp says here regarding the need to question a 'block world'-type view, and note that my 'possibilist' development of Cramer's Transactional Interpretation ("PTI") makes use of a sub-empirical domain that seems to me very similar to his 'process time'. My book for CUP on PTI is underway and will present these ideas in detail. Due to time constraints I did not quite get to this in my AAAS talk.

best

Ruth

________________________________________

From: Henry P. Stapp [hpstapp@lbl.gov]

Sent: Tuesday, June 21, 2011 5:18 PM

To: JACK SARFATTI

Subject: Re: Retrodiction yes, retrocausation no

From Henry Stapp:

Because my comments at the recent AAAS meeting were mentioned, I would

like to give a fuller account of what I said, or at least tried to

say, and my opinion on the significance of the Bem results---

assumed for present purposes to be veridical and reproducible.

In the first place, the Aharonov papers do not change standard

(orthodox) QM by "one iota", in the sense that the predictions of

results of observations on the post-selected subsets are precisely

the predictions obtained by application of the standard rules. That

is why there is no doubt or question about what the predictions are.

And there should be no surprise that the predictions are confirmed by

the observations: that matching is merely confirmation of the standard

rules of QM, applied to the specified empirical conditions.

Jack: Exactly, the practical advantage is psychological leading to weak and partial measurement methods good for extracting weak signals in strong noise. Techniques that might have been discovered by orthodox thinkers. but allegedly were not.

The second key point is that the results reported by Bem are

inconsistent with the standard rules of orthodox QM. That is because

the Bem results essentially allow the transmission of a "signal"

(a message created by the sender) backwards in time.

Yes, this is signal nonlocality! This is what we were looking for as described in David Kaiser's book How The Hippies Saved Physics. The New York Times Book Review (June 19, 2011) is hostile to the book and grossly inaccurate about the physics. Peter Woit's review in American Scientist is much more balanced and accurate. - Jack

The idea of making changes that act "backward in time" needs to be

clarified, since it is, in a sense, incompatible within the ordinary

common sense idea of time. Yet the empirical results arising

from the Wheeler delayed choice experiments seem, in some sense, to

be saying that choices made now are affecting what happened in the past.

This backward-in-time aspect of QM is compactly captured by the

assertion made in the recent book "The Grand Design" by Hawking and

Mlodinow "We create history by our observations, history does not

create us". (p.140)

How can one make rationally coherent good sense out of this

peculiar feature of QM?

The way I do this is to introduce the concept of "process time",

which is a "time" that is different from the "Einstein time" that

is joined to physically described space to give Einstein space-time.

(See my chapter in "Physics and the Ultimate Significance of Time" SUNY,

1986, Ed. David Ray Griffiths. Three physicists, D. Bohm, I. Prigogine and

I presented our ideas about time.)

Jack: Since entropy is technically imaginary action, e.g,. Hawking's rotation to imaginary time relates quantum field theory to statistical mechanics with a change of space dimension. So would a complex time do what Henry wants? Of course the signature changes. No light cones in imaginary time, so no causal constraints either?

In relativistic quantum field theory (Tomonaga/Schwinger) the quantum

state collapses not on an advancing sequence of constant time surfaces

t(i):t(i+1)>t(i); but rather on an advancing sequence of space-like

surfaces sigma(i): for each i, every point on sigma(i) is spacelike

displaced from every other point on sigma(i), and every point on

sigma(i+1) either coincides with a point on sigma(i), or lies in the open

future lightcone of some points on sigma(i).

At each surface sigma(i) a projection operator P(i), or its complement

P'(i)=(I-P(i)), acts to reduce the quantum state to some part of its

former self.

For each sigma(i) there is a "block universe" defined by extending the

quantum state on sigma(i) both forward and backward in time via the

unitary time evolution operator generated by the Schroedinger equation.

Let the index i that labels the sigma(i) be called "process time".

Then for each instant i of process time, a new history is defined by

the backward-in-time evolution from the newly created state on

sigma(i). All predictions about the future are "as if" the future

state is the smooth continuation from the newly created past.

This newly created past is the "effective past", in the sense that

the future predictions are given by taking this newly created past to be

the past.

In orthodox QM each instant of process time corresponds to an

"observation": the collapse at process time i reduces the former

quantum state to the part of itself compatible with the increased

knowledge generated by the new observation.

The actual physical universe is generated by the always-forward-moving

creative pocess---in the sense that the sequence of surfaces sigma(i)

advances into the future, even though there is an associated set

of revised "histories".

This is all just standard quantum mechanics, elaborated for clarity.

The element of "randomness" enters Copenhagen/vonNeumann QM in the

following way:

Each "observation" must be preceded first by a choice on the part of

the observer of what aspect of nature he or she wants to probe.

The outcome "yes" of the probing action must be a recognizable

empirical experience. The "probability" that nature will return the answer

"yes" is specified by quantum theory. The probability of "No" is the

complement: Prob(Yes)+Prob(No)=1.

It is a fundamental theorem of QM that IF the standard orthodox

statistical rules governing nature's choice of reply are strictly

adhered to, then no "message" (information transfer containing

info controlled by the sender) can be sent faster-than-light

or backward-in-time: I cannot receive "now" information controlled

by the "freely chosen" actions of a sender acting in my future, in any

frame: "messages" can be sent only into the forward light-cone;

"normal statistics entails normal causation!"

Jack: According to this rule, John Cramer's recent modification of the Dopfur experiment throwing out the coincidence circuit must fail.

Given the powerful evidence for the validity of the

orthodox rules, not only from the thousands of standard pertinent

experiments performed during the past century, but also for the

strange predictions for post-selection and partial measurements described

at this AAAS conference, it is reasonable to try retain the orthodox

rules, insofar as is possible, and not try to start from scratch, as

was recommended by some speakers.

The simplest way to get the Bem results within the general framework of

orthodox QM is to allow a slight biasing of the orthodox statistical

rules: a biasing of nature's choices that works to enhance the

maintainance of life of biological systems.

We know already from EPR-type phenomena that nature's choices "here" must

have some sort of pertinent access to faraway information: nature's choice

at some moment i in process time may have access to the corresponding

block universe, which represents weighted potentialities for the future,

as they exist at that particular moment i in process time.

Of course, this notion that nature's choices are biased in favor

of biology is absolutely opposed to the concepts of classical physics,

where everything---except the initial state of the universe---is

controlled by purely local mechanical process. This idea that nature

creates the initial state, but thereafter maintains a strictly

hands-off stance, is carried over to QM by the orthodox statistical

rule, which is completely determined by purely physical considerations.

But the Bem experiment, by indicating a possible bio-enhancing bias,

may be an empirical indication that the classical-physics notion of a

hands-off basically passive nature may be incorrect!

Of course, this conclusion tends to move science in a direction

somewhat concordant with some religions, but that fact is not a proper

scientific reason to reject it: the motivation here is strictly

to save the principles of orthodox QM, in the face of data that,

if bona fide, seem to entail a need to revise in some way the

strictly orthodox theory. And the simplest revision is to back

away from the notion that nature, after creating the physical universe,

simply lets it run on automatic. That idea is certainly very agreeable

to mathematical physicists, but is hardly the only way that nature could

behave. We may need to tailor the existing theory to accomodate the data,

not confine the acceptable data to those that fit the existing (subject to

modification) theory!

That, in brief, is the message that I tried to convey at the AAAS meeting!

Jack: Indeed you did and if you look at B.J. Carr's 2008 review paper you will see I have been arguing very much the same position based on the prior reports of Libet, Radin, Bierman, Puthoff & Targ. Now Bem's data seem to put the last nail in the applicability of Orthodox Quantum Mechanics for living matter.

Can Psychical Research Bridge the Gulf Between Matter and Mind? Bernard Carr Proceedings of the Society for Psychical Research, Vol 59 Part 221 June 2008 describes my ideas on signal nonlocality violating quantum theory in living matter as well as Brian Josephson's.

see also

Subquantum Information and Computation

Antony Valentini

(Submitted on 11 Mar 2002 (v1), last revised 12 Apr 2002 (this version, v2))

It is argued that immense physical resources - for nonlocal communication, espionage, and exponentially-fast computation - are hidden from us by quantum noise, and that this noise is not fundamental but merely a property of an equilibrium state in which the universe happens to be at the present time. It is suggested that 'non-quantum' or nonequilibrium matter might exist today in the form of relic particles from the early universe. We describe how such matter could be detected and put to practical use. Nonequilibrium matter could be used to send instantaneous signals, to violate the uncertainty principle, to distinguish non-orthogonal quantum states without disturbing them, to eavesdrop on quantum key distribution, and to outpace quantum computation (solving NP-complete problems in polynomial time).

http://arxiv.org/abs/quant-ph/0203049

---------------------------------------------------------------

On Sat, 18 Jun 2011, JACK SARFATTI wrote:

On Jun 18, 2011, at 5:00 PM, Michael Nauenberg wrote:

I couldn't attend your San Diego meeting, but I had sent a

message to Daniel, summarized below, which I had hoped he would share with you

Michael

Subject: Retrodiction yes, retrocausation no.

Dear Daniel,

I won't be able to attend your meeting, but I had a brief

e-mail exchange with Jeff. The best way to summarize my disagreement with

the article of Aharanov et al. is

that standard quantum mechanics is consistent with

"retrodiction", but not with "retrocausation". For a consistent post-selection of states

in the future, prior statistical predictions of quantum mechanics are properly

given by "conditional" probability relations.

But the Aharanov et al. claim in Physics Today that imposing such future boundary conditions,i.e.

retrocausation, does not change quantum mechanics

by "one iota", is not valid.

I hope these comments are helpful.

Regards, Michael

I do agree that it's retro-causal signal nonlocality that is the real revolution in physics as given by Daryl Bem's paper throwing the last nail in the coffin of signal locality. That means a violation of Orthodox Quantum Theory OQT in living systems. Henry Stapp made this clear in his talk. I have been saying for a long time what Henry now says. On the other hand, both weak measurement and Elitzur's partial measurements lead to useful technology that almost certainly would not have been thought of in the OQT way of thinking.

So if you like

retrodiction = weak retrocausality

the real beef however is

signal nonlocality ---> decodable messages received before they are sent, but only if in fact nothing prevents them from being sent in a globally consistent loop.

"A beautiful theory is murdered by an ugly fact." Richard Feynman

OQT = beautiful theory

Bem's experiments = ugly fact

(also of course Libet, Schmidt, Radin, Bierman ...)