You are here:
Home Jack Sarfatti's Blog Blog (Full Text Display)

Jul
17

Tagged in:

From: Stuart Hameroff <hameroff@u.arizona.edu>

To: JACK SARFATTI <sarfatti@pacbell.net>

Sent: Sun, July 17, 2011 9:05:04 AM

Subject: Re: topological quantum memories - mechanism for tubulin consciousness?

Dear Jack

Thanks for bringing this up. Here's the story.

In 1998 after the Royal Society meeting on

quantum computing, John Preskill gave a talk about topological

quantum error correction on an orthogonal grid. The quantum

algorithm could run along one axis, and quantum error correction

codes on the other, repeatedly intersecting and correcting any

decoherence.

I asked whether this could work on a hexagonal lattice, thinking

of the microtubule A lattice (which has Fibonacci geometry). Preskill

replied, sure, why not? Roger Penrose remarked, how interesting it

would be if the Fibonacci geometry enabled quantum error correction.

Kitaev, Preskill and others generalized the error correction to

topological quantum computing, with pathways of bits/qubits (rather

than states of individual bits/qubits) conveying information. In a 2002

paper (Conduction Pathways In Microtubules, biological quantum computation

and consciousness, Stuart Hameroff, Alex Nip, Mitchell Porter and Jack

Tuszynski,Biosystems 2002, 64(1-3):149-168, on my website under

publications, microtubule biology) we suggested that microtubules

perform topological quantum computing using the Fibonacci pathways

of tubulins as qubits. This reduces the overall information capacity

but greatly enhances resistance to decoherence.

More recemtly, Travis Craddock, Jack Tuszynski and I have looked at the

intra-tubulin pathways which could support the mesoscopic/macroscopic

conduction and found quantum channels of non-polar electron clouds in

aromatic amino acids. This was mentioned in the Google workshop talk

you cited, and in the paper by Roger and me in the forthcoming book

Consciousness and the Universe (you have a paper in it as well).

Anirban Bandyopadhyay has evidence for warm temperature resonances

which correspond with the different pathways, and people are talking

about Fibonacci anyons in topological quantum computing. (I dont know

about the fractional charge supposedly required for anyons.)

So what you have referred to as caged qubits may be caged (in the

sense they are isolated from polar environments)

quantum channels, giving mesoscopic/macroscopic quantum states the

length of microtubules.

The Berkeley quantum photosynthesis group has suggested quantum

coherence as a generalized biological feature. See

http://arxiv.org/PS_cache/arxiv/pdf/1106/1106.2911v1.pdf

The role of the pigment chromophores in the FMO complex

they see as quantum conveyors may be played by non-polar amino acid

electron clouds in microtubules, as Travis Craddock has shown, all

at biological temperature and conditions.

Quantum biology, and quantum consciousness, are heating up!

cheers

Stu

Stuart Hameroff MD

Anesthesiology, Psychology and Center for Consciousness Studies

The University of Arizona, Tucson, Arizona

www.quantumconsciousness.org

Quoting JACK SARFATTI <sarfatti@pacbell.net>:

> Clarifying the Tubulin bit/qubit - Defending the Penrose-Hameroff ...

>

>

> www.youtube.com/watch?v=LXFFbxoHp3s

> 46 min - Oct 28, 2010 - Uploaded by GoogleTechTalks

> Google Workshop om error n Quantum Biology Clarifying the tubulin

> bit/qubit - Defending the Penrose-Hameroff Orch OR Model of Quantum

> ...

> ?

> YouTube - Quantum Consciousness - Criticism of the Penrose ...

>

>

> www.youtube.com/watch?v=Oqz2frRRpyk

> 9 min - Jul 25, 2010 - Uploaded by PianoIsTheRemedy

> Long article with pictures on Hameroff's Tubulin Theory:

> http://rds.yahoo.com/ _ylt=A0oG7_o6PU5MfWQBDblXNyoA;_ylu ...

> More videos for hameroff tubulin »

> [PDF] Penrose-Hameroff Quantum Tubulin Electrons, Chiao Gravity Antennas ...

> www.valdostamuseum.org/hamsmith/QMINDpaper.pdf

> File Format: PDF/Adobe Acrobat - Quick View

> analogous to tubulin caged electrons. Tegmark has criticized

> Penrose-Hameroff quantum consciousness, based on thermal decoherence

> of any such quantum ...

> Bringing Order through Disorder: Localization of Errors in

> Topological Quantum Memories

> James R. Wootton and Jiannis K. Pachos

> School of Physics and Astronomy, University of Leeds, Leeds LS2 9JT,

> United Kingdom

> (Received 1 March 2011; published 14 July 2011)

>

> Anderson localization emerges in quantum systems when randomized

> parameters cause the exponential suppression of motion. Here we

> consider this phenomenon in topological models and establish its

> usefulness for protecting topologically encoded quantum information.

> For concreteness we employ the toric code. It is known that in the

> absence of a magnetic field this can tolerate a finite initial

> density of anyonic errors, but in the presence of a field anyonic

> quantum walks are induced and the tolerable density becomes zero.

> However, if the disorder inherent in the code is taken into account,

> we demonstrate that the induced localization allows the topological

> quantum memory to regain a finite critical anyon density and the

> memory to remain stable for arbitrarily long times. We anticipate

> that disorder inherent in any physical realization of topological

> systems will help to strengthen the fault tolerance of quantum

> memories.

>

> Topological quantum memories are many-body interacting systems that

> can serve as error-correcting codes [1]. These models possess

> degenerate ground state manifolds in which quantum information may be

> encoded. The size of the model and its energy gap then protect this

> information, preventing local perturbations from splitting the

> degeneracy and hence causing errors [2?5]. However, the dynamic

> effects of perturbations when excitations are present are a serious

> problem for the stability of the memory [6], especially for nonzero

> temperature [7]. Several promising schemes have been proposed [8,9]

> that suggest ways to combat this problem with their own merits and

> drawbacks. In particular, in Ref. [9] it is shown that disorder can

> aid the stability of topological phases. Here we shed new light on

> the issue by showing that disorder in topological memories can induce

> Anderson localization [10]. This exponentially suppresses the dynamic

> effects of perturbations on exited states and allows the memory to

> remain stable.

>

> In the definition of a topological memory, we require the following

> conditions. First, the stored information is encoded within a

> degenerate ground state of a system. Second, we require that the

> memory can be left exposed to some assumed noise model for

> arbitrarily long times without active monitoring or manipulations.

> Finally, measurement of the system after errors have occurred

> extracts both the (now noisy) contents of the memory and an error

> syndrome, allowing a one-off error correction step to be performed to

> retrieve the original stored information.

>

> In topological memories, errors create anyonic excitations. Logical

> errors correspond to the propagation of these anyons around

> topologically nontrivial paths on the surface of the system. While

> the anyons are normally static, a kinetic term emerges in the

> presence of a spurious magnetic field [6]. If this can act unchecked,

> even a single pair of anyons will cause a logical error after a time

> linear with the system size L. This means that the memory is not

> resilient against any nonzero density of anyons initially present in

> the system. We demonstrate that randomness in the couplings of the

> code causes anyons to remain well localized in their initial

> positions. This enables the topological memory to successfully store

> quantum information for arbitrarily long times as long as the

> distribution of anyons is below a critical value. Hence, topological

> quantum memories can be made fault-tolerant against the dynamical

> effects of local perturbations.

>

>

>

> Since disorder will be inherent in any experimental realization of

> topological systems, e.g., with Josephson junctions [14], the effect

> described here is expected to play a significant role in their

> behavior. Localization will also protect against Hamiltonian pertur-

> bations in other topological models, including those of non-Abelian

> anyons. The prospect of purposefully engi- neering disorder into

> topological systems to benefit from further localization effects, for

> both coherent and incoher- ent errors, is a subject of continuing

> study.

Leeds, Leeds LS2 9JT,

> United Kingdom

> (Received 1 March 2011; published 14 July 2011)

>

> Anderson localization emerges in quantum systems when randomized

> parameters cause the exponential suppression of motion. Here we

> consider this phenomenon in topological models and establish its

> usefulness for protecting topologically encoded quantum information.

> For concreteness we employ the toric code. It is known that in the

> absence of a magnetic field this can tolerate a finite initial

> density of anyonic errors, but in the presence of a field anyonic

> quantum walks are induced and the tolerable density becomes zero.

> However, if the disorder inherent in the code is taken into account,

> we demonstrate that the induced localization allows the topological

> quantum memory to regain a finite critical anyon density and the

> memory to remain stable for arbitrarily long times. We anticipate

> that disorder inherent in any physical realization of topological

> systems will help to strengthen the fault tolerance of quantum

> memories.

>

> Topological quantum memories are many-body interacting systems that

> can serve as error-correcting codes [1]. These models possess

> degenerate ground state manifolds in which quantum information may be

> encoded. The size of the model and its energy gap then protect this

> information, preventing local perturbations from splitting the

> degeneracy and hence causing errors [2?5]. However, the dynamic

> effects of perturbations when excitations are present are a serious

> problem for the stability of the memory [6], especially for nonzero

> temperature [7]. Several promising schemes have been proposed [8,9]

> that suggest ways to combat this problem with their own merits and

> drawbacks. In particular, in Ref. [9] it is shown that disorder can

> aid the stability of topological phases. Here we shed new light on

> the issue by showing that disorder in topological memories can induce

> Anderson localization [10]. This exponentially suppresses the dynamic

> effects of perturbations on exited states and allows the memory to

> remain stable.

>

> In the definition of a topological memory, we require the following

> conditions. First, the stored information is encoded within a

> degenerate ground state of a system. Second, we require that the

> memory can be left exposed to some assumed noise model for

> arbitrarily long times without active monitoring or manipulations.

> Finally, measurement of the system after errors have occurred

> extracts both the (now noisy) contents of the memory and an error

> syndrome, allowing a one-off error correction step to be performed to

> retrieve the original stored information.

>

> In topological memories, errors create anyonic excitations. Logical

> errors correspond to the propagation of these anyons around

> topologically nontrivial paths on the surface of the system. While

> the anyons are normally static, a kinetic term emerges in the

> presence of a spurious magnetic field [6]. If this can act unchecked,

> even a single pair of anyons will cause a logical error after a time

> linear with the system size L. This means that the memory is not

> resilient against any nonzero density of anyons initially present in

> the system. We demonstrate that randomness in the couplings of the

> code causes anyons to remain well localized in their initial

> positions. This enables the topological memory to successfully store

> quantum information for arbitrarily long times as long as the

> distribution of anyons is below a critical value. Hence, topological

> quantum memories can be made fault-tolerant against the dynamical

> effects of local perturbations.

>

>

>

> Since disorder will be inherent in any experimental realization of

> topological systems, e.g., with Josephson junctions [14], the effect

> described here is expected to play a significant role in their

> behavior. Localization will also protect against Hamiltonian pertur-

> bations in other topological models, including those of non-Abelian

> anyons. The prospect of purposefully engi- neering disorder into

> topological systems to benefit from further localization effects, for

> both coherent and incoher- ent errors, is a subject of continuing

> study.

Jul
16

Penrose-Hameroff tubulin consciousness theory requires topological quantum memories with anyon errors suppressed by noise inducing Anderson localization in macro-quantum coherent degenerate ground state manifolds.

See the pdf I just uploaded to quantum computing Library Resources

See the pdf I just uploaded to quantum computing Library Resources

Jul
15

Tagged in:

The recent Leggett-Zielinger collaboration is very important.

Either they are right and Bohm's nonlocal realism is disproven - or the other way around.

Which is it?

Also taking quantum theory to the macroscopic without emergence of local order parameters from spontaneous symmetry breakdown seems wrong.

I mean the reason for no Schrodinger Tigers (the classical world) must be the emergence of macro-quantum coherent local nonlinear nonunitary Landau-Ginzburg dynamics in ordinary 3D + 1 spacetime from the micro-quantum random substratum of nonlocal linear unitary Schrodinger dynamics in 3N + 1 entangled configuration space. When Legget says he does not really believe quantum theory on the large scale he may mean something like what I have just said?

PS

Bell showed both realism (e.g. definite Bohm particle trajectories) and locality together violate statistical predictions of quantum theory.

So either one or the other or both must be wrong.

Leggett beyond Bell shows both must be wrong, i.e. nonlocal quantum surreality.

PPS

Of course Zeilinger's very readable philosophe paper is either in ignorance or denial of the body of experimental work

Libet --> Radin-May-Spottiswoode ---> Bierman ---> Bem

Puthoff-Targ RV

PEAR Princeton & Global Consciousness Project (Roger Nelson)

Antony Valentini on "signal nonlocality" from violation of the Born probability rule that Zeilinger takes as an absolute truth.

Yakir Aharonov retrocausal weak measurement (with signal locality passion at a distance)

Avshalom Elitzur partial measurement (retrocausal entanglement)

...

on outcome independence of the individual quantum event as well as parameter independence where the statistical pattern is independent of the distant setting.

Jul
15

Tagged in:

"Quantum mechanics fundamentally

concerns the way in which we observers connect to the universe

we observe. The theory implies that when we measure particles

and atoms, at least one of two long-held physical principles is

untenable: Distant events do not affect one other, and properties

we wish to observe exist before our measurements. One of

these, locality or realism, must be fundamentally incorrect.

... Now Zeilinger and his collaborators ...

In Vienna experiments are testing whether quantum

mechanics permits a fundamental physical reality. ...

(Nobel Laureate Tony) Leggett doesn’t believe quantum mechanics is correct, and

there are few places for a person of such disbelief to now turn.

But Leggett decided to find out what believing in quantum

mechanics might require. He worked out what would happen

if one took the idea of nonlocality in quantum mechanics seriously,

by allowing for just about any possible outside influences

on a detector set to register polarizations of light. Any unknown

event might change what is measured. The only assumption Leggett

made was that a natural form of realism hold true; photons should have measurable polarizations

that exist before they are measured. With this he laboriously derived a

new set of hidden variables theorems and inequalities as Bell once had.

But whereas Bell’s work could not distinguish between realism and locality,

Leggett’s did. The two could be tested. ...

The experiment wouldn’t be too difficult, but understanding it would. It took

them months to reach their tentative conclusion: If quantum

mechanics described the data, then the lights’ polarizations

didn’t exist before being measured. Realism in quantum

mechanics would be untenable ...

In the past decade or so, Zeilinger and his

many collaborators were the first to teleport light, use quantum

cryptography for a bank transaction (with optical fibers in the

sewers of Vienna), realize a one-way quantum computer, and

achieve entanglement over large distances through the air, first

across the Danube River and then between two of the Canary

Islands. Zeilinger’s work had also previously shown the greatest

distinction between quantum mechanics and local realism. ...

“Quantum mechanics is very fundamental, probably

even more fundamental than we appreciate,” he said, “But to

give up on realism altogether is certainly wrong. Going back to

Einstein, to give up realism about the moon, that’s ridiculous.

But on the quantum level we do have to give up realism.” ...

With eerie precision, the results of Gröblacher’s weekend

experiments had followed the curve predicted by quantum

mechanics. The data defied the predictions of Leggett’s model

by three orders of magnitude. Though they could never observe

it, the polarizations truly did not exist before being measured.

For so fundamental a result, Zeilinger and his group needed

to test quantum mechanics again ...

Leggett’s theory was more powerful than Bell’s ...

In mid-2007 Fedrizzi found that the new realism model was

violated by 80 orders of magnitude;

the group was even more assured that

quantum mechanics was correct. ...

Last year Brukner and his student Johannes Kofler decided

to figure out why we do not perceive the quantum phenomena

around us. If quantum mechanics holds universally for atoms,

why do we not see directly its effects in bulk?

Most physicists believe that quantum effects get washed out

when there are a large number of particles around. The particles

are in constant interaction and their environment serves to “decohere”

the quantum world—eliminate superpositions—to create

the classical one we observe. Quantum mechanics has within it

its own demise, and the process is too rapid to ever see. Zeilinger’s

group, which has tested decoherence, does not believe there is a

fundamental limit on the size of an object to observe superposition.

Superpositions should exist even for objects we see, similar

to the infamous example of Schrödinger’s cat. In fact, Gröblacher

now spends his nights testing larger-scale quantum mechanics in

which a small mirror is humanely substituted for a cat.

Brukner and Kofler had a simple idea. They wanted to find

out what would happen if they assumed that a reality similar to

the one we experience is true—every large object has only one

value for each measurable property that does not change. In other

words, you know your couch is blue, and you don’t expect to be

able to alter it just by looking. This form of realism, “macrorealism,”

was first posited by Leggett in the 1980s.

Late last year Brukner and Kofler showed that it does not

matter how many particles are around, or how large an object

is, quantum mechanics always holds true. The reason we see

our world as we do is because of what we use to observe it.

The human body is a just barely adequate measuring device.

Quantum mechanics does not always wash itself out, but to

observe its effects for larger and larger objects we would need

more and more accurate measurement devices. We just do not

have the sensitivity to observe the quantum effects around us.

In essence we do create the classical world we perceive, and as

Brukner said, “There could be other classical worlds completely

different from ours.”

I am not sure if they are correct because of the emergence of

c-number order parameters that obey the nonlinear non-unitary

Landau-Ginzburg equation in ordinary space not the linear unitary

q-number Schrodinger equation in configuration space.

JOSHUA ROEBKE in May/June, 2008 http://www.SEEDMAGAZINE.COM Reality Tests

concerns the way in which we observers connect to the universe

we observe. The theory implies that when we measure particles

and atoms, at least one of two long-held physical principles is

untenable: Distant events do not affect one other, and properties

we wish to observe exist before our measurements. One of

these, locality or realism, must be fundamentally incorrect.

... Now Zeilinger and his collaborators ...

In Vienna experiments are testing whether quantum

mechanics permits a fundamental physical reality. ...

(Nobel Laureate Tony) Leggett doesn’t believe quantum mechanics is correct, and

there are few places for a person of such disbelief to now turn.

But Leggett decided to find out what believing in quantum

mechanics might require. He worked out what would happen

if one took the idea of nonlocality in quantum mechanics seriously,

by allowing for just about any possible outside influences

on a detector set to register polarizations of light. Any unknown

event might change what is measured. The only assumption Leggett

made was that a natural form of realism hold true; photons should have measurable polarizations

that exist before they are measured. With this he laboriously derived a

new set of hidden variables theorems and inequalities as Bell once had.

But whereas Bell’s work could not distinguish between realism and locality,

Leggett’s did. The two could be tested. ...

The experiment wouldn’t be too difficult, but understanding it would. It took

them months to reach their tentative conclusion: If quantum

mechanics described the data, then the lights’ polarizations

didn’t exist before being measured. Realism in quantum

mechanics would be untenable ...

In the past decade or so, Zeilinger and his

many collaborators were the first to teleport light, use quantum

cryptography for a bank transaction (with optical fibers in the

sewers of Vienna), realize a one-way quantum computer, and

achieve entanglement over large distances through the air, first

across the Danube River and then between two of the Canary

Islands. Zeilinger’s work had also previously shown the greatest

distinction between quantum mechanics and local realism. ...

“Quantum mechanics is very fundamental, probably

even more fundamental than we appreciate,” he said, “But to

give up on realism altogether is certainly wrong. Going back to

Einstein, to give up realism about the moon, that’s ridiculous.

But on the quantum level we do have to give up realism.” ...

With eerie precision, the results of Gröblacher’s weekend

experiments had followed the curve predicted by quantum

mechanics. The data defied the predictions of Leggett’s model

by three orders of magnitude. Though they could never observe

it, the polarizations truly did not exist before being measured.

For so fundamental a result, Zeilinger and his group needed

to test quantum mechanics again ...

Leggett’s theory was more powerful than Bell’s ...

In mid-2007 Fedrizzi found that the new realism model was

violated by 80 orders of magnitude;

the group was even more assured that

quantum mechanics was correct. ...

Last year Brukner and his student Johannes Kofler decided

to figure out why we do not perceive the quantum phenomena

around us. If quantum mechanics holds universally for atoms,

why do we not see directly its effects in bulk?

Most physicists believe that quantum effects get washed out

when there are a large number of particles around. The particles

are in constant interaction and their environment serves to “decohere”

the quantum world—eliminate superpositions—to create

the classical one we observe. Quantum mechanics has within it

its own demise, and the process is too rapid to ever see. Zeilinger’s

group, which has tested decoherence, does not believe there is a

fundamental limit on the size of an object to observe superposition.

Superpositions should exist even for objects we see, similar

to the infamous example of Schrödinger’s cat. In fact, Gröblacher

now spends his nights testing larger-scale quantum mechanics in

which a small mirror is humanely substituted for a cat.

Brukner and Kofler had a simple idea. They wanted to find

out what would happen if they assumed that a reality similar to

the one we experience is true—every large object has only one

value for each measurable property that does not change. In other

words, you know your couch is blue, and you don’t expect to be

able to alter it just by looking. This form of realism, “macrorealism,”

was first posited by Leggett in the 1980s.

Late last year Brukner and Kofler showed that it does not

matter how many particles are around, or how large an object

is, quantum mechanics always holds true. The reason we see

our world as we do is because of what we use to observe it.

The human body is a just barely adequate measuring device.

Quantum mechanics does not always wash itself out, but to

observe its effects for larger and larger objects we would need

more and more accurate measurement devices. We just do not

have the sensitivity to observe the quantum effects around us.

In essence we do create the classical world we perceive, and as

Brukner said, “There could be other classical worlds completely

different from ours.”

I am not sure if they are correct because of the emergence of

c-number order parameters that obey the nonlinear non-unitary

Landau-Ginzburg equation in ordinary space not the linear unitary

q-number Schrodinger equation in configuration space.

JOSHUA ROEBKE in May/June, 2008 http://www.SEEDMAGAZINE.COM Reality Tests

Jul
14

Tagged in:

is actually arch-conservative a more sophisticated version of Copenhagen. Retrocausality is verboten.

It is at odds with:

1) Bohm's quantum potential

2) Cramer's transaction

3) Aharonov's post-selection.

4) Valentini's extension of QM to include signal nonlocality

5) Stapp's version of Bem's experiments beyond quantum theory.

6) Avshalom Elitzur's version of retrocausality

7) York Dobyns foundational analysis of retrocausality in physics.

Yet Zeilinger has done remarkable experiments, but all involve non-living matter - except for the Schrodinger Tigers that still has not progressed to life.

So, Zeilinger could not explain the retro-causal data of Libet, Radin, Bierman, Bem nor the remote viewing data of Puthoff and Targ et-al.

It is at odds with:

1) Bohm's quantum potential

2) Cramer's transaction

3) Aharonov's post-selection.

4) Valentini's extension of QM to include signal nonlocality

5) Stapp's version of Bem's experiments beyond quantum theory.

6) Avshalom Elitzur's version of retrocausality

7) York Dobyns foundational analysis of retrocausality in physics.

Yet Zeilinger has done remarkable experiments, but all involve non-living matter - except for the Schrodinger Tigers that still has not progressed to life.

So, Zeilinger could not explain the retro-causal data of Libet, Radin, Bierman, Bem nor the remote viewing data of Puthoff and Targ et-al.

Jack Sarfatti

Russell Targ sends paper on signal nonlocality in humans http://bit.ly/qCiXBC

6 hours ago via AutoTweet Connector · Like ·

Gareth Lee Meredith and Laurel Oplatka like this.

Teresa Ellen Will read tonight. Is it getting published?

3 hours ago · Like

Jack Sarfatti it is published - read the bylines - font is small it's a reprint

22 minutes ago · Like

Gareth Lee Meredith

Doctor Sarfatti, do you believe a unified field theory must involve consciousness?

7 hours ago · Like · · See Friendship

Jack Sarfatti depends what you mean by "unified field theory"

7 hours ago · Like

Jack Sarfatti to explain the forces only needs the local gauge principle, spontaneous breakdown of symmetry for some groups, and a few other technical points like parity violation. No need for human consciousness there.

7 hours ago · Like

Gareth Lee Meredith I guess I would define it as an understanding of all fundamental concepts? Is consciousness, there go a fundamental concept?

7 hours ago · Like

Jack Sarfatti look I covered this in my Journal of Cosmology paper vol 14 April 2011 online

7 hours ago · Like

Gareth Lee Meredith i shall look for it, but for us that may not find it, could you give us some brief details on your beliefs on this subject? Thank you sir!

7 hours ago · Like

Jack Sarfatti no, it's easy to find, just google, link is on front page http://stardrive.org/ this venue is not appropriate - short soundbyte cliches not good enough - also I already told you above the basics - answer is no

7 hours ago · Like

Gareth Lee Meredith My understanding of the Copenhagen Interpretation, is that observable quantities do not exist unless there is some act of measurement on our system which would define the density. Should observation on objects be reserved for the quantum world, or should the observer have a pivotal role in the development of physical reality?

7 hours ago · Like

Gareth Lee Meredith ok, i will google it. Thank you.

7 hours ago · Like

Jack Sarfatti Again remember John Wheeler''s "The question is: what is the question?" Is the question well-posed? Does it even make sense? Many questions by non-scientists don't.

7 hours ago · Like

Jack Sarfatti On reality see the pop papers by Anton Zeilinger I posted links to only yesterday.

7 hours ago · Like

Gareth Lee Meredith interesting! indeed. I will check this pop paper, by Anton Zeilinger.I have never heard of him.... Obviously not as famous as you lol :P

7 hours ago · Like

Jack Sarfatti Nonlocal realism is not enough says Zeilinger. We need nonlocal surrealism.

7 hours ago · Like

Gareth Lee Meredith Or maybe I just haven't crossed the right channels? :P

7 hours ago · Like

Gareth Lee Meredith What is non-local surrealism, very shortly put? I have studied physics for years, but I have never came across this terminology...

7 hours ago · Like

Jack Sarfatti Ignorance is bliss. Zeilinger is one of the most important physicists alive today because he does actual experiments of fundamental importance.

7 hours ago · Like

Jack Sarfatti Read Zeilinger. Do your homework.

7 hours ago · Like

Gareth Lee Meredith I will indeed, it's calling me :P

7 hours ago · Like

Gareth Lee Meredith Time for me to depart and learn a little, if not, a lot :)

7 hours ago · Like

Jack Sarfatti I invented it just now. It means quanta do not have definite properties before detection.

7 hours ago · Like

Jack Sarfatti This seems to conflict with Bohm's theory.

7 hours ago · Like

Gareth Lee Meredith Explain...??? I think I understand that statement. If there are no properties before the detector fires, then we are assuming a mesh of probabilities, rather than actualities...

7 hours ago · Like

Gareth Lee Meredith It's clear I still have a lot to learn. Thank you doctor, it was nice seeing your insights into the matter, and I shall chase your links :P

6 hours ago · Like

Jack Sarfatti Bohr said anyone who thinks they understand quantum theory doesn't. Richard Feynman said no one understands quantum theory - I mean quantum reality. This led to "shut up and calculate" chapter in David Kaiser's book on how I and my pirate crew saved physics in the dark ages 60's to 90's.

2 minutes ago · Like

Jack Sarfatti One issue is whether Leggett's experiment described by Zeilinger falsifies Bohm's nonlocal realism? In other words is the universe even weirder than Bohm imagined at the micro-quantum level? - nonlocal surrealism?

about a minute ago · Like

Jul
13

Tagged in:

received from Russell Targ

http://dl.dropbox.com/u/5612263/Telepathy %28TIBG%29.pdf

Many mainstream physicists simply deny the evidence as crackpot like the Vatican priests who refused to look through Galileo's telescope.

Jul
11

Tagged in:

Right. Their English is hard to read of course. In any case new angle (pun intended) on your earlier schemes. Won't hurt to take a look.

A note on Kaiser's Ch 9.

OK key assumption for no-cloning an unknown arbitrary quantum state is linearity.

Linearity is broken in coherent states. They are eigenstates of non-Hermitian operators and obey a nonlinear even non-unitary Landau-Ginzburg equation that cannot be second quantized to regain the linearity needed for no-cloning.

Also Glauber's point about spontaneous emission noise for the 1-photon Fock states is of no consequence for coherent states.

And of course they are not orthogonal.

And from Sanders we know coherent states can be entangled.

There is still your point about modulation - are the non-zero overlaps controllable from one end of the system? This would mean "parameter dependence" not included in Bell's inequality math.

So, there still are loose-ends on the limits of Stapp's theorem. It clearly applies in orthodox situations.

Furthermore, near field optics (Physics Today July 2011) violated the Heisenberg microscope resolution limit. What does that do to QM?

In addition

This week in Physics — July 11, 2011

Viewpoint: Questioning the rules of the game

?aslav Brukner, Physics 4, 55 – Published July 11, 2011

Can quantum theory be derived from more fundamental principles?

But they assume no signaling from the future as a postulate - not helpful.

From: nick herbert <quanta@cruzio.com>

To: JACK SARFATTI <sarfatti@pacbell.net>

Sent: Mon, July 11, 2011 9:22:22 AM

Subject: Re: Nick take a look at this new Bell paper from China (FLASH, QUICK?)

Clever Chinese.

Will look at paper but don't see how this will work.

Transforms task of distinguishing CUP from PUP light

(PUP = plane-unpolarized light)

to task of distinguishing CUP from PUP* light

where PUP* is rotated plane-unpolarized light.

On Jul 10, 2011, at 12:31 PM, JACK SARFATTI wrote:

http://arxiv.org/pdf/0906.0279v5

a new way to attempt a variation on QUICK & FLASH?

This sounds familiar.

"In order to discriminate between the two hypotheses, we must seek a material that can exhibit different effects when circularly and linearly polarized photons pass through it respectively. Note that the usual method of inserting a quarter-wave plate cannot be used here since the photons in one optical path may have two rotation directions. So we make use of roto-optic effect (or Faraday effect) to distinguish between circularly and linearly polarized photons. This is because a linearly polarized photon can be regarded as the combination of left-handed and right-handed circularly polarized components. When it passes through a roto-material, the velocities of the two components are different according to Fresnel’s roto-optic theory. Then there exists a phase shift between the two components. The polarization plane of the photon will rotate and the quantum state will change. As a circularly polarized photon passes through the roto-material, its polarization quantum state will not change since it has only one rotation direction."

A note on Kaiser's Ch 9.

OK key assumption for no-cloning an unknown arbitrary quantum state is linearity.

Linearity is broken in coherent states. They are eigenstates of non-Hermitian operators and obey a nonlinear even non-unitary Landau-Ginzburg equation that cannot be second quantized to regain the linearity needed for no-cloning.

Also Glauber's point about spontaneous emission noise for the 1-photon Fock states is of no consequence for coherent states.

And of course they are not orthogonal.

And from Sanders we know coherent states can be entangled.

There is still your point about modulation - are the non-zero overlaps controllable from one end of the system? This would mean "parameter dependence" not included in Bell's inequality math.

So, there still are loose-ends on the limits of Stapp's theorem. It clearly applies in orthodox situations.

Furthermore, near field optics (Physics Today July 2011) violated the Heisenberg microscope resolution limit. What does that do to QM?

In addition

This week in Physics — July 11, 2011

Viewpoint: Questioning the rules of the game

?aslav Brukner, Physics 4, 55 – Published July 11, 2011

Can quantum theory be derived from more fundamental principles?

But they assume no signaling from the future as a postulate - not helpful.

From: nick herbert <quanta@cruzio.com>

To: JACK SARFATTI <sarfatti@pacbell.net>

Sent: Mon, July 11, 2011 9:22:22 AM

Subject: Re: Nick take a look at this new Bell paper from China (FLASH, QUICK?)

Clever Chinese.

Will look at paper but don't see how this will work.

Transforms task of distinguishing CUP from PUP light

(PUP = plane-unpolarized light)

to task of distinguishing CUP from PUP* light

where PUP* is rotated plane-unpolarized light.

On Jul 10, 2011, at 12:31 PM, JACK SARFATTI wrote:

http://arxiv.org/pdf/0906.0279v5

a new way to attempt a variation on QUICK & FLASH?

This sounds familiar.

"In order to discriminate between the two hypotheses, we must seek a material that can exhibit different effects when circularly and linearly polarized photons pass through it respectively. Note that the usual method of inserting a quarter-wave plate cannot be used here since the photons in one optical path may have two rotation directions. So we make use of roto-optic effect (or Faraday effect) to distinguish between circularly and linearly polarized photons. This is because a linearly polarized photon can be regarded as the combination of left-handed and right-handed circularly polarized components. When it passes through a roto-material, the velocities of the two components are different according to Fresnel’s roto-optic theory. Then there exists a phase shift between the two components. The polarization plane of the photon will rotate and the quantum state will change. As a circularly polarized photon passes through the roto-material, its polarization quantum state will not change since it has only one rotation direction."

Jul
10

Tagged in:

http://arxiv.org/pdf/0906.0279v5

a new way to attempt a variation on QUICK & FLASH?

This sounds familiar.*"In order to discriminate between the two hypotheses, we must seek a material that can exhibit different effects when circularly and linearly polarized photons pass through it respectively. Note that the usual method of inserting a quarter-wave plate cannot be used here since the photons in one optical path may have two rotation directions. So we make use of roto-optic effect (or Faraday effect) to distinguish between circularly and linearly polarized photons. This is because a linearly polarized photon can be regarded as the combination of left-handed and right-handed circularly polarized components. When it passes through a roto-material, the velocities of the two components are different according to Fresnel’s roto-optic theory. Then there exists a phase shift between the two components. The polarization plane of the photon will rotate and the quantum state will change. As a circularly polarized photon passes through the roto-material, its polarization quantum state will not change since it has only one rotation direction."*

No, I am trying to have it both ways not Sanders. There is no problem having it both ways. Partial measurements are an obvious idea. Wooters wrote a famous paper on it for the double slit in the big Wheeler Quantum Measurement book I think (Princeton). You can partially know which slit and still see a fuzzy fringe pattern. Measurements do not have to be perfect to be useful technologically.

In Sanders set up maybe we can use a Kerr cell at the a' output to modulate the entanglement signal at b'. But perhaps not. There may be a Catch 22 that the amount of Kerr phase shift in the a' beam needed to get the entanglement signal makes the overlap too small at b' to detect the signal? This would save Stapp's theorem in that case.

Now we have that paper from China that may reawaken Nick's old ideas in QUICK and FLASH?

From: Paul Zielinski <iksnileiz@gmail.com>

To: JACK SARFATTI <sarfatti@pacbell.net>

Sent: Sun, July 10, 2011 12:00:39 PM

Subject: Re: Stapp's theorem for entangled coherent states

Either way I would say this is an interesting result.

It does seem that Sanders is trying to have it both ways with his quasi-orthogonal coherent states (13) with high alpha -- "distinct", but not quite orthogonal.

See paper just now uploaded in Library Quantum Computing.