"However, physics would be fundamentally different. If we break the uncertainty principle, there is really no telling what our world would look like."

Magick without magic.

I announce the conjecture:

Nature is as weird as it can be.
The nonlocal action principle of maximal weirdness, e.g. consciousness.

Post-quantum theory has maximum weirdness (aka signal nonlocality) beyond the minimal weirdness of orthodox quantum theory.

On Nov 18, 2010, at 6:55 PM, JACK SARFATTI wrote:

v3- expanded to include part 3
It's a surprising and perhaps ironic twist," said Oppenheim, a Royal Society University Research Fellow from the Department of Applied Mathematics & Theoretical Physics at the University of Cambridge. Einstein and his co-workers discovered non-locality while searching for a way to undermine the uncertainty principle. "Now the uncertainty principle appears to be biting back."

Non-locality determines how well two distant parties can coordinate their actions without sending each other information. Physicists believe that even in quantum mechanics, information cannot travel faster than light. Nevertheless, it turns out that quantum mechanics allows two parties to coordinate much better than would be possible under the laws of classical physics. In fact, their actions can be coordinated in a way that almost seems as if they had been able to talk. Einstein famously referred to this phenomenon as "spooky action at a distance".

"Quantum theory is pretty weird, but it isn't as weird as it could be. We really have to ask ourselves, why is quantum mechanics this limited? Why doesn't nature allow even stronger non-locality?" Oppenheim says.However, quantum non-locality could be even spookier than it actually is. It's possible to have theories which allow distant parties to coordinate their actions much better than nature allows, while still not allowing information to travel faster than light. Nature could be weirder, and yet it isn't – quantum theory appears to impose an additional limit on the weirdness.

The surprising result by Wehner and Oppenheim is that the uncertainty principle provides an answer. Two parties can only coordinate their actions better if they break the uncertainty principle, which imposes a strict bound on how strong non-locality can be.

"It would be great if we could better coordinate our actions over long distances, as it would enable us to solve many information processing tasks very efficiently," Wehner says. "However, physics would be fundamentally different. If we break the uncertainty principle, there is really no telling what our world would look like."


http://www.eurekalert.org/pub_releases/2010-11/cfqt-rus111210.php


But it appears we can beat the usual Heisenberg uncertainty limit that assumes no resolution better than the mean wavelength of the photon probe's wave packet i.e. a super-oscillating weak measurement Heisenberg microscope enhanced with negative index of refraction meta-material super-lens.


Search Results
New Superlens is Made of Metamaterials - Ingenious lens ten times ...
Apr 25, 2008 ... New Superlens is Made of Metamaterials - Ingenious lens ten times as powerful as conventional ones.
news.softpedia.com/.../New-Superlens-is-Made-of-Metamaterials-84359. shtml -Cached - Similar
?
More on Metamaterials and Superlens over 5 times better than ...
Jun 28, 2006 ... Powerpoint tutorial, by G Shvets of the Univeristy of Texas at Austin, on meta- materials and applying superlenses to laser plasma ...
nextbigfuture.com/.../more-on-metamaterials-and-superlens.html - Cached - Similar
Metamaterials for magnifying superlenses | IOM3: The Global ...
Apr 30, 2007 ... Array of superlenses Advances in the field of magnifying superlenses have been reported by two separate US research teams.
www.iom3.org/news/mega-magnifiers - Cached - Similar
[PDF] Photonic Meta Materials, Nano-scale plasmonics and Super Lens ...
File Format: PDF/Adobe Acrobat - Quick View
Photonic Meta Materials, Nano-scale plasmonics and Super Lens. Xiang Zhang. Chancellor's Professor and Director. NSF Nano-scale Science and Engineering ...
boss.solutions4theweb.com/Zhang_talk_abs_with_pictures__1.pdf - Similar
Nano-Optics, Metamaterials, Nanolithography and Academia: 3D ...
3D Metamaterials Nanolens: The best superlens realized so far! My paper was published online 2 days ago, in Applied Physics Letters: ...
nanooptics.blogspot.com/2010/.../3d-metamaterials-nanolens-best.html - Cached
Magnifying Superlens based on Plasmonic Metamaterials
by II Smolyaninov - 2008 - Related articles
Magnifying Superlens based on Plasmonic Metamaterials. Igor I. Smolyaninov, Yu- Ju Hung, and Christopher C. Davis. Electrical and Computer Engineering ...
ieeexplore.ieee.org/iel5/4422221/4422222/04422540.pdf?arnumber...
Superlens from complementary anisotropic metamaterials—[Journal of ...
Metamaterials with isotropic property have been shown to possess novel optical properties such as a negative refractive index that can be used to design a ...
link.aip.org/link/JAPIAU/v102/i11/p116101/s1
[PDF] Surface resonant states and superlensing in acoustic metamaterials
File Format: PDF/Adobe Acrobat - Quick View
by M Ambati - 2007 - Cited by 16 - Related articles
May 31, 2007 ... This concept of acoustic superlens opens exciting opportunities to design acoustic metamaterials for ultrasonic imaging. ...
xlab.me.berkeley.edu/publications/pdfs/57.PRB2007_Murali.pdf - Similar
Superlens imaging theory for anisotropic nanostructured ...
by WT Lu - 2008 - Cited by 18 - Related articles
Superlens imaging theory for anisotropic nanostructured metamaterials with broadband all-angle negative refraction. WT Lu, S Sridhar ...
link.aps.org/doi/10.1103/PhysRevB.77.233101 - Similar
[0710.4933] Superlens imaging theory for anisotropic ...
by WT Lu - 2007 - Cited by 18 - Related articles
Oct 25, 2007 ... Title: Superlens imaging theory for anisotropic nanostructuredmetamaterials with broadband all-angle negative refraction ...
arxiv.org › cond-mat - Cached


On Nov 18, 2010, at 4:54 PM, JACK SARFATTI wrote:


The probabilistic nature of quantum events comes from integrating out all the future advanced Wheeler-Feynman retro-causal measurements. This is why past data and unitary retarded past-to-present dynamical evolution of David Bohm's quantum potential is not sufficient for unique prediction as in classical physics. Fred Hoyle knew this a long time ago. Fred Alan Wolf and I learned it from Hoyle's papers back in the late 60's at San Diego State and also from I. J. Good's book that popularized Hoyle's idea. So did Hoyle get it from Aharonov 1964 or directly from Wheeler-Feynman 1940 --> 47?

Note however, that in Bohm's theory knowing the pre-selected initial condition on the test particle trajectory does seem to obviate the necessity for an independent retro-causal post-selection in the limit of sub-quantal thermodynamic equilibrium with consequent signal locality, i.e. no remote viewing possible in this limit for dead matter. However, there may be a hidden retro-causal tacit assumption in Bohm's 1952 logic. Remember Feynman's action principle is nonlocal in time. One also must ultimately include back-reaction fluctuations of Bohm's quantum potential Q. The test particle approximation breaks down when the particle hidden variables are no longer in sub-quantal equilibrium caused by some external pumping of them like the excited atoms in a laser that is lasing above threshold, or like in H. Frohlich's toy model of a biological membrane of electric dipoles.

Yakir et-al says that by 1964 "the puzzle of indeterminism ... was safely marginalized" to the Gulag. ;-)

John Bell's locality inequality for quantum entanglement of 1964 changed all that. I had already gotten into a heated argument with Stanley Deser and Sylvan Schweber on this very issue back in 1961 at Brandeis University. I had independently seen the problem Bell had a few years later from reading David Inglis's Tau Theta Puzzle paper on Rev Mod Phys. As a mere grad student I was shouted down by Deser and told to "learn how to calculate" - one of the reasons I quit by National Defense Fellowship and went to work for Tech/Ops at Mitre in Lexington, Mass on Route 2 an Intelligence Community Contractor under Emil Wolfs student George Parrent Jr.

?
Optics InfoBase - Imaging of Extended Polychromatic Sources and ...
by GB PARRENT JR - 1961 - Cited by 2 - Related articles
GEORGE B. PARRENT JR., "Imaging of Extended Polychromatic Sources and Generalized Transfer Functions," J. Opt. Soc. Am. 51, 143-151 (1961) ...
www.opticsinfobase.org/abstract.cfm?uri=josa-51-2-143
In 1964 Aharonov and two colleagues (Peter Bergmann & Lebowitz) announce that the result of a measurement at t not only influences the future, but also influences the past. Of course, Wheeler-Feynman knew that 25 years earlier. Did they precog Aharonov? ;-)

OK we pre-select at t0, we measure at t and we post-select at t1

t0 < t < t1

We then have a split into sub-ensembles that correspond to the procedures of scattering measurements described by the unitary S-Matrix.

The statistics of the present measurements at t is different for different joint pre t0 and post t1 selected sub-ensembles, and different still from the total pre selected ensemble integral over all the joint pre-post sub-ensembles.

Note we still have unitary S-Matrix signal locality here. It's not possible to decode a retrocausal message from t1 at t for example.
No spooky uncanny paranormal Jungian synchronicities, no Destiny Matrix is possible in this particular model.

Weak Measurements
We can partially beat Heisenberg's microscope with metamaterial tricks of negative refractive index, we can also beat it if we tradeoff precision for disturbance. Even less precise weak simultaneous measurements of non-commuting observables can still be sufficiently precise when N^1/2 << N for N qubits all in the same single-qubit state. "So at the cost of precision, one can limit disturbance." Indeed, one can get a weak measurement far outside the orthodox quantum measurement eigenvalues, indeed

S(45 degrees) ~ N/2^1/2

i.e. 2^1/2 xlargest orthodox eigenvalue for N un-entangled qubits pre-selected for z along + 1/2 and post-selected along x at +1/2 with error ~ N^1/2.

"It's all a game of errors."

"Sometimes the device's pointer ... can point, in error, to a range far outside the range of possible eigenvalues."

Larger errors than N^1/2 must occur, but with exponentially decreasing probability.

When ultra-rarely the post selection measures Sx = N/2, the intermediate measurement is N/2^1/2 +- N^1/2 > N/2.

This is not a random error.

Superoscillation

The present measurement at t entangles the pointer device with the measured qubit. Future post-selection at t1 destroys that entanglement. The pointer device is then left in a superposition of its legitimate orthodox quantum eigenstates. The superoscillation coherence among the device's eigenstates boost it to the non-random error outside of its orthodox eigenvalue spectrum to N/2^1/2 > N/2.

Indeed, this beats the limits of Heisenberg's microscope http://www.aip.org/history/heisenberg/p08b.htm

Superposing waves with different wavelengths, one can construct features with details smaller than the smallest wavelength in the superposition. Example

f(x) = [(1 + a)exp(i2pix/N)/2 + (1 - a)exp(-i2pix/N)/2]^N

a > 1 is a real number

Expand the binomial, take the limit x ---> 0

f(x) ~ exp(i2piax)

with an effective resolution of 1/a << 1

so much for Heisenberg's uncertainty principle in a weak measurement?

to be continued in Part 2

Bear in mind that the ultimate post-selection for every measurement in our observable universe is at our total absorber future event horizon.