Quantum theory is one of the great achievements of 20th century science, yet physicists have struggled to find a clear boundary between our everyday world and what Albert Einstein called the "spooky" features of the quantum world, including cats that could be both alive and dead, and photons that can communicate with each other across space instantaneously.

For the past 60 years, the best guide to that boundary has been a theorem called Bell's Inequality, but now a new paper shows that Bell's Inequality is not the guidepost it was believed to be, which means that as the world of quantum computing brings quantum strangeness closer to our daily lives, we understand the frontiers of that world less well than scientists have thought.

In the new paper, published in the July 20 edition of Optica, University of Rochester researchers show that a classical beam of light that would be expected to obey Bell's Inequality can fail this test in the lab, if the beam is properly prepared to have a particular feature: entanglement.

To read more, click here.