Watch a movie backwards and you'll likely get confused—but a quantum computer wouldn't. That's the conclusion of researcher Mile Gu at the Centre for Quantum Technologies (CQT) at the National University of Singapore and Nanyang Technological University and collaborators.
In research published 18 July in Physical Review X, the international team shows that a quantum computer is less in thrall to the arrow of time than a classical computer. In some cases, it's as if the quantum computer doesn't need to distinguish between cause and effect at all.
The new work is inspired by an influential discovery made almost 10 years ago by complexity scientists James Crutchfield and John Mahoney at the University of California, Davis. They showed that many statistical data sequences will have a built-in arrow of time. An observer who sees the data played from beginning to end, like the frames of a movie, can model what comes next using only a modest amount of memory about what occurred before. An observer who tries to model the system in reverse has a much harder task—potentially needing to track orders of magnitude more information.
This discovery came to be known as causal asymmetry. It seems intuitive—after all, modeling a systemwhentime is running backwards is like trying to infer a cause from an effect. We are used to finding that more difficult than predicting an effect from a cause. In everyday life, understanding what will happen next is easier if you know what just happened, and what happened before that.