The field of quantum computing originated with a question posed by Richard Feynman. He asked whether or not it was feasible to simulate the behavior of quantum systems using a classical computer, suggesting that a quantum computer would be required instead [1]. Saleh Rahimi-Keshari from the University of Queensland, Australia, and colleagues [2] have now demonstrated that a quantum process that was believed to require an exponentially large number of steps to simulate on a classical computer could in fact be simulated in an efficient way if the system in which the process occurs has sufficiently large loss and noise.
To read more, click here.