Moore’s law is already pretty fast. It holds that computer chips pack in twice as many transistors every two years or so, producing major jumps in speed and efficiency. But the computing demands of the deep learning era are growing even faster than that — at a pace that is likely not sustainable. The International Energy Agency predicts that artificial intelligence will consume 10 times as much power in 2026 as it did in 2023, and that data centers in that year will use as much energy as Japan. “The amount of [computing power] that AI needs doubles every three months,” said Nick Harris, founder and CEO of the computing-hardware company Lightmatter — far faster than Moore’s law predicts. “It’s going to break companies and economies.”
One of the most promising ways forward involves processing information not with trusty electrons, which have dominated computing for over 50 years, but instead using the flow of photons, minuscule packets of light. Recent results suggest that, for certain computational tasks fundamental to modern artificial intelligence, light-based “optical computers” may offer an advantage.
The development of optical computing is “paving the way for breakthroughs in fields that demand high-speed and high-efficiency processing, such as artificial intelligence,” said the University of Cambridge physicist Natalia Berloff.
To read more, click here.