Since the computer age began, microchips have consistently been shrunk to smaller and smaller sizes. Moore’s Law, articulated in 1965 by Intel co-founder Gordon Moore, predicts, fairly accurately to date, that the number of transistors we can fit on a microchip will double every 18 to 24 months, constantly increasing computer speed and efficiency. Many computer scientists and engineers, however, believe we will soon reach a point where the traditional chip circuitry made of silicon will be too microscopic to work reliably.

So what’s going to happen? No one is sure yet, but chipmakers are already making moves to safeguard the future of hardware development. This week, IBM announced plans to allocate $3 billion over five years to chip research. While the company's overall R&D expenditures will remain the same, there is a new focus not only on miniaturizing circuitry to 7 nanometers, but also on replacing silicon chips with alternative technologies.

Georgia Tech computer scientist Tom Conte tells Popular Science that 7-nanometer transistors are “basically the size of large atoms. There are a lot of unknown quantum effects” that can’t be controlled, so chipmakers can’t guarantee reliable function.

Of course they are.  IBM has always been considerably ahead of the curve in basic research. It doesn't take a genius to see that silicon based photlithography is reaching its prcatical physical limits. To read more, click here.