For several decades now, Georgia Tech professor Tom Conte has been studying how to improve computers: "How do we make them faster and more efficient next time around versus what we just made?"

And for decades, the principle guiding much of the innovation in computing has been Moore's law — a prediction, made by Intel co-founder Gordon Moore, that the number of transistors on a microprocessor chip would double every two years or so. What it's come to represent is an expectation, as The New York Times puts it, that "engineers would always find a way to make the components on computer chips smaller, faster and cheaper."

Lately, faith in Moore's Law has been fading. "I guess I see Moore's Law dying here in the next decade or so, but that's not surprising," Moore said in a 2015 interview with a publication of the Institute of Electrical and Electronics Engineers.

To read more, click here.