At the inaugural International Solid-State Circuits Conference held on the campus of the University of Pennsylvania in Philadelphia in 1960, a young computer engineer named Douglas Engelbart introduced the electronics industry to the remarkably simple but groundbreaking concept of “scaling.”
Dr. Engelbart, who would later help develop the computer mouse and other personal computing technologies, theorized that as electronic circuits were made smaller, their components would get faster, require less power and become cheaper to produce — all at an accelerating pace.
Sitting in the audience that day was Gordon Moore, who went on to help found the Intel Corporation, the world’s largest chip maker. In 1965, Dr. Moore quantified the scaling principle and laid out what would have the impact of a computer-age Magna Carta. He predicted that the number of transistors that could be etched on a chip would double annually for at least a decade, leading to astronomical increases in computer power.
His prediction appeared in Electronics magazine in April 1965 and was later called Moore’s Law. It was never a law of physics, but rather an observation about the economics of a young industry that ended up holding true for a half-century.
One transistor, about as wide as a cotton fiber, cost roughly $8 in today’s dollars in the early 1960s; Intel was founded in 1968. Today, billions of transistors can be squeezed onto a chip the size of a fingernail, and transistor costs have fallen to a tiny fraction of a cent.
That improvement — the simple premise that computer chips would do more and more and cost less and less — helped Silicon Valley bring startling advances to the world, from the personal computer to the smartphone to the vast network of interconnected computers that power the Internet.
In recent years, however, the acceleration predicted by Moore’s Law has slipped. Chip speeds stopped increasing almost a decade ago, the time between new generations is stretching out, and the cost of individual transistors has plateaued.
Technologists now believe that new generations of chips will come more slowly, perhaps every two and a half to three years. And by the middle of the next decade, they fear, there could be a reckoning, when the laws of physics dictate that transistors, by then composed of just a handful of molecules, will not function reliably. Then Moore’s Law will come to an end, unless a new technological breakthrough occurs.
To read more, click here.