To try to express the speed of modern supercomputers in conventional notation is an exercise in absurdity: a petaflop is one quadrillion floating point operations per second, while the next milestone, an exaflop, is a million trillion calculations per second. Got that?

The easiest way to compare them is to note that an exaflop is a thousand times as large as a petaflop. An exascale computer would be a thousand times as powerful as the fastest supercomputers on earth today, each of which cost in excess of $100 million to construct.

"To me the major happening over the past few years is that we were able to reach petascale using conventional technology," says Thom Dunning, head of the National Center for Supercomputing Applications. "It was an evolution of a technology path we've been on for the past five years. We can't reach exascale with same technology path."

No one knows how we're going to get to that scale, but just about every high-performance computing scientist seems to view it as inevitable. The Defense Advanced Research Projects Agency (DARPA), the government body tasked with funding the riskiest, most far-out research projects, thinks getting there will require that we "reinvent computing."

To read the rest of the article, click here.