Let me say this upfront: I'm not convinced that 'superintelligent' AI are the most pressing threat from coming generations of deep learning machines. Indeed, the entire notion of superintelligence may be nothing more than a philosophical 'what if' hypothesis. We simply do not know whether such a thing can in fact be made, developed, or evolved into existence - here on Earth or elsewhere in the cosmos.

Right now we don't even have a convincing quantitative theory of intelligence. One that both tells us what we really mean by intelligence ('oh look, it can open a can of beans') and tells us how intelligence actually scales with complexity, and whether or not there is a theoretical maximum.

It could be that intelligence follows an S-like curve-of-growth (a logistic function), like so many natural (and unnatural) phenomena. A logistic function or curve can start out with exponential growth, but then flattens or plateaus out as things saturate. A simple example is idealized population growth, where a rapid increase in the number of organisms plays off against the availability of food or resources, ultimately leveling off.

To read more, click here.