Generally, computers slow down as they age. Their processors struggle to handle newer software. Apple even deliberately slows its iPhones as their batteries degrade. But Google researchers have published details of a project that could let a laptop or smartphone learn to do things better and faster over time.

The researchers tackled a common problem in computing, called prefetching. Computers process information much faster than they can pull it from memory to be processed. To avoid bottlenecks, they try to predict which information is likely to be needed and pull it in advance. As computers get more powerful, this prediction becomes progressively harder.

In a paper posted online this week, the Google team describes using deep learning—an AI method that employs a large simulated neural network—to improve prefetching. Although the researchers haven’t shown how much this speeds things up, the boost could be big, given what deep learning has brought to other tasks.

“The work that we did is only the tip of the iceberg,” says Heiner Litz of the University of California, Santa Cruz, a visiting researcher on the project. Litz believes it should be possible to apply machine learning to every part of a computer, from the low-level operating system to the software that users interact with.

 

Such advances would be opportune. Moore’s Law is finally slowing down, and the fundamental design of computer chips hasn’t changed much in recent years. Tim Kraska, an associate professor at MIT who is also exploring how machine learning can make computers work better, says the approach could be useful for high-level algorithms, too. A database might automatically learn how to handle financial data as opposed to social-network data, for instance. Or an application could teach itself to respond to a particular user’s habits more effectively.

To read more, click here.