Artificial intelligence not only affords impressive performance, but also creates significant demand for energy. The more demanding the tasks for which it is trained, the more energy it consumes. Víctor López-Pastor and Florian Marquardt, two scientists at the Max Planck Institute for the Science of Light in Erlangen, Germany, present a method by which artificial intelligence could be trained much more efficiently. Their approach relies on physical processes instead of the digital artificial neural networks currently used.
The amount of energy required to train GPT-3, which makes ChatGPT an eloquent and apparently well-informed Chatbot, has not been revealed by Open AI, the company behind that artificial intelligence (AI). According to the German statistics company Statista, this would require 1000 megawatt hours -- about as much as 200 German households with three or more people consume annually. While this energy expenditure has allowed GPT-3 to learn whether the word 'deep' is more likely to be followed by the word 'sea' or 'learning' in its data sets, by all accounts it has not understood the underlying meaning of such phrases.
In order to reduce the energy consumption of computers, and particularly AI-applications, in the past few years several research institutions have been investigating an entirely new concept of how computers could process data in the future. The concept is known as neuromorphic computing. Although this sounds similar to artificial neural networks, it in fact has little to do with them as artificial neural networks run on conventional digital computers. This means that the software, or more precisely the algorithm, is modelled on the brain's way of working, but digital computers serve as the hardware. They perform the calculation steps of the neuronal network in sequence, one after the other, differentiating between processor and memory.
To read more, click here.