A team of AI researchers at Google's DeepMind project have developed a type of AI system that is able to demonstrate social learning capabilities. In their paper published in the journal Nature Communications, the group describes how they developed an AI application that showed it was capable of learning new skills in a virtual world by copying the actions of an implanted "expert."
Most AI systems, such as ChatGPT, gain their knowledge through exposure to huge amounts of data, such as from repositories on the Internet. But such an approach, those in the industry have noted, is not very efficient. Therefore many in the field continue to look for other ways to teach AI systems to learn.
One of the most popular approaches used by researchers is to attempt to mimic the process by which humans learn. Like traditional AI apps, humans learn by exposure to known elements in an environment and by following the examples of others who know what they are doing. But unlike AI apps, humans pick things up without the need for huge numbers of examples. A child can learn to play the game of Jacks, for example, after watching others play for just a few minutes—an example of cultural transmission. In this new effort, the research team has attempted to replicate this process using AI constrained to a virtual world.
To read more, click here.