A self-driving car powered by one of the more popular artificial intelligence techniques may need to crash into a tree 50,000 times in virtual simulations before learning that it’s a bad idea. But baby wild goats scrambling around on incredibly steep mountainsides do not have the luxury of living and dying millions of times before learning how to climb with sure footing without falling to their deaths. And a psychologist’s 3-year-old daughter did not need to practice millions of times before she figured out, upon a whim, how to climb through an opening in the back of a chair.
Today’s most powerful AI techniques learn almost everything about the world from scratch with the help of powerful computational resources. By comparison, humans and animals seem to intuitively understand certain concepts—objects and places and sets of related things—that allow them to quickly learn about how the world works. That begs an important “nature vs. nurture” question: Will AI learning require built-in versions of that innate cognitive machinery possessed by humans and animals to achieve a similar level of general intelligence?
Two leading researchers in AI and psychology went head-to-head debating that topic in an event hosted by New York University’s Center for Mind, Brain and Consciousness last night.
“None of the AI techniques we have can build representations of the world, whether through structure or through learning, that are anywhere near what we observe in animals and humans,” said Yann LeCun, a computer scientist at NYU and director of Facebook Artificial Intelligence Research.
To read more, click here.