According to a paper published by Nature Computational Science on Friday, the researchers developed a model that bridges the gap between big, externally complex AI networks and the small, internally complex workings of the brain.
Industry experts said the team’s findings could mark a pivotal shift in AI development, prompting further exploration of computing solutions that are not dependent on silicon chips.
Current AI trends largely revolve around building ever-bigger neural networks, an approach that is fuelling concerns about unsustainable energy demands and a lack of interpretability.
In contrast, the human brain – with its 100 billion neurons and around 100 trillion synaptic connections – consumes about 20 watts of power. At the same time, each of the brain’s neurons is more diverse and complex than any existing AI model.
To read more, click here.