Technology
Novel Chinese computing architecture 'inspired by human brain' can lead to AGI, scientists say
Scientists in China have created a new computing architecture that can train advanced artificial intelligence (AI) models while consuming fewer computing resources — and they hope that it will one day lead to artificial general intelligence (AGI).
The most advanced AI models today — predominantly large language models (LLMs) like ChatGPT or Claude 3 — use neural networks. These are collections of machine learning algorithms layered to process data in a way that's similar to the human brain and weigh up different options to arrive at conclusions.
LLMs are currently limited because they can't perform beyond the confines of their training data and can't reason well like humans. However, AGI is a hypothetical system that can reason, contextualize, edit its own code and understand or learn any intellectual task that a human can.
Today, creating smarter AI systems relies on building even larger neural networks. Some scientists believe neural networks could lead to AGI if scaled up sufficiently. But this may be impractical, given that energy consumption and the demand for computing resources will also scale up with it.
Other researchers suggest novel architectures or a combination of different computing architectures are needed to achieve a future AGI system. In that vein, a new study published Aug. 16 in the journal Nature Computational Science proposes a novel computing architecture inspired by the human brain that is expected to eliminate the practical issues of scaling up neural networks.
Related: 22 jobs artificial general intelligence (AGI) may replace — and 10 jobs it could create
"Artificial intelligence (AI) researchers currently believe that the main approach to building more general model problems is the big AI model, where existing neural networks are becoming deeper, larger and wider. We term this the big model with external complexity approach," the scientists said in the study. "In this work we argue that there is another approach called small model with internal complexity, which can be used to find a suitable path of incorporating rich properties into neurons to construct larger and more efficient AI models."
-
Technology4h ago
Investigative Journalist Ronan Farrow Talks Documentary About Military-Grade Spyware Being Used Against American Citizens
-
Technology20h ago
Opioid-free surgery treats pain at every physical and emotional level
-
Technology20h ago
Meat has a distinct taste, texture and aroma − a biochemist explains how plant-based alternatives mimic the real thing
-
Technology22h ago
WhatsApp, Instagram, other apps face disruptions across Pakistan | The Express Tribune
-
Technology22h ago
Cassava’s Alzheimer’s drug fails in final-stage trials, stock plunges | The Express Tribune
-
Technology1d ago
Lahore's smog: lessons to be learnt | The Express Tribune
-
Technology2d ago
Breaking up Google? What a Chrome sell-off could mean for the digital world | The Express Tribune
-
Technology3d ago
AI harm is often behind the scenes and builds over time – a legal scholar explains how the law can adapt to respond