Technology
Novel Chinese computing architecture 'inspired by human brain' can lead to AGI, scientists say
Scientists in China have created a new computing architecture that can train advanced artificial intelligence (AI) models while consuming fewer computing resources — and they hope that it will one day lead to artificial general intelligence (AGI).
The most advanced AI models today — predominantly large language models (LLMs) like ChatGPT or Claude 3 — use neural networks. These are collections of machine learning algorithms layered to process data in a way that's similar to the human brain and weigh up different options to arrive at conclusions.
LLMs are currently limited because they can't perform beyond the confines of their training data and can't reason well like humans. However, AGI is a hypothetical system that can reason, contextualize, edit its own code and understand or learn any intellectual task that a human can.
Today, creating smarter AI systems relies on building even larger neural networks. Some scientists believe neural networks could lead to AGI if scaled up sufficiently. But this may be impractical, given that energy consumption and the demand for computing resources will also scale up with it.
Other researchers suggest novel architectures or a combination of different computing architectures are needed to achieve a future AGI system. In that vein, a new study published Aug. 16 in the journal Nature Computational Science proposes a novel computing architecture inspired by the human brain that is expected to eliminate the practical issues of scaling up neural networks.
Related: 22 jobs artificial general intelligence (AGI) may replace — and 10 jobs it could create
"Artificial intelligence (AI) researchers currently believe that the main approach to building more general model problems is the big AI model, where existing neural networks are becoming deeper, larger and wider. We term this the big model with external complexity approach," the scientists said in the study. "In this work we argue that there is another approach called small model with internal complexity, which can be used to find a suitable path of incorporating rich properties into neurons to construct larger and more efficient AI models."
-
Technology10h ago
Ready for AI to fly your plane? World’s first AI passenger plane could make it happen
-
Technology10h ago
Verified checkmarks in Google search aim to identify legitimate businesses
-
Technology1d ago
Cybertruck recall: Tesla to fix rearview camera delay issue affecting over 27,000 vehicles
-
Technology1d ago
World’s first ovarian cancer vaccine could eliminate deadly disease
-
Technology2d ago
Nuclear rockets could travel to Mars in half the time − but designing the reactors that would power them isn’t easy
-
Technology2d ago
Trees’ own beneficial microbiome could lead to discovery of new treatments to fight citrus greening disease
-
Technology2d ago
Facebook launches Content Monetization beta to boost creator earnings
-
Technology3d ago
Tesla drops its most affordable Model 3 as US tariffs hit Chinese-made components