Could computers built like brains be a ‘competition killer’?

image source, Getty Images

image caption, Demand for electricity from data centers is growing rapidly

  • Author, Zoe Corbyn
  • Role, Technology reporter
  • Report from San Francisco

Modern computers’ appetite for electricity is growing at an alarming rate.

According to a recent report by the International Energy Agency (IEA), the consumption of data centers, artificial intelligence (AI) and cryptocurrencies could reach up to double the level of 2022 by 2026.

It is estimated that in 2026, the energy consumption of these three sectors could be roughly equivalent to Japan’s annual energy needs.

Companies like Nvidia—whose computer chips power most AI applications today—are working to develop more energy-efficient hardware.

But could an alternative route be to build computers with a fundamentally different type of architecture that is more energy efficient?

Some companies certainly think so, drawing on the structure and function of an organ that uses a fraction of the power of a regular computer to perform more operations faster: the brain.

In neuromorphic computers, electronic devices mimic neurons and synapses and are interconnected in a way that resembles the electrical network of the brain.

This is nothing new – scientists have been working on this technique since the 1980s.

However, the energy demands of the AI ​​revolution are increasing the pressure to bring the nascent technology into the real world.

Current systems and platforms exist primarily as research tools, but proponents say they could deliver huge gains in energy efficiency,

Those with commercial ambitions include hardware giants such as Intel and IBM.

There are also a handful of small businesses on the scene. “There’s an opportunity waiting for a company that can figure it out,” says Dan Hutcheson, an analyst at TechInsights. “[And] the opportunity is that it could be Nvidia’s killer”.

image source, SpiNNcloud systems

image caption, SpiNNcloud says its neuromorphic computer will be more energy efficient for AI

In May, SpiNNcloud Systems, a spinout of the Dresden University of Technology, announced that it will start selling neuromorphic supercomputers for the first time and is accepting pre-orders.

“We achieved the commercialization of neuromorphic supercomputers before other companies,” says Hector Gonzalez, its co-CEO.

This is a significant advance, says Tony Kenyon, professor of nanoelectronic and nanophotonic materials at University College London, who works in this area.

“While there is still no killer application … there are many areas where neuromorphic computing will provide significant gains in energy efficiency and performance, and I am confident that we will begin to see widespread adoption of this technology as it matures.” he says.

Neuromorphic computing covers a range of approaches – from simply a more brain-inspired approach to a near-complete simulation of the human brain (which we’re actually nowhere near).

However, there are some basic design features that set it apart from conventional computers.

First, unlike conventional computers, neuromorphic computers do not have separate memory and processing units. Instead, these tasks are performed together on one chip in one place.

Eliminating the need to transfer data between the two reduces power consumption and speeds up processing times, notes Prof Kenyon.

An event-driven approach to computing may also be common.

Unlike conventional computing, where every part of the system is always on and available to communicate with any other part, activation in neuromorphic computing can be more sparse.

Imitation neurons and synapses only activate when they have something to communicate, just as many neurons and synapses in our brains only jump into action when there is a reason.

Doing work only when there is something to process also saves energy.

And while modern computers are digital—using ones or 0s to represent data—neuromorphic computation can be analog.

Historically, this method of computation relies on continuous signals and can be useful where data coming from the outside world needs to be analyzed.

However, for the sake of ease, most commercially oriented neuromorphic efforts are digital.

More business technology

The commercial applications considered fall into two main categories.

One of SpiNNcloud’s focus is providing a more energy-efficient and powerful platform for AI applications – including image and video analytics, speech recognition, and large-language models that power chatbots like ChatGPT.

Another is in “edge computing” applications – where data is processed not in the cloud, but in real time on connected devices, but operating with limited performance. Autonomous vehicles, robots, mobile phones and wearable technology could all benefit.

However, technical problems persist. Long considered a major stumbling block to the progress of neuromorphic computing in general is the development of the software needed to make the chips work.

While having the hardware is one thing, it must be programmed to work, and this may require the development of a completely different programming style than that used by regular computers.

“The potential of these devices is huge … the problem is how to make them work,” sums up Mr Hutcheson, who predicts that it will take at least a decade, if not two, before the benefits of neuromorphic computers are really felt.

There are also price issues. Whether they use silicon, as commercially oriented efforts do, or other materials, creating radically new chips is expensive, notes Prof Kenyon.

image caption, Intel is making ‘rapid progress’ with its neuromorphic computer, says Mike Davies (right)

Intel’s current prototype neuromorphic chip is called Loihi 2.

In April, the company announced that it had collected 1,152 of them to create Hala Point, a massive neuromorphic research system containing more than 1.15 billion fake neurons and 128 billion fake synapses.

With a neuron capacity roughly equivalent to an owl’s brain, Intel claims it’s the world’s largest system to date.

At the moment, it is still a research project for Intel.

“[But Hala Point] it shows that there is some real viability for applications using AI,” says Mike Davies, director of Intel’s Neuromorphic Computing Lab.

About the size of a microwave, Hala Point is “commercially relevant” and there is “rapid progress” on the software side, he says.

IBM calls its latest brain-inspired prototype chip NorthPole.

It was introduced last year and is an evolution of the previous TrueNorth chip prototype. Tests show it’s more energy-efficient, space-efficient and faster than any chip currently on the market, says Dharmendra Modha, the company’s chief scientist for brain-inspired computing. He adds that his group is now working to demonstrate that the chips can be dialed together into a larger system.

“The path to market will be in the future,” he says. One of NorthPole’s big innovations, notes Dr Modha, is that it was co-designed with software, so all the architecture’s capabilities can be used from the start.

Other smaller neuromorphic companies include BrainChip, SynSense and Innatera.

image caption, IBM claims its NorthPole chip is more energy efficient and faster than other chips

The SpiNNcloud supercomputer commercializes neuromorphic computing developed by researchers from TU Dresden and the University of Manchester under the auspices of the EU’s Human Brain Project.

The result of this effort are two neuromorphic supercomputers for research purposes: the University of Manchester-based SpiNNaker1 machine consisting of more than one billion neurons, which has been operational since 2018.

The second-generation SpiNNaker2 machine at TU Dresden, which is currently in the process of being configured, has the capacity to emulate at least five billion neurons. Commercially available systems offered by SpiNNcloud can reach an even higher level, at least 10 billion neurons, Mr. Gonzalez says.

The future will be one of different types of computing platforms – conventional, neuromorphic and quantum, another new type of computing also on the horizon – all working together, says Professor Kenyon.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top