Intel has created the largest neural computing device in the world, a device aimed at mimicking the workings of the human brain. The company hopes to be able to run artificial intelligence models that are more complex than possible on traditional computers, but experts say there are engineering challenges that must be overcome before the device can compete with leading technology and surpass it.
Many hopes are pinned on neural computing devices because they differ inherently from traditional devices. While a regular computer uses its processor to perform operations and store data in separate memory, a neural computing device uses artificial neural cells to store some and calculate others, just as our brains do. This eliminates the need to shuttle data back and forth between components, which could be a bottleneck for current computers.
This architecture can bring much greater energy efficiency, as Intel claims that its new neural computer Hala Point uses 100 times less energy than traditional machines when solving optimization problems, which involve finding the best solution for a given problem while considering some constraints. It can also open up new ways to train and run artificial intelligence models that use chains of neural cells, resembling real brains in processing information, instead of mechanically passing inputs through each layer of artificial neural cells as current models do.
Hala Point contains 1.15 billion artificial neural cells across 1152 Loihi 2 chips, capable of performing 380 trillion synaptic operations per second. Mike Davies at Intel says that despite this power, the system occupies only six shelves in a regular server rack – a space similar to a microwave oven. Davies says that it will be possible to build even larger machines. “We built this system at this scale because, frankly, a billion neural cells is a nice round number,” he says. “I mean, there were no specific technical engineering challenges that made us stop at this level.”
There is currently no other existing device that can match the size of Hala Point, although the DeepSouth device, believed to be completed this year, will be able to execute 228 trillion synaptic operations per second.
The Loihi 2 chips are still early-stage prototypes manufactured in small numbers by Intel, but Davies says the real barrier lies in the layers of software needed to translate real-world problems into a format that can be run on a neural computer and processed. This process, like neural computing in general, is still in its early stages. “Software was a very specific limiting factor,” says Davies, meaning that there is no point in building a larger machine just yet.
Intel suggests that a device like Hala Point could create artificial intelligence models that continuously learn, instead of needing to be trained from scratch for each new task, as is the case with current models. But James Knight at the University of Sussex in the UK dismisses this as “hype.”
Studies suggest that current models like ChatGPT are trained using graphics cards in parallel work, meaning that many chips can be used to train the same model. But because neural computing devices operate with only single input and cannot be trained in parallel, it is likely to take decades to train something like ChatGPT on such devices, and designing ways to make it continuously learn once operational will be challenging, according to him.
Davies says that while today’s neuromorphic devices are not suitable for training large artificial intelligence models from scratch, he hopes that someday they will be able to take pre-trained models and enable them to learn new tasks over time. He says: “Though the methods are still under research, this is the problem of continuous learning that we believe large-scale neuromorphic systems like Hala Point can efficiently solve in the future.”
He is optimistic that neuromorphic computers may provide support for many other problems in computer science, in addition to increasing efficiency – when the necessary tools for developers to write programs to run these problems on unique devices mature.
They may also provide a better path to achieving human-level intelligence, also known as Artificial General Intelligence (AGI), which many AI experts believe will not be achievable using large language models like ChatGPT. “I think this view is becoming increasingly less controversial,” says Knight. “The dream is that someday neuromorphic computing will enable us to create brain-like models.”