UK physicists’ chip could make AI systems more energy efficient


Researchers in the United Kingdom have developed a new computer chip that could make some artificial intelligence (AI) systems far more energy efficient.

Developed by Loughborough University physicists, the device can process data that changes over time directly in hardware, rather than relying on software running on conventional computers.

Researchers claimed that the approach can be up to 2,000 times more energy efficient than conventional software-based methods in some tasks, though the exact gains vary depending on the application.

Energy-efficient neuromorphic electronics

“This is exciting because it shows we can rethink how AI systems are built,” said Senior Lecturer in Physics, Dr Pavel Borisov, who led the research team funded by the Engineering and Physical Sciences Research Council (EPSRC).

“By using physical processes instead of relying entirely on software, we can dramatically reduce the energy needed for these kinds of tasks.”

Published in the journal Advanced Intelligent Systems, the work showcases a niobium oxide-based thin film memristor device with intrinsic structural inhomogeneity in the form of random nanopores and performed computational tasks of XOR operations, image recognition, and time series prediction and reconstruction.

“For the latter task we chose a complex three-dimensional chaotic Lorenz-63 time series. By applying three temporal voltage waveforms individually across the device and training the readout layer with electrical current signals from a three-output physical reservoir, we achieved satisfactory prediction and reconstruction accuracy in comparison to the case of no reservoir,” said researchers in the study.

The research team also revealed that their work highlights the potential for scalable, on-chip devices using all-oxide reservoir systems, paving the way for energy-efficient neuromorphic electronics dealing with time signals.

Researchers showed that the device can process time-dependent data and, when its output is fed into a linear computer model, can be used to identify patterns and make short-term predictions.

They tested the system using the Lorenz-63 system – a well-known mathematical model of chaos linked to the “butterfly effect”, where small changes can lead to very different outcomes – as well as tasks including recognising simple pixelated images of numbers and performing basic logic operations, according to a press release.

In these tests, the model was able to use the memristor-processed data to successfully predict the short-term behaviour of the chaotic Lorenz system and reconstruct missing data. It also correctly identified the pixelated numbers and carried out basic logic operations, showing that the same device can support a range of different tasks, as per the release.

Lower energy consumption

“Inspired by the way the human brain forms very numerous and seemingly random neuronal connections between all its neurons, we created complex, random, physical connections in an artificial neural network by designing pores in nanometre-thin films of niobium oxide as part of a novel electronic device,” said Dr Borisov.

“We showed how one can predict the future evolution of a complex time series using these devices at up to two thousand-times lower energy consumption compared to a standard software-based solution.”



Source link