This New Chip Could Lead to Faster, More Secure AI
A team of researchers that includes University of Pittsburgh Assistant Professor Rajkumar Kubendran in the Swanson School of Engineering has helped develop a new type of computer chip that could run artificial intelligence programs locally rather than relying on the cloud.
Published in Nature, this “compute-in-memory” chip is a step toward developing new ways to use artificial intelligence that are more secure, faster, cheaper and even more ecologically friendly.
“Everybody can have this computer processor in their phones or any mobile device,” said Kubendran. “It would save battery when using artificial intelligence apps and make it easier for any device to run them. This development really can find a use in just about any device you can think of.”
Currently, most AI applications are hosted on the cloud: data servers that are accessed with the internet rather than hosted on the device. The time and energy involved with moving data over the long distances between these servers and other devices is one of the main bottlenecks for conventional AI hardware. With compute-in-memory technology, it becomes possible to offload AI from the cloud and to the device directly — eliminating the power-hungry process of data movement between separate parts of the computer.
Previous compute-in-memory chips used too much power to run complex AI applications directly on the devices that use them, Kubendran said. The new design the team published in Nature, however, is efficient enough to be used even by battery-powered devices like smart wearables or drones.
Kubendran began this work during his time as a PhD student at University of California, San Diego. His work on the project continued at Pitt, where he contributed to both collecting data and writing the paper. He explains that the chip was inspired by observations of how our own brain cells store and process information. “This led us to trying to emulate their function and purpose in our technology by building artificial neurons and synapses, in close proximity, similar to their biological counterparts," he explained.
The eventual goal? Faster, more secure and cheaper AI technologies, making AI more accessible while also saving power and money for consumers.
Image by Nicolle Fuller/Sayo Studio