fbpx
News

MIT develops chip to make AI faster and drain less battery on smartphones

Chip

The Massachussetts Institute of Technology (MIT) has developed a special-purpose chip that is significantly faster and more power-efficient that its predecessors.

Specifically, the chip can increase the speed of neural network computations by three to seven times, all while reducing power consumption by 94 to 95 percent.

To achieve this, MIT researchers simplified the chip design to cut down on the numerous data transfers that occur between different processors.

By freeing up this extra work, the chip could in theory better manage tasks like advanced speech, cloud computing and facial recognition.

The chip is intended for battery-powered devices such as smartphones, although MIT says they may also be capable of running in household appliances.

“This is a promising real-world demonstration of SRAM-based in-memory analog computing for deep-learning applications,” said Dario Gil, vice president of artificial intelligence at IBM, in an MIT press release.

“The results show impressive specifications for the energy-efficient implementation of convolution operations with memory arrays. It certainly will open the possibility to employ more complex convolutional neural networks for image and video classifications in IoT [the internet of things] in the future.”

Image credit: Pixabay

Source: MIT Via: Engadget

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments