fbpx
News

Apple researchers’ AI breakthrough could enable generative AI on iPhones

 Apple GPT might be closer than we expect

Researchers at Apple have reportedly made a major advancement in deploying large language models (LLMs) on mobile devices with limited memory, such as iPhones and iPads.

The breakthrough would reportedly allow Apple devices to run powerful AI assistants and chatbots on-device without relying on cloud servers.

Apple has been lagging behind other tech giants in AI, though that was expected. The company has long had a mindset that focuses on doing things perfectly rather than doing them quickly.

To be able to run AI assistants and chatbots on-device, instead of on the cloud, Apple researchers have devised a method that uses flash memory, the same type of memory where your apps and photos are stored, to run LLMs on mobile devices (via MacRumors).

In a research paper titled “LLM in a flash: Efficient Large Language Model Inference with Limited Memory,” the authors describe two main techniques, namely Windowing and Row-Column Bundling.

Windowing: This technique reuses some of the data that the LLM has already processed instead of loading new data every time. This reduces the frequency and amount of data fetching from the flash memory, making the process faster and smoother.

Row-Column Bundling: This technique groups the data more compactly to read it faster from the flash memory. This speeds up the LLM’s ability to understand and generate language.

“This breakthrough is particularly crucial for deploying advanced LLMs in resource-limited environments, thereby expanding their applicability and accessibility,” wrote the authors.

This could pave the way for iPhones to run sophisticated AI assistants and chatbots on-device, without the need for internet connection or cloud servers, ensuring privacy and security.

Apple CEO Tim Cook has already confirmed that the Cupertino-based company is “investing quite a bit” in AI technology and that it will be a component of some of the company’s future products. He also mentioned that Apple will use generative AI in a “responsible” way, aligning with the company’s values and principles.

Past rumours have indicated that Apple is working on its own chatbot, and some refer to it as Apple GPT. Apple employees are currently using the generative AI for internal testing. A different rumour suggested that iOS 18, which will release next year, could bring generative AI features to Siri.

Image credit: Shutterstock

Source: Apple (LLM in a flash: Efficient Large Language Model Inference with Limited Memory), via: MacRumors

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments