The research team at Apple in the field of artificial intelligence has developed a new mechanism that allows running artificial intelligence models directly on iPhones.
This new mechanism relies on the storage memory in iPhones, where applications, photos, and all files are saved by users, in order to operate the large language models (LLMs) that form the basis of the system’s artificial intelligence features.
It is worth mentioning that chatbots relying on large language models like ChatGPT, Claude, and others require a significant amount of data and memory, presenting a challenge for consumer devices like iPhones with limited memory. These chatbots and similar tools are typically operated through cloud services.
The researchers in a recent research paper from Apple pointed out that the developed technologies are capable of enhancing the use of storage memory to run artificial intelligence models and overcome the challenges by reducing data transfer and maximizing storage memory performance, thus speeding up the data collection and processing process.
Researchers indicated that the modern techniques developed lead to running artificial intelligence models on devices faster at a rate up to four to five times faster than central processors and twenty to twenty-five times faster than graphics processors. The use of these technologies will allow users to benefit from artificial intelligence features like advanced Siri capabilities, real-time language translation, sophisticated imaging features, and augmented reality directly on iPhones without the need for Apple servers.
It is expected that Apple will release the upcoming iOS 18 system and subsequent versions of iPhones and iPads, heavily relying on artificial intelligence technologies.