You are currently viewing Here’s why newer iPhones and Vision Pro can’t run Apple Intelligence

Here’s why newer iPhones and Vision Pro can’t run Apple Intelligence

Update June 24: Daring Fireball’s John Gruber explains why Private Cloud Compute can’t take over the full functions of Apple Intelligence.

If you were watching Apple demo the main features of Apple Intelligence during the WWDC keynote on Monday, you were likely thinking of all the ways you’d be able to use the new service on your iPhone this fall. However, when it was over many iPhone users were dismayed to learn it won’t work on their phone—Apple Intelligence is off-limits to all but the newest and most expensive phones.

While Macs and iPads going back to 2020 will get the benefits of Apple Intelligence, support for the iPhone range is restricted to the 15 Pro and 15 Pro Max. That leaves out two of Apple’s newest phones released just a few months ago as well as all older models still on sale and the iPhone SE.

However, while it might seem like a strange decision since the A16 chip in the iPhone 15 and iPhone 15 Plus is plenty fast, a new report from Ming-Chi Kuo sheds some light on things. As he notes, the Neural Engine power of the A16 chip is actually higher than the M1 (17 trillion operations per second vs 11 TOPS), so the requirements aren’t about the NPU. Rather it has to do with memory: The A16 chip has 6GB of RAM versus at least 8GB on all of the devices that support Apple Intelligence.

He breaks it down even further: “The demand for DRAM can be verified in another way. Apple Intelligence uses an on-device 3B LLM (which should be FP16, as the M1’s NPU/ANE supports FP16 well). After compression (using a mixed 2-bit and 4-bit configuration), approximately 0.7-1.5GB of DRAM needs to be reserved at any time to run the Apple Intelligence on-device LLM.”

Over at Daring Fireball, John Gruber explains why devices that don’t have enough memory can’t just use Private Cloud Compute for most tasks: “The models that run on-device are entirely different models than the ones that run in the cloud, and one of those on-device models is the heuristic that determines which tasks can execute with on-device processing and which require Private Cloud Compute or ChatGPT.” He also says Vision Pro isn’t getting Apple Intelligence because the next-gen device “is already making significant use of the M2’s Neural Engine to supplement the R1 chip for real-time processing purposes — occlusion and object detection, things like that.”

Rumors have previously claimed that all iPhone 16 models will have 8GB of RAM, and based on the Apple Intelligence requirements, that’s almost certainly the case. Kuo also assumes that future devices will likely start at 16GB of RAM as Apple Intelligence evolves “most likely to a 7B LLM.” Some smartphones, such as the OnePlus 12 and Xiaomi 14, already have 16GB of RAM.

If you’re a coder, the situation’s a little worse. The new predictive code completion AI in Xcode 16 requires an Apple Silicon Mac with with 16GB of RAM, according to Apple’s documentation.

When Apple Intelligence arrives with iOS 18 this fall, it will still be in beta. However, reports have said it will nevertheless be a centerpiece feature of the iPhone 16. 

Source