An interesting thing is happening with iOS 18.4. Visual Intelligence, previously exclusive to the iPhone 16 and iPhone 16 Pro, is being expanded to the iPhone 15 Pro. Visual Intelligence is a feature that uses the rear camera on your iPhone to analyze and identify objects in the viewfinder. It’s essentially Apple’s take on Google Lens.

iPhone 16 pro being held with text next to it describing Visual Intelligence

With both the iPhone 16e and iPhone 15 Pro, Apple will be expanding this Apple Intelligence feature to devices without Camera Control. Visual Intelligence will be accessible through either the Action Button or a Control Center toggle. This is great because in my mind, this should also provide a way to bring Visual Intelligence to Apple’s other product line with a world-facing camera and powerful AI hardware.

Yes, I know, I know….the only people that use the back camera on iPads are old people at concerts. I’m well aware of that narrative. But us longtime iPad users know there are plenty of legitimate uses for the iPad camera. And really, the awkwardness of using the iPad camera only applies to the 13-inch models (if at all). With smaller iPads, especially the iPad Mini, using the back camera actually feels pretty natural.

I can imagine working outside with my iPad in the Magic Keyboard case, and just tilting my iPad or repositioning it slightly to discreetly use Visual Intelligence. Or being able to do a quick Visual Intelligence lookup of something when I don’t have my iPhone near me.

The camera hardware on iPads may not be in the same league as a modern smartphone, but the fact that Google has been doing this for years with older and worse cameras tells me that iPad hardware is more than up to the task.

Software being able to leverage inputs from the rear camera is a big benefit of the tablet form factor over a traditional laptop. This makes iPad a really attractive platform for AI and Augmented Reality experiences like Visual Intelligence.

There have been no rumors of Apple expanding Visual Intelligence to iPad, but I think it makes as much sense to have this feature on iPads as it does on iPhones.

One response to “iPadOS 19 Feature Request: Visual Intelligence on iPad”

  1. I bet it shows up soon. The fact they brought it to the iPhone 15 Pro tells me they want to get it on as many devices as possible. That increases the adoption numbers and cultural impact. If the ChatGPT app can use an image taken from the back of an iPad and explain it, there is no reason why Apple can’t pipe it through their Visual Intelligence UI.

Leave a Reply

Discover more from SlatePad

Subscribe now to keep reading and get access to the full archive.

Continue reading