If Apple made one thing clear during Tuesday’s iPhone 13 launch event, it’s that the cameras and processors on these new phones are going to be a big deal. Apple’s keynote even included a short film by Oscar-winning director Kathryn Bigelow and cinematographer Greig Fraser that was shot on the iPhone 13 Pro to show off its new camera tricks.
It’s evident that Apple is marketing the iPhone 13 and 13 Pro to photographers and videographers. But the iPhone 13’s advancements make me think about how Apple could be setting the stage for something bigger. It may be laying the groundwork for what’s expected to be Apple’s Next Big Thing: an augmented or virtual reality headset.
Upgrades like better cameras, more powerful processors and additional storage options are typical for new phones. But it’s the way these additions come together with other iPhone updates from the past two years that suggest Apple is setting up the iPhone to be an augmented reality powerhouse.
The iPhone may one day be the brains of Apple’s rumored AR, VR headset
Rumors have circulated for years that Apple could be working on smart glasses that provide augmented and virtual reality experiences. But unlike most Apple products, which typically leak in detail long before their release, reports have offered a mixed perspective so far.
Bloomberg reported last January that Apple is developing an all-in-one AR and VR headset geared toward developers that runs on its most powerful chips. This headset would serve as a precursor to a more mainstream pair of sleek AR glasses, the report said.
But a recent story from The Information, published just days before the iPhone event, describes something entirely different. It suggests that Apple’s headset will run on a less powerful chip and will therefore have to connect to a host device such as the iPhone.
If The Information’s report turns out to be correct, the iPhone 13 lineup certainly seems like a capable host for that kind of wearable device. Apple is calling the version of its A15 Bionic chip that’s in the iPhone 13 Pro, which has five graphics processing cores instead of the four in the regular iPhone 13, the fastest chip ever in a smartphone. Apple is pitching this phone towards photo and video editors, but better graphics also likely means better performance in AR and VR apps.
Processing power aside, all of Apple’s iPhones are also getting a boost in battery life and more storage space, with the Pro becoming the first iPhone to get a 1TB storage option. Again, these are updates would likely be necessary if AR apps become more popular, and Apple certainly appears confident they will be. That makes me think Apple could be future-proofing these iPhones for a scenario in which we’re all using AR or VR apps on our phones almost daily. Or for when Apple’s long-rumored headset exists.
The iPhone is gradually becoming better-equipped for AR
These upgrades alone don’t suggest anything meaningful about Apple’s ambitions for future products. But Apple is clearly making the iPhone much better at powering AR experiences that are meant to live on your phone, as we’ve seen in recent years.
Apple is positioning the camera updates on the iPhone 13 — including a new Cinematic mode that automatically shifts focus between subjects — as being ideal for media professionals. And Apple is probably correct in assuming that’s the most meaningful way we’ll see those fancy cameras being put to use in the near term. But I can imagine that having cameras that are capable of locking onto subjects more quickly and accurately would be extremely useful for AR apps too, although Apple didn’t focus on AR during its event.
In addition to upgrading the iPhone’s cameras, Apple has been outfitting its gadgets with sensors that give them a much better sense of their surroundings. That’s key for a technology like AR that needs to accurately detect objects in the real world in order to function
The biggest clue came last year when Apple added a lidar scanner to the iPhone 12 Pro and Pro Max, a sensor that detects depth by measuring how long it takes for light to reflect back from an object. Apple hasn’t been subtle about the way lidar can improve augmented reality on the iPhone; it highlighted AR as a key reason for putting lidar in the iPhone in the first place. I can imagine that the iPhone 13 Pro’s upgraded cameras combined with lidar could enable it to run some powerful AR apps.
A year earlier, Apple also put an ultrawideband chip in the iPhone for the first time. The iPhone 11 introduced Apple’s U1 chip, which allows for location tracking that’s much more precise when used indoors compared to GPS. Right now, the iPhone’s ultrawideband tech is primarily used to improve AirDrop and to find lost items via Apple’s AirTag trackers.
Still, it’s another example of how the iPhone is becoming more spatially aware, and it could hold a lot of potential for future AR applications. AirTags are already providing an early indication of how this technology could be used in AR apps.
A feature called Precision Finding, for example, displays prompts on your iPhone’s screen leading you to your lost AirTag. It’s easy to imagine how that could translate to future AR apps that overlay directions on top of the real world instead of on your iPhone’s screen.
And if that’s not enough, Apple’s upcoming iOS 15 also comes loaded with features that seem poised for future AR glasses, as my colleague Scott Stein notes.
Did Apple give the iPhone 13 a better camera, more processing power and longer battery life just for augmented and virtual reality apps? No. These upgrades are useful for every smartphone user, even those who aren’t shooting movies on their iPhones and primarily use their device for snapping pet photos and reading the news. But when you consider these updates in the context of how the iPhone has evolved over the years, it certainly seems like there’s potential for a whole lot more.