I’ve spent the last week wearing Apple’s new AirPods Pro. Not a week straight, I mean, but pretty consistently in all the places where I would usually use one of my other myriad sets of headphones.
In looking at the AirPods Pro as a product, I think there are important things to be gleaned from the choices Apple made in their design—the kind of design choices that may lend insight into the way Apple is thinking about the wearables market.
Wearables, of course, was the market that was sharply up in the company’s most recent quarterly results, and thus is clearly a place that Apple is likely to be focusing some attention in the future. And with rumors of Apple’s AR goggles/glasses starting to coalesce around next year, the AirPods Pro might key us into how Apple is thinking about entering the still nascent (or perhaps non-existent) market for augmented reality headsets.
Holistic Area Network
As I walked up the street the other day, listening to music on the AirPods Pro, they alerted me that my iPhone was ringing. Without breaking my stride, I glanced at my Apple Watch, saw it was from an unknown number, and tapped the button to dismiss the call. All without pulling out my iPhone.
It’s not the first time I’ve thought of Apple’s wearable devices as part of a “personal area network,” but the reality really hit home in that moment: a wrist-mounted display and sensor package in the form of the Apple Watch marries strikingly with the AirPods Pro, which deliver not only audio for entertainment purposes, but notifications as well.
An augmented reality heads-up display would seem to enhance this even further, helping complete this constellation of devices that provides a variety of ways to interact with your technology. By treating this device more like an external display rather than a piece of self-sufficient hardware, à la Microsoft’s HoloLens, it frees up the company to design a much lighter, simpler gadget—and if there’s one thing that we can all agree that Apple likes, it’s light, thin hardware.
Talk to me, Siri
With the latest updates to iOS 13.2, Apple has enabled a feature for AirPods that features distinctly like an augmented reality capability: namely, the ability for Siri to read messages to you as they arrive. Rather than just providing an audible alert chime, Siri will lower the volume of whatever audio is playing and then read the entire message. The virtual assistant will even describe emoji sent in messages, or relate the action from a tapback in iMessage (“John laughed at your text”, for example).
This kind of seamless integration could be a preview of what a pair of augmented reality glasses from Apple provides. Sure, such a device could simply barrage you with notifications like your Apple Watch, but does anybody really want that? Instead, it will almost certainly allow you to choose specifically which apps are allowed to provide you with information—and, in truest Apple fashion, it may not even let that many apps take advantage of your attention in its earliest versions.
And if you’re wondering how Siri would work in a visual environment, look no further than the Siri watchface on the Apple Watch. Granted, it’s an interface that’s still in need of improvement, but there’s something to be said for using machine learning to surface only the notifications and data that you need when you need them.
One of the most interesting features of the AirPods Pro is its spectrum of isolation technologies, from noise-canceling mode to its Transparency feature. Apple’s certainly not the first company to implement such capabilities, but by designing them to be quickly toggled between, there’s a strong indication that the company expects these modes to be used—and switched between—frequently.
The Transparency mode, which essentially pipes in outside sound from the AirPods Pro’s mics, provides a way for users to interact with the world without having to remove the AirPods. Think of it like an augmented reality mode, where the noise canceling feature is more like a virtual reality that insulates you in a world where your audio is the more important factor. I found it handy, for example, when waiting for my name to be called at the coffee shop.
Could a similar set of features find their way into whatever heads-up display hardware Apple might be developing? I don’t find it hard to imagine an ability to toggle between an AR mode where you are getting information about the world around you and a “transparent” mode where the glasses focus more on the world itself. That’ll be particularly important if you’re ever expected to wear these devices while doing a task like, say, driving a car; some sort of Do Not Disturb mode would seem to be a must-have. Because in investing in a wearable heads-up display, Apple will have to walk a very fine line between “augmented” and “reality.”