Apple Augmented Reality Play: Glasses and Much More
Apple’s investment in augmented reality is getting a lot of media play today on the heels of a Bloomberg report on the team:
Apple has built a team combining the strengths of its hardware and software veterans with the expertise of talented outsiders, say the people, who requested anonymity to discuss internal strategy. Run by a former Dolby Laboratories executive, the group includes engineers who worked on the Oculus and HoloLens virtual reality headsets sold by Facebook and Microsoft as well as digital-effects wizards from Hollywood. Apple has also acquired several small firms with knowledge of AR hardware, 3D gaming and virtual reality software.
This follows the script Apple used in creating its watch and for its self-driving car project. The watch succeeded to reaching the market and racking up impressive sales, but the car didn’t do as well. So the question is whether Apple AR will be more like the watch, the car, or better yet, like the iPhone.
Where the Car Went Wrong
Apple got a couple of years into the car project before realizing that it would take an enormous investment in areas not likely to return large margins: dealerships, a support network, and the actual car itself. While Apple is good at delivering entertainment and a great user experience, cars are mainly about the task of delivering people from point A to point B with a minimum of hazard, fuss, and expense.
Utilitarian products are not Apple’s game. More importantly, the contribution of Apple’s software would have been dwarfed by the strictly automotive task, so no large margins to capture.
And I suspect that Apple realized that revolutionizing the driving experience requires some expertise that Apple doesn’t quite have yet. The AR play is a way to bridge this gap, so the car wasn’t dropped as much as it was delayed until these other technologies are developed. AR is a way to make them come to life.
AR is More than Gadgetry
Bloomberg seems to underestimate what’s involved in AR. It’s not simply a matter of mating AR glasses to an iPhone and declaring victory. AR is a new paradigm that will need all the other pieces of a comprehensive new way of using computers, software, and content.
Apple realizes this, and added people to the team with unusual skills for a computer company:
Apple has also recruited people with expertise in everything from 3D video production to wearable hardware. Among them, the people say: Cody White, former lead engineer of Amazon’s Lumberyard virtual reality platform; Duncan McRoberts, Meta’s former director of software development; Yury Petrov, a former Oculus researcher; and Avi Bar-Zeev, who worked on the HoloLens and Google Earth.
Apple has rounded out the team with iPhone, camera and optical lens engineers. There are people with experience in sourcing the raw materials for the glasses. The company has also mined the movie industry’s 3D animation ranks, the people said, opening a Wellington office and luring several employees from Weta Digital, the New Zealand special-effects shop that worked on King Kong, Avatar and other films.
This suggests a desire to create an immersive entertainment experience that’s integrated with real life. So Apple doesn’t want to take you away from your daily tasks to immerse you in a movie or a game, it wants to add a thrill factor to daily life. That’s what Apple does, after all.
Siri is the Missing Piece
Regardless of how much computing power Apple can add to the iPhone, it won’t be enough to create the full AR experience. To map imagery and sound onto real life with an good sense of context, we’re going to need cloud computing horsepower on the back end. AR is best thought of as the user interface part of an immersive computing experience.
The new AR glasses will be supplemented by something like Apple’s AirPods – earphones you can always wear that include high-quality mics. With AirPods we can enjoy hands-free communication with Siri. And with a new, improved Siri, we have access to the cloud.
So AR is way to bring AI into daily life to give us information we need to make things better when we need it, and in a way that doesn’t interfere with the things we need to do.
Examples of AR in Daily Life
Some people are already thinking about how this will work. Blogger Jason Calacanis speculated a week ago about some things he’d like Siri to do. This list is pretty utilitarian, but a couple of scenarios are suggestive:
- Hey Siri, set Waze to take me to my office on the 280 unless it’s seven minutes more than the 101 and set my Tesla to 84 miles per hour, but slow down to 72 miles per hour if Waze reports the police are ahead…
- Imagine someone walks up to you speaking Japanese, and it’s automatically translated for you.
I’m sure you can think of other ways to employ an AI with access to the world around you as well as massive databases of personal and public information to make your life better.
The New Vistas Thing
Once we’re used to having a context-aware information buddy following us when we want one, it’s easy to see how the next generations of cars and TVs will look. They will have to adapt to our new way of doing things.
And if Apple is successful in bringing information-rich AR into daily life, consumers will demand that all of our machines adapt as well. So that means Siri-enabled (or Alexa-enabled if Amazon wins) devices in the car, the home, and the office.
But an AR-focused phone supplemented with a new headset is the next step along the path. Apple certainly appears to be doing the right thing. And Microsoft isn’t just sitting back waiting for this to happen either: it’s building up its AI/research staff to 5,000 people so as not to miss the Next Big Thing.