Some interesting details from Mark Gurman for Bloomberg on Apple’s AR and VR hardware projects led by Mike Rockwell:
He started building his team in late 2015, and what grew into a 1,000-strong group of engineers went to work developing two products aimed at upending the VR and AR segments. A device code-named N301 would take the best of both VR and AR—the first an all-encompassing digital experience for gaming and consuming content, and the second a tool for overlaying information such as text messages and maps in front of a viewer. The other device, N421, a lightweight pair of glasses using AR only, is more complex.
N301 was initially designed to be an ultra-powerful system, with graphics and processing speeds previously unheard of for a wearable product. The processing capabilities were so advanced—and produced so much heat—that the technology couldn’t be crammed into a sleek headset. Instead, Rockwell’s team planned to sell a stationary hub, which in prototype form resembled a small Mac, that would connect to the headset with a wireless signal. In Rockwell’s early version, the headset would also be able to operate in a less-powerful independent mode.
Which sounds like a lot of the other attempts to do truly “magical” VR and AR, from MagicLeap to Oculus, to do it to that level, they all still need some sort of external computing device. This, it seems, did not sit well with Jony Ive:
Ive balked at the prospect of selling a headset that would require a separate, stationary device for full functionality. He encouraged Rockwell and his team to redevelop N301 around the less powerful technology that could be embedded entirely in the device. Rockwell pushed back, arguing that a wireless hub would enable performance so superior that it would blow anything else on the market out of the water. The standoff lasted for months.
And then:
As for the impasse between Rockwell and Ive, Chief Executive Officer Tim Cook ultimately sided with the design chief. Although the headset now in development is less technologically ambitious than originally intended, it’s pretty advanced. It’s designed to feature ultra-high-resolution screens that will make it almost impossible for a user to differentiate the virtual world from the real one. A cinematic speaker system will make the experience even more realistic, people who have used prototypes say. (The technology in the hub didn’t go entirely to waste: Some is being recycled to build the powerful processors Apple plans to announce next week for its Macs, replacing components made by Intel Corp.)
Still, dispensing with the hub means graphics won’t be as good as they might have been, and the download speeds could be slower. It will also probably make the experience less lifelike than originally hoped. For Ive, who left last year after almost three decades at the company, a more realistic experience was potentially problematic: He didn’t want Apple promoting technology that would take people out of the real world. According to people familiar with the matter, he preferred the concept of the N421 glasses, which would keep users grounded in reality while beaming maps and messages into their field of vision.
Given all the hoopla and touting by Apple of their move to their own silicon for the Mac, it’s sort of surprising they couldn’t use the iPhone itself as the “external hub” for these devices. Again, the chips in the ARM Macs would seem to be one the very same ones that power the iPhone (well, technically the iPad, for the dev kit, at least). Perhaps Apple is worried about battery drainage or the CPU running too hot if these things require too much computing power. Or maybe they’re just waiting for the A14 or A15 or A16…
Still, it’s an interesting sidetone about the rift with Ive here. Not just on the hardware, but on the philosophy around what these devices are trying to do… But hard to hold that argument when you’ve left, of course. And Apple is clearly evolving. How these products land — in 2022 and 2023, it sounds like — will be a true test of the “new” Apple (beyond Services, that is).