The last few years of fluctuation in excitement around XR have uncovered a few lessons and realizations. For example, VR hardware faces consumer resistance and price sensitivity, while mobile AR has greater nearer-term revenue potential.
Another trend we’ve begun to track is that XR tech providers will often trade capability for scale. In other words, reigning back initial aspirations for high-end specs, some have realized that lower-grade — but more affordable, scalable and “good enough” — products can be strategic.
The biggest example of this principle is ARCore. It eventually replaced its forbear Tango, because hardware and logistical barriers were too high to bring Tango to market. Its costly optics didn’t fly with the cost-conscious smartphone OEMs with which Google partnered.
So once Google developed good-enough AR software that works on hundreds of millions of existing smartphone RGB cameras, it pointed all guns in that direction. The other important lesson this tells is the role software can play in working around XR’s hardware bottlenecks.
More evidence of the capability/scale tradeoff can be seen in Oculus Go. The lower-end device will be an attempt to jumpstart Mark Zuckerberg’s goal of a billion people in VR. That won’t happen with higher-end VR, given a price point that’s untenable for mainstream consumers.
Another example lies in Web XR versus apps. The vestiges of the app-happy smartphone era won’t always be optimal for XR. Due to interoperability, accessibility and other market-expanding factors, browser based XR will often be better than apps — even if technologically inferior.
Lastly, there’s marker-based AR — a primitive form of AR relative to SLAM-based scene mapping in ARkit and ARCore. But could markers in traditional media make AR sessions more accessible and frequent? Facebook thinks so, using markers in its AR-centric Ready Player One campaign.
Poisoning the Well
Of course, there are downsides to sacrificing quality to achieve scale. At XR’s early and fragile stages where haters abound, it’s important to create truly compelling experiences. That’s at odds with any strategy that involves compromising quality, given the old adage about first impressions.
This is what Oculus CTO John Carmack calls “poisoning the well.” Unlike its parent, and fitting to a hardware play, Oculus’ product strategy is to not release too early (don’t we know it). The thought is that sub-par experiences can turn off prospective adopters, who may not return for years.
NVIDIA’s Martina Sourada expressed a similar devil’s advocate position on Oculus Go at December’s VRARA panel discussion that ARtillry moderated. She argued that the best of both worlds — untethered, while also high quality — will only be realized when Santa Cruz arrives.
Timing is Everything
And that’s just it: It’s a matter of timing. Moore’s Law and other technological enablers will reach that sweet spot of price/accessibility/quality. But it’s not here yet, hence the tradeoff between capability and scale. Premium quality is obviously ideal, but not if there’s no market to use it.
In addition to some of the tactics above like trading capability for scale and getting creative with software, “unifying technologies,” will also play a role. One of our 2018 predictions, these include things like the AR Cloud, which counteract the platform fragmentation that slows things down.
Meanwhile, the XR industry’s 2016 exuberance has been reigned back a bit. The good news is that it wasn’t wrong or overstated…it was just early — similar to the early 2000’s eCommerce bubble. XR will make its way back to those loftier aspirations, but not until market develops a bit.
That will happen over the next 3–5 years, and bring a less binary combination of capability and scalability. Then we’ll get to realize some of those early visions including scaled-up Tango-like phones, more ubiquitous high-end VR and all the other trappings of the high end.