One of the biggest hurdles facing anyone wanting to deliver AR/VR content right now is that every different implementation requires a different packaging of content data. Some of this is a result of the “game” and “app” ecosystems that these experiences come from, but there’s also no other alternative.
Content cannot be delivered as a broadcast stream because there is no definition of what that stream is. Without that there is no standard viewing “environment” to leverage. There are some attempts to work on this – YouTube’s 360 video is an interesting way of delivering one component of immersive content, but it’s not an extensible or leverageable technology. It’s essentially only a movie player. A content creator cannot, for instance, embed a 360 video as one of many elements in a deliverable program.
And so content creators also have to be technologists capable of building worlds of mixed elements inside of an app or game metaphor. Each experience is a one-off, individually crafted delivery of heterogenous content. But most of this content is really just reconfigured instances of only a handful of different kinds of data – 2d, 3d, static, animated, geometry, images, navigable, etc. And this repetition could be exploited into not only a consistent data exchange “format”, but also a consistent experience environment. A content provider would construct, not an app or game, but a container of elements and descriptors, deliverable as a “unit” to any compliant experience environment. Like a broadcast network delivered TV shows, bounced off satellites, thrown across the airwaves or down cables to a TV set that decoded and displayed the experience.
But what would that package look like? How can we all agree? What are the NTSC, mpeg, jpeg, obj, wav of VR? Is it a file? Is it a file aggregation container? There are a lot of questions to answer, but the freedom afforded to content creators when they no longer have to worry about he technology of the viewing experience, could bring the freedom that other creators have had for years. Film makers don’t have to worry about the inner mechanical workings of projectors, writers don’t have to worry about how printing presses work, and AMVR content creators should not have to worry about writing apps.
It’s easy to imagine the content and potential of future alternative and trans-media experiences – what they might look like, might sound like, might feel like, smell like? What’s more difficult is to imagine delivery channels of sufficient scale to get those experiences to an audience. At GDC this year there were a lot of companies with amazing demos of amazing technology that played amazingly. At least when all the proprietary hardware could connect to the proprietary “store” to play proprietary file formats. In an age of Open Source and Open Standards, the achilles heel of VR might just be an attachment to closed platforms and formats.
“It has to be as easy to use as a toaster.” , or a TV, but probably easier than a microwave oven. It reminds me a lot of the mid 90s “multi-media” revolution, which was a lot of promise plagued by driver problems, hardware incompatibilities, and an explosion of file formats.
As easy as TV – but why is TV easy? It’s because TV is a standard – NTSC, PAL, SECAM and now digitally with MPG. More than once I saw demos false-start with an error saying that the hardware couldn’t connect to the *brand-name* store. Not only is this the monetization tail wagging the physical playback dog, it also points out that the VR world is not only siloed, but apparently even garrisoned. Currently I can author content in Unity, or Unreal, or something else that turns into something that can only play back in a corresponding engine through a dedicated app, and while this is great for a game approach, it’s a pain in the ass if what I want is a TV/toaster level barrier to entrance.
How do we get there from here? We need the NTSC of VR. And not a megalithic beast that tries to be everything to everybody, but something that standardizes basic delivery of content. It’s the old 85% 10% 5% approach – figure out how to automate 85%, facilitate 10% and make 5% possible – if painful. Standardized delivery is that 85%. Just as broadcast TV never replaced movies, a VR standard won’t obviate the need for cutting edge experiences to be delivered on bleeding edge hardware. But what VR/AR and other immersive media experiences need is wide spread adoption, and the only way to get that is with a smart way of encapsulating most of the functionality needed in a standardized wrapper that can leverage existing online delivery channels. Youtube and Vimeo can both playback video files, or those files can be served up from another place. The files can be hosted from one place and served up through an embedded link. It’s a delivery ecosystem that the web knows, and knows how to monetize already; it’s just waiting for standard packages to deliver.