Standard Delivery of AR/VR Packages

It’s easy to imagine the content and potential of future alternative and trans-media experiences – what they might look like, might sound like, might feel like, smell like? What’s more difficult is to imagine delivery channels of sufficient scale to get those experiences to an audience. At GDC this year there were a lot of companies with amazing demos of amazing technology that played amazingly. At least when all the proprietary hardware could connect to the proprietary “store” to play proprietary file formats. In an age of Open Source and Open Standards, the achilles heel of VR might just be an attachment to closed platforms and formats.

“It has to be as easy to use as a toaster.” , or a TV, but probably easier than a microwave oven. It reminds me a lot of the mid 90s “multi-media” revolution, which was a lot of promise plagued by driver problems, hardware incompatibilities, and an explosion of file formats.

As easy as TV – but why is TV easy? It’s because TV is a standard – NTSC, PAL, SECAM and now digitally with MPG. More than once I saw demos false-start with an error saying that the hardware couldn’t connect to the *brand-name* store. Not only is this the monetization tail wagging the physical playback dog, it also points out that the VR world is not only siloed, but apparently even garrisoned. Currently I can author content in Unity, or Unreal, or something else that turns into something that can only play back in a corresponding engine through a dedicated app, and while this is great for a game approach, it’s a pain in the ass if what I want is a TV/toaster level barrier to entrance.

How do we get there from here? We need the NTSC of VR. And not a megalithic beast that tries to be everything to everybody, but something that standardizes basic delivery of content. It’s the old 85% 10% 5% approach – figure out how to automate 85%, facilitate 10% and make 5% possible – if painful. Standardized delivery is that 85%. Just as broadcast TV never replaced movies, a VR standard won’t obviate the need for cutting edge experiences to be delivered on bleeding edge hardware. But what VR/AR and other immersive media experiences need is wide spread adoption, and the only way to get that is with a smart way of encapsulating most of the functionality needed in a standardized wrapper that can leverage existing online delivery channels. Youtube and Vimeo can both playback video files, or those files can be served up from another place. The files can be hosted from one place and served up through an embedded link. It’s a delivery ecosystem that the web knows, and knows how to monetize already; it’s just waiting for standard packages to deliver.


Borrowing a Historical Lens

The comment in this image that photo is “as well composed as the best renaissance art”, is interesting. Firstly – it’s as well composed as the best romantic neoclassical art, some 400 years after the Italian Renaissance (there’s more time between Da Vinci and Gericault, than Gericault and Picasso). But our collective apathy toward our own cultural history is not the most interesting part. The power in this image is not so much in its composition directly, but how much that composition makes the content of the image play in the same intellectual/emotional space as the old paintings it borrows them from. We read the brawl with all the heroic grandeur of neoclassical history painting because we are seeing it through the lens of neoclassical history painting. And so the struggle in the Ukrainian Parliament, becomes “The Struggle in the Ukrainian Parliament”, and the picture becomes a grand satire within a cognitive dissonance that exists between a schoolyard fight with old men in suits, and the “really important stuff” we see in “real” Art museums.

(need to find image attribution and update)

‪#‎artsnob‬ ‪#‎usingmydegree‬ ‪#‎agingpoststructuralism‬

#Siggraph2013 #effectsOmelet

The “Effects Omelet” presentation at SIGGRAPH is always a great source for inspired creativity on the ground by VFX artists and TDs.  David Lipton, Head of Effects at Dreamworks Animation, gave a particularly interesting talk about he achieved the Jack Frost frost effect in DWA’s “Rise of the Guardians”.

Interesting use of old school approaches to get more controllable artistic results in the expressive effect of Jack Frost’s frost in DWA’s Rise of the Guardians.  The frost needed to be a highly stylized, very art directable and expressive effect, where Jack’s staff would freeze objects by propagating elegant, icy arabesques that skated across surfaces, covering them in stylized frost patterns.

Lipton said that they were helped immensely by the copious notes, reference images and concept art prepared by the Art Department.  This gave him and his team a very distinct target to aim for, and helped to narrow the problem at hand.

The first approaches were simulation based, but proved to be hard to control, espescially because the effect itself needed to be an expressive actor in the film, with its performance often leading directing the eye through key story moments.  The winning approach was to look far back into the history of computer graphics to an old standby of cellular automata.  These are systems in which cells of a grid, like pieces on checker board, follow simple rules that determine how each cell becomes filled by its neighbors.  In this case the rules would determine how ice would grow from square to square as time progresses. The speed at which the squares were filled defined paths, like roadways, along which the delicate and stylized crystal patterns would be constructed.  Because the automata exist in a grid, the rules could be “painted” in like pixels in a digital photo providing a high degree of control.  The end result was a controllable, yet very organic looking crystal propagation that added a sense of magic and expressiveness to the scenes.

Siggraph 2013 emerging tech Girish Balakrishnan

#Sigraph2013 Emerging technology
Girish Balakrishnan, a masters candidate from Drexel University was demonstrating his performance capture camera rig made entirely of commodity consumer components. It’s centered around an iPad and attached Playstation3 controllers that provide the rig’s spatial tracking as well as the user interface components.

The virtual world which the camera operator navigates is provided as a Unity game engine scene runing on the iPad. As the the operator moves through space the iPad displays that motion through a virtual camera in the game scene – like Avatar on a beer budget. The iPad integrates data from the playstation with its own, storing it as a file that can be imported into Maya or Motion Builder.

Balakrishnan has been interested in performance capture for years and feels that the current crop of tools leaves users too tethered to the mouse and keyboard. He wants to change that using tablets, commodity cameras, and game technology. His enthusiasm for the project might just make it a reality.

In its current configuration it can serve as a low budget indy game production tool or a very inexpensive previs tool for independent film and video production. Girish is looking into how to incorporate new HD cameras like the Black Magic to build a more robust camera performance capture system that could expand the creative palette of independent film makers. Performance in the venue was hampered by the huge amount of wireless interference in the Emerging Technologies hall, but it would be interesting to see how it performs in its intended environment – the mocap green screen stage in your garage.