They say you shouldn’t hijack the head-tracking data stream of the Oculus Rift; visuals should not be separated from the human vestibular system...but rules were meant to be broken. Why? Because there’s so much more to VR than gaming.
This is not to say games aren’t becoming movies! I found myself strangely immersed in Naughty Dog’s “Last of Us”, more than any tent-pole movie I’ve seen in the past few months. Such is the power of CG movies, uncanny valley be damned.
Defining a language for 360 degree look-around movies: You know how it all began oh so long ago (OK, four years ago) when the language of film-making was being defined / re-written for S3D. Well, time for a re-write again. Immersive 360 film-making is set to explode; geared for an audience of teens to mid-forties - at least at the start, and telling stories in this medium is quite a different skill-set to master.
Citizen Kane, back in the day, although a 2D film, had given enough clues to modern 3D film-makers on how to effectively use the medium of S3D… but no one really had the patience to listen. Lighting, depth of field and yes – even hijacking the head-tracking stream can work when creating movies on a 360 degree canvas.
When I started investigating this exciting medium a few months ago, alarm bells would go off when I asked on Oculus Rift / Game Engine forums about intercepting head-tracking and orientation info of these devices, but that’s because so far it’s only games that have been designed for VR.
It’s soon becoming evident that apart from the gimmicky interactive look-around voyeuristic possibilities offered by the medium, serious directors and storytellers will look at retaining control of the “frame” if they are to be enticed into creating movies in virtual reality.
So what could an immersive 360 Director’s tool-box look like?
• Lighting- With the temptation to look around a scene, a director and VR DoP can use the age-old technique of spot-lighting areas of importance.
• 360 Positional sound- maybe, Dolby Atmos gets interested - Chances are an Atmos SDK might already be in the works to create scound-scapes that can aid in directing an audience’s attention.
• Depth of field- The pet peeve of steresoscopic 3D film-making, unless done correctly. This technique is worth exploring in an immersive 360 environment, to guide audience attention. At least it won’t be a lead-by-the-nose experience, as it’s sometimes abused by inexperienced DPs and directors on 2D films.
• Limiting the horizontal FoV- There is no rule per se that every scene should feature full wrap-around 360 views of the scene for the audience to explore. The horizontal field of view can be restricted for certain shots. This is a creative call, and is what will contribute to the flavour of the overall movie experience being crafted by the film-maker.
Advanced Tools for Immersive 360 Storytelling: I am currently working on ‘MAYA’ - a mixed media motion comic for the Oculus Rift and other VR devices, including cell phone VR, such as Google’s Cardboard, Durovis Dive and others that will allow almost any smartphone; andriod or iOS, to playback VR movies and experiences.
In the first scene after the title, a girl – Maya – stands at the window. That is what the director intends the audience to see, and the rest of the room has subdued lighting. That is... unless the wearer turns their head around, which triggers the bedlamps visible in the scene to increase in intensity.
In this instance, Head Tracking Hijack is an issue. The next scene shows the girl framed on the bed. This cut will happen, irrespective of where the wearer of the Rift is looking. Yes, it is a forced cut, and will put the scene bang centre.
Article continues on next page ...
The important point to be aware of, is this: The same rules for stereoscopic 3D storytelling (in 3D films) apply; mismatched depth splicing should be avoided.
GreenScreening the crew out: This idea came to me when I glanced at the image of what I later realised was a paratrooper in the movie. I initially thought they had covered crew/equipment in green, for later keying/wire removal. While I have not looked at the actual feasibility of stereoscopically replacing a background plate after removing any green-screen clad crew or equipment - I am confident that it could be possible, even when dealing with de-warping and stitching the 360 image.
Compositing in 360: There’s every reason to believe, a competent NUKE programmer-artist could write a warping matrix to composite elements directly over a spherical or equi-rectangular sequence of images or video.
Jaunt VR - the people behind “The Mission VR” a 360 immersive film currently under production, are certainly proficient in the algorithm and Cuda coding department, to pull it off.
Virtual Reality film making gear
Capture: The current camera system used by Jaunt VR is said to be a rig based on 14 GoPro cameras in stereoscopic config to capture a wrap-around view of a scene in S3D.
Now - I’ll admit i’m sitting on the fence about actual “scanline” level CMOS sync of go-pros in a stereo config, much less 7 pairs of them!... yet, I’ll give the rig and the experts behind it, the benefit of the doubt. It’s possible that sync drift is non-existent in today’s oscillators/controllers.
Stereo 3D conversion houses: 360 film making might actually put the spotlight on 2D to 3D conversion. - A much needed service for film-makers and a new Business avenue for conversion studios to explore! Compositing in stereo 3D on the stitched 360 image is only a few Nuke nodes away for competent conversion houses.
Software: For MAYA - a 360 Motion novel, I’ve chosen the excellent Cinema Director from the Cinema-Suite authoring package. It’s the closest one can get to an NLE system for an otherwise scripting heavy Game Engine such as Unity.
I had to customise the software to a certain extent to get the desired tools I needed to do those cuts for the Oculus Rift.
Most NLE (non linear editing) systems were late to catch the Stereoscopic 3D train, the same will probably be the case for 360 VR support. Toolkits will need to be written to control different aspects of a VR experience. So what would a time-line look like for an immersive 360 VR film? - Take a look at a screenshot for MAYA, below.
Immersive VR eye-wear: Credit for the device that started the VR revival would go to the Oculus Rift. The recent acquisition of the Oculus Rift for a staggering $2 billion by FaceBook should prove that VR is here to stay.
Equally important are initiatives and products coming out from Sony with it’s Morpheus eyewear, Samsung’s own venture, and a special nod to long time VR headwear experts TDVision and their soon to be released ImmersiON VR eyewear.
Smart phones can easily be converted to immersive 360 video playback hardware. The Durovis Dive, and even Google’s “Cardboard” show how low cost this could be.
Article continues on next page ...
The hardware and software is becoming available at a quicker pace than the talent to produce Immersive 360 content. Are audiences ready?
Gamers have always wanted beefier gaming rigs (laptops), and the same holds true for their screens. With VR devices, it’s like having an Imax strapped to their faces. Yes they are ready - and they’d like to watch their movies the same way. But VR is not limited to gamers.
The luxury of an immersive large screen environment and the privacy and intimacy it offers cannot be discounted. A long haul flight is but one venue that comes to mind where VR eye-wear would be in high demand...
Education is another. Already 360 documentaries, complete with Sir Attenborough’s voice are being readied for when these devices go mainstream.
As a film-maker, will your storytelling skills evolve for the next generation of audiences?
Clyde DeSouza is currently working on a self-funded 360 Motion comic ‘MAYA’. The project is open to involvement and contributor perks include end screen credits. Contact the author at email@example.com for information on how to get involved.