Virtual filmmaking

Virtual camera technologies are enabling new forms of storytelling and...
Director Jon Favreau’s adaptation of The Jungle Book is already an Oscar candidate for next year.
Director Jon Favreau’s adaptation of The Jungle Book is already an Oscar candidate for next year.

Share

Virtual camera technologies are enabling new forms of storytelling and collapsing the boundaries between production and post.

The traditionally compartmented filmmaking process which moves linearly from pre-production to principal photography, editorial and post production is being collapsed under the advance of technologies from virtual cameras to computer simulations and digital characters.

Post facilities, such as London’s Framestore, have long argued that the work it does to create films like Gravity should be considered integral to production, as if the term ‘post’ somehow denigrates the collaboration its artists have with key creatives on the design and storytelling of the finished piece.

“Filmmakers are becoming more comfortable with digital technology,” says editor Jeffrey Ford, ACE (Iron Man 3; Captain America: Civil War). “Rather than making a film in order of pre-production, production and post these disciplines are happening all at once. Directors can be confident of pulling off longer takes as a stylistic approach because the technology gives them greater flexibility than they’ve ever had to take the brave choice toward cutting less often.”

“The idea of virtual production has been around for many years but with faster graphics and faster computing power we can do more of it in real-time,” says Andy Morley, visual effects supervisor (Gods Of Egypt, Fantastic Creatures and Where to Find Them) Cinesite. “Tools like Ncam rely on real-time tracking technologies that didn’t exist ten years ago,” he adds.

Broadly, two types of data can be turned into virtual visuals. There are computer generated models of a world built in a 3D software package like Maya and that can run live through Autodesk’s MotionBuilder or a game engine like Unreal Engine. The second type of data is created when using motion and performance capture in which actors can be filmed live on set and have their performance translated into a digital animated character in real-time.

He explains, “On set when using this technology, a director or DP (as well as the cast) can view a digital representation of the environment, a space craft or massive city scape for example, giving them a far better guide to the composition of the shot than was possible by just looking at a huge green-screen.”

Pre-visualisation, the process of visualising complex scenes before filming, is typically done by a small group of creative artists led by the director and DP. Vfx facilities can do this allowing for a smoother transition into their pipelines — then the bulk of their work will be the animation and rendering of these sequences.

Post-visualisation is another stage where non-final quality CGI is overlaid onto live action plates to aid with editorial decisions — again, the same pre-visualisation artists often do this work. It’s helpful to have this reference, as it aids the vfx post house later on — Cinesite is currently working on a high end film project where post-visualisation has been supplied.

Morley also runs FuzzyLogicFX, a boutique creative team, who are currently building a digital world set in Alaska for a future film project — all based in the Unreal game engine.

“We use Oculus virtual reality headsets to look around the world and help make creative decisions such as camera placement and lighting choices,” he says. “It is less expensive than going on a location recce, plus if we have something in the digital set we do not like, we can change it!” He likens the creative experience to “a theatrical exploration of a virtual space, making it is easier to make decisions than by looking at 2D images on a monitor.”

Cinesite is developing virtual camera technologies to help give artists a better idea of a DP’s vision, and how to simulate a real camera much more realistically. The [current interface] of a mouse and keyboard do not really give any feeling to the artist as to how a real film camera feels and moves.

He advises, “On any project you are telling a story and delivering a narrative. Ultimately that takes priority, no matter how cool a technology might be.”

The Jungle Book

Robert Legato, ASC, pioneered virtual production techniques for James Cameron’s Avatar in 2009, and worked out how to use 3D for a non-action movie for Martin Scorsese’s Hugo in 2011.

“What I’ve been pushing for since Avatar are tools that allow me to behave the way I want on the set because I’m used to doing analogue work, live-action work,” he said at the FMX visual effects festival in Stuttgart earlier this year. “I’m not sure what the angle of the shot is until I see it. Filmmaking is intuitive, it is best done by trial and error and live in the moment on set.”

Translating this ‘analogue creativity’ into digital form is what Legato along with director Jon Favreau attempted for The Jungle Book. Disney’s adaptation of its own classic cartoon also reverts to Rudyard Kipling’s source material and is already an Oscar candidate for next year.

The production team made the bold decision to film the movie without any outdoor locations, shooting only stills for reference in India.

“It’s a live action movie not an animated movie,” insists Legato. “This is the first movie since Avatar where a whole movie is shot using virtual cameras this way. Since 30% of Avatar is live action, and every frame of Jungle Book contains visual effects, this picture is more complex.”

Only actor Neel Sethi, who plays Mowgli, and a handful of props and set pieces that he touches, are real elements. All the creatures and environments are CG, animated by MPC with exception of the primates (handled by Weta Digital), spanning 1200 shots or 95 minutes of screen time.

A team of senior artists, led by MPC VFX Supervisor Adam Valdez, helped visualise and plan the script and storyboard development and guide the filming on set.

The process enabled director of photography Bill Pope ASC, to pre-visualise real and virtual elements together in order to make all of the detailed cinematography-related decisions for imagery created in a computer.

“First, we had an animatic version, as you would on an animated film, then a motion-capture version that we edited, and then, finally, we took that and shot the kid as though he were an element,” explains Legato.

Rough animations and textures to help Pope decide where the virtual camera should go, block scenes, what lens he would use (were he shooting live action) and the angle of light.

“If you were location scouting you’d have a map and a solar programme to tell you where the sun is going to be and you pick and choose angles based on that,” says Legato. “This process is the same except that we pick and choose angles and feed that back into the computer.”

The lighting choices and camera angles were then fed to the sound stages in LA for the live action direction of Sethi. Favreau used a simulcam — an Arri stereo rig synchronised with a virtual camera monitor showing live composites of the virtual environments and animated creatures. The virtual camera is switchable between background and foregrounds and live action.

“Since time on a stage is expensive, the pre-visualisation allows you to figure out exactly the schematic of how you are going to shoot the scene,” says Legato. “We had the analogue freedom to just choose when we cut to the close-up, and we picked it like we’d normally do in live action. That became the blueprint that we were going to bring to the blue screen stage to recreate specifically that shot. And we knew with great authority that it would fit into the whole because we’d already seen it edited together.”

This means that a director or cinematographer’s creative intent is not being interpreted by someone reading a storyboard or by an artist somewhere in post long after shooting. “Traditional storyboard frames eliminate all the other angles that could well be the choice of the DP on the day of the shoot,” says Legato. “With this method, we have all the coverage at the disposal of the DP and director. It’s trial and error — but this retains the analogue creative element while reducing the cost of what it would normally be to shoot a scene this way.”

To achieve a feeling of infinity — that Mowgli and the creatures could walk endlessly through the jungle — the production shot on a 40-foot turntable using a projector as a light source. The turntable was integrated with a computer program which controlled the projector and replicated how the light would look if the characters were walking up a hill or were in deep shade.

With Technicolor, Valdez created a LUT which would give the animators the exact light ratios and colour curves to work from.

“For the most part what you are seeing is something that could have been conventionally filmed,” says Legato. “But the discipline was to use the computer as a camera. We treated it as strictly a live action camera photographing with the same fidelity, lens choices and aberrations, spherical distortions and dust on the lens as if we’d shot it.”

So for Pope, no stranger to VFX having shot Spider-Man 2 and the Matrix trilogy, the virtual camera had to be indistinct from an actual one in order to achieve the photo-real ‘as if filmed on location’ look.

MPC put 800 artists to work over two years to create 54 animal species, 58 sets and 500 variations of plants and trees for set dressing. The data took up some 1984 Terabytes and occupied 240 million render farm hours. Valdez suggests each shot cost about $80,000.

Although Disney gave its production a budget to match — $175m — the processes pioneered in film have a habit of trickling downstream and may eventually impact TV. In that sense, Legato hopes that The Jungle Book ends up as a “game-changer”

“It is no longer just a visual effect — it is just another way to make a movie,” he says. “CG is no longer a dirty word, and visual-effects artists are no longer technicians. They are filmmakers.”

Computational cinematography and VFX

Recent advances in camera technology and computer processing could render even the cutting-edge approach of The Jungle Book obsolete.

Silicon Valley tech developer Lytro is launching a camera able to record images in five dimensions through a microlens array comprising 2.5 million lenslets. The technique, known as light-field, captures the depth, direction and intensity of light rays in a scene as data to enable an extraordinary degree of flexibility in altering the image in postproduction.

“Currently, key creative decisions such as the frame rate are ‘baked in’ at the time of capture. Light-field means you can computationally change the shutter time, frame rate and aperture after the event,” says Jon Karafin, head of light-field video, product management, Lytro. “For the film and TV community, that means they can get shots that previously weren’t possible. Computers, storage and bandwidth have finally caught up to a point that makes light-field practical, with the level of quality for demanding production.”

While cinemamatographers may baulk at the idea, the system can simulate existing lenses, for example by applying an anamorphic lens to scenes in post.

It is being offered for rent, inclusive of Lytro technicians, server and software, from $120,000 and is initially being targeted to work on sequences that would later involve vfx-intensive work.

Other light-field innovations are already in use assisting visual effects. OTOY, which has Google mogul Eric Schmidt on its board, has developed a means of creating ultra-realistic computer animations. Its work has been used for vfx in Avatar and Gravity.

“Light-fields have been used to capture actors to create exacting digital doubles for special effects use,” explains Jules Urbach, OTOY founder and CEO. “Now, light-fields can now be used to capture environments with the same level of detail. Every reflection, refraction, and absorption of a field of light is modelled as it would be in the real world.”

Most Popular

Editor's Choice

StarzPlay’s tech powers India’s Lionsgate Play
Provides managed services, developed by StarzPlay, to mark successful launch of Lionsgate Play ...
Subscribe to Digital Studio Middle East
The latest editor of Digital Studio Middle East, the foremost and most respected platform for ...
Dubai's Faisal Hashmi working on feature film
10 short films in 10 years has earned Faisal Hashmi international plaudits at film festival, ...

Don't Miss a Story