I am soaring above the majestic Mumbai skyline at night, an uninterrupted view of the tops of skyscrapers just metres below. I survey my bird’s-eye view of the city in every direction before landing safely on the balcony of a high-rise apartment block. I then remove the Samsung Gear VR headset strapped around my head and re-adjust to the reality of the ITP office to discuss Dirrogate - the virtual reality film I was just immersed in - with its creator, Clyde DeSouza.
DeSouza, a specialist in stereoscopic 3D filmmaking and an early exponent of VR as a medium for film, started work on Dirrogate in early 2015 and completed the project about seven months later in September 2015. The film, which runs to just under 10 minutes, was majority funded by Dubai-based MasterMedia, and is based on DeSouza’s hard sci-fi novel, Memories With Maya.
Both the novel and the film explore the idea of virtual reality and the ability of artificial intelligence to create digital incarnations of people who have died, and the effect of this on human relationships.
The film, which can be viewed on Samsung Gear and Oculus Rift headsets, immerses viewers in the world DeSouza created. By demonstrating the merits of VR as a medium for film, it also shows that the heady ideas explored in the novel may be much closer to reality than we think. The film has already garnered some 35,000 views on VRideo, a platform that hosts VR experiences. It is also available on Youtube360 and is the only indie film on SamsungVR, where it sits alongside Hollywood VR productions such as The Martian VR Experience.
The film consists of various indoor and outdoor locations which were filmed before being rendered as a 3D virtual environment in software including Adobe After Effects, Premier and iClone. The film starts with an impressive free-flight over central Mumbai before revealing a giant panel that presents the film as a graphic novel, allowing the viewer to select which scene they want to leap into.
The three main characters in the film, protagonist Dan, Maya and her brother Krish, were created using tailored animated figures from software called Daz3D. The figures, while clearly animated, appear quite realistic, and the fact that they are life-size and 3D is initially quite jarring for viewers not used to VR. The viewer is also free to look in any direction throughout the film and experiences total depth of vision, making it a truly immersive experience.
DeSouza confirms that he is also now working on a “deep VR” version of the film in which the viewer will be able to walk around the sets and interact with certain objects, courtesy of Oculus’s motion tracking technology. He has already completed about 40% of the conversion process on the original Dirrogate and expects to complete work on the new version in the next few months. In the upgraded version, billed as Dirrogate Deep VR, the user will experience a world that is something like a cross between a film and a game. While the viewer will be led through a clear narrative, there will also be the freedom to move around and interact with the virtual space.
Returning to the groundwork on the original version, DeSouza explains the complexity of making the film, for which he used numerous software suites including Adobe After Effects and Premiere, Octane Render, iClone and Daz3D. Starting with the sets in the film, DeSouza shot most of the locations, including an apartment and balcony in Dubai Marina, part of Mumbai train station and a nightclub, from different angles on a Ricoh camera. He then fed the images into a 3D engine, Unity3D, to create 3D models of each location. This gave the sets in which the main characters relate the story. It also represented one of the simpler aspects of making the film. Indeed, creating life size characters and showing them moving around in a fluid way was more difficult. DeSouza generated the characters from Daz3D, a software that provides realistic 3D models popular with advertising agencies and other media companies. The software allows users to create CG characters with a mix and match approach, selecting figure type, face, hairstyle and clothes. Once the characters were selected from this site, DeSouza exported them to a software called Octane Render where he rendered the characters into what is known as the ‘equirectangular’ format, which gives a true 360 degree feel. The characters were then composited back into After Effects.
To do the actual animation and make the characters move around in an authentic way, DeSouza exported the characters in a format compatible with iClone, a semi-professional software that works well for character motion. “It’s got really good motion control, as in if you want to lift a hand or make a person move, it’s very intuitive,” DeSouza says.
He then output the sequences in which the characters move to Octane Render to work on the equirectangular formatting to give the 360-degree effect and do the compositing. Here the characters were rendered with an alpha channel, the CG equivalent of a green screen, before being taken back into After Effects to be placed into their proper setting.
DeSouza faced a number of challenges while making the film, not least an absence of tools tailored for making a true 3D VR film. “One of the problems was there was no plugin to monitor my progress in a headset. There was no way of smoothly looking at it through the timeline itself. I had to do a shot – a still image – take it out as a jpeg, put it into another software and then view it. If there was something wrong I’d have to do the whole thing again, tweak it a little bit and view it again. It was like a nightmare but that was how I did it,” DeSouza says.
Thankfully, in the 18 months or so since DeSouza started work on the project, the VR filmmaking industry has come a long way in terms of tools, and there are now at least two options that allow VR animators to monitor their progress. “One is from a company called Mettle based in Canada. From the Adobe Premier time line it allows you to see a scene in 360-degrees. The other solution is from GoPro, which bought a company called Kolor and offers a free plug in,” DeSouza says.
He adds that Adobe also recently released an update to Premier which allows users to edit 360 footage inside After Effects and Premier. But DeSouza admits that the biggest challenge while making the film was getting the stereoscopic compositing right. “In 2D it is very easy to render something like someone standing on the ground. But in stereoscopic, if you’re a little off you will get the person floating in the air or sinking into the ground,” he says.
Part of the problem was having to composite in stereoscopic 3D without the support of dedicated software tools. To this end, DeSouza’s experience of regular stereoscopic 3D filmmaking came into its own, although it still left a few patience-testing procedures. “The headache was literally taking out and exporting one frame, checking it, tweaking it a bit, again checking it, tweaking it a bit, again checking it and then locking it in, so the whole manual process of doing it was the biggest challenge,” he says.
DeSouza learned a lot from making the film, including the fact that when shooting a 360-degree stereoscopic film, the reverse 180-degree section of the footage exhibits a seemingly unusual (but entirely predictable) effect. Indeed, when viewing the reverse portion – that is the rear 180-degree portion of the 360-degree film – the footage plays incorrectly because the footage for viewer’s left and right eye is displayed in the wrong order. “The reverse 180-degrees effectively gets swapped, so the left eye becomes the right eye,” DeSouza explains. “You end up with a big seam in there that needs to be cleaned out.”
Editing and colour grading were done in After Effects and Adobe Premier and were relatively straightforward. However, there are some important factors that budding 360-degree VR filmmakers should remember. “It is important because today when you are shooting in stereoscopic 360 you are probably shooting with many cameras, so you have to make sure all the F-Stops are correct and the white balance is correct across all of them,” DeSouza says.
“It is a bit of a challenge because you are shooting 360-degrees. If you’re outside in the daytime, the sun is always going to be there, so you have to match the ISO settings and everything needs to be matched properly, or else you need good colour graders later to try to minimise any problems between the cameras.”
He adds that this extra level of complexity may explain why a lot of people who are doing VR films at the moment are moving more towards CG, or doing a composite between green screen and live action on a CG background.
With Dirrogate proving a hit on free platforms including VRideo and YouTube360, DeSouza is keen to pay tribute to MasterMedia, the production and broadcast consultancy led by founder and managing director Hasan R. Sayed Hasan. MasterMedia came on board as producer and majority funded the original version of Dirrogate shortly after DeSouza started work on the project and royalties from the film will be shared between the two.
While the film has only been distributed as a free download to date, DeSouza is signing a paid deal for distribution in Korea. “It’s through a distribution site, and they are probably going to translate it into Korean,” he says. There are also positive signs from China, with DeSouza in early discussions with a potential distributor. Based on these and other discussions, DeSouza sees significant potential for VR films in Asia. “These countries are really moving ahead and have got distribution platforms in place, which I have not even seen in the West right now. Everyone is talking about distribution of VR films but it seems that these guys are ahead.”
With much of the film being set in Mumbai, DeSouza is also planning to get the film translated into Hindi, while being careful not to compromise the science behind the plot and themes.
As Digital Studio went to press, DeSouza was also busy preparing to submit the film for the Proto Awards, an awards scheme dedicated to VR content.
Looking forward, DeSouza hopes to use his learnings from Dirrogate to take on some commercial projects. He sees particular potential to use the “graphic novel” format of Dirrogate to create short VR film versions of mainstream films. “I am going to approach filmmakers of upcoming films and try to see if they are interested in getting a graphic novel out to publicise their film. It’s a quicker development path rather than doing a full-fledged VR film that would typically run to 20 minutes or so. With a graphic novel format you can step into the pages of the book and then the scene plays out around you,” he says.
DeSouza is also convinced that VR is not only here to stay, but that many of the ideas he explores in his film, and the novel it is based on, will become a reality quicker than we think. “It’s the way it’s going,” he says.
With his latest project, Dirrogate Deep VR, Clyde DeSouza is pushing the boundaries of VR filmmaking. While the original Dirrogate told a story in VR, allowing the viewer to look in any direction they wanted, the new version will make full use of Oculus’s positional tracking to allow the viewer to also move around the sets and interact with certain objects. For example, in one of the early scenes, on the balcony of an apartment, the viewer will be able pick up and read the novel, Memories With Maya. To an extent, the viewer also gets the chance to interact with the characters. One of the central characters, Maya, invites the viewer into the apartment and the viewer can choose to go in, or they can explore the balcony. If they opt for the latter, they will automatically be transported to the next scene after about a minute, allowing the narrative to continue.