Journey to the Center of the Earth: 3-D Coming at Ya!
For Eric Brevig's remake of the Jules Verne classic, Journey to the Center of the Earth, not only is it chock full of vfx but it also represents the first big live-action test case for 3-D. There are virtual environments, fully CG creatures, atmospheric effects, water simulations, set extensions, digital doubles and complex live-action integration. As for the stereoscopic impact, for those fortunate enough to catch it in digital 3-D, the vfx team found it extremely challenging.
No less than five visual effects firms worked on the production. All five -- Meteor Studios (now defunct), Hybride (now sold to Ubisoft), Frantic Films, Mokko and Rodeo FX -- did CG environments, complex simulation work and stereo compositing. In addition, Meteor did CG creatures (the dinosaur). Hybride also did flowing water (river) and lava creatures (carnivorous plants, glowing bird and dandelions). Frantic Films did water (ocean) and lava creatures (flying fish, water dinosaur).
Coordinating their efforts and overseeing all the visual effects work on the film was a monumental task, one handled quite capably by Chris Townsend. Working hand-in-hand with Brevig (the vfx supervisor turned director), was Chris Townsend, the overall visual effects supervisor, to ensure that all the effects synced up in a stereo format.
"Everything had to work in stereo," Townsend stated. "All the tricks we are so used to using in the world of feature film visual effects in a mono world, had to be reconsidered." That proved to be, as one might expect, the most challenging aspect of the production. "In 2-D, if you need to show that an object is a hundred feet away, elevating the black levels, reducing its scale, maybe blurring a little to imply depth cueing, works. These are the tricks we know. Working in stereo -- all that goes out the window. In stereo the object literally has to be placed a hundred feet away, in the virtual world. That means you have to create a CG 3-D environment, which is created to scale -- for every shot. There is no cheating; the rulebook has changed. Those tricks, those sleights of hand, with which we are so familiar, they disappear. If it doesn't work in stereo, it doesn't work. That was the biggest challenge: learning that the way we work has changed."
Everything from composition of a shot within the third dimension to integrating live-action elements so that they seamlessly fit within a scene had to work not only from a 2-D aesthetic point of view but also from a stereo perspective.
"Often we worked with sub pixel accuracy to ensure that a splash happened upon the surface of an ocean rather than floating above, or a bluescreen foot walked upon a CG plane rather than under it. The level of detail required to ensure that the audience wouldn't think, 'Hey, there's something wrong about that image,' was intense. We look at the world in stereo, that's our life. Creating a film in stereo takes us one step closer to that reality that we know so well. You don't need to be a visual effects expert to tell if something is wrong in stereo. You just need to have stereo vision! However, understanding the problem and knowing how to solve it does require a different level of expertise."
So how did Townsend and the rest of the visual effects team solve the problem?
"As we photographed the entire film in stereo, using specially designed rigs holding two cameras, ensuring that the two 'eyes' matched was another challenge. Lenses are physical things which, even though they are built to pretty tight tolerances, are never identical. The two images created can be off in scale, rotation, vertically misaligned, have a focus or depth of field mismatch, have a different exposure. All these things need to be adjusted so that the images are normalized with the only variation between the two being a horizontal offset, like your eyes. Otherwise, viewers will have a hard time resolving the two images. Even if they can, eyestrain and headaches will ensue. Not good for a full-length feature film!"
All of the vfx sequences were prevised by Persistence of Vision. "One sequence, the floating rocks, was also prevised in stereo, which allowed us to test out some of our thoughts on the interocular distances (how far apart the lenses should be) and convergence plains. We did an early test of an actress in a cave environment, shooting with the stereo cameras, to put everything through a dry run, to test out stereo pipelines in some of the facilities and to do some (image) development in stereo."
As for the software being used on the production, since there were multiple vfx firms involved, each used their own platforms.
"Each had to be enhanced to create compatibility with stereo. Some were tuned with proprietary GUIs to try to simplify working with two corresponding eyes. For others it was just a matter of designing a stereo workflow.
"On the set we used hollow cubes and grided markers to help the camera tracking in post, but rig and lens metadata was also encoded into the DPX frames. This information was then used by most vendors to further simplify the camera tracking process."
Townsend also credits FrameCycler as a viewing tool of major importance.
"Being able to view shots in stereo was key. Initially I reviewed work on a monitor, using Iridas' FrameCycler using active shutter glasses. The ability to analyze full resolution shots in realtime, in stereo, in my own time, without the constraints of having to book a theater, was incredibly important. Discerning what is right and wrong about a stereo image is complex. There are so many aspects, which can cause an image to be wrong. Is an element within the image flipped, misaligned, out of synch, in the wrong stereo space? Are the left and right eyes of different exposure? Are there photographic anomalies (lens flares, blur, dust, etc) that appear in one eye but not the other? All these things have to be studied in order to move the shot along, from a stereo point of view. And that doesn't even take into account the aesthetics of the shot itself; that was a whole other ball game."
Another viewing tool that Townsend used was Acuity's QuVis software.
"Having viewed the work on a monitor, I then reviewed the work on a 23-foot screen, using dual projectors, passive circular polarized glasses and Acuity's QuVis software. This allowed me to look at the work from an audience's perspective and to examine the stereo space more accurately."
Townsend admits that working in stereo is working in a whole other dimension. "It relies on techniques that we, as an industry, are only just learning, but that future promises to be richer and far more immersive, from an audience's point of view."
Frantic Films VFX, a division of Prime Focus Group, served as a lead visual effects provider on the film. The filmmakers came to Frantic because of the studio's expertise in creating believable digital water effects using its proprietary fluid simulation suite, Flood. Frantic also created three digital characters end-to-end -- the Razorfish, Plesiosaur and Trilobite -- developing a flexible character pipeline using Autodesk 3ds Max. Custom plug-ins were scripted to manage data interchange from rigging to animation, modeling and lighting. This non-linear approach provided a more practical workflow. In the event that changes were called for, the team didn't have to halt the entire production pipeline.
The Frantic team was led by Vancouver Visual Effects Supervisor Chris Harvey, Winnipeg Visual Effects Supervisor Mike Shand, Visual Effects Producer Randal Shore and Head of Software Mark Wiebe.
"I think by far the most challenging aspect is the fact that most of the traditional 2-D compositing tricks don't work in stereo," Shand said. "There is usually a lot you can do in 2-D to finesse a shot, fix CG issues and whatever. However, in stereo it needs to be done in 3-D so that the element is grounded correctly in z-space and has depth within itself."
Some of the vfx facilities on Journey to the Center of the Earth opted for a 2-D workflow until converting to a 3-D workflow at the very end, but Frantic decided to work in 3-D the whole way through.
"We looked at the various ways we could composite the show," Shand explained. "We considered having the second eye auto generated through advanced scripting, which would require some additional artist tweaking in the end. However, very early on we received some test footage to play with. We found that our compositing package Eyeon Fusion could easily handle the memory requirements of compositing both eyes simultaneously.
"We also found that if we paired the eyes into a single image, that it made managing all the elements far simpler for the compositor. So, if you were to apply a color correct blur or whatever, it would be applied to each eye evenly, and at any time the shot could be rendered no matter what the state and screened in stereo. If a stereo problem was found, then it was very quick and simple for the artist to find the problem in the flow, since both eyes remained together the whole way through.
"This was very important because our actors were surrounded by CG rain and splashes, and so a lot of work went into seating everything into the right stereo depth. We created a variety of custom tools within Eyeon Fusion that gave us the ability to view the shots in anaglyph stereo at any stage in the composite. This gave the artists the ability to assess their shots in stereo at their workstations before we would screen them. In the end, we were very satisfied with this approach."
Frantic also did custom development to facilitate how the metadata translated to the actual camera rig to simplify final 3-D renders. The 3-D camera systems used to shoot Journey were equipped to change their interocular distance dynamically, making the process of tracking and then applying a basic offset to the second camera impossible. Fortunately, the cameras recorded additional metadata that captured all of the animated interocular movements. Frantic wrote tools to extract this data and used it to generate the second camera. The tools also allowed for some additional tweaking to correct imperfections in the information recovered from the footage.
Marc Rousseau, the vfx supervisor from Mokko, discussed their compositing contributions: "Compositing in a non-stereo feature is all about cheating. Cheat perspective, cheat distances, cheat rotopaint, etc. In the stereo world, this is not possible at all. So everything we have been working really hard at mastering over the last 15 years is now obsolete. So, we then needed to go back to the basics of compositing and CG and forget all the cheats we have come to use."
Pierre Raymond, Hybride's visual effects producer and supervisor singled out interaction with a 3-D environment as the most problematic.
"Our biggest challenge was to have live actors and various objects interact in 3-D stereoscopic. Not only was it imperative for the animation to look natural, realistic and fluid, but it also had to interact perfectly with the actors' actions in a three-dimensional environment. Another challenge was striking the right balance between stereoscopic images and visual comfort throughout all 234 shots. And so, it was with these parameters in mind that we approached this daring project, which allowed us to utilize all of our departments and know-how."
Aaron Dem, the former vp of production for Meteor and now president of production for Lumiere VFX, said they worked on several sequences. "We did full CG virtual environments for all of these sequences. We did effects animations throughout these sequences. We did the dinosaur. For the Mine Ride, [there was] a two-minute CG sequence and full environment that we did. We also did the Mine Crash and the Mine Entrance. Those were full CG environments as well.
"Challenges working on a stereo 3-D film were multiple. One was basically manipulating the stereoscopic images when Eric wanted the photography repositioned. We had to redo a whole layout and then figure out all the camera information and the 3-D information and then re-track that, replace it in a full layout and then re-render all the elements. Obviously the tracking was difficult and the rotopainting was difficult. The tracking primarily was the most difficult part.
"We built a proprietary software to deal with the lens issues within Maya and also we received some other data from the camera so we used that as well to get us started on our pre-tracking. Then we used 3D Equalizer. They [3D Equalizer] worked hand-in-hand with Meteor to develop new tools as well to tackle the challenges that we had to deal with in stereoscopic.
"The project was challenging because it was the first live-action stereoscopic film produced with the new technologies. So, being on set for almost all the shoot, you got to see first hand how the camera technology worked and processed daily. We'd shoot something then go into a screening room to make sure it looked great.
"Just the process of re-lighting stereo shots for a film of this magnitude was definitely a challenge because Eric had a big toolbox to work with, and he definitely wanted to use every tool in his possession. So, it was a challenge to keep up with him, and it was a pleasure to work with him. I think at the end of the day, it'll prove the new technology is here to stay, and it's a great theatrical experience."
By J. Paul Peszko, VFXWorld