Seeing Production and Post in Stereo
When he started doing research into post-production workflows, David Cummins got more interested in 3D than in DIs. And once Quantel upgraded its Pablo finishing system to handle stereoscopic workflows, Cummins figured it was time to get involved in 3D in a big way. He's now the director of business development at Stereoscope in Burbank, where he's working with Managing Member Jeff Pierce. Right now, it's a post and DI facility with 3D capabilities, but in the near future Cummins plans to have a full-bore production and post company running live-action 3D movies through the pipeline. We called him up to ask why he thought now was the right time to go deep with 3D.
What convinced you that now was the right time to open a 3D facility?
The man behind the idea is a gentleman named Jeff Pierce. I was consulting for a company last year that wanted to get into the DI business. I started looking at 3D and I said, “I think you should forget about buying film scanners and film recorders. It’s all going to a file-based workflow with digital cinema cameras. I think you should start focusing on 3D.”
What inspired me to start thinking about 3D is when Shamrock Holdings — which is Roy Disney — invested $50 million into Real D last year. I got on Quantel’s radar because I thought Quantel was farther ahead than anyone else in terms of robust solutions, and they invited me to a demo, at Vince Pace’s place, of a Pablo that had stereoscopic output. I called Jeff Pierce up and said “Hey, I think 3D on the post side is real. We should look at this technology.”
Jeff and I decided the post market was near ripe — but we’d have to be careful how we got in. Jeff took it from there. He wrote a business plan for Stereoscope, got it funded, and we built a post facility for this company — but we’re actually an entertainment company. We’re here to produce content as well. Post is our primary focus for now.
So you decided the business case for 3D is strong...
It’s looking viable, for sure.
...and at the same time saw the Pablo as an anchor for a post workflow.
Yes. The strength of the Pablo is you can edit, you can composite, you can do color-correction and spatialization all on the same box. Scratch and Iridas are great technology, but if you’re spatializing or color-correcting and all of a sudden the director wants to change a scene or an effect, you have to go all the way back through the process. You have to go back to editorial, you have to go back to the online and reconform and re-import everything. On the Pablo, for the most part, you can do it all in real time.
And your post pipeline is up and running now?
Yes. We’re probably a little ahead of the demand. We were expecting it to come a little sooner. But the writers’ strike slowed down everything. At last look, there were 20 to 30 3D features in the pipeline that have been greenlit, but the writers’ strike slowed that down. And with SAG still at the table, nothing is being greenlit until that’s resolved. It’s impacted 2D work, but it’s been even more difficult on the 3D side. We’re doing 2D DI work now on the box, and we’re mastering images for a 3D feature film that we’re producing in house.
Is it live-action?
It’s an outdoor-adventure adrenaline type of project. It’ll probably be shot in worldwide locations and hopefully will be ready to go in 2009.
To date, there’s a feeling that mainstream Hollywood still sees 3D largely as something you do to animation — or to kids’ movies — to juice them up. U23D has changed some of that, and Hannah Montana to some degree.
You had live-action music projects. Hannah Montana was a natural, but that was more of a phenomenon because she had a built-in audience among the teen crowd. That was a brilliant move by Disney. You’re not selling one movie ticket, you’re selling two or three, because Mom and Dad are going to go, too. That’s one of the compelling things about 3D. If it’s the right project, especially for kids, they’ll come back and see it again.
The project we’re working on will probably be cross-generational. Grandparents will be watching it with their grandkids because they’ve all enjoyed the sport together. The kids will probably go back and see it again.
How pervasive can 3D become? I’ve heard people insist that in, say, 10 years, substantially everything is going to be shot in 3D.
My objectivity may be clouded because I’ve joined the 3D club. I drank the 3D Kool-Aid, and it’s pretty cool. It’s so compelling to look at, and it changes your experience. The physiological effect of seeing images recreated the way you see them in real life is submersive and immersive at the same time. You forget where you are if you watch something compelling.
You’re going to hear announcements soon about 3D television sets. Between Mitsubishi and Samsung, they’ve probably sold more then two million 3D-ready sets. The technology is way ahead of the content. There are emerging format standards — and there may or may not be a format battle — but there are people who can get 3D played back from Blu-ray. That’s another vehicle for delivering 3D. When everything goes digital, the spectrum for digital [broadcast] is going to be one channel of high-def, or five channels of standard-def. You could experiment with delivering two channels of near high-def, left-eye/right-eye encoded. There are a lot of ways to do it, and it’s not rocket science.
So the technology is in place. It’s just a question of who wrangles it and how?
It boils down now to some format issues, delivery issues. We were at a meeting of the 3D@Home Consortium that just formed [at NAB in Las Vegas], and the mantra was content, content, content. There’s just not enough content. In Japan, BS 11 [one of a number of channels transmitted via broadcasting satellites above Japan] is broadcasting about four hours a day right now of 3D, in home. In February, Germany started testing in-home delivery of 3D, and about two months ago a rugby tournament was broadcast live in 3D [by the BBC]. So Europe and Japan, as usual, are way ahead of us.
How much work is required to translate 3D from a large screen to a home screen?
That’s a good question. I don’t think anyone knows the answer yet. Right now everyone is shooting 3D in parallel, either using a parallel rig or a beam splitter to allow the cameras to get closer. But you introduce problems when you do that. When you have light hitting one lens directly and light being refracted to another lens, you have a slight color-temperature difference happening. No mirror is perfect, so you have some distortion issues to deal with and have to create a mesh to un-distort the image.
Some very prominent 3D people are saying, “The way things are in the field, you have to fix the interocular distances in post.” That’s a fix-it-in-post mentality that we don’t agree with. We’re looking at camera technology that’s been around for 20 or 30 years to simplify the process. If you shoot 3D, you have to be able to pull convergence in the same way you would pull focus in a 2D feature. For 3D to hit critical mass on the production side — some people are going to have to come up with camera technology that does what it’s supposed to do. A DP has to be able to get up to speed without being a trained specialist.
I remember talking to filmmakers who were shooting on HDCAM for the first time. And I’m saying, “OK, let me talk to your DP. Don’t blow out the whites, don’t crush the blacks, don’t try and get that look in your HD camera.” Now we’re at the studios shooting 3D, and the director’s being told, “Here’s your interocular, here’s your ‘depth-view’, but we’re going to tweak that in post and sync everything up.” If it’s going to be interpreted in post, there’s going to be some confusion there. It ought to be that the director or the DP sets the Z-space out in the field or on the set, and that’s what they create. Post-production technology is now becoming part of the creative process.
And eventually a 3D effect is going to be much the same as the choice of color or framing — a specific creative decision that has to be respected?
It has to be respected, but the 3D guys have to say, “If you try to go for that, that’s going to be an eye-ripper. That’s not going to work. We better back off on that.” The only way to get a sense of that is to shoot dailies, look at them, and say, “Oh no, that’s too extreme.” The Pablo can re-spatialize the Z-space, and that may be something you have to do. But there’s going to be a debate, and confusion.
Let’s face it. I still have people coming in who don’t understand HD. They don’t understand the difference between 60i and 24. They don’t understand 1920x1080 as opposed to 1280x720. They don’t understand HD, and now you’re going to be shooting two channels of HD — or maybe, in some cases, on a 65mm film camera. In post, we’re going to have not just two channels but several new challenges. We hope the left camera is shooting at the same framerate as the right camera. We hope the interocular was done within a range that we can pull focus on. We hope all those things happen. If they don’t, we’ll be fixing it in post. But the studios will be saying, “What’s taking you so long?”
By Bryant Frazer, StudioDaily