Live Stereo 3-D Acquisition
The main issue that live-action stereo 3-D projects have to contend with is that no two cameras are exactly alike. The slightest inconsistencies in alignment, distortions and aberrations from lenses, focus breathing, lens flare and spherical reflections can produce discomfort or break the stereo illusion of depth. Some lenses even create subtle anamorphic squeezes.
The problem is particularly acute when zooming. Because no two lenses track identically, even a fractional misalignment will lead to uncomfortable 3-D viewing. The image will not only deviate around the center of the lens during a zoom (horizontally) but also vertically.
Specialized 3-D motorized camera rigs, automated by bespoke software, are designed to control the interaxial (distance between the lens axes) and convergence (the point at which left- and right-camera optical axes converge) parameters and to virtually eliminate pitch, yaw and roll between cameras. Such motorized rigs also ensure that lens length, focus, iris and zoom (FIZ) are linked as closely as possible.
There can still be small inconsistencies because of chromatic and spherical aberrations and zoom breathe, and those may need to be addressed in a post-production environment that allows metadata tracking throughout the pipeline.
For live productions, a reliance on post is not an option, so obtaining accurate results at source is critical. It is vital that time code references are genlocked, and computer control is established over zoom, interaxial distance and focal length while each camera's respective metadata is saved.
Rig Selection
Although some initial tests for broadcast stereo 3-D in Europe used nonmotorized (passive) rigs to reduce costs and did without the zoom function, BSkyB is setting the bar high after striking a deal with 3ality Digital to outfit a dedicated Sky 3-D OB vehicle (in tandem with integrator Sony and OB supplier Telegenic). The technology is perceived as expensive to hire, but is renowned and robust so equipment errors will be minimized.
BSkyB's philosophy chimes with that of other broadcasters looking to produce live events in stereo 3-D. That is, to make as much use of existing broadcast infrastructure as possible and encompass existing investments in cameras, lenses with digital servo drives, production switchers and fiber cabling.
There is an increasing volume and variety of rigs available, and growing competition should drive costs down, although a live stereo 3-D shoot will cost anywhere between 15 percent and 50 percent more than current HD budgets. The cost profile of stereo is expected to mirror that of HD with premiums reducing over time as demand rises, technology proliferates and OB crews become trained stereographers.
The main commercial rigs for live stereo 3-D production are made by 3ality Digital, Pace, Binocle, Element Technica, P+S Technik and imARTis (SwissRIG). A number of standard broadcast cameras, such as SI-2K minicams or high-end imagers such as REDs, can be attached to these rigs or used with T-block adaptors.
Rigs can usually be configured as mirrored (or beam-splitter) for close work with wide lenses or arranged as a side-by-side system for use with longer focal lengths. Mirrored arrangements are indispensable but will lose a stop of light from both lenses and can introduce color distortion to each channel as well as cause issues with polarized reflected light off some surfaces. They also tend to be heavier because of the additional mirror and infrastructure. Lightweight models are available or being developed and will prove more suitable for Steadicam. Either way, a robust rig is vital because the more the cameras shift around, the harder it will be to adjust them.
Regardless of the rig, nothing is more crucial than balancing the lenses before shooting, and it's important to recognize, isolate and diagnose errors when calibrating each dual position.
Monitoring can be conducted locally at the camera or in the truck using professional screens with polarized display surfaces that take in two HD-SDI feeds. The Transvideo monitor takes dual HD-SDI and displays anaglyph (monochrome or color), LCD shutter glasses mode or a 50/50 color overlay. Polarized monitors (from JVC or Hyundai Xpol) take a single side-by-side HD stream (often via the Inition StereoBrain or 3ality Stereo Image Processor). Standard HD monitors can be fed with anaglyph, 50/50 overlay or luma difference by the StereoBrain or 3ality processors.
The motorized rigs are operated in conjunction with software. For example, the 3ality processor enables the creation of a look-up table (LUT) for any given pair of zooms. To set the rig up, the lens pairs are put through a 16-point line-up in which the lenses are pulled to their full focal length and back again. By lining up on close and distant targets, the operator can dial in those points so that the moving baseplate will move accordingly — left, right, up or down — for any given focal length. The processor logs all changes that need to be made to the zoom so that it stays centered. An additional LUT allows for the tracking speed of each lens relative to each other so that the image stays the same size from the beginning to the end of zoom travel. For the rigors of an outdoor shoot (temperature, wind, knocks) there is a manual override to enable an operator to make adjustments live.
3-D Truck Design
To keep costs down, the design of 3-D-capable OB trucks is likely to adhere to the footprint of existing HD trucks as far as possible. Systems integrators are working on the assumption that 3-D is an upgrade path, not a new build. The bulk of a redesign will be remapping the router so that there's enough connectivity around the existing truck to tie left- and right-eye streams together.
With 3-D, the viewer wants more time to explore the image because there's more information — particularly in the main wide shots. There's consensus that a conventional 2-D edit will not work for 3-D, where the direction needs to be slower and fewer camera angles are needed. Consequently, 2-D and 3-D broadcasts of the same live event are likely to be produced separately, and fewer camera positions will be required for 3-D. However, that is balanced by the current necessity to include one additional “3-D puller” per camera pair so trucks may require modification for seating.
The 3-D puller's work will be overseen by a stereographer who is responsible for the overall 3-D design and supervision of the depth balance across all cameras. The stereographer will create a “depth script” for the coverage, perhaps wanting to deliver greater 3-D punch in the first five minutes or to alternate the 3-D effect with the ebb and flow of play.
They impart that information to the 3-D pullers, who are able to view the image difference (foreground and background separations) from their stereo pair on monochromatic displays overlaid with a grid. The grid lines afford them an easy way of controlling the crucial interaxial spacing and convergence by altering FIZ parameters.
Sky Sports, for example, is working to an overall depth budget of 3 percent. The depth budget is defined as a percentage of screen width, and in the main the budget is 1 percent in front and 1 percent behind for a naturalistic feel. SkySports is advising that for most sports, it is safe to go to ±2 percent for occasional actions (such as objects coming dramatically close to camera). As the camera operators pull zoom, the 3-D pullers are changing the convergence and interaxial distance in accordance to stay within the depth budget set by the stereographer.
In theory, all 3-D pulling will occur in the vision control room while the director operates in the production area as normal. The director will be directing from 2-D monitors alongside a final 3-D view monitor and will trust the stereographer's skill that when a cut is called, the camera is prepared with the correct interaxial spacing and convergence. It's possible that each 3-D puller will also have his own small 3-D color monitor to glance at, aside from the black-and-white difference monitor.
Clearly this represents an untenable number of crew for the long term, but software processing boxes such as Sony's are being designed to be capable of managing two camera pairs in the future.
It's also important to note that in BSkyB's tests, the impact of eye fatigue on OB operators looking at monochromatic 3-D screens for several hours was an issue — particularly for sports such as football, where the convergence point is always moving. Its solution is to employ a “floating” convergence technician to rotate staff at intervals.
Mixing 2-D and 3-D
It is likely that additional 2-D HD feeds will be used to augment live 3-D coverage. The 2-D feeds would pass through an image conversion tool that uses software algorithms to artificially push the background layer into positive parallax away from the foreground layer to give a sense of 3-D. This can be performed in real time. There are obvious cost-saving benefits, and an additional camera position will give the director a greater element of choice.
Electronically Controlled Rigs
In the longer term, it's envisaged that a master stereographer will oversee and set the depth while much of the manipulation is taken care of electronically rather than manually. Misalignments, caused by zoom lenses, are also intended to be corrected electronically. Sony is developing such a 3-D processor box that can deliver camera pair alignment and correct for errors introduced in the rig, including image geometry and color matching. It offers stereographic engineers another option to manage alignment in addition to mechanical alignment rigs to enable the control of live 3-D content capture.
What the box will not be able to do, according to some experts, is simulate the change in interaxial spacing, which defines the 3-D volume. This will still need motorization to physically move the cameras closer together or farther apart as the shot dictates. Camera set-up would be quick along as basic adjustments of vertical offsets and interaxial distance were observed, with dual images fed to the camera control unit (CCU) as normal.
From the CCU, the left- and right-eye signals would be fed to the stereo processor (one per camera pair) with the results monitored on-screen by the 3-D puller. A zoom function enables the stereographer to hone in electronically on an image with the processor automatically rescaling the image or adjusting lens tilt.
The next advance is to retrieve data from the lens itself. Such metadata already exists, stored in the front end of the camera, but until now has not been necessary in the OB van. That data can be routed down to the CCU as part of the HD-SDI stream so nothing needs altering in the OB truck; it's just ancillary data encoded into the video stream. The processor will decode the data and, based on the focal length, make corrections electronically to the image without needing to physically shift the rig.
The software processor would include color correction and balance tools so that dual outputs could be matched. Other discrepancies in camera racking would be corrected automatically. The delay incurred in such prototype systems is currently two to three frames. This is likely to decrease and in any case is not unfamiliar to broadcasters used to working with virtual studio style set-ups.
3-D Cameras
Several manufacturers are developing single-body stereo cameras, which would reduce the weight, footprint and probable cost of twin-bodied rigs. Panasonic is the first out of the blocks with a single-body dual-lens camcorder that will ship around IBC2010. The camcorder packs two nonexchangeable 12X zooms, a camera head and a memory card recorder into a 3kg body. In essence, it is two distinct cameras recording two streams compressed with long-GOP AVCHD onto SDHC/SD cards, with each recording maintained separately until post production. A remote control tethered to the camera head enables an operator to adjust the convergence point while in use.
A single-body dual-lens camera will be useful for certain applications, such as goal-line POV shots or the occasional Steadicam, because the apparatus is lighter. For most applications, such as the primary camera position in a stadium gantry, side-by-side cameras are needed to open up the interaxial distance and force the 3-D perspective.
According to a leading stereographer, fixed interaxial devices are a red herring. It is a common misconception that 63mm, because it is the interocular spacing for humans, is some sort of holy grail. Instead, 63mm is just one of many interaxial spacings that can be used for 3-D.
By Adrian Pennington, BroadcastEngineering